Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Sir Beregond

  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About Sir Beregond

  • Title


  • CPU
    [Intel] Core i7-4790k
  • Motherboard
  • RAM
    [Corsair] Vengance Pro Series 16GB DDR3-2400
  • GPU
    [ASUS] GTX 980 Strix
  • Case
    [Corsair] Carbide Air 540
  • Storage
    [Crucial] M550 512GB | [WD] Black 3TB x 2
  • PSU
    [Corsair] AX750
  • Display(s)
    [LG] 32GK650F-B
  • Cooling
    [EK] Supremacy EVO | [EK] GTX 980 Strix - Fullcover w/ Backplate | [EK] D5 X-RES Top 100 w/ D5 Vario | [XSPC] EX240 x 2 | [Corsair] SP120 x 5 / AF120 x 1
  • Keyboard
    [Corsair] K70 - Red switches
  • Mouse
    [Razer] Naga Epic
  • Sound
    [Audio Technica] ATH-A700 | [Antlion] ModMic
  • Operating System
    [Windows] 8.1 Pro 64-bit
  • Laptop
    [ASUS] Zenbook UX305FA
  • Phone
    [Samsung] Galaxy S9+
  1. Blocks are very specific to whatever card designs they are made for. Gen to Gen you'll have completely new PCB designs necessitating a new block.
  2. And any brand new 20-series cards are still being sold for their MSRP if you can find them, I've noticed. (Source: Local Micro Center)
  3. They called it "Titan class", and yeah makes more sense for non-gaming use, but also made a big deal about supposed 8k gaming with this card. So Nvidia definitely pushed a gaming angle with this card as well. 10-15% more performance (maybe, very cherrypicked I am sure) for 114% more money. Terrible gaming card.
  4. No worries, had plenty of those moments at work today.
  5. 20 would be if they used 2GB chips, or do like the 3090 and use front and back of the PCB since I don't think 2GB chips exist yet for GDDR6X.
  6. If you look at the PCB, it clearly has blank spots available for 2 additional 1GB GDDR6X chips which if populated would have provided a 12GB card, but would also be 384-bit bus, so maybe they are segmenting it away from 3090 by doing it this way. Who knows.
  7. No. It's "Titan class" from the announcement video, but very clearly not a Titan as it is not a fully unlocked GA102 chip which aligns it more as an xx80 Ti tier product with a $300 price increase over last gen and a rebrand as a new xx90 product naming scheme so people would buy into this marketing. The 24GB of vRAM is definitely Titan like, but the not fully unlocked chip is more like an xx80 Ti.
  8. SLI support will only continue to be deprecated over time so should not be seen as any long-term feature for gaming in your logic in my opinion. It is knocking on death's doors. A lot of your post clearly indicates you are making assumptions. It essentially boils down to this: Do you want to spend 114% more money over the 3080 for maybe at best 10% more performance and 24GB GDDR6X? The memory alone won't carry you to the 60-series if we figure generational cycles are 2 years now. That's 6 years. The 3090 will have other more basic performance issues before the vRAM is s
  9. Well the other problem is that the xx80 tier card used to be the big chip and fully unlocked, and then it wasn't starting with the GTX 680. The price never changed to reflect that. To the 3080's credit, at least it is the big chip and not the 104 mid-range chip like the 680, 980, 1080, 2080 all were. So as far as I am concerned, pricing has been a joke for 8 years now. At least back before Turing, xx60 and xx70 class cards were priced like they should have been and not $400/$500 respectively nonsense. Now that we've been seeing all the testing and benchmark reviews for the 3080, I'
  10. The problem with people comparing pricing of current gen based on x% performance gain over last gen at $x-price leads to pricing treadmills. Lots of people were doing this 2 years ago trying to justify the 2080 Ti pricing because its x% faster than 1080 Ti. Pricing fluctuation or creep based on competition, inflation, R&D, node process and yields, etc. makes sense and would be expected. Going from $500 fully unlocked chips with Fermi to $1000 not even buying you a fully unlocked chip with Kepler in the span of 2-3 years to $2500 fully unlocked chips these days does not and is
  11. I believe Nvidia discontinued production of those cards a while back in preparation for Ampere and to clear stock.
  12. Meh. I'll wait till we have reviews and GamersNexus has put the different models through the ringer. Though as a watercolor, I am more concerned with block compatibility, VRM/power delivery, and overclockability.
  13. I have no sympathy for anyone who bought a $1200 graphics card. You knew you were getting shafted and bought it anyway. We also have no actual benchmark data on the 3070 vs the 2080 Ti yet on performance so it is premature to just take Nvidia marketing for what it's worth and call it "faster". If you noticed on their graph, it was on the same level as the 2080 Ti, not above it. My speculation is it will win some, lose some. Its strength will probably be in next gen Tensor/RT cores over the 2080 Ti for titles that support RT/DLSS, otherwise the rest of its stats look worse to me. It
  14. All that for a graphics card? Sheesh. I'll wait for Jan/Feb/March. Heck by then we might get Ti variants.
  15. RTX as a feature is not that compelling to me yet. DLSS on the other hand definitely is. I'm somewhat hyped, but at the same time feel Nvidia is still holding back until RDNA2 launches. I think we will see some 16GB/20GB variants of the 3070 and 3080 respectively at the very least, which would make me feel better about holding onto a card for 4-5 years or more like I did with my GTX 980. That said, one thing that concerns me is the much higher power draw for Ampere. Perfomance per watt will definitely be something I factor. Remember Fermi?