Jump to content

Alleged GeForce RTX Turing "SUPER" Refresh Specs Leaked (Update 2)

5 minutes ago, leadeater said:

I'm guessing Nvidia was calculating that more people would buy higher end cards than they are, seems Nvidia didn't quite get the pricing quite right for that.

Turing has sold poorly overall. The 2070 Super is the bin the 2070 should have been at launch. Nvidia got lazy when they weren't offering performance per price improvements compared to mid-2018 prices. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Taf the Ghost said:

AMD and Nvidia are pretty static for their market share and have been for a few years. This result is really about the fact they put 2 SKUs on the TU106 die, so their orders are imbalanced. They need to spread out the bins more. Especially as the 2070 ends up eating up dies that'd go to Pro Cards.

True, but, if AMD has competitive parts at competitive prices. They would have to adjust. Which is what we basically saw with RX Vega 56 and RX Vega 64. It was a little simpler though, as the 1070 Ti did the trick alone (and some other faster memory SKUs on other cards). Forcing people on the fence of cards to purchase (1070, Vega 56, 1070 Ti, Vega 64, 1080).

 

18 minutes ago, leadeater said:

Very unlikely to go EOL, no OEM would allow that and the product investment by Nvidia and all other involved would be terrible if done.

 

There's plenty of price gap between current models to put new SKUs in with minor to no changes of existing prices. Pricing follows a pretty basic principle, it's aligned to performance, and performance is aligned to pricing. So I can say with 99.99% confidence that Nvidia would not make the RTX 2070 SUPER faster than the RTX 2080 and the RTX 2080 won't be cheaper than the RTX 2070 SUPER.

 

Unless I'm not correct about the products going EOL but I just can't see that happening.

I'm not saying EOL anytime soon. Maybe about 6-7 months out. While I do agree that, they don't necessarily have to drop prices. It is something that sort of happened, even by simply the introduction of the 1070 Ti. Which I touched upon (you might have not seen it, because I added that part at the last minute). Essentially though, was that when the 1070 Ti was launched, 1070 and 1080 prices were all over the place. It almost killed both of those SKUs in many instances because all three models by various board partners had varying prices (though, on average, the 1070 Ti was the better deal). Most people said it didn't make sense to buy the 1080 nor the 1070 because of where the 1070 Ti slotted.

 

As I stated, the 2070 Super would be around 13-15% slower than a standard 2080 on paper. So I do not believe myself that the 2070 Super would be faster than the original 2080. What I do believe though, is that a minor price difference is plausible; as we see from the 1660 to 1660 Ti to 2060 for example. Of course, AIB models sometimes make things a little wonky. But generally speaking, I think a price difference of $50-70 between the 2070 SUPER and original 2080 is pretty acceptable and plausible.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, BiG StroOnZ said:

True, but, if AMD has competitive parts at competitive prices. They would have to adjust. Which is what we basically saw with RX Vega 56 and RX Vega 64. It was a little simpler though, as the 1070 Ti did the trick alone. Forcing people on the fence of cards to purchase (1070, Vega 56, 1070 Ti, Vega 64, 1080).

 

I'm not saying EOL anytime soon. Maybe about 6-7 months out. While I do agree that, they don't necessarily have to drop prices. It is something that sort of happened, even by simply the introduction of the 1070 Ti. Which I touched upon (you might have not seen it, because I added that part at the last minute). Essentially though, was that when the 1070 Ti was launched, 1070 and 1080 prices were all over the place. It almost killed both of those SKUs in many instances because all three models by various board partners had varying prices (though, on average, the 1070 Ti was the better deal). Most people said it didn't make sense to buy the 1080 nor the 1070 because of where the 1070 Ti slotted.

 

As I stated, the 2070 Super would be around 13-15% slower than a standard 2080 on paper. So I do not believe myself that the 2070 Super would be faster than the original 2080. What I do believe though, is that a minor price difference is plausible; as we see from the 1660 to 1660 Ti to 2060 for example. Of course, AIB models sometimes make thinks a little wonky. But generally speaking, I think a price difference of $50-70 between the 2070 SUPER and original 2080 is pretty acceptable and plausible.  

On the bold part, no, they wouldn't, as the last generation showed. 80-85% of the market just looks for the GPU in their price category and any potential sales on that part. There's only a small range of consumers that actually cross the current Market Share boundary. Nvidia doesn't have to cut prices unless AMD really did launch the 5700 XT at 250USD, but I doubt they'd even do it then, given AMD would be making almost nothing on the product.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Taf the Ghost said:

On the bold part, no, they wouldn't, as the last generation showed. 80-85% of the market just looks for the GPU in their price category and any potential sales on that part. There's only a small range of consumers that actually cross the current Market Share boundary. Nvidia doesn't have to cut prices unless AMD really did launch the 5700 XT at 250USD, but I doubt they'd even do it then, given AMD would be making almost nothing on the product.

 

I was speaking more in terms of, adjusting by introducing a newer, more competitive product. Which I believe the 1070 Ti definitely was.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, BiG StroOnZ said:

 

I was speaking more in terms of, adjusting by introducing a newer, more competitive product.

We're in the normal launch window for the 2080 Ti, if they hadn't released it at launch.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, BiG StroOnZ said:

While I do agree that, they don't necessarily have to drop prices. It is something that sort of happened, even by simply the introduction of the 1070 Ti. Which I touched upon (you might have not seen it, because I added that part at the last minute). Essentially though, was that when the 1070 Ti was launched, 1070 and 1080 prices were all over the place. It almost killed both of those SKUs in many instances because all three models by various board partners had varying prices (though, on average, the 1070 Ti was the better deal). Most people said it didn't make sense to buy the 1080 nor the 1070 because of where the 1070 Ti slotted.

I think what happened with the 1070 Ti is that it was such a good value proposition Nvidia had to lower 1080 prices to maintain product volume and similar with the 1070. It's pretty expensive to maintain a product that doesn't actually sell very well, even though you're selling the same number of GPUs you want to optimize that balance as best you can.

 

I actually tend to think Nvidia isn't attuned to the mid tier market as well as they think, nor actually understand it that well which I find if true amusing because they sell the most GPUs in that segment. Mid tier buyers are a bit more pragmatic with their purchases by necessity so even if they could afford the next card up that product has to demonstrate it's value for the cost increase. This is where RTX missed the mark so much, these buyers can't just go up a price segment or won't by default just to maintain buying the same model class card. If you were a 1070 buyer you weren't just going to buy a RTX 2070 without proving it's value well enough.

 

 

Edit:

This will happen with any 'SUPER' cards also.

Link to comment
Share on other sites

Link to post
Share on other sites

Even with these reported performance increases and price-drops, the RTX 2070 Super is the new GTX 1080, and not the 20XX version of the XX70 card.

 

The price-gouging is real.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, leadeater said:

I think what happened with the 1070 Ti is that it was such a good value proposition Nvidia had to lower 1080 prices to maintain product volume and similar with the 1070. It's pretty expensive to maintain a product that doesn't actually sell very well, even though you're selling the same number of GPUs you want to optimize that balance as best you can.

 

I actually tend to think Nvidia isn't attuned to the mid tier market as well as they think, nor actually understand it that well which I find if true amusing because they sell the most GPUs in that segment. Mid tier buyers are a bit more pragmatic with their purchases by necessity so even if they could afford the next card up that product has to demonstrate it's value for the cost increase. This is where RTX missed the mark so much, these buyers can't just go up a price segment or won't by default just to maintain buying the same model class card. If you were a 1070 buyer you weren't just going to buy a RTX 2070 without proving it's value well enough.

 

 

Edit:

This will happen with any 'SUPER' cards also.

Gamers Nexus' own stats showed that the RTX launch caused the biggest sales in 1080 Ti's. That was kinda what RTX brought.

Link to comment
Share on other sites

Link to post
Share on other sites

really hope Nvidia to have significant price drop, (but did they ever have any drop...?)

also please dont give any of those founders bullshit

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Taf the Ghost said:

We're in the normal launch window for the 2080 Ti, if they hadn't released it at launch.

Yeah, they definitely mixed things up with Turing. Yields must have been weird at first, or maybe they actually have something coming sooner (new arch); thus affecting their "normal" release timeline (Ampere in Q3 2020?).

 

For one (and something to keep in mind), the die sizes are in fact pretty large compared to what we've been seeing in the past (TU106 = 445 mm² vs. GP106 = 200 mm² vs. GM206 = 228 mm²). Now the 2070 SUPER is using TU104 instead of TU106... (what should have been done originally, and what we would have expected on the original 2070). Something weird is definitely going on over at HQ. This is why I'm skeptical about the rush for this 'refresh'. Something is slightly off about this whole release. 

 

Lastly, we cannot dismiss the fact that AMD has the jump on the newer process node (7nm), and what they (NVIDIA) are conjuring up, knowing they are on the "inferior" process. 

 

1 hour ago, leadeater said:

I think what happened with the 1070 Ti is that it was such a good value proposition Nvidia had to lower 1080 prices to maintain product volume and similar with the 1070. It's pretty expensive to maintain a product that doesn't actually sell very well, even though you're selling the same number of GPUs you want to optimize that balance as best you can.

 

I actually tend to think Nvidia isn't attuned to the mid tier market as well as they think, nor actually understand it that well which I find if true amusing because they sell the most GPUs in that segment. Mid tier buyers are a bit more pragmatic with their purchases by necessity so even if they could afford the next card up that product has to demonstrate it's value for the cost increase. This is where RTX missed the mark so much, these buyers can't just go up a price segment or won't by default just to maintain buying the same model class card. If you were a 1070 buyer you weren't just going to buy a RTX 2070 without proving it's value well enough.

 

Edit:

This will happen with any 'SUPER' cards also.

This is essentially what came down to my recent video card upgrade (GTX 780 to GTX 1660 Ti). Firstly, at least for me (but I know of many others who feel the same). Triple A gaming is not as popular as it was four, five, even six years ago. There are so many good Indie, and Free-to-play games that simply don't require excessive GPU horsepower. It's not even that these games have bad graphics either. They have unique art style, or aesthetic, or use well-optimized engines (surprisingly). Basically translating, that you can get a great gaming experience out of a card like a RX 570, 580, 1060, 1660, 1070, etc. 

 

I wanted a 2060, but I knew deep down inside I really didn't need it. It simply didn't make sense for me to go that next card up from the 1660 Ti. Don't get me wrong, I have a ton of backlog games that I do want to play that could benefit (that are considered mainstream and Triple-A titles), but I honestly don't care if I have to disable AA, or Post Processing, or use SSAO instead of HBAO for example; to get a more ideal and fluid gaming experience (I mean I was on a GTX 780 since 2013, I think I'll manage just fine). Besides the fact that the main games I play, have consistent new content released all the time. I can't even catch up on those DLC/Expansions to be honest. Therefore, even when I do finish up my main games, and finally get to the backlog - it's more of a no rush attitude at my current age (stop and smell the roses if you will). Those games aren't going anywhere. Only thing that will change is the prices will get lower, and that will include future released DLC/Expansions. 

 

While I think the RTX stuff may have missed the mark, I think they made up for it with TU116. I think that people will be happy (or at least "happier") to see if the prices of the original RTX cards equalize as a result of the SUPER stuff launching, and if not, we will have to see what NAVI brings to the table. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, BiG StroOnZ said:

This is essentially what came down to my recent video card upgrade (GTX 780 to GTX 1660 Ti). Firstly, at least for me (but I know of many others who feel the same). Triple A gaming is not as popular as it was four, five, even six years ago. There are so many good Indie, and Free-to-play games that simply don't require excessive GPU horsepower. It's not even that these games have bad graphics either. They have unique art style, or aesthetic, or use well-optimized engines (surprisingly). Basically translating, that you can get a great gaming experience out of a card like a RX 570, 580, 1060, 1660, 1070, etc. 

Not just that but even the bigger titles really don't need that high GPU power at 1080p and 1440p, a lot of the highest detail settings don't really do anything anyway. I mean I still max out everything I can whenever I can but I know most of it does nothing worthy of note lol

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Not just that but even the bigger titles really don't need that high GPU power at 1080p and 1440p, a lot of the highest detail settings don't really do anything anyway. I mean I still max out everything I can whenever I can but I know most of it does nothing worthy of note lol

We're right at the edge of 1080p becoming trivial, in the same way we don't talk about 720p or 640x480. 4K is doable on high-end hardware, so in 2 generations 1080p will be completely trivial.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Taf the Ghost said:

We're right at the edge of 1080p becoming trivial, in the same way we don't talk about 720p or 640x480. 4K is doable on high-end hardware, so in 2 generations 1080p will be completely trivial.

That's not true. There are endless amounts of 1080p monitors and TV's. Only most bottom of the barrel LCD TV's are still 720p. 1080p is not going anywhere for a while. We'll go to 4K when mid end graphic cards can run it maxed out at 60fps. Which is again not happening any time soon...

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Taf the Ghost said:

We're right at the edge of 1080p becoming trivial, in the same way we don't talk about 720p or 640x480. 4K is doable on high-end hardware, so in 2 generations 1080p will be completely trivial.

I think 1080p has been 'trivial' since the RX 570, being a card that can either do 1080p ultra at 60fps or higher refresh rate with graphical tweaking, any one sticking to 1080p (which is still fine being honest, specially something like 2560x1080p) will have absolutely no problem playing the latest games at great image quality for as cheap as a console.

 

Specially with the further upcoming optimization while also quality improvements expected for the next generation of mainstream consoles.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, RejZoR said:

That's not true. There are endless amounts of 1080p monitors and TV's. Only most bottom of the barrel LCD TV's are still 720p. 1080p is not going anywhere for a while. We'll go to 4K when mid end graphic cards can run it maxed out at 60fps. Which is again not happening any time soon...

It has nothing to do with monitors, GPUs are near the point that even the more lower end ones can do 1080p on high settings without much issue hence 1080p becoming trivial. Also the next step isn't 4k even though that is what is touted so much, we didn't jump up resolutions that much before which before hand there wasn't actually some fixed static resolution like we treat it today. 1440p is the next resolution step and that is still reasonably challenging even for good mid range cards.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, RejZoR said:

That's not true. There are endless amounts of 1080p monitors and TV's. Only most bottom of the barrel LCD TV's are still 720p. 1080p is not going anywhere for a while. We'll go to 4K when mid end graphic cards can run it maxed out at 60fps. Which is again not happening any time soon...

1080p res monitors are going nowhere anytime soon. That wasn't my point. My point was gaming at 1080p/High is going to be completely trivial for GPUs. We're moving into high refresh 1440p gaming already in the mid-range. CPUs will be the limiters at 1080p within the next 2 years, not the mid-range GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

It has nothing to do with monitors, GPUs are near the point that even the more lower end ones can do 1080p on high settings without much issue hence 1080p becoming trivial. Also the next step isn't 4k even though that is what is touted so much, we didn't jump up resolutions that much before which before hand there wasn't actually some fixed static resolution like we treat it today. 1440p is the next resolution step and that is still reasonably challenging even for good mid range cards.

There's this weird issue where the GPUs might be actually running ahead of the Monitor market for the first time. Most of the last two decades was the GPUs having problems gaming properly on the current upper-mid range of Monitors. While both the Monitors & GPUs have gotten more expensive, it's really the monitors that are behind the GPUs, since 1440p is only starting to get fairly common. (i.e. prices finally are starting to make sense for fairly minor improvements over 1080p. And, really, that's only because of better Panels.)

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Taf the Ghost said:

Turing has sold poorly overall.

 

Please elaborate. I've tried running a poll previously on this forum, even at the time (closer to the launch of the GTX Turning cards) there were as many Turing respondents as there were for AMD total. Not a bad position, even if likely skewed by the LTT demographic.

 

12 minutes ago, Taf the Ghost said:

There's this weird issue where the GPUs might be actually running ahead of the Monitor market for the first time.

From my perspective, GPUs are still tracking monitors, on the high end at least. With my 1080Ti, I didn't find it good enough for 4k high/ultra at around 60fps, let alone higher refresh. The jump to 2080 Ti might just about provide that.

 

Now, for someone who thinks 1080p is plenty, more GPU power doesn't necessarily help, except indirectly as you pay ever less to get that performance. If there were to be a GPU excess, I wonder if game devs could make use of that to offer better visuals e.g. increasing poly counts, or just do more "stuff".

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Taf the Ghost said:

1080p res monitors are going nowhere anytime soon. That wasn't my point. My point was gaming at 1080p/High is going to be completely trivial for GPUs. We're moving into high refresh 1440p gaming already in the mid-range. CPUs will be the limiters at 1080p within the next 2 years, not the mid-range GPUs.

what about the used market? Those still will have a hard time depending on the card

I live in misery USA. my timezone is central daylight time which is either UTC -5 or -4 because the government hates everyone.

into trains? here's the model railroad thread!

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, porina said:

 

Please elaborate. I've tried running a poll previously on this forum, even at the time (closer to the launch of the GTX Turning cards) there were as many Turing respondents as there were for AMD total. Not a bad position, even if likely skewed by the LTT demographic.

 

From my perspective, GPUs are still tracking monitors, on the high end at least. With my 1080Ti, I didn't find it good enough for 4k high/ultra at around 60fps, let alone higher refresh. The jump to 2080 Ti might just about provide that.

 

Now, for someone who thinks 1080p is plenty, more GPU power doesn't necessarily help, except indirectly as you pay ever less to get that performance. If there were to be a GPU excess, I wonder if game devs could make use of that to offer better visuals e.g. increasing poly counts, or just do more "stuff".

We'll know more in August with Nvida's Q2 Financials, but Q1 showed a pretty big hole in their dGPU sales. We'll see what it looks like for the first half of the year soon enough. But, the reports have been pretty consistent that Turing is selling slower than Pascal did, which isn't too surprising. Nvidia wants people to pay the same price for the same performance. 

 

Main thing with monitors is that they're now quite disconnected from TVs. With the rise of 4K TVs (I believe this year is the first time 4K will be the majority), the Console space has a much bigger need for high-end GPUs than the PC Gaming space. 4K screens are nice for a computer, but they'll never make sense for the mass majority of sales. At least until the tech gets super cheap and manufacturers just stop making 1080p panels in mass because of economics. 

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, porina said:

Please elaborate. I've tried running a poll previously on this forum, even at the time (closer to the launch of the GTX Turning cards) there were as many Turing respondents as there were for AMD total. Not a bad position, even if likely skewed by the LTT demographic.

Not to kick AMD/RTG when they are down but selling as much is them is pretty much 'not selling well' ?. More joking than being serious. When some more accurate market data comes out it'll be interesting not matter the outcome.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Taf the Ghost said:

Nvidia wants people to pay the same price for the same performance. 

The value add was RTX, and I guess that is proving harder to swallow while support for it builds up. In a quick look at one retailer right now, the 2060 doesn't seem bad actually, it is significantly cheaper than the 1070 was even before mining inflation hit. When I got a 2070 it was near cost parity with then still current 1080, without the general uplift I still got RTX and power efficiency. I'm kinda debating selling off my Pascal cards while they're not worthless as the power efficiency of Turing is a noticeable jump. Literally giving away my Maxwell cards...

 

13 minutes ago, Taf the Ghost said:

Main thing with monitors is that they're now quite disconnected from TVs. With the rise of 4K TVs (I believe this year is the first time 4K will be the majority), the Console space has a much bigger need for high-end GPUs than the PC Gaming space. 4K screens are nice for a computer, but they'll never make sense for the mass majority of sales. At least until the tech gets super cheap and manufacturers just stop making 1080p panels in mass because of economics. 

I hadn't thought about console display needs in that way before, but would its need be mitigated somewhat that people tend to sit much further from TVs. Also they already use render scale techniques to give a good experience with limited GPU resource.

 

1080p has been the standard for PC displays for what feels like forever. Personally I feel 1440p is the sweet spot keeping 100% scaling while still usable pixel size. Hate using 4k as Windows scaling still really sucks. If MS ever sort that out, it would be good even for the masses for improved perceived sharpness that we more commonly see on mobile phones. Also personally I'd like to see better panels in general. Having used OLED for 1st time on current phone, why can't my computer monitor be that good? 

1 minute ago, leadeater said:

Not to kick AMD/RTG when they are down but selling as much is them is pretty much 'not selling well' ?. More joking than being serious. When some more accurate market data comes out it'll be interesting not matter the outcome.

 

I don't want to over-analyse the poll I ran previously as it will have many flaws, but note I didn't ask "what did you buy recently" but "what do you use". So on that note, as many people voted for the shiny new Turing, as Polars+Vega over their whole life up to that point. Also the poll doesn't consider what people might have used in past, so it could also be that some of the Polaris/Vega owners moved onto Turing so would no longer be counted.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, porina said:

The value add was RTX, and I guess that is proving harder to swallow while support for it builds up. In a quick look at one retailer right now, the 2060 doesn't seem bad actually, it is significantly cheaper than the 1070 was even before mining inflation hit. When I got a 2070 it was near cost parity with then still current 1080, without the general uplift I still got RTX and power efficiency. I'm kinda debating selling off my Pascal cards while they're not worthless as the power efficiency of Turing is a noticeable jump. Literally giving away my Maxwell cards...

 

I hadn't thought about console display needs in that way before, but would its need be mitigated somewhat that people tend to sit much further from TVs. Also they already use render scale techniques to give a good experience with limited GPU resource.

 

1080p has been the standard for PC displays for what feels like forever. Personally I feel 1440p is the sweet spot keeping 100% scaling while still usable pixel size. Hate using 4k as Windows scaling still really sucks. If MS ever sort that out, it would be good even for the masses for improved perceived sharpness that we more commonly see on mobile phones. Also personally I'd like to see better panels in general. Having used OLED for 1st time on current phone, why can't my computer monitor be that good? 

 

I don't want to over-analyse the poll I ran previously as it will have many flaws, but note I didn't ask "what did you buy recently" but "what do you use". So on that note, as many people voted for the shiny new Turing, as Polars+Vega over their whole life up to that point. Also the poll doesn't consider what people might have used in past, so it could also be that some of the Polaris/Vega owners moved onto Turing so would no longer be counted.

I'd sell on any Pascal cards while they still have value. The real benefit to Turing was Async finally working well. Having talked to a few people that follow GPU tech, RTX v1 is DoA. To manage the BVH Intersections, the RT cores induce so many wasted cycles that it drastically slows down the GPU even if nothing is being traced. (This behavior is seen in many tests of the tech.) It's a hardware-level issue, which means RTX v2 is the first place it'll make sense, and that's mostly what Nvidia is setting up with the next round of support. (It should work much better in Ampere at the end of 2020. Though AMD may have better RT hardware out before Nvidia. Next year could be interesting in GPUs.)

 

In about 2 years, 1440p will be the standard monitor size at 24 inches and above, but we hit 1080p at that point in around 2010? 1440p won't be the last, as 4K will eventually overtake it just because of economies of scale with panel production, but 4K is probably the realistic end of consumer desktop monitors.  However, as mentioned, Consoles will keep pushing towards higher resolutions and the technology to do it without having to directly calculate all of those pixels in the pipeline. It's going to be a strange dynamic over the next 2 console generations. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×