Jump to content

RTX GPUs should see a 35% - 45% improvement in current games over the GTX 10 without DLSS and other RTX enhancements - Tom Peterson

D13H4RD

Taking the lower number (35%), becouse you know .. Nvidia... Then.... thats REALLY not enough of a performance boost to justify the price, it REALLY isnt.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SolarNova said:

Taking the lower number (35%), becouse you know .. Nvidia... Then.... thats REALLY not enough of a performance boost to justify the price, it REALLY isnt.

maybe not for you but people buy new cpu for less increase

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, SolarNova said:

Taking the lower number (35%), becouse you know .. Nvidia... Then.... thats REALLY not enough of a performance boost to justify the price, it REALLY isnt.

I would agree if the RTX lineup would not have Tensor and RT Cores. With tensor that 35% may as well jump to 70% and there is nothing like RT at all. May not be worth it to you, but it certainly is for others. Considering most 2080 TIs are out of stock already.

Link to comment
Share on other sites

Link to post
Share on other sites

I wait with abated breath for the independent benchmarks.  But if this is true then Nvidia might have finally found a way to convince me to part with more money for a luxury item.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, D13H4RD2L1V3 said:

That's usually been the case. 

 

Pascal's improvements, for instance, saw the GTX 1060 6GB being roughly equivalent to a 980 and the 1070 being roughly equivalent to a 980 Ti

Yep, I had GTX980Tis (very good ones that broke past 1500MHz) that trade blows with GTX1070s back then.

Still, some more game benchmarks be nice even if I am skipping out on this gen.  At least give me an idea of what one can look forward too when 7nm drops, which should bring more refinements with the newer features as well.

 

Though, I am more interested in what the RTX cards can do in folding and BOINC.  Those applications probably won't leverage the new cores, but those extra Cuda cores should bump the output a bit.

 

6 hours ago, d0mini said:

I’ve been using this forum much more since overclock.net had a redesign. Just thought I’d say I really appreciate this community, you guys are great. That’s all :)

Ah, another from OCN.  Yeah, the site redesign is bad over there.  Think a good chunk of folks left by the looks or don't post much anymore over there.

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

These numbers sound good.. except its double speak. All the cards went up a tier, especially in price. The 2080ti replaces the titan, the 2080 replaces the 1080ti.. etc etc. So if the 2080 is 40% better than the 1080, thats useless. Is 1080ti money. It needs to be 25% better than the 1080ti to be worth it. But its most likely a close tie to the 1080ti.

Main Rig: http://linustechtips.com/main/topic/58641-the-i7-950s-gots-to-go-updated-104/ | CPU: Intel i7-4930K | GPU: 2x EVGA Geforce GTX Titan SC SLI| MB: EVGA X79 Dark | RAM: 16GB HyperX Beast 2400mhz | SSD: Samsung 840 Pro 256gb | HDD: 2x Western Digital Raptors 74gb | EX-H34B Hot Swap Rack | Case: Lian Li PC-D600 | Cooling: H100i | Power Supply: Corsair HX1050 |

 

Pfsense Build (Repurposed for plex) https://linustechtips.com/main/topic/715459-pfsense-build/

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, d0mini said:

Having them clocked to the wall at stock would make them look worse for power consumption, instead they can advertise more overclocking headroom and leave the choice of increased clocks or lower power usage to users.

When I see really good performance I really don't give a damn about power draw :).

 

Maybe they don't want such a huge clock and performance difference between desktop and mobile GPUs as well, could be something like that?

Link to comment
Share on other sites

Link to post
Share on other sites

I'm going to laugh if they are perpetually sold out for months after launch and everyone who blasted me for preordering can't find one.

Link to comment
Share on other sites

Link to post
Share on other sites

The without 'DLSS and other enhancements' only matters if Nvidia can't get game makers to implement (fairly simple) features into their game engines. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

When I see really good performance I really don't give a damn about power draw :).

 

Maybe they don't want such a huge clock and performance difference between desktop and mobile GPUs as well, could be something like that?

Maybe but remember nvidia is on limited cross patent licenses with Samsung now

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, pas008 said:

Maybe but remember nvidia is on limited cross patent licenses with Samsung now

 

What? Samsung? Nvidia mobile GPUs go in laptops, they are mobile SKUs and referred to as that by Nvidia.

 

Cellphones, pfffff I'd never talked about those B|.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

What? Samsung? Nvidia mobile GPUs go in laptops, they are mobile SKUs and referred to as that by Nvidia.

 

Cellphones, pfffff I'd never talked about those B|.

But they have cross patented licenses

Maybe that's how they are able to stack that shit up on a wafer

Take some intel samsung and tsmc process together and bam these huge dies

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, pas008 said:

But they have cross patented licenses

Maybe that's how they are able to stack that shit up on a wafer

What does that have to do with what I was replying to? Conversation was about overclocking and the very commonly achievable clocks well above stock. Nvidia could raise the spec as suggested since pretty much all of them can hit ~2Ghz but the mobile SKU could never get near that (cooling + power). Even now there is a small difference between a desktop 1080 and a mobile 1080 (both the mobile SKU and a desktop SKU in a laptop) but if you raise that desktop clock standard up to that 2Ghz mark then all 1080's in laptops would look rather bad compared to a desktop counterpart, would look bad for Nvidia and the laptop maker (they get enough flak now about bad cooling).

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, ltguy said:

These numbers sound good.. except its double speak. All the cards went up a tier, especially in price. The 2080ti replaces the titan, the 2080 replaces the 1080ti.. etc etc. So if the 2080 is 40% better than the 1080, thats useless. Is 1080ti money. It needs to be 25% better than the 1080ti to be worth it. But its most likely a close tie to the 1080ti.

In Australia the 2080 is the same price as the 1080ti, performs a little better and has all the new tech goodies.  Not exactly double speak. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

What does that have to do with what I was replying to? Conversation was about overclocking and the very commonly achievable clocks well above stock. Nvidia could raise the spec as suggested since pretty much all of them can hit ~2Ghz but the mobile SKU could never get near that (cooling + power). Even now there is a small difference between a desktop 1080 and a mobile 1080 (both the mobile SKU and a desktop SKU in a laptop) but if you raise that desktop clock standard up to that 2Ghz mark then all 1080's in laptops would look rather bad compared to a desktop counterpart, would look bad for Nvidia and the laptop maker (they get enough flak now about bad cooling).

Just rambling

All this is 

Is just ramblings

Intel is only coming into the game because of the push for rt

Which they had tried on larrabee 

Nvidia cock blocked them then

But for your mobile comment

They could be clocking their cards accordingly for mobile aspects

And why we see hard locks in firmware

 

But my statement was to put

Intel sacrifice their node with some spacing which allows for the higher clocks even if 14+++ but gives up some efficeincy 

I'm just having drunk talk that nvidia is playing the game of cross patents to get what they need to incert it to their ai programs to achieve their purpose

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, VegetableStu said:

I'll counterlaugh when the 2180ti DOUBLES performance over the 2080ti

 

or the 2080 mark ii inherits the 2080ti's performance

I will laugh and say at least I can for now having lived with raytracing a little sooner :P

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, leadeater said:

When I see really good performance I really don't give a damn about power draw :).

 

Maybe they don't want such a huge clock and performance difference between desktop and mobile GPUs as well, could be something like that?

Hey moderator, thanks for the reply :)

 

You or I might not care about power draw, but looking at nvidia’s history of GPUs, it seems clear that for whatever reason, they want to keep their GPUs within a certain TDP (around 150-165W for x80 and 250W for x80ti).

 

This is a new release and they are competing against themselves, they have no reason to deviate from their historical TDP trend other than to ensure an acceptably higher level of performance over the 10 series cards.

 

With that in mind, you could even argue that nvidia is already pushing these cards harder than they would ideally have liked in order to meet adequate performance levels. If the 10 series was worse than it is, we might have seen even lower clocked 20 series cards with TDPs more in-line with previous cards than they are currently.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, ltguy said:

These numbers sound good.. except its double speak. All the cards went up a tier, especially in price. The 2080ti replaces the titan, the 2080 replaces the 1080ti.. etc etc. So if the 2080 is 40% better than the 1080, thats useless. Is 1080ti money. It needs to be 25% better than the 1080ti to be worth it. But its most likely a close tie to the 1080ti.

This.

 

When my argument was "the 2080ti is to expensive when compared  to previous generations", and then people for some strange reason started trying to defend Nvidia and justifiy it ( i mean what kind of consumer openly goes out of their way to defend spending more money for somthing that is usualy less, its absurd) by saying "the 2080ti is a Titan", that doesnt justifiy it, that simple moves the tier list, which still means things are screwed up.

 

As the above quoted poster correctly stated, if what would have been the Titan is now a 2080ti, then what would have been the 2080ti is in that case the 2080, as such it 'should' then be compared to the 1080ti, not the 1080.

When you make that comparison you see it is more expensive than it 'should' be and wil likely not have the perfomance jump you would expect. I hope im wrong here i really do.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

I don't understand the "35% increase is not worth the price" crowd.  Since when has performance increase to price increase ratio been linear?  People have been paying exorbitant prices for the last few percents of performance since the existence of enthusiasts.  This has been true for almost any industry.

 

To look at the highest end of performance in the metric of value is silly.

i9-9900k @ 5.1GHz || EVGA 3080 ti FTW3 EK Cooled || EVGA z390 Dark || G.Skill TridentZ 32gb 4000MHz C16

 970 Pro 1tb || 860 Evo 2tb || BeQuiet Dark Base Pro 900 || EVGA P2 1200w || AOC Agon AG352UCG

Cooled by: Heatkiller || Hardware Labs || Bitspower || Noctua || EKWB

Link to comment
Share on other sites

Link to post
Share on other sites

image.png.fd71fcfd8f7835fa754239d9a829f44e.png 

 

image.png.5879a46cb64e1d86a79f68e080640137.png

 

RTX GPUs should see a 35% - 45% improvement in current games over the GTX 10 without DLSS and other RTX enhancements - Tom Peterson

 

All these releases and comparisons showcasing the power of rtx... why didn't they just tack this on at the announcement? What were the settings used for these benches? on what hardware? why no side by side comparisons with the only difference being the gpu?... *eats some mud* meh ill pre order it and find out later! image.png.d16c726379aaf9b02cc54feca7628097.png

Bolivia.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, SolarNova said:

i mean what kind of consumer openly goes out of their way to defend spending more money for somthing that is usualy less, its absurd

3

Well, it really is not absurd at all.

Comparing something new, with new features and options to something that does not include those,... that is absurd.

 

There are plenty of reasons the new cards are more expensive. At least 4 come to mind right away. You don't have to like it, but it is what we have right now.

The market always regulates itself. If people pay it, then the price is fine. If they don't the price will drop.

 

As of right now (considering most 2080tis are sold out) id say that enough people appreciate the new features and options. For the rest, the 1080ti is still there and kicking.

I don't go out of my way to hate BMW because the new i8 is too expensive for me and not worth the higher price, just because I only drive a 530i. If I ignore all the hybrid functionality of it, then yeah the 530i is superior and like 1/3rd the price. But those that like the features don't even consider a 5xx version, because it simply does not have the features of an i8.

 

The same goes for RTX.

They are damn expensive, not gonna lie. But they also pack unique features PLUS a performance jump.

And again, using your reasoning, it is absurd to compare RTX to GTX and ignore all the new stuff.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, SolarNova said:

Taking the lower number (35%), becouse you know .. Nvidia... Then.... thats REALLY not enough of a performance boost to justify the price, it REALLY isnt.

You do understand you're paying for a massive core, right?

.

Link to comment
Share on other sites

Link to post
Share on other sites

Ok im just going to post the past price points of high end cards.

 

The arguments being made are absurd. yes its a bigger core, yes it has new features . yes yes. Thats irrelivant, every past new series has somthing over the older, that doesnt mean each one has had a price hike over the last and most certianly not to this extent. Observe.

 

7800 GTX $600

8800 GTX $650 (ultra $830)

GTX 280 $650

GTX 480 $500

GTX 580 $500

GTX 680 $500

GTX 780ti $700

GTX 980ti $650

GTX 1080ti $700

 

GTX 2080ti $1200  (GTX 2080 $800)

 

Facts, figures, they dont lie. These are historical facts. Each card listed was placed to replace the prior both in terms of performance and price point. With AMD's lack of high end perfomance with the 200 series vs the GTX 700 series, there was a price hike back up to around that of pre GTX 200 series.

Now the 2080ti comes and people make the excuse, "well its a Titan", ok exclude it fom the list if you must, and use the 2080 as the replacment for the 1080ti, Itis still a $100 MSRP price hike.

So far word from Nvidia is the 2080 is 35-45% faster than the 1080, since its the manufacture is saying it, asume at most 35% over the 1080. The 1080ti is 'atleast' 30% faster than the 1080. So, from Nvidia,  they are effectivly saying the 2080 is 5% faster than the 1080ti.

So your ok with paying $100 for 5%? 5%, thats 5 fps if you can run a game at 100 fps with a 1080ti. And your ok defending that ?

 

Historicaly, at the price point of the previous generations primary flagship card (+/- $50) the new gen has a not insignificant performance bump of 25% 'atleast'

The 2080 (if u think the 2080ti should be forgiven the price due to "its a Titan") is, untill actual user benchmarks arrive, around 5% faster than the 1080ti for atleast a $100 bump.

 

These are the facts and figures im using here, please do correct me if im wrong here. really.  Feel free to nit pick, ill adjust the figures, im sure the results will be the same, but im open to being proven wrong.

 

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, SolarNova said:

These are the facts and figures im using here, please do correct me if im wrong here. really.  Feel free to nit pick, ill adjust the figures, im sure the results will be the same, but im open to being proven wrong.

5

You are not wrong in these numbers.

 

You just ignore the first graphical improvement in a decade, a 50% bigger die size, lacking competition and NVidia not wanting to push their own cards (1080, 1080ti) out of the market themselves.

 

Prices are higher, no one doubts that. Everyone can see it. There is no discussion to be had here.

The discussion we can (and should) have is if the mentioned differences are worth the price. And that is likely down to the buyer itself.

 

For me? Hell yeah. Tensor cores alone are worth it for me as I work on machine learning. This means I can do home-office and not spend 3k on a Titan V.

I also run 1440p with AA enabled, so DLSS is a godsend for me and will make the performance jump extremely nice.

I also enjoy better graphics and HATE how shadows and light are handled currently. This seems to be an argument I am alone with, but I personally got excited with the room they showed with the only light coming through the window. if it was only that, it would be worth it FOR ME.

 

How about the average CS:GO player that only runs on low settings to get 300+ FPS?

They will hate RTX. It is a big price jump for added features that they don't use at all. All they care about is FPS. Then again, do they need a new card tho? 1080ti is still around and should be enough to run CS:GO on 300+ fps on low, no?

 

The price hike is hugely due to the new features that are NOT OPTIONAL. Not gonna lie here. And I fully understand that people don't want to pay for it. But simply, don't do it then.

We can argue if making it optional would be possible, but I honestly don't see that as a good strategy for a company trying to establish a new tech.

Maybe they put the 2030, 2050 and 2060 under the GTX brand, who knows. But I actually hope they don't. I would love to see this new tech be the new standard in 2 years. And we are not gonna get that if it is optional. As sad as it is.

 

No one should be entitled to a company like NVidia doing exactly what they want. They pick what they want to do and release the product, you either like it and buy it, or you don't.

Again, having a discussion about the new stuff is fine and great. But it really should be about the new tech, not about entitlement.

 

People acting like Nvidia owes them a Card that has exactly X% performance gain, must have the name Y and must have price Z are really annoying. It is Nvidia's product, they choose. If you can do it better, then go ahead and be the 3rd (4th) player in the market.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rattenmann said:

You just ignore the first graphical improvement in a decade

You mean 3d game graphic didn't improve in past decade? 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×