Jump to content

People are a little like spoilt brats but they use their own money, but put expectations way too high.

IronMidHeaven

Take the recent nVidia release: RTX.

It is new tech, is it great right now? no. Will it or can it be great in the future perhaps so.

Why are so many of you missing some patience & common sense? nVidia has done this before with the GTX 400 series, it touted superior tesselation hardware and better DX11 than AMD. It was not showing this at first which is what everyone remembered it for, but move on a year or so later the GTX 400 line up decimated the HD 5K and even toasted the 6K series in a lot of later DX11 titles.

 

I expect RTX to be similar unless it is tech that does not catch on, then nVidia will have a lot to pay for, but knowing nVidia they will have thought this out properly.

FragBox: i7-2700k 4.7-12GB Samsung 1600mhz CL8-ASUS P8Z68-V/GEN 3-R9 Fury Nitro custom bios-NZXT HALE90 750W-NZXT s340

AOC i2267WH IPS

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, IronMidHeaven said:

Take the recent nVidia release: RTX.

I agree except that gtx 400 line up never toasted and never even kept up with 6K series. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Settlerteo said:

I agree except that gtx 400 line up never toasted and never even kept up with 6K series. 

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-480-vs-AMD-HD-6970/3157vsm7712

 

I am sure we can add some memes for the toasting xD

 

HD 6K OC poorly, Fermi OC'd great and had big performance gains.

FragBox: i7-2700k 4.7-12GB Samsung 1600mhz CL8-ASUS P8Z68-V/GEN 3-R9 Fury Nitro custom bios-NZXT HALE90 750W-NZXT s340

AOC i2267WH IPS

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, IronMidHeaven said:

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-480-vs-AMD-HD-6970/3157vsm7712

 

I am sure we can add some memes for the toasting xD

 

HD 6K OC poorly, Fermi OC'd great and had big performance gains.

both of these cards released in 2010.................. what the hell are you talking about? they're performing about the same.

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, syn2112 said:

both of these cards released in 2010.................. what the hell are you talking about? they're performing about the same.

You must not know about history.

 

The HD 5870 was AMD's first DX11 GPU which came out in 2009.

GTX 480 was nVidia's idea of what a DX11 GPU should be, it came out in 2010.

AMD's answer was HD 6K series which had no effect on Fermi taking the lead as heavier games came out using more DX11 features.

 

The HD 7970 set in stone what AMD was capable of and was a great architecture.. still is in fact.

 

FragBox: i7-2700k 4.7-12GB Samsung 1600mhz CL8-ASUS P8Z68-V/GEN 3-R9 Fury Nitro custom bios-NZXT HALE90 750W-NZXT s340

AOC i2267WH IPS

Link to comment
Share on other sites

Link to post
Share on other sites

Thread cleaned and moved to the Off-Topic section.

 

Please keep this on-topic and clean, there's no need to be condescending toward each other.

If you need help with your forum account, please use the Forum Support form !

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, IronMidHeaven said:

nVidia has done this before with the GTX 400 series, it touted superior tesselation hardware and better DX11 than AMD. It was not showing this at first which is what everyone remembered it for, but move on a year or so later the GTX 400 line up decimated the HD 5K and even toasted the 6K series in a lot of later DX11 titles.

Yes, but the GTX 480 wasn't $500 more expensive than the GTX 280 and still offered a moderate performance boost.

 

Almost every GPU architecture you bring up will be a bad example because we have not seen a price rise like this before.

 

Also, Fermi is remembered as being incredibly hot more than anything.

 

 

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Suika said:

Yes, but the GTX 480 wasn't $500 more expensive than the GTX 280 and still offered a moderate performance boost.

 

Almost every GPU architecture you bring up will be a bad example because we have not seen a price rise like this before.

Solid point to make on that!

 

 

FragBox: i7-2700k 4.7-12GB Samsung 1600mhz CL8-ASUS P8Z68-V/GEN 3-R9 Fury Nitro custom bios-NZXT HALE90 750W-NZXT s340

AOC i2267WH IPS

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Suika said:

Yes, but the GTX 480 wasn't $500 more expensive than the GTX 280 and still offered a moderate performance boost.

 

Almost every GPU architecture you bring up will be a bad example because we have not seen a price rise like this before.

 

Also, Fermi is remembered as being incredibly hot more than anything.

 

 

pascal released with 1200 700 and 550 price points

 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, pas008 said:

pascal released with 1200 700 and 550 price points

You're including the Titan, and Pascal offered way more performance compared to Maxwell then Turing does to Pascal.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

More like people have a compunction (or no self restraint) that they MUST have the latest thing, is the new "I don't wanna be the uncool kid on the playground"

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Suika said:

You're including the Titan, and Pascal offered way more performance compared to Maxwell then Turing does to Pascal.

what does tier matter if the tier changed lol

and performance boost of turing isnt done yet

remember you are getting tensor, rt and cuda along with better nvenc and as leadeater pointed out in review thread about gsync/hdr boost

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, pas008 said:

what does tier matter if the tier changed lol

and performance boost of turing isnt done yet

remember you are getting tensor, rt and cuda along with better nvenc and as leadeater pointed out in review thread about gsync/hdr boos

Tier does matter because tier hasn't changed. The 2080 Ti is still a cut down TU102, so we very well may see a fully enabled TU102 Titan class card.

 

Yes, certain games that use NVIDIA's DLSS will see a larger performance uplift, but the games I play, that don't have DLSS planned for a future update, will see that the 2080 is functionally indifferent from a 1080 Ti. I'm certain that there are other consumers who won't benefit from DLSS either.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Suika said:

Tier does matter because tier hasn't changed. The 2080 Ti is still a cut down TU102, so we very well may see a fully enabled TU102 Titan class card.

 

Yes, certain games that use NVIDIA's DLSS will see a larger performance uplift, but the games I play, that don't have DLSS planned for a future update, will see that the 2080 is functionally indifferent from a 1080 Ti. 

lol wasnt 1080ti and titan cut downs too

tier doesnt matter

 

and you should look at how pascal matured in drivers

 

and if your games will not use dlss or take advantage of new tech then why say shit?

that is your buying power not everyone elses

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, pas008 said:

lol wasnt 1080ti and titan cut downs too

The 1080 Ti and Titan X(Pascal) were cut down, the Titan Xp, that came out shortly after the 1080 Ti, is fully unlocked GP102. As I mentioned, we may still see a Titan Turing, so tier matters as far as I'm concerned.

6 minutes ago, pas008 said:

and you should look at how pascal matured in drivers

Pascal may have matured, but still launched with objectively better performance than Maxwell. Turing launch with similar performance to Pascal.

8 minutes ago, pas008 said:

and if you games that will not use dlss or take advantage of new tech then why say shit?

Because NVIDIA has a stranglehold on the market, and they're allowed to launch disappointing products with little to no repercussion because of fanboys and apologists. The less stink people make about how disappointing Turing has been so far, the more likely NVIDIA will make this a pattern.

 

It's worth mentioning, Pascal also introduced new technologies, and was still objectively better in raw performance than Maxwell.

12 minutes ago, pas008 said:

that is your buying power not everyone elses

Which is why I'm making a point here. Turing, as of right now, is a bad product and nobody should be jumping on board. People may have their reasons to, but right now it's a bad purchase.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

980ti's are still performing like beasts today.  The testimony I have is my friends rig with a i7 X CPU and the 980ti.  Only game he has had to dumb down graphics in (we game daily together and have for 10+ years) 4k was S.C.U.M. - he is only RECENTLY losing the performance he has had for years. 

 

This is completely in line with all of their releases as far as I can tell.  Barely out perform the previous generation, but stay ahead of AMD.  Why release their best GPU when they can sell you the RND from last year or previous years because Consumers will STILL BUY it. 

 

Consumers are the reason NVIDIA is able to have the business practice it has.  That and the lack of a competitor.

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Wifes Rig: ASRock B550m Riptide, Ryzen 5 5600X, Sapphire Nitro+ RX 6700 XT, 16gb (2x8) 3600mhz V-Color Skywalker RAM, ARESGAME AGS 850w PSU, 1tb WD Black SN750, 500gb Crucial m.2, DIYPC MA01-G case

My Rig: ASRock B450m Pro4, Ryzen 5 3600, ARESGAME River 5 CPU cooler, EVGA RTX 2060 KO, 16gb (2x8) 3600mhz TeamGroup T-Force RAM, ARESGAME AGV750w PSU, 1tb WD Black SN750 NVMe Win 10 boot drive, 3tb Hitachi 7200 RPM HDD, Fractal Design Focus G Mini custom painted.  

NVIDIA GeForce RTX 2060 video card benchmark result - AMD Ryzen 5 3600,ASRock B450M Pro4 (3dmark.com)

Daughter 1 Rig: ASrock B450 Pro4, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) Patriot Viper CL14 2666mhz RAM, Corsair HX850 PSU, 250gb Samsun 960 EVO NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MP30 M.2 SATA III SSD, SuperTalent 512gb SATA III SSD, CoolerMaster HAF XM Case. 

https://www.3dmark.com/3dm/37004594?

Daughter 2 Rig: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire Nitro+ R9 Fury Tri-X, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case

 

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Suika said:

The 1080 Ti and Titan X(Pascal) were cut down, the Titan Xp, that came out shortly after the 1080 Ti, is fully unlocked GP102. As I mentioned, we may still see a Titan Turing, so tier matters as far as I'm concerned.

Pascal may have matured, but still launched with objectively better performance than Maxwell. Turing launch with similar performance to Pascal.

Because NVIDIA has a stranglehold on the market, and they're allowed to launch disappointing products with little to no repercussion because of fanboys and apologists. The less stink people make about how disappointing Turing has been so far, the more likely NVIDIA will make this a pattern.

 

It's worth mentioning, Pascal also introduced new technologies, and was still objectively better in raw performance than Maxwell.

Which is why I'm making a point here. Turing, as of right now, is a bad product and nobody should be jumping on board. People may have their reasons to, but right now it's a bad purchase.

isnt turing still the 3rd largest jump on performance without dlss or rt in the equation? in waht over last 10 yrs easy?

 

and also visuals are performance increases when it comes to rt/etc

fps arent the only way to increase performance

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Suika said:

Also, Thermi is remembered as being incredibly hot more than anything.

ftfy

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, pas008 said:

isnt turing still the 3rd largest jump on performance without dlss or rt in the equation? in waht over last 10 yrs easy?

Not.. really?

 

As I mentioned, the RTX 2080 is functionally indifferent from the 1080 Ti currently, they perform on par. The 2080 Ti is the only story here and it only performs a disappointing 20-30% better at a larger power draw, where as the GTX 1080 Ti had a 60-80% performance uplift when compared to the GTX 980 Ti.

 

Also go out and compare RTX visuals in video games right now. Do it, I dare you.

 

You can't, so Turing is so far extremely disappointing.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Suika said:

Not.. really?

 

As I mentioned, the RTX 2080 is functionally indifferent from the 1080 Ti currently, they perform on par. The 2080 Ti is the only story here and it only performs a disappointing 20-30% better at a larger power draw, where as the GTX 1080 Ti had a 60-80% performance uplift when compared to the GTX 980 Ti.

 

Also go out and compare RTX visuals in video games right now. Do it, I dare you.

 

You can't, so Turing is so far extremely disappointing.

lol 4xx to 5xx

5xx to 6xx

6xx to 7xx

all those series where what typical 25% and was fine

now you think you are entitled to 70% all the time now

lol look at cpus you can get 10 cores didnt mean it was going to be2.5x better performance for apps gaming etc years ago and pretty much now still

 

and SO FAR yes but cant get new games using tech if the tech isnt out there yet

 

and the best always have price premiums 2080ti is it

Link to comment
Share on other sites

Link to post
Share on other sites

Pricing structure isn't in line with inflation, currency devaluation - its inline with consumers who buy stuff they shouldn't because they need to have that feels good (which is okay, its their money, their life):

 

Nvidia has announced the price of its new graphics card, the Nvidia GeForce GTX 780 Ti, will be $699 when it hits shops on 7 November. The new card - the specs of which still haven't been revealed - will come in above the GTX 780 so accordingly that card will be getting a price cut, dropping from $649 to $499.Oct 28, 2013

 

5 years later, newest TI coming in at $1250 isn't even inline with anything I can correlate...sans its Consumers fault alone this is acceptable (price vs perf)

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Wifes Rig: ASRock B550m Riptide, Ryzen 5 5600X, Sapphire Nitro+ RX 6700 XT, 16gb (2x8) 3600mhz V-Color Skywalker RAM, ARESGAME AGS 850w PSU, 1tb WD Black SN750, 500gb Crucial m.2, DIYPC MA01-G case

My Rig: ASRock B450m Pro4, Ryzen 5 3600, ARESGAME River 5 CPU cooler, EVGA RTX 2060 KO, 16gb (2x8) 3600mhz TeamGroup T-Force RAM, ARESGAME AGV750w PSU, 1tb WD Black SN750 NVMe Win 10 boot drive, 3tb Hitachi 7200 RPM HDD, Fractal Design Focus G Mini custom painted.  

NVIDIA GeForce RTX 2060 video card benchmark result - AMD Ryzen 5 3600,ASRock B450M Pro4 (3dmark.com)

Daughter 1 Rig: ASrock B450 Pro4, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) Patriot Viper CL14 2666mhz RAM, Corsair HX850 PSU, 250gb Samsun 960 EVO NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MP30 M.2 SATA III SSD, SuperTalent 512gb SATA III SSD, CoolerMaster HAF XM Case. 

https://www.3dmark.com/3dm/37004594?

Daughter 2 Rig: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire Nitro+ R9 Fury Tri-X, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, IronMidHeaven said:

It was not showing this at first which is what everyone remembered it for, but move on a year or so later the GTX 400 line up

A year or so later you could buy the best bang for your buck DX11 GPU available on the market. You didn't need a GTX 400, and you didn't need to pay GTX 400 prices for that.

 

At the current state of game development, I don't see RTX as anything but an afterthought in games a year from now (especially taking into account that only 2070 -not reviewed yet- and above will have RTX, meaning game developers still need to focus on rasterization for the time being), but regardless, let's say in X time it does effectively become ubiquitous and fundamental for game rendering: you still need to argue that when time X arrives there won't be a better card for RTX, or a better value, or a lower price for the RTX 2000 series. Because if I can buy a better or cheaper RTX card when RTX actually matters, then everyone is right: there's no point in paying $1200 today for ~30% more performance than a 1080ti today.

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, pas008 said:

lol 4xx to 5xx

5xx to 6xx

6xx to 7xx

all those series where what typical 25% and was fine

now you think you are entitled to 70% all the time now

The GTX 580 launched the exact same year as the 480 and was launched to fix the issue of being Fermi. I'm not going to say it's an unfair comparison, but we're talking a difference of 7 months. Turing came over two years after Pascal, or about 18 months after the 1080 Ti. The performance difference was admittedly small.

 

The GTX 680 vs. 580 is more significant than 25%, usually 30-50% depending on the title.

 

The GTX 780 Ti to the 680 is pretty much the same gain that Pascal saw to Maxwell, roughly 70% or in that ballpark. 25% is pretty insulting.

 

So no, I'm not suddenly entitled. You're also ignoring the massive price discrepancy, the second most polarizing price jump we've seen in the last decade was in Keplar, the 780 Ti was 40% more expensive than the GTX 680, but was also 60-70% more powerful. The 2080 Ti is now up to 80% more expensive... but 20-30% more powerful. All of the other generations you listed didn't see that much of a pricing discrepancy in the x80 class cards and still offered performance gains.

41 minutes ago, pas008 said:

lol look at cpus you can get 10 cores didnt mean it was going to be2.5x better for apps gaming etc years ago and pretty much now

That really doesn't relate to any of this discussion. Regardless, some apps scale differently and will offer 2.5x performance, some don't. CPUs are a different beast for a different discussion.

41 minutes ago, pas008 said:

and SO FAR yes but cant get new games using tech if the tech isnt out there yet

So I'll reiterate my point; Turing is a terrible buy. Don't buy it right now. You would be the early adopter of literally nothing.

Quote

and the best always have price premiums 2080ti is it

...What? So just because it's a flagship, it ought to be a ridiculously expensive product outside the reach of many consumers?

 

That's an absolutely ridiculous argument to hold.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Suika said:

The GTX 580 launched the exact same year as the 480 and was launched to fix the issue of being Fermi. I'm not going to say it's an unfair comparison, but we're talking a difference of 7 months. Turing came over two years after Pascal, or about 18 months after the 1080 Ti. The performance difference was admittedly small.

 

The GTX 680 vs. 580 is more significant than 25%, usually 30-50% depending on the title.

 

The GTX 780 Ti to the 680 is pretty much the same gain that Pascal saw to Maxwell, roughly 70% or in that ballpark. 25% is pretty insulting.

 

So no, I'm not suddenly entitled. You're also ignoring the massive price discrepancy, the second most polarizing price jump we've seen in the last decade was in Keplar, the 780 Ti was 40% more expensive than the GTX 680, but was also 60-70% more powerful. The 2080 Ti is now up to 80% more expensive... but 20-30% more powerful. All of the other generations you listed didn't see that much of a pricing discrepancy in the x80 class cards and still offered performance gains.

That really doesn't relate to any of this discussion. Regardless, some apps scale differently and will offer 2.5x performance, some don't. CPUs are a different beast for a different discussion.

So I'll reiterate my point; Turing is a terrible buy. Don't buy it right now. You would be the early adopter of literally nothing.

...What? So just because it's a flagship, it ought to be a ridiculously expensive product outside the reach of many consumers?

 

That's an absolutely ridiculous argument to hold.

do you realize 780ti wasnt relased six months after 780 and titan was first

6xx series had no ti or titan

but now naming tiers matter?

 

and the best products do have price premiums always did and always will

hence titan series being hot sellers period

 

you can have your opinion on rtx all you want but people will pay for the best i'd buy 2080ti if wasnt upgrading living stuff this year

on the rtx 2080 well when has performance ever decreased and stagnated after new product launch especially after few drivers? but wait there is also new features to improve

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×