Jump to content

Nvidia Super Card Reviews

6 hours ago, thorhammerz said:

Percentages bad! Raw numbers good! Because I say so!

 

?

Another "wow 20% performance upgrade". Which ends up being fucking 5fps in the end. Basically margin of error in real world coz 5fps means dick. But hey, % good, framerate, what the fuck even is that other than the very fucking reason we all buy stupid expensive graphic cards. But hey, go on, fucking mock me. Pathetic.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, RejZoR said:

Another "wow 20% performance upgrade". Which ends up being fucking 5fps in the end. Basically margin of error in real world coz 5fps means dick. But hey, % good, framerate, what the fuck even is that other than the very fucking reason we all buy stupid expensive graphic cards. But hey, go on, fucking mock me. Pathetic.

 

If 20% is margin of error then you have some serious data collection issues.   The most common Margin of error is usually at a confidence level of 95%.  For more malleable data it can be 90% and for strict data control it's 99%.  If you have 80% the information is basically trash to begin with.

 

Suffice to say, the main issue still presents in that if you use FPS then you have the same confidence level and margin of error, so the information and accuracy does not change regardless.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

This is the whole frigging point. You say 20% is a lot coz it's a big number. I say it's margin of error coz if 20% means 5fps of actual framerate I kinda fucking rest my case. 5fps in insignificant worthless difference that no one can sense or make any good use of. 100 or 105 fps, no damn difference. Just like there is no god damn difference between 10 and 15 fps or 25 and 30. It's just too insignificant. But muh 20% performance boost and everyone loses their shit coz it's 20%!!!!!!!! TWENTY!!!!! And that's like... a lot. Or 10%. Or 15%. In terms of framerate, all these are useless numbers coz they always end up providing insignificant framerate. Always. Seen a lot of reviews on reputable pages where they specified overclocks in % and were "happy" about sub 15%. Turning that into framerate and it almost always gave such small framerate differences it's not even worth bothering to overclock. Same reason why I don't bother manually overclocking my GTX 1080Ti. Anything I'd try would give me such insignificant framerate it's not even worth the effort. But would look nice in % tho.

 

I made up 20% being 5fps to make my point obvious, but apparently people still don't get it. Blimey.

Link to comment
Share on other sites

Link to post
Share on other sites

Welp glad I went with a 1080Ti over any of the RTX cards

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, RejZoR said:

This is the whole frigging point. You say 20% is a lot coz it's a big number. I say it's margin of error coz if 20% means 5fps of actual framerate I kinda fucking rest my case. 5fps in insignificant worthless difference that no one can sense or make any good use of. 100 or 105 fps, no damn difference. Just like there is no god damn difference between 10 and 15 fps or 25 and 30. It's just too insignificant. But muh 20% performance boost and everyone loses their shit coz it's 20%!!!!!!!! TWENTY!!!!! And that's like... a lot. Or 10%. Or 15%. In terms of framerate, all these are useless numbers coz they always end up providing insignificant framerate. Always. Seen a lot of reviews on reputable pages where they specified overclocks in % and were "happy" about sub 15%. Turning that into framerate and it almost always gave such small framerate differences it's not even worth bothering to overclock. Same reason why I don't bother manually overclocking my GTX 1080Ti. Anything I'd try would give me such insignificant framerate it's not even worth the effort. But would look nice in % tho.

 

I made up 20% being 5fps to make my point obvious, but apparently people still don't get it. Blimey.

Lol why did you get 1080ti in the first place?

Its 30% faster than 1080 ?

Wait 30% means shit right?

 

Exactly you are laughable

Why overclock

Why get faster ram

Why get more cores

Why get newer stuff period

Etc etc

They only are percentages faster even if 5fps lol

Link to comment
Share on other sites

Link to post
Share on other sites

If 20% more performance is giving you 5 FPS that means you were running the game in question at 25 FPS in the first place. What?

AMD Ryzen 7 3700X | Thermalright Le Grand Macho RT | ASUS ROG Strix X470-F | 16GB G.Skill Trident Z RGB @3400MHz | EVGA RTX 2080S XC Ultra | EVGA GQ 650 | HP EX920 1TB / Crucial MX500 500GB / Samsung Spinpoint 1TB | Cooler Master H500M

Link to comment
Share on other sites

Link to post
Share on other sites

Im only interested in how the 2080s will perform.

 

I wont be buying one ofc, i already made my choice recently and bought a 1080ti, so far these new cards and the results have just reinforced the fact that i made the correct decision.

 

Even if the 2080S turns out to be the real 1080ti replacement for this series, its to little to late, Nvidia really screwed up this series, though im sure their happy counting money form all the people who fell for their marketing. Those that bought a 2080ti when in the past they never would have forked out for a Titan, and those who bought a more expensive 2080 over a cheaper 1080ti.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, pas008 said:

Lol why did you get 1080ti in the first place?

Its 30% faster than 1080 ?

Wait 30% means shit right?

 

Exactly you are laughable

Why overclock

Why get faster ram

Why get more cores

Why get newer stuff period

Etc etc

They only are percentages faster even if 5fps lol

Funny you talk how I'm crazy about absolutes yet you don't understand analogies or exaggerations. I even fucking said I've exaggerated it as an example for 20% to be 5fps. Yet you're all dronning your shit liek I'm speaking fucking Chinese. I'm done with this idiocy here. People presumably native English and don't fucking get it. Dafaq...

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, RejZoR said:

Funny you talk how I'm crazy about absolutes yet you don't understand analogies or exaggerations. I even fucking said I've exaggerated it as an example for 20% to be 5fps. Yet you're all dronning your shit liek I'm speaking fucking Chinese. I'm done with this idiocy here. People presumably native English and don't fucking get it. Dafaq...

 

35 minutes ago, melete said:

If 20% more performance is giving you 5 FPS that means you were running the game in question at 25 FPS in the first place. What?

Exactly isnt your own idiocy read this quote above  hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know, I don't do stupid %. If you haven't noticed yet... The performance boost of Ti over vanilla GTX 1080 was from 20-40fps, usually being on the upper side of it. I literally don't care how much is that in % coz that tells me nothing useful. Framerate does. It tells me everything. But whatever, go on, mock me some more. :rolleyes:

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

I don't know, I don't do stupid %. If you haven't noticed yet... The performance boost of Ti over vanilla GTX 1080 was from 20-40fps, usually being on the upper side of it. I literally don't care how much is that in % coz that tells me nothing useful. Framerate does. It tells me everything. But whatever, go on, mock me some more. :rolleyes:

In what? at what resolution? what settings?

 

Oh around 30% on average with Same variables

 

Wow lol

 

Hahaha

Math is hard

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, RejZoR said:

Funny you talk how I'm crazy about absolutes yet you don't understand analogies or exaggerations. I even fucking said I've exaggerated it as an example for 20% to be 5fps. Yet you're all dronning your shit liek I'm speaking fucking Chinese. I'm done with this idiocy here. People presumably native English and don't fucking get it. Dafaq...

I'm sorry this seems like everyone droning and some people are being rude,  but I think the important thing here is that you have presented something that just doesn't work factually. It further doesn't add to your argument rationally as introducing margin of error and exaggerating on makes the argument less relevant.

 

You are dead right when looking at single games or specific benchmarks (which is why they are all presented in FPS) because for any given setting FPS tells you more than percentage.  But when comparing cards to cards in this way, percentage is just as accurate if not more so due to the nature of the averages and the data presented. 

 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/6/2019 at 9:32 AM, mr moose said:

You are dead right when looking at single games or specific benchmarks (which is why they are all presented in FPS) because for any given setting FPS tells you more than percentage.  But when comparing cards to cards in this way, percentage is just as accurate if not more so due to the nature of the averages and the data presented. 

Never mind

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

I'm fairly sure TechPowerUp is the source of this discussion chain and their percentages per game is what is being referred to as the problem.

The source of contention was just the claim that FPS is more accurate than percentage.  My stance is they are essentially the same because the card to card comparisons are averages across multiple games for each resolution. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mr moose said:

The source of contention was just the claim that FPS is more accurate than percentage.  My stance is they are essentially the same because the card to card comparisons are averages across multiple games for each resolution. 

Yea I was looking back over the post with the graphs then went to the actual site and they have the game sections all with FPS and that page was the summary page, so eh dunno what the issue is... which is why I hid my post. Was hoping you wouldn't see it lol.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

Yea I was looking back over the post with the graphs then went to the actual site and they have the game sections all with FPS and that page was the summary page, so eh dunno what the issue is... which is why I hid my post. Was hoping you wouldn't see it lol.

It's alright, it's a fairly long and perplexing discussion anyway. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I gotta say.

 

I'm still unimpressed with everything I'm seeing from both Nvidia and AMD in terms of GPU's.

 

I've yet to see any benchmark overclocks that beat my 1080ti FE which sits with a 2.076/8 (I can't specifically remember which, 2.076 or 2.078) overclock.

 

While yes, I am watercooling, and yes someone has probably beaten that, and yes core clocks are not an exacting thing to base performance on,

 

I've yet to encounter a game that really made this thing struggle to run at 4k 60hz at ultra or equal graphics. The only exceptions being heavily modded Fallout 4 with god rays ALL the way up, ridiculously modded Skryim, and Star Citizen which is an alpha.

 

The last of which I was able to run with 45-50ish fps, in 4k high, with fallout 4 left running in the background because I completely forgot I had been playing it.

 

Far as I can tell, Only upgrade I need is from this 7700k to an AMD Ryzen CPU.

 

Gonna launch a game and see if it was 2.076 or 2.078

 

Edit: So apparently, my 1080ti will no longer do 2076 or 2078, only 2062. No matter how far up I push the GPU offset clock or whatever EVGA Precision X is calling it.

 

Seems odd. My GPU has always run very cool, and is still doing so, but is now running slower than it used to.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Trik'Stari said:

I've yet to encounter a game that really made this thing struggle to run at 4k 60hz at ultra or equal graphics. The only exceptions being heavily modded Fallout 4 with god rays ALL the way up, ridiculously modded Skryim, and Star Citizen which is an alpha.

Well this is an expectation brought over from Geforce 900 to Geforce 10 series, due completely to 28nm to 16nm shift which allowed for very large clock increase and large SM increase. The raw compute power double between those generations, to expect that again or so easily is a bit unjustified. Are we forgetting the three generation 28nm stagnation, why that happened, and how the die size just got bigger and bigger and clocks actually decreased from 600 to 700 because of that. The specifications for clocks between 10 series and 20 series also shows a decrease, for the same reasons.

 

Geforce 10 series started out with a large die and TSMC did such a good job with 16nm there was little to improve so the only way to increase performance significantly is to make the die bigger and more expensive, which is what happened. 12nm isn't a 28nm to 16nm shift and 12nm isn't anything more than 16nm+ anyway.

 

Welcome to buying in the top end of the market I guess, where you don't need to replace every generation or benefit from doing it but you can if you wish.

 

Edit: Also a 30% performance increase between generations, regardless of all other factors is a good performance increase.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/2/2019 at 8:58 AM, Trixanity said:

I called it when I said they’d discontinue the old cards rather than drop the price down. People were too quick to assume that Nvidia would hand out the old cards with nice discounts. That’s simply not how Nvidia works.

Well the cards in stock will have to be sold, so at least until the stock of the old cards runs out they'll probably go for pretty cheap.

Main Desktop: CPU - i9-14900k | Mobo - Gigabyte Z690 Aorus Elite AX DDR4 | GPU - ASUS TUF Gaming OC RTX 4090 RAM - Corsair Vengeance Pro RGB 64GB 3600mhz | AIO - H150i Pro XT | PSU - Corsair RM1000X | Case - Phanteks P500A Digital - White | Storage - Samsung 970 Pro M.2 NVME SSD 512GB / Sabrent Rocket 1TB Nvme / Samsung 860 Evo Pro 500GB / Samsung 970 EVO Plus 2tb Nvme / Samsung 870 QVO 4TB  |

 

TV Streaming PC: Intel Nuc CPU - i7 8th Gen | RAM - 16GB DDR4 2666mhz | Storage - 256GB WD Black M.2 NVME SSD |

 

Phone: Samsung Galaxy Z Fold 4 - Phantom Black 512GB |

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Ha? said:

 

On the other hand the 2080ti performs 25% better then the 1080ti with a $1,200 MSRP.

 

Is 25% more performance worth a $500/42% price premium? Most able to think objectively/logically would say no.

Depending on the application, paying the $500 for the 25% extra performance may be worth it. 3d modeling and rendering applications would easily benefit enough as to outweigh the cost in time saved.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Zodiark1593 said:

Depending on the application, paying the $500 for the 25% extra performance may be worth it. 3d modeling and rendering applications would easily benefit enough as to outweigh the cost in time saved.

No it still dosn't because now you are getting into compute territory were AMD cards vastly out perform the nvidia competition at a much lower price point.

 

The entire RTX lineup is a very poor value for the consumer.

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Ha? said:

No it still dosn't because now you are getting into compute territory were AMD cards vastly out perform the nvidia competition at a much lower price point.

 

The entire RTX lineup is a very poor value for the consumer.

Not necessarily true there. This really depends on the application itself, but, for a notable example here, some render engines perform considerably faster on Nvidia hardware (Octane, Cycles), while some perform faster on AMD hardware (LuxRender). Permiere Pro (prefers Nvidia hardware) and DaVinci Resolve (favors AMD) also provide another fantastic example.

 

https://www.pugetsystems.com/labs/articles/Premiere-Pro-CC-2019-AMD-Radeon-VII-vs-NVIDIA-GeForce-RTX-1395/#AMDRadeonVegavsNVIDIAGeForceRTXforPremiereProCC2019

 

http://boostclock.com/show/000242/gpu-rendering-nv-blender-tile-2019-01-01.html

 

Oftentimes, it is far less expensive to cough up a little extra dough for a guaranteed return than it is to alter your entire workflow to switch to different software. So I maintain my point that not only is Nvidia perfectly competitive in the productivity workspace, it is also perfectly reasonable to spend more on a gpu if the time saved outweighs the cost. The extra time the speedup allows can be spent on other projects, hence more income to be had.

 

 

 

Screenshot_20190707-122459.png

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Zodiark1593 said:

Not necessarily true there. This really depends on the application itself, but, for a notable example here, some render engines perform considerably faster on Nvidia hardware (Octane, Cycles), while some perform faster on AMD hardware (LuxRender). Permiere Pro (prefers Nvidia hardware) and DaVinci Resolve (favors AMD) also provide another fantastic example.

 

https://www.pugetsystems.com/labs/articles/Premiere-Pro-CC-2019-AMD-Radeon-VII-vs-NVIDIA-GeForce-RTX-1395/#AMDRadeonVegavsNVIDIAGeForceRTXforPremiereProCC2019

 

http://boostclock.com/show/000242/gpu-rendering-nv-blender-tile-2019-01-01.html

 

Oftentimes, it is far less expensive to cough up a little extra dough for a guaranteed return than it is to alter your entire workflow to switch to different software. So I maintain my point that not only is Nvidia perfectly competitive in the productivity workspace, it is also perfectly reasonable to spend more on a gpu if the time saved outweighs the cost. The extra time the speedup allows can be spent on other projects, hence more income to be had.

 

 

 

Screenshot_20190707-122459.png

A $200 premium for 29 seconds is not what I consider a worth while "little extra dough". This chart is also now outdated if navi on garbage early drivers is anything to judge by where your previous examples no longer seem to be valid at the same price point.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Ha? said:

A $200 premium for 29 seconds is not what I consider a worth while "little extra dough". This chart is also now outdated if navi on garbage early drivers is anything to judge by where your previous examples no longer seem to be valid at the same price point.

Whoosh. Sound of point going 2000 feet over your head. 29 seconds over 100 frames is... yeah, for just 1 second or 2 of video encoding, you are gaining days of work if you are encoding/rendering an entire advert... or even just a Youtube intro (see Gamers Nexus).

 

This is why Linus sometimes goes crazy on the video editors PCs... as if it saves them just 5 mins a day, that's 25mins in the week, and well, an extra loo break every day for staff. If you times that up to 5 or 6 PCs/GPUs doing 4k/8k rendering, you are probably saving more than an hour a day compared to "cheap" components.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×