Jump to content

Why Nvidia's 30 series is not going to be 2x faster then the 20 series

shadowcalen1

I really don't get the hype around the 30 series. A bold claim perhaps with Nvidia boating a 2x performance increase, but assuming my math (which it admittedly may not be) is correct, the reality is far less impressive. Warning about wall of text incoming. If you don’t like walls of text, you can just read the conclusion.

 

Nvidia claims a RTX 3070 is faster then a 2080ti. You know what also performs better then a 2080ti? A 2070 when RTX is off.

All the performance comparisons we have seen to date have been specified that they are RTX performance numbers, or they have failed to specify if RTX is on or off. Now I can't say for certain, but those numbers which are not specified performance aligns with RTX on performance numbers, so I will be going on the assumption that they are also RTX on.

 

So why is RTX being on important? As stated prior, a 2070 with RTX off will perform better then a 2080ti on. This is because the RT and tensor cores used to drive RTX are not powerful enough on there own to drive RTX and as such some of the load is pushed to the shader cores. How much load is shifted to the shader cores determines how much of a performance hit enabling RTX will incur.

As such, by increasing the performance of its RT and tensor cores, Nvidia can increase the FPS of RTX on scenarios. However, as RT and tensor cores are useless for regular gaming, the GPU will be no faster in those scenarios.

Now its been two years sense RTX first came out, and in that time only a handful of games have come out that support the technology, and even fewer where the game was made to look noticeably better with it enabled. Im going to make the assumption that the average gamer does not care that much about RTX performance and instead guess that they are far more interested in Ampere’s non RTX performance.

 

Time for some napkin math. We know Control, as with most games, takes a 40% performance hit when enabling RTX (aka running at 60% performance). We also know thanks to a Digital Foundry video that in control a 3080 is 1.8x faster then a 2080. 0.6*1.8 = 1.08, or a whopping 8% faster in non RTX applications. Now this number makes the assumption that enabling RTX on Ampere does not hurt performance. I think this unlikely, so in actuality the performance between a 2080 and 3080 is likely to be higher. At the same time, the only way Nvidia could realize a 1.8x improvement in non RTX performance is if RT cores are no faster then in the prior generation. To put that 8% perspective, that’s about the difference between the 2080 and 2080 super.

 

In conclusion, unless you really care about RTX performance, there is insufficient information to gauge how well the 30 series performs in non RTX tasks (aka 99% of games). While the 3080 could perform like a 2080 super with better raytracing support, its probably going to do better then that. However, unless you assume that RTX performance on 30 series is the same as it was on the 20 series, we are not going to see a 2x performance improvement in anything but RTX performance. I strongly suggest waiting until we get third party benchmarks before hyping it up any more then it already has been.

Edit: It seems like Digital Foundries ran battlefield 5 with RTX off which resulted in a 65% performance improvement improvement between a 2080 and 3080. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, shadowcalen1 said:

If you don’t like walls of text, you can just read the conclusion.

One step ahead of you. ;)

7 minutes ago, shadowcalen1 said:

In conclusion, unless you really care about RTX performance, there is insufficient information to gauge how well the 30 series performs in non RTX tasks

Most of us are already aware of this. I feel like it's all I've been saying to people that last few days.

 

"wait for benchmarks"

BabyBlu (Primary): 

  • CPU: Intel Core i9 9900K @ up to 5.3GHz, 5.0GHz all-core, delidded
  • Motherboard: Asus Maximus XI Hero
  • RAM: G.Skill Trident Z RGB 4x8GB DDR4-3200 @ 4000MHz 16-18-18-34
  • GPU: MSI RTX 2080 Sea Hawk EK X, 2070MHz core, 8000MHz mem
  • Case: Phanteks Evolv X
  • Storage: XPG SX8200 Pro 2TB, 3x ADATASU800 1TB (RAID 0), Samsung 970 EVO Plus 500GB
  • PSU: Corsair HX1000i
  • Display: MSI MPG341CQR 34" 3440x1440 144Hz Freesync, Dell S2417DG 24" 2560x1440 165Hz Gsync
  • Cooling: Custom water loop (CPU & GPU), Radiators: 1x140mm(Back), 1x280mm(Top), 1x420mm(Front)
  • Keyboard: Corsair Strafe RGB (Cherry MX Brown)
  • Mouse: MasterMouse MM710
  • Headset: Corsair Void Pro RGB
  • OS: Windows 10 Pro

Roxanne (Wife Build):

  • CPU: Intel Core i7 4790K @ up to 5.0GHz, 4.8Ghz all-core, relidded w/ LM
  • Motherboard: Asus Z97A
  • RAM: G.Skill Sniper 4x8GB DDR3-2400 @ 10-12-12-24
  • GPU: EVGA GTX 1080 FTW2 w/ LM
  • Case: Corsair Vengeance C70, w/ Custom Side-Panel Window
  • Storage: Samsung 850 EVO 250GB, Samsung 860 EVO 1TB, Silicon Power A80 2TB NVME
  • PSU: Corsair AX760
  • Display: Samsung C27JG56 27" 2560x1440 144Hz Freesync
  • Cooling: Corsair H115i RGB
  • Keyboard: GMMK TKL(Kailh Box White)
  • Mouse: Glorious Model O-
  • Headset: SteelSeries Arctis 7
  • OS: Windows 10 Pro

BigBox (HTPC):

  • CPU: Ryzen 5800X3D
  • Motherboard: Gigabyte B550i Aorus Pro AX
  • RAM: Corsair Vengeance LPX 2x8GB DDR4-3600 @ 3600MHz 14-14-14-28
  • GPU: MSI RTX 3080 Ventus 3X Plus OC, de-shrouded, LM TIM, replaced mem therm pads
  • Case: Fractal Design Node 202
  • Storage: SP A80 1TB, WD Black SN770 2TB
  • PSU: Corsair SF600 Gold w/ NF-A9x14
  • Display: Samsung QN90A 65" (QLED, 4K, 120Hz, HDR, VRR)
  • Cooling: Thermalright AXP-100 Copper w/ NF-A12x15
  • Keyboard/Mouse: Rii i4
  • Controllers: 4X Xbox One & 2X N64 (with USB)
  • Sound: Denon AVR S760H with 5.1.2 Atmos setup.
  • OS: Windows 10 Pro

Harmonic (NAS/Game/Plex/Other Server):

  • CPU: Intel Core i7 6700
  • Motherboard: ASRock FATAL1TY H270M
  • RAM: 64GB DDR4-2133
  • GPU: Intel HD Graphics 530
  • Case: Fractal Design Define 7
  • HDD: 3X Seagate Exos X16 14TB in RAID 5
  • SSD: Inland Premium 512GB NVME, Sabrent 1TB NVME
  • Optical: BDXL WH14NS40 flashed to WH16NS60
  • PSU: Corsair CX450
  • Display: None
  • Cooling: Noctua NH-U14S
  • Keyboard/Mouse: None
  • OS: Windows 10 Pro

NAS:

  • Synology DS216J
  • 2x8TB WD Red NAS HDDs in RAID 1. 8TB usable space
Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, shadowcalen1 said:

Nvidia claims a RTX 3070 is faster then a 2080ti. You know what also performs better then a 2080ti? A 2070 when RTX is off.

And this is where I stopped reading.

 

On a more serious note, Digital Foundry's video of the 3080 also covered non-DXR performance in a couple of titles. The improvement over a 2080 was not "8%".

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, shadowcalen1 said:

As stated prior, a 2070 with RTX off will perform better then a 2080ti on.

I doubt Nvidia would do us in like this, the backlash isnt worth it.

they might say 2x better, but hide the term "perf/dollar" in smallest of texts, but never unfair comparison

 

they're not intel.

 

I'm somewhat excited because of the CUDA core counts compared to last gen, which is what matters for non RTX applications.

but of course, only review will tell the full story.

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, shadowcalen1 said:

.

ur comparing non-rtx performance to rtx performance...

 

though i'll agree that the 3070 rast performance is unlikely to be better than a 2080 ti.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's not unheard of. I owned 2 cards that doubled in performance compared to last gen and they weren't slow by any means when they were new so jump was serious and not doubling of some crap results. Radeon 4850 doubled the performance of HD3850. And later, HD5850 doubled the performance of HD4850. And that was across the series. So, I'm not doubtful or questioning things, I'm just excited.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, xg32 said:

ur comparing non-rtx performance to rtx performance...

 

though i'll agree that the 3070 rast performance is unlikely to be better than a 2080 ti.

Yes, I am comparing RTX to non RTX performance. Because RTX is run on separate hardware from the actual game, you can improve RTX performance without improving raw performance. The point I am making is 2x RTX improvement could mean only a 8% non RTX performance.

11 minutes ago, Mateyyy said:

Digital Foundry's video of the 3080 also covered non-DXR performance in a couple of titles

Timestamp? I have watched the video 4 times now, can't find any mention of performance in non RTX scenarios. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, shadowcalen1 said:

Timestamp? I have watched the video 4 times now, can't find any mention of performance in non RTX scenarios. 

bl3 and doom wasn't RTX

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, xg32 said:

bl3 and doom wasn't RTX

Yea they were. They did not say that they were, but if you look in the reflections you can see that RTX was indeed on. It would also be weird of NVIDIA to choose games with RTX support then let digital foundries run the game without it on.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, shadowcalen1 said:

Timestamp? I have watched the video 4 times now, can't find any mention of performance in non RTX scenarios. 

1182770026_3e0b0ea3889937d8740e837df560e996(1).thumb.png.621be3ef70bd8ec95dca4aa53e3d78c5.png

cea7440aec6c2bfde34cedee7b5e5efe.thumb.png.5767fcd7b5e8c51ef659f6ef30f05d64.png

 

When DXR was used, it looks to be clearly labeled:

57cf8374a1b2006cf3fa228391d1d433.thumb.png.60198eb9ff3756b090465cb3634363bc.png

c51ae65be922e0cb22c55fc41ab84289.thumb.png.f49074df6b43721f7e72f8f5c221c6f1.png

 

1 minute ago, shadowcalen1 said:

Yea they were. They did not say that they were, but if you look in the reflections you can see that RTX was indeed on.

Funny how they ran the games with ray tracing enabled, when they don't even have ray tracing support in the first place at the moment.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Mateyyy said:

1182770026_3e0b0ea3889937d8740e837df560e996(1).thumb.png.621be3ef70bd8ec95dca4aa53e3d78c5.png

cea7440aec6c2bfde34cedee7b5e5efe.thumb.png.5767fcd7b5e8c51ef659f6ef30f05d64.png

 

When DXR was used, it looks to be clearly labeled:

57cf8374a1b2006cf3fa228391d1d433.thumb.png.60198eb9ff3756b090465cb3634363bc.png

c51ae65be922e0cb22c55fc41ab84289.thumb.png.f49074df6b43721f7e72f8f5c221c6f1.png

By extension of your logic, when RTX is not used it should also be clearly labeled. Though I admit I seem to be wrong about battlefield 5 using RTX, though there seems like there are some non screen space reflections in the footage. I stand corrected and will make a edit to the original post to reflect this.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, shadowcalen1 said:

By extension of your logic, when RTX is not used it should also be clearly labeled. Though I admit I seem to be wrong about battlefield 5 using RTX, though there seems like there are some non screen space reflections in the footage. I stand corrected.

There does seem to be a bit of inconsistency in the way the settings are labeled on the screen, though from what I remember when I actually watched the video completely, Richard did mention whether the testing he was talking about was being run with DXR on or off, depending on the benchmark shown.

 

Perhaps the inconsistency could be attributed to a rush in trying to get the video edited and published, since I can't imagine they got the card for testing with a lot of time in advance.

Still, I'm looking forward to seeing actual reviews on the GPUs being done, with non-cherry-picked games and workloads, and raw data. Then we'll have a clear idea of how Ampere fares compared to Turing, and how dramatic the performance uplift really is.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Mateyyy said:

There does seem to be a bit of inconsistency in the way the settings are labeled on the screen, though from what I remember when I actually watched the video completely, Richard did mention whether the testing he was talking about was being run with DXR on or off, depending on the benchmark shown.

 

Perhaps the inconsistency could be attributed to a rush in trying to get the video edited and published, since I can't imagine they got the card for testing with a lot of time in advance.

Still, I'm looking forward to seeing actual reviews on the GPUs being done, with non-cherry-picked games and workloads, and raw data. Then we'll have a clear idea of how Ampere fares compared to Turing, and how dramatic the performance uplift really is.

Watched the video through again, during the initial benchmarks he did not mention RTX once. He did however mention borderlands 3 running at 40FPS on a 2080, which seems to me to indicate RTX on for that benchmark even though it was not labeled as such. The rest of the video was quake and control where RTX was clearly on.

Link to comment
Share on other sites

Link to post
Share on other sites

I have 2 computers with i7 8086ks listed below.

One has a 2080 ti and one has a GTX 1080 ti(1080 ti = 2070).

 

If I run The SOTTR bench at 1440p highest settings I get 85fps average with the GTX 1080 ti. 

On the 2080 ti setup with the same settings but with ray traced shadow quality on Ultra I get 84fps on average. If I lower ray tracing to high the 2080 ti wins.

 

So with my EVGA FTW3 Ultra 2080 ti your statement is not correct. 

 

Control with DLSS 2 I can do 4k 60 with ray tracing in the hardest area of the game according to Digital Foundries(corridor of doom).  I have not seen better with a 2070 without RT on Youtube so far.

 

I did not get the 2080 ti for ray tracing since non of my favorite games have it. I got it for 4k with ultra settings and it can do that very well. 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, shadowcalen1 said:

He did however mention borderlands 3 running at 40FPS on a 2080, which seems to me to indicate RTX on for that benchmark even though it was not labeled as such.

 

Spoiler

untitled-3.png.e5f70159dadd4fa347e04796549b0d9b.png

 

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

Then there's this:

 

NVIDIA-RTX-30-Performance.jpg

 

Borderlands 3, Doom, and RDR2 appear to not have RTX on. Obviously this is based on Nvidia's word.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Mateyyy said:

Wow, did not realize that borderlands 3 ultra preset was so demanding. I stand corrected again. Maybe the 3080 will indeed be as good as the rumors stated then.

  Hide contents

untitled-3.png.e5f70159dadd4fa347e04796549b0d9b.png

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Mister Woof said:

Then there's this:

 

NVIDIA-RTX-30-Performance.jpg

 

Borderlands 3, Doom, and RDR2 appear to not have RTX on. Obviously this is based on Nvidia's word.

If those numbers are true that would indeed be amazing. I have not seen this slide before, I don't think it was in the official launch video (correct me if I am wrong please), but its on reputable sites so it seems like its official. However, the graph looking the same for RTX on titles and RTX off games is weird. That would mean that there is little if any performance improvement in the RT cores themselves with Ampere. 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, shadowcalen1 said:

If those numbers are true that would indeed be amazing. I have not seen this slide before, I don't think it was in the official launch video (correct me if I am wrong please), but its on reputable sites so it seems like its official. However, the graph looking the same for RTX on titles and RTX off games is weird. That would mean that there is little if any performance improvement in the RT cores themselves with Ampere. 

 

Also remember the 2080S is NOT a 2080Ti..... ;)

 

Wait for the real bench marks etc as others have been saying.

 

I predict the 3070 isn't going to be faster or match the the 2080Ti in most games. They might be close within a few FPS here and there maybe... Depending on the game... Especially without RTX or DLSS enabled...

 

I can see the 3080 being faster with a percentage that would make a difference however.

 

 

 

 

i9 9900K @ 5.0 GHz, NH D15, 32 GB DDR4 3200 GSKILL Trident Z RGB, AORUS Z390 MASTER, EVGA RTX 3080 FTW3 Ultra, Samsung 970 EVO Plus 500GB, Samsung 860 EVO 1TB, Samsung 860 EVO 500GB, ASUS ROG Swift PG279Q 27", Steel Series APEX PRO, Logitech Gaming Pro Mouse, CM Master Case 5, Corsair AXI 1600W Titanium. 

 

i7 8086K, AORUS Z370 Gaming 5, 16GB GSKILL RJV DDR4 3200, EVGA 2080TI FTW3 Ultra, Samsung 970 EVO 250GB, (2)SAMSUNG 860 EVO 500 GB, Acer Predator XB1 XB271HU, Corsair HXI 850W.

 

i7 8700K, AORUS Z370 Ultra Gaming, 16GB DDR4 3000, EVGA 1080Ti FTW3 Ultra, Samsung 960 EVO 250GB, Corsair HX 850W.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

From what I have read I suspect a 60-70% improvement over the equivalent of last gen. That should put the 3070 about 5% ahead of the 2080TI in non rtx. I would be satisfied with a 50% gain and happy with 60+

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, shadowcalen1 said:

I really don't get the hype around the 30 series. A bold claim perhaps with Nvidia boating a 2x performance increase, but assuming my math (which it admittedly may not be) is correct, the reality is far less impressive. Warning about wall of text incoming. If you don’t like walls of text, you can just read the conclusion.

 

Nvidia claims a RTX 3070 is faster then a 2080ti. You know what also performs better then a 2080ti? A 2070 when RTX is off.

All the performance comparisons we have seen to date have been specified that they are RTX performance numbers, or they have failed to specify if RTX is on or off. Now I can't say for certain, but those numbers which are not specified performance aligns with RTX on performance numbers, so I will be going on the assumption that they are also RTX on.

 

So why is RTX being on important? As stated prior, a 2070 with RTX off will perform better then a 2080ti on. This is because the RT and tensor cores used to drive RTX are not powerful enough on there own to drive RTX and as such some of the load is pushed to the shader cores. How much load is shifted to the shader cores determines how much of a performance hit enabling RTX will incur.

As such, by increasing the performance of its RT and tensor cores, Nvidia can increase the FPS of RTX on scenarios. However, as RT and tensor cores are useless for regular gaming, the GPU will be no faster in those scenarios.

Now its been two years sense RTX first came out, and in that time only a handful of games have come out that support the technology, and even fewer where the game was made to look noticeably better with it enabled. Im going to make the assumption that the average gamer does not care that much about RTX performance and instead guess that they are far more interested in Ampere’s non RTX performance.

 

Time for some napkin math. We know Control, as with most games, takes a 40% performance hit when enabling RTX (aka running at 60% performance). We also know thanks to a Digital Foundry video that in control a 3080 is 1.8x faster then a 2080. 0.6*1.8 = 1.08, or a whopping 8% faster in non RTX applications. Now this number makes the assumption that enabling RTX on Ampere does not hurt performance. I think this unlikely, so in actuality the performance between a 2080 and 3080 is likely to be higher. At the same time, the only way Nvidia could realize a 1.8x improvement in non RTX performance is if RT cores are no faster then in the prior generation. To put that 8% perspective, that’s about the difference between the 2080 and 2080 super.

 

In conclusion, unless you really care about RTX performance, there is insufficient information to gauge how well the 30 series performs in non RTX tasks (aka 99% of games). While the 3080 could perform like a 2080 super with better raytracing support, its probably going to do better then that. However, unless you assume that RTX performance on 30 series is the same as it was on the 20 series, we are not going to see a 2x performance improvement in anything but RTX performance. I strongly suggest waiting until we get third party benchmarks before hyping it up any more then it already has been.

Edit: It seems like Digital Foundries ran battlefield 5 with RTX off which resulted in a 65% performance improvement improvement between a 2080 and 3080. 

Than........You sound thoughtful and intelligent but every time I see "a 2070 with RTX off will perform better then a 2080ti"......ugh.  THAN.  "Than" is the proper term for comparison, not "then".

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×