Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
HelpfulTechWizard

3080 benchmarks are in! Are they good?

Recommended Posts

35 minutes ago, xg32 said:

for last gen I don't have the detailed math anymore but if there's a budget limit then 3600 was almost always the choice as a better gpu will give more performance, once we go into a 2080S territory, it gets very interesting on whether u wanna go from a 3600 to a 10600k for the extra performance for 150usd~ more (cpu cooler+cpu cost diff), i almost never recommended a 3700x but that's just me. There's also a meme that a 3600+2080 ti is about the same as a 10600k +2080S in some use cases.

The 3600 was always my go to for recommending a CPU. If you're building a system from scratch or upgrading a very old system, the $150 USD may be worth it for the 10600K. However, if you've already built an R5-3600 system and are worried about upgrading from, lets say, a 1070/1080 to a 3080, I think it is nonsensical since you will see a boost in the graphical fidelity you can attain nonetheless. That is, if you're at 1440p and especially at 4K. If you're at 1080p, your screen is technically the bottleneck unless it is 144HZ and you care more about high framerates (maybe you're into competitive shooters). In that case, you will need a better CPU to hit those framerates.

 

39 minutes ago, xg32 said:

Another to look at bottlenecks is to base off the top cpus atm, most of the time 10600k/10700k has the same gaming performance as a 10900k, so for most people those are enough, when i talk about bottlenecks in those scenario it's really how much slower X cpu is compared to at least a 10600k on a given resolution/frame rate, for a 700dollar gpu i'd really not skip the 150usd difference between a 3600 and a 10600k and lose ~10% performance. For older cpu, just compare them to a 10600k or 10700k to see the % bottleneck, at least with a number users will get an idea of how much they lose instead of just "fine, a little bit, severe"

If someone was in the position of choosing to build either an R3600 system with a 3080 or an 10600K with a 3070, I would probably recommend the former. While the 3600 will leave some performance on the table, the 3080 will be more future proof and the difference in performance between the 3600 and 10600K will shrink over time as games start to push graphical fidelity further which disproportionally affects graphic cards.

 

1 hour ago, xg32 said:

I'm excited for zen 3 now that we know that the current cpus are all too slow for the 3080 in some ways. 

I wouldn't say that current CPUs are too slow. Current CPUs are fast enough to benefit from an upgrade to the 3080 so long as you are at 4K. 


I should be writing papers...

Link to post
Share on other sites

I am upgrading from my 1070. Glad I waited becuase man it is going to be a jump

 


Be sure to @Pickles - Lord of the Jar if you want me to see your reply!

Stopping by to praise the all mighty jar Lord pickles... * drinks from a chalice of holy pickle juice and tossed dill over shoulder* ~ @WarDance
3600x | NH-D15 Chromax Black | 32GB 3200MHz | GTX 1070 Hybrid (2100c/2241m) | Gigabyte X570 Aorus Elite | Seasonic X760w | Phanteks Evolv X | 500GB WD_Black SN750 | Sandisk Skyhawk 3.84TB SSD | 4TB HDD 

Link to post
Share on other sites
1 hour ago, SolarNova said:

You know what even more interesting...

 

Ethereum hash rates.

 

The 3080 is roughly the same as a Radeon 7, but the R7 uses slightly less power and will have already paid for itself for those that have them now.

 

So this worry about miners buying up the 3080's may not be as big a problem as we may think. If they are smart ..which is ...questionable.. .they would stick with their current R7's.

as a ex-miner, the numbers really dont impress, unless there is some optimization that could come in time they wont change the market and cause miners to try get them, i think amd might be in more danger there with them supposedly going for a massive L4 cache, which could speed up some of the coins quite a bit.

8 minutes ago, DutchGuyTom said:

I wouldn't say that current CPUs are too slow. Current CPUs are fast enough to benefit from an upgrade to the 3080 so long as you are at 4K. 

even outside of it only 1080p sees massive bottlenecks, for 1440p in most cases you are fine and where there is its usually at quite high fps anyway

Link to post
Share on other sites
8 hours ago, Kierax said:

I am pondering now if the 3070 is worthwhile upgrade for a 1080 for 1440p 165hz gaming?

Your CPU will be your bottleneck at 165 FPS. might be time for an upgrade, even 8th gen will give you a bit of a boost to that.

It was mentioned even 8th gen to 10th gen is a 20% increase in FPS at stock GPU settings at the 100+ side of the spectrum.


CPU | Intel i7-8086K Overclocked 5.4Ghz @ 1.44v | GPU | EVGA 1080 FTW2 Overclocked 2190Mhz | PSU | Corsair RM850i | RAM | 4x8GB Corsair Vengeance 3200MHz |MOTHERBOARD | Asus ROG Maximus X Formula | STORAGE | 2x Samsung EvO 256GB NVME  | COOLING | Hard Line Custom Loop D5 Pump Monsta 480mm Radiator  | MONITOR | Acer Predator X34 | OS | Windows 10

Link to post
Share on other sites

 Amd is going to have a hard time convincing me to go over to there side. The 3070 coolers looks better than the Amd gpu's and has more features at the same price. This is pure speculation as ive heard that amd is trying not put themselves as the cheaper nvidia card option but don't quote me on that.


If you're new to the forum it would be highly appreciated for you to visit this sub forum 

 

Link to post
Share on other sites
13 minutes ago, Gundar said:

 Amd is going to have a hard time convincing me to go over to there side. The 3070 coolers looks better than the Amd gpu's and has more features at the same price. This is pure speculation as ive heard that amd is trying not put themselves as the cheaper nvidia card option but don't quote me on that.

every aircooler ive seen i don't care able till the 3080, it is a piece of art and its fully functional. great job Nvidia but for us hardline cooling fans i feel bad for the first time planning to rip this thing apart.


CPU | Intel i7-8086K Overclocked 5.4Ghz @ 1.44v | GPU | EVGA 1080 FTW2 Overclocked 2190Mhz | PSU | Corsair RM850i | RAM | 4x8GB Corsair Vengeance 3200MHz |MOTHERBOARD | Asus ROG Maximus X Formula | STORAGE | 2x Samsung EvO 256GB NVME  | COOLING | Hard Line Custom Loop D5 Pump Monsta 480mm Radiator  | MONITOR | Acer Predator X34 | OS | Windows 10

Link to post
Share on other sites
Just now, Maticks said:

every aircooler ive seen i don't care able till the 3080, it is a piece of art and its fully functional. great job Nvidia but for us hardline cooling fans i feel bad for the first time planning to rip this thing apart.

I mean for small form factor water cooling that pcb is a godsend


If you're new to the forum it would be highly appreciated for you to visit this sub forum 

 

Link to post
Share on other sites

Great to see benchmarks finally. I personally favor TPU's benchmarks since I've been reading them for almost a decade now and never been steered wrong.

 

3080 vs 2080ti($999.99(Sept 2018));

 

14.94% better at 1080p
23.45% at 1440p
31.57% at 4k

3080 vs 2080 super($699.99(July 2019));

29.87% at 1080p
42.85% at 1440p
56.25% at 4k

 

3080 vs 2080 ($699.99(Sept 2018));

35.13% at 1080p
51.51% at 1440p
66.66% at 4k

 

3080 vs 1080ti($699.99(March 2017));


49.25% at 1080p
69.49% at 1440p
88.67% at 4k

 

So looking at 1440p(cause thats what I run) and focusing on $699 price, in 1 year we got 42.85% more performance, 2 years 51.51%, and 3 years 69.49%.

 

So, the jump between 1080ti and 2080S was only 18.64% and had to wait 2 years.

 

1080ti and 3080 is 69% in 3 years. Literally 44% more performance, at the same price, waiting an extra year after 2080S...yeah, Turing sucked big time for price:performance.

 

Even though this isn't "2x 2080 Super", this is still a HUGE jump and makes me question...why these prices? Why did Nvidia decide to sell at such a competitive price compared to previous generations? They could have sold the 3080 for $999 easy, no problem. Hell, they could have sold the 3080 for $1200 and bumped the 3090 to $2000.

 

I'm glad I returned my 2060 I bought recently for a SFF build and am using my old GTX 970 in its place, because this is quite a big jump, but now I'm super curious with what AMD has in store.

 

AMD has not been competitive in the past, especially these past 2 years during Turing where Nvidia seemed to only release small upgrades at high costs, but with Nvidia coming down in price:performance, this really makes me want to believe AMD has something this time around, especially with the new consoles(albeit cut down), do seem to claim great performance(4k60fps, some multiplayer being 4k120fps).

 

Could this be AMD's redemption period? I don't think I would place a monetary bet, but I'm definitely not buying a 3000 series while AMD seemingly has potential if Nvidia is acting this way.

 

November is going to be quite interesting!

Link to post
Share on other sites
2 minutes ago, EChondo said:

Why did Nvidia decide to sell at such a competitive price compared to previous generations? They could have sold the 3080 for $999 easy, no problem. Hell, they could have sold the 3080 for $1200 and bumped the 3090 to $2000.

 

 

one theory why that is at that  pricepoint:

competing with AMD on two fronts - RDNA2 desktop chips about to release and with the 3070 right at same price as new game consoles, some might even see the 3070 as a gateway, well hey, if you are willing to spend $500 on a gpu, maybe you might be willing to spend a bit more and get the 3080 instead.  They went with Samsung 8nm node, from what I read/watched it is considerably cheaper per wafer than what TSMC has on their 7nm, combine that with the sourness over the significant price increase on the 2080ti over the 1080ti and it just adds up.  


Rock On!

Link to post
Share on other sites

Today, 09/17 at 7:59AM GMT+1, the order of RTX 3080 still doesn't open...

 

https://www.nvidia.com/fr-fr/geforce/graphics-cards/30-series/


PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 16GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 41.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2667 v3 | 2x Intel BXSTS200C | 32GB DDR4-2133 ECC Reg | Gigabyte GeForce RTX 2080 SUPER Gaming OC 8G | 6x 120GB SSD SATA RAID0 SanDisk Plus | Seasonic SSR-850TR | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T61p | T9500 | 4GB DDR2 667 | Quadro FX 570m | 120GB SSD OCZ Vertex 2 | 15.4" TFT 1920x1200 | Win10

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 5 others computers (2 Apple classic, 1 mini PC WinXP and 2 PC pocket WinCE)

Link to post
Share on other sites
1 hour ago, EChondo said:

Great to see benchmarks finally. I personally favor TPU's benchmarks since I've been reading them for almost a decade now and never been steered wrong.

 

 

 

3080 vs 2080 ($699.99(Sept 2018));

26% at 1080p
34% at 1440p
40% at 4k

 

 

I think you're made an error in your calculations

https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/34.html

 

 

3080 vs 2080:

1080p - 74 to 100 = 35% increase

1440p - 66 to 100 = 51% increase

4K - 60 to 100 = 66% increase

 

 

Remember, 50 to 100 is a 100% increase, not a 50% increase. "The 2080 gets 74% of the performance of a 3080" is not the same as "the 3080 is 26% faster than the 2080".

So the 3080 is actually even better than what you calculated in your post.

Link to post
Share on other sites
14 hours ago, IAmAndre said:

Now all AMD has to do is match the performance of a 2080/2080Ti, make it consume much less power and sell it for less.

Press X to doubt.


Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to post
Share on other sites

I don't think AMD was expecting NVIDIA to make such massive jump in performance. Or maybe, AMD had a massive jump and NVIDIA managed to find out and leaped them. Time will tell. Soon. Whatever it happens, I hope AMD will do well coz we need strong competition for this to be regular occurrence. Otherwise we'll go back to 15% boosts with generations...

Link to post
Share on other sites
1 hour ago, RejZoR said:

I don't think AMD was expecting NVIDIA to make such massive jump in performance. Or maybe, AMD had a massive jump and NVIDIA managed to find out and leaped them. Time will tell. Soon. Whatever it happens, I hope AMD will do well coz we need strong competition for this to be regular occurrence. Otherwise we'll go back to 15% boosts with generations...

Both sides would have made their design decisions with target performance levels a long time ago, years. As time goes on, it gets increasingly difficult to make major changes to what they produce. They can't know for sure ahead of time what they get out at the end, even if they will have an idea based on the design intent and target performance of whichever fab is used. Only later on can they fine tune some details. Trade off power vs clock. If not using full die, how much do they disable? And of course, they can adjust the price.

 

It does feel like nvidia decided to go all out this generation, with a mix of architectural changes as well as getting benefit from an updated process. AMD we'll have to wait and see, but my feeling is they'll be ball park performance competitive on the parts up to 3080, but will do what they usually do and try to offer more in some area (VRAM) and/or price aggressively to increase perceived value.


Main system: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Desktop Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance Pro RGB 3200 4x16GB, Asus Strix 1080Ti, NZXT E850 PSU, Cooler Master MasterBox 5, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

TV Gaming system: Asus X299 TUF mark 2, 7920X @ 8c8t, Noctua D15, Corsair Vengeance LPX RGB 3000 8x8GB, Gigabyte RTX 2070, Corsair HX1000i, GameMax Abyss, Samsung 970 Evo 500GB, LG OLED55B9PLA

VR system: Asus Z170I Pro Gaming, i7-6700T stock, Scythe Kozuti, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, Crucial BX500 1TB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB + 480GB SSD

Link to post
Share on other sites
2 hours ago, porina said:

 

It does feel like nvidia decided to go all out this generation, with a mix of architectural changes as well as getting benefit from an updated process. AMD we'll have to wait and see, but my feeling is they'll be ball park performance competitive on the parts up to 3080, but will do what they usually do and try to offer more in some area (VRAM) and/or price aggressively to increase perceived value.

Nvidia willing to break the taboo of "power hungry" image. It used to belong to AMD. Nvidia has always been better at performance per watt. If Nvidia keeps the power envelop like before 250w for the high end card. There won't be a big jump of performance.

Link to post
Share on other sites
Posted · Original PosterOP
7 hours ago, EChondo said:

 

Even though this isn't "2x 2080 Super"

They didn't claim that. They claimed up to 2x the RTX 2080. And, in fact, in Minecraft, VRAY, and various other games/programs it is.


I am still TechWizardThatNeedsHelp, just less of a mouthfull.

 

My beautiful, but not that powerful, main PC:

My new PC I'm saving for:

  • NZXT H1 Matte Black
    • Comes with a 650W NZXT PSU
    • NZXT AIO
  • Ryzen 5 3600
  • MSI B450i GAMING PLUS AC
  • XPG ADATA 2800Mhz
  • RX480 untill I have more for a 2060 or 2070
  • 2tb Sabrent Rocket Q PCIE gen 3 NVME SSD
  • Samsung 470 128GB SATA SSD
Link to post
Share on other sites
7 hours ago, BroliviaWilde said:

one theory why that is at that  pricepoint:

competing with AMD on two fronts - RDNA2 desktop chips about to release and with the 3070 right at same price as new game consoles, some might even see the 3070 as a gateway, well hey, if you are willing to spend $500 on a gpu, maybe you might be willing to spend a bit more and get the 3080 instead.  They went with Samsung 8nm node, from what I read/watched it is considerably cheaper per wafer than what TSMC has on their 7nm, combine that with the sourness over the significant price increase on the 2080ti over the 1080ti and it just adds up.  

People are forgetting that RTX cards sold terribly. They blew a hole in Nvidia's quarterly financials until the GTX cards got out. Consumer Turing 1.0 (RTX) is likely Nvidia's worst selling product stack since the early 2000s. Turing 1.5 (GTX) sold enough to cover it up in their data, while the Server versions of the Turing 1.0 cards is why their stock is skyrocketing. When you can sell 500USD gaming GPUs for >5k each, the profit margins are massive.

 

What we're seeing is that Nvidia screwed up, they know it and they also know AMD isn't taking a couple of years off in the >500USD price bracket. Nvidia knows they can move cards at 800USD, but they can't move them well at 1200USD. The 3080 is actually a 750-800 card in reality, for what Nvidia is charging AIBs, so Nvidia is also semi-suppressing prices in the opening before AMD can cause them issues.

Link to post
Share on other sites
5 hours ago, Deli said:

 Nvidia has always been better at performance per watt.

Not really. They've pulled a Fermi before, and it seems they're doing it again. It seems they couldn't get enough of a performance-per-watt increase out of Samsung's fabs, so they went with a more brute-force approach to hitting the absolute performance they targeted. Something similar to what happened back then.

Link to post
Share on other sites
2 minutes ago, SpaceGhostC2C said:

Not really. They've pulled a Fermi before, and it seems they're doing it again. It seems they couldn't get enough of a performance-per-watt increase out of Samsung's fabs, so they went with a more brute-force approach to hitting the absolute performance they targeted. Something similar to what happened back then.

I almost want to mention Fermi, the infamous GTX480.  But thinking it's long time ago. Not sure how many people still rock the Fermi cards now.

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, Deli said:

GTX480.

I read rx 480. almost said that I had one.


I am still TechWizardThatNeedsHelp, just less of a mouthfull.

 

My beautiful, but not that powerful, main PC:

My new PC I'm saving for:

  • NZXT H1 Matte Black
    • Comes with a 650W NZXT PSU
    • NZXT AIO
  • Ryzen 5 3600
  • MSI B450i GAMING PLUS AC
  • XPG ADATA 2800Mhz
  • RX480 untill I have more for a 2060 or 2070
  • 2tb Sabrent Rocket Q PCIE gen 3 NVME SSD
  • Samsung 470 128GB SATA SSD
Link to post
Share on other sites
3 hours ago, Deli said:

I almost want to mention Fermi, the infamous GTX480.  But thinking it's long time ago. Not sure how many people still rock the Fermi cards now.

The point is AMD, Nvidia and Intel will all screw their perf/watt numbers when they deem it necessary to strike their intended perf/$ targets in the consumer space (you can't dismiss perf/watt as easily for servers), and they each has shown it in the past.

Link to post
Share on other sites
11 hours ago, Gundar said:

I mean for small form factor water cooling that pcb is a godsend

is it though, its weird shape, will probably be hard to hide

Link to post
Share on other sites
Just now, cj09beira said:

is it though, its weird shape, will probably be hard to hide

Its probably a lot better than trying to stick a foot long pcb in an itx case 


If you're new to the forum it would be highly appreciated for you to visit this sub forum 

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×