Jump to content

3080 benchmarks are in! Are they good?

Helpful Tech Witch
5 hours ago, RejZoR said:

Would be nice if they compared RTX 3080 to GTX 1080Ti since those people are most likely to be doing upgrades, realistically...

GN compared it in their review. 3080 is about 70-90% faster at 1440p and 4k. Looking like a nice upgrade for those of us on 1080Ti that smelled the 20 series scam from a mile away. I'll just be kicking back and waiting to see what AMD does and for stock to stabilize.

Link to comment
Share on other sites

Link to post
Share on other sites

RTX 3080 tested on various CPUs to measure bottlenecks: https://www.tomshardware.com/features/nvidia-geforce-rtx-3080-ampere-cpu-scaling-benchmarks

 

Edit: I just saw that @SolarNova already posted it.

 

1503625650_TH4770KOC1.png.fc34ad1ee1b0faf008311913ce419e02.png

 

1453669444_TH4770KOC2.png.45a88d036d13c3954e68075fb5daac25.png

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, DutchGuyTom said:

.

for last geni don't have the detailed math anymore but if there's a budget limit then 3600 was almost always the choice as a better gpu will give more performance, once we go into a 2080S territory, it gets very interesting on whether u wanna go from a 3600 to a 10600k for the extra performance for 150usd~ more (cpu cooler+cpu cost diff), i almost never recommended a 3700x but that's just me. There's also a meme that a 3600+2080 ti is about the same as a 10600k +2080S in some use cases.

 

Another to look at bottlenecks is to base off the top cpus atm, most of the time 10600k/10700k has the same gaming performance as a 10900k, so for most people those are enough, when i talk about bottlenecks in those scenario it's really how much slower X cpu is compared to at least a 10600k on a given resolution/frame rate, for a 700dollar gpu i'd really not skip the 150usd difference between a 3600 and a 10600k and lose ~10% performance. For older cpu, just compare them to a 10600k or 10700k to see the % bottleneck, at least with a number users will get an idea of how much they lose instead of just "fine, a little bit, severe"

 

The tom's link is awesome, thanks @SolarNova 

https://www.tomshardware.com/features/nvidia-geforce-rtx-3080-ampere-cpu-scaling-benchmarks

 

It's good that we now know the pci-e 3.0 bottleneck is 1-2% so we can throw that out the window for the 3080 at least.

 

I'm excited for zen 3 now that we know that the current cpus are all too slow for the 3080 in some ways. 

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, TVwazhere said:

They do every time. Hence why I tell people Wait for the goddamn benchmarks.

 

Now people may or may not get appropriately hyped for these cards. 

i have exactly 0 sympathy for people that preordered any of the 30xx cards and now feel like they don't live up to the hype because they listened to the marketing, stop listening to the marketing, WTF

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

You know what even more interesting...

 

Ethereum hash rates.

 

The 3080 is roughly the same as a Radeon 7, but the R7 uses slightly less power and will have already paid for itself for those that have them now.

 

So this worry about miners buying up the 3080's may not be as big a problem as we may think. If they are smart ..which is ...questionable.. .they would stick with their current R7's.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

Can’t lie I’m disappointed with the numbers at 1440p. A 2080 super avg 107 fps vs 3080 154fps. I can’t see spending $700 when I’m already over 100fps in most of my games already. 
 

This is a good improvement that we needed to see with 4k trying to be the thing now a days. I’m happy that we are seeing playable framerates at 4k instead of the 40-60 numbers we had before. 
 

 I’ll stick with my $250 rtx 2080 super that I bought when people panic sold cards after the reveal. It still does the job just fine on my 1440p 144hz monitor. 
 

To everyone that buys this, I hope you enjoy your card. I myself can’t spend the money and will wait til the rtx4000 series comes out when ray tracing should really be a thing in games and not some random ass add on some developers put in games 

No cpu mobo or ram atm

2tb wd black gen 4 nvme 

2tb seagate hdd

Corsair rm750x 

Be quiet 500dx 

Gigabyte m34wq 3440x1440

Xbox series x

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Gohardgrandpa said:

$250 rtx 2080 super

Wow! That is one incredible deal right there

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Hymenopus_Coronatus said:

Wow! That is one incredible deal right there

Yeah it’s an evga xc ultra that was purchased in November so it still has over 2 years of warranty on it. I went from a 2060 super to the 2080 super. I play games that ran fine on the 2060 super so I don’t need to upgrade. 
 

Maybe in the next year or two we’ll get affordable 4k 144hz monitors. 

No cpu mobo or ram atm

2tb wd black gen 4 nvme 

2tb seagate hdd

Corsair rm750x 

Be quiet 500dx 

Gigabyte m34wq 3440x1440

Xbox series x

Link to comment
Share on other sites

Link to post
Share on other sites

I am upgrading from my 1070. Glad I waited becuase man it is going to be a jump

 

Be sure to @Pickles von Brine if you want me to see your reply!

Stopping by to praise the all mighty jar Lord pickles... * drinks from a chalice of holy pickle juice and tossed dill over shoulder* ~ @WarDance
3600x | NH-D15 Chromax Black | 32GB 3200MHz | ASUS KO RTX 3070 UnderVolted and UnderClocked | Gigabyte Aorus Elite AX X570S | Seasonic X760w | Phanteks Evolv X | 500GB WD_Black SN750 x2 | Sandisk Skyhawk 3.84TB SSD 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SolarNova said:

You know what even more interesting...

 

Ethereum hash rates.

 

The 3080 is roughly the same as a Radeon 7, but the R7 uses slightly less power and will have already paid for itself for those that have them now.

 

So this worry about miners buying up the 3080's may not be as big a problem as we may think. If they are smart ..which is ...questionable.. .they would stick with their current R7's.

as a ex-miner, the numbers really dont impress, unless there is some optimization that could come in time they wont change the market and cause miners to try get them, i think amd might be in more danger there with them supposedly going for a massive L4 cache, which could speed up some of the coins quite a bit.

8 minutes ago, DutchGuyTom said:

I wouldn't say that current CPUs are too slow. Current CPUs are fast enough to benefit from an upgrade to the 3080 so long as you are at 4K. 

even outside of it only 1080p sees massive bottlenecks, for 1440p in most cases you are fine and where there is its usually at quite high fps anyway

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Kierax said:

I am pondering now if the 3070 is worthwhile upgrade for a 1080 for 1440p 165hz gaming?

Your CPU will be your bottleneck at 165 FPS. might be time for an upgrade, even 8th gen will give you a bit of a boost to that.

It was mentioned even 8th gen to 10th gen is a 20% increase in FPS at stock GPU settings at the 100+ side of the spectrum.

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

 Amd is going to have a hard time convincing me to go over to there side. The 3070 coolers looks better than the Amd gpu's and has more features at the same price. This is pure speculation as ive heard that amd is trying not put themselves as the cheaper nvidia card option but don't quote me on that.

Quote me for a reply, React if I was helpful, informative, or funny

 

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Gundar said:

 Amd is going to have a hard time convincing me to go over to there side. The 3070 coolers looks better than the Amd gpu's and has more features at the same price. This is pure speculation as ive heard that amd is trying not put themselves as the cheaper nvidia card option but don't quote me on that.

every aircooler ive seen i don't care able till the 3080, it is a piece of art and its fully functional. great job Nvidia but for us hardline cooling fans i feel bad for the first time planning to rip this thing apart.

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Maticks said:

every aircooler ive seen i don't care able till the 3080, it is a piece of art and its fully functional. great job Nvidia but for us hardline cooling fans i feel bad for the first time planning to rip this thing apart.

I mean for small form factor water cooling that pcb is a godsend

Quote me for a reply, React if I was helpful, informative, or funny

 

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

Link to comment
Share on other sites

Link to post
Share on other sites

Great to see benchmarks finally. I personally favor TPU's benchmarks since I've been reading them for almost a decade now and never been steered wrong.

 

3080 vs 2080ti($999.99(Sept 2018));

 

14.94% better at 1080p
23.45% at 1440p
31.57% at 4k

3080 vs 2080 super($699.99(July 2019));

29.87% at 1080p
42.85% at 1440p
56.25% at 4k

 

3080 vs 2080 ($699.99(Sept 2018));

35.13% at 1080p
51.51% at 1440p
66.66% at 4k

 

3080 vs 1080ti($699.99(March 2017));


49.25% at 1080p
69.49% at 1440p
88.67% at 4k

 

So looking at 1440p(cause thats what I run) and focusing on $699 price, in 1 year we got 42.85% more performance, 2 years 51.51%, and 3 years 69.49%.

 

So, the jump between 1080ti and 2080S was only 18.64% and had to wait 2 years.

 

1080ti and 3080 is 69% in 3 years. Literally 44% more performance, at the same price, waiting an extra year after 2080S...yeah, Turing sucked big time for price:performance.

 

Even though this isn't "2x 2080 Super", this is still a HUGE jump and makes me question...why these prices? Why did Nvidia decide to sell at such a competitive price compared to previous generations? They could have sold the 3080 for $999 easy, no problem. Hell, they could have sold the 3080 for $1200 and bumped the 3090 to $2000.

 

I'm glad I returned my 2060 I bought recently for a SFF build and am using my old GTX 970 in its place, because this is quite a big jump, but now I'm super curious with what AMD has in store.

 

AMD has not been competitive in the past, especially these past 2 years during Turing where Nvidia seemed to only release small upgrades at high costs, but with Nvidia coming down in price:performance, this really makes me want to believe AMD has something this time around, especially with the new consoles(albeit cut down), do seem to claim great performance(4k60fps, some multiplayer being 4k120fps).

 

Could this be AMD's redemption period? I don't think I would place a monetary bet, but I'm definitely not buying a 3000 series while AMD seemingly has potential if Nvidia is acting this way.

 

November is going to be quite interesting!

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, EChondo said:

Why did Nvidia decide to sell at such a competitive price compared to previous generations? They could have sold the 3080 for $999 easy, no problem. Hell, they could have sold the 3080 for $1200 and bumped the 3090 to $2000.

 

 

one theory why that is at that  pricepoint:

competing with AMD on two fronts - RDNA2 desktop chips about to release and with the 3070 right at same price as new game consoles, some might even see the 3070 as a gateway, well hey, if you are willing to spend $500 on a gpu, maybe you might be willing to spend a bit more and get the 3080 instead.  They went with Samsung 8nm node, from what I read/watched it is considerably cheaper per wafer than what TSMC has on their 7nm, combine that with the sourness over the significant price increase on the 2080ti over the 1080ti and it just adds up.  

Rock On!

Link to comment
Share on other sites

Link to post
Share on other sites

Today, 09/17 at 7:59AM GMT+1, the order of RTX 3080 still doesn't open...

 

https://www.nvidia.com/fr-fr/geforce/graphics-cards/30-series/

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, EChondo said:

Great to see benchmarks finally. I personally favor TPU's benchmarks since I've been reading them for almost a decade now and never been steered wrong.

 

 

 

3080 vs 2080 ($699.99(Sept 2018));

26% at 1080p
34% at 1440p
40% at 4k

 

 

I think you're made an error in your calculations

https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/34.html

 

 

3080 vs 2080:

1080p - 74 to 100 = 35% increase

1440p - 66 to 100 = 51% increase

4K - 60 to 100 = 66% increase

 

 

Remember, 50 to 100 is a 100% increase, not a 50% increase. "The 2080 gets 74% of the performance of a 3080" is not the same as "the 3080 is 26% faster than the 2080".

So the 3080 is actually even better than what you calculated in your post.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, IAmAndre said:

Now all AMD has to do is match the performance of a 2080/2080Ti, make it consume much less power and sell it for less.

Press X to doubt.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think AMD was expecting NVIDIA to make such massive jump in performance. Or maybe, AMD had a massive jump and NVIDIA managed to find out and leaped them. Time will tell. Soon. Whatever it happens, I hope AMD will do well coz we need strong competition for this to be regular occurrence. Otherwise we'll go back to 15% boosts with generations...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

I don't think AMD was expecting NVIDIA to make such massive jump in performance. Or maybe, AMD had a massive jump and NVIDIA managed to find out and leaped them. Time will tell. Soon. Whatever it happens, I hope AMD will do well coz we need strong competition for this to be regular occurrence. Otherwise we'll go back to 15% boosts with generations...

Both sides would have made their design decisions with target performance levels a long time ago, years. As time goes on, it gets increasingly difficult to make major changes to what they produce. They can't know for sure ahead of time what they get out at the end, even if they will have an idea based on the design intent and target performance of whichever fab is used. Only later on can they fine tune some details. Trade off power vs clock. If not using full die, how much do they disable? And of course, they can adjust the price.

 

It does feel like nvidia decided to go all out this generation, with a mix of architectural changes as well as getting benefit from an updated process. AMD we'll have to wait and see, but my feeling is they'll be ball park performance competitive on the parts up to 3080, but will do what they usually do and try to offer more in some area (VRAM) and/or price aggressively to increase perceived value.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

 

It does feel like nvidia decided to go all out this generation, with a mix of architectural changes as well as getting benefit from an updated process. AMD we'll have to wait and see, but my feeling is they'll be ball park performance competitive on the parts up to 3080, but will do what they usually do and try to offer more in some area (VRAM) and/or price aggressively to increase perceived value.

Nvidia willing to break the taboo of "power hungry" image. It used to belong to AMD. Nvidia has always been better at performance per watt. If Nvidia keeps the power envelop like before 250w for the high end card. There won't be a big jump of performance.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, EChondo said:

 

Even though this isn't "2x 2080 Super"

They didn't claim that. They claimed up to 2x the RTX 2080. And, in fact, in Minecraft, VRAY, and various other games/programs it is.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, BroliviaWilde said:

one theory why that is at that  pricepoint:

competing with AMD on two fronts - RDNA2 desktop chips about to release and with the 3070 right at same price as new game consoles, some might even see the 3070 as a gateway, well hey, if you are willing to spend $500 on a gpu, maybe you might be willing to spend a bit more and get the 3080 instead.  They went with Samsung 8nm node, from what I read/watched it is considerably cheaper per wafer than what TSMC has on their 7nm, combine that with the sourness over the significant price increase on the 2080ti over the 1080ti and it just adds up.  

People are forgetting that RTX cards sold terribly. They blew a hole in Nvidia's quarterly financials until the GTX cards got out. Consumer Turing 1.0 (RTX) is likely Nvidia's worst selling product stack since the early 2000s. Turing 1.5 (GTX) sold enough to cover it up in their data, while the Server versions of the Turing 1.0 cards is why their stock is skyrocketing. When you can sell 500USD gaming GPUs for >5k each, the profit margins are massive.

 

What we're seeing is that Nvidia screwed up, they know it and they also know AMD isn't taking a couple of years off in the >500USD price bracket. Nvidia knows they can move cards at 800USD, but they can't move them well at 1200USD. The 3080 is actually a 750-800 card in reality, for what Nvidia is charging AIBs, so Nvidia is also semi-suppressing prices in the opening before AMD can cause them issues.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×