Jump to content

3080 benchmarks are in! Are they good?

Helpful Tech Witch

CPU scaling

 

https://www.tomshardware.com/features/nvidia-geforce-rtx-3080-ampere-cpu-scaling-benchmarks

 

TLDR:

If u have old and/or slow CPU, and u want to get a 3080, make sure ur running 4k.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, ONOTech said:

You could use DSR in the meantime.

 

2 hours ago, CTR640 said:

You can still game at 1440p using your current 1080p monitor using nVidia DSR. I do that too and I play GTAV at 1440p on my 1080p monitor and GTA4 at 2160p.

I guess that makes sense. Would it look any good though?

 

1 hour ago, comander said:

At 1080p you'll probably have a roughly equally CPU/GPU bottlenecked system. 

Wouldn't my CPU bottleneck the 3080 though? Why would it also be the other way around?

Link to comment
Share on other sites

Link to post
Share on other sites

I am pondering now if the 3070 is worthwhile upgrade for a 1080 for 1440p 165hz gaming?

PC - NZXT H510 Elite, Ryzen 5600, 16GB DDR3200 2x8GB, EVGA 3070 FTW3 Ultra, Asus VG278HQ 165hz,

 

Mac - 1.4ghz i5, 4GB DDR3 1600mhz, Intel HD 5000.  x2

 

Endlessly wishing for a BBQ in space.

Link to comment
Share on other sites

Link to post
Share on other sites

Well that's just marketing isn't it? 80%? Well that's close to 100% so let's just say double the performance. It rolls off the tongue nicer and really gets peoples' attention. But it's still no slouch so I don't see a reason why people should be absolutely disappointed. Would you rather get 80% improvement or just the 20% generational improvement that's typical in the tech space? 

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

Well at least my guesses on the power draw were right. 325w stock, 375w OC. 650w is enough for most people.

 

MAIN PC:

CPU: Intel® Core™ i9-9900K Processor  Motherboard: Gigabyte Z390 Aorus Pro Wifi  CPU Cooler: Scythe Fuma 2  GPU: EVGA RTX 3080 FTW3 Ultra  RAM: Corsair Vengeance 32GB (4x8GB) 3000Mhz CL15

Case: CoolerMaster TD500 Mesh PSU: Thermaltake GF1 PE 750w Storage: 1TB Western Digital Blue 3D + 1TB Crucial P1 + 1TB ADATA XPG Gammix S11 Pro + 4TB Seagate Barracuda 5400RPM OS: Windows 10 Home

Headphones: Philips SHP9500s   Keyboard: Corsair K70 RGB MK.2 Cherry MX Red  Displays: Gigabyte M27Q (27" 1440p 170hz IPS), Samsung UN32EH4003FXZA (32" 768p 60hz TV)

 

SECONDARY PC:

CPU: Intel® Core™ i3-9100F Processor  Motherboard: ASRock Z390 Phantom Gaming 4-CB  CPU Cooler: Arctic Alpine 12 CO  GPU: EVGA RTX 3060 XC RAM: ADATA XPG 16GB (2x8GB) 2400Mhz CL16

Case: CyberpowerPC Onyxia  PSU: ATNG ATA-B 800w 80 Plus Bronze  Storage: 500GB Samsung 850 EVO + 2TB Seagate FireCuda SSHD 5400RPM    OS: Windows 10 Home

 

Former parts that I've used: Acer XG270HU, Asus Dual OC 2080, Gigabyte Aorus Master 3080, Gigabyte Gaming OC 3080, EVGA XC3 Ultra 3080, EVGA FTW3 Ultra 3080 Ti

Link to comment
Share on other sites

Link to post
Share on other sites

As usual, a good product that seems a little better than it actually is due to modern marketing practices.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, GDRRiley said:

Don't rush and buy cards wait for RDNA2 to be out

I'm not in a rush to upgrade, it would be nice if AMD had something that could at least compete with a 3070 with more VRAM, but i'm not getting hyped for anything from AMD.

And wow that 325w power draw at stock, I wonder how much power AIB cards will use.

Link to comment
Share on other sites

Link to post
Share on other sites

So Nvidia showed the numbers that put the card in the very best light compared to the last generation.
I feel like we go through this every generation, and every couple of generations a hype train gets built up.
 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, valdyrgramr said:

I literally told everyone that Nvidia lied on the Doom test with evidence, and that they exaggerated.   What they were doing was using older drivers for the 2080 Ti to misrepresent data.

...did they though?

doom-eternal-3840-2160.png.2bed65fe0e800987549cbf8f1d19ee1b.png

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

Seems like the 3000 series is going to be a pretty convincing win along the lines of 900 series -> 1000 series.

 

2000 series owners are getting it both ends.  Crushed on resale value and never really had good enough performance for 4K anyways.

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, HelpfulTechWizard said:

it looks like they over exaderated how much faster the 3080 is over the 2080.

how so?

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, valdyrgramr said:

See above I never hit under 144 with mine, and they are using a higher binned with an older driver.

So now you're implying not just that Nvidia purposefully gimped the 2080Ti to make the 3080 look better in comparison... but also Techpowerup?

And Eurogamer too, apparently:

Spoiler

4a18456fdb6170b54306359b0b75d7d0.png.a2e64887a9d4fe112a53a9cb584ba4c5.png

 

Maybe they just ran their benchmark in a different level/area, as opposed to the one you're in in the screenshots? From my experience playing the game performance varies quite a bit depending on the level and where exactly I'm at in it.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

I laugh at the so called bottlenecks. It was never a great way of thinking unless you're ludicrously unbalanced, but any consumer 6 core since coffee lake or zen 2 should be fine for any regular gamer with a 3080. I propose the "good enough" decision on if if you should pair a CPU and GPU together. In short, both have to be good enough to do what you want it to do. That's it. It doesn't matter if one is limiting the other as long as it reaches the performance target.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, SolarNova said:

CPU scaling

 

https://www.tomshardware.com/features/nvidia-geforce-rtx-3080-ampere-cpu-scaling-benchmarks

 

TLDR:

If u have old and/or slow CPU, and u want to get a 3080, make sure ur running 4k.

 

How dare you! I'll be getting the 3090, and running at 720p!

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

I laugh at the so called bottlenecks. It was never a great way of thinking unless you're ludicrously unbalanced, but any consumer 6 core since coffee lake or zen 2 should be fine for any regular gamer with a 3080. I propose the "good enough" decision on if if you should pair a CPU and GPU together. In short, both have to be good enough to do what you want it to do. That's it. It doesn't matter if one is limiting the other as long as it reaches the performance target.

You can have bottlenecks though that aren't desirable (for ex 2200g and gtx 1060) but in the case of 3xxx I'd rather see it as "reserve" as long you have a decent CPU (3600 and up)

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, AnonymousGuy said:

900 series -> 1000 series

Nah this is never coming back, not with conventional tech. Moore's law (or rather the inevitable death thereof)  is a real thing and not just a meme .

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

Nah this is never coming back, not with conventional tech. Moore's law (or rather the inevitable death thereof)  is a real thing and not just a meme .

I mean...is this not already happening with 2000 -> 3000.  The 3080 is already about 30% better than a 2080Ti so 50-60%  that happened from the 980Ti to the 1080Ti) is plausible for 2080Ti vs 3090?

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

My 2-cents.

You can't just compare the TFLOPS numbers; 2x TFLOPS =/= 2x real life performance.

RTX 3070 > RTX 2080 Ti because TFLOPS numbers...cuz nVidia said so...

 

RTX 3080 still a decent card, though.

Intel Z390 Rig ( *NEW* Primary )

Intel X99 Rig (Officially Decommissioned, Dead CPU returned to Intel)

  • i7-8086K @ 5.1 GHz
  • Gigabyte Z390 Aorus Master
  • Sapphire NITRO+ RX 6800 XT S.E + EKwb Quantum Vector Full Cover Waterblock
  • 32GB G.Skill TridentZ DDR4-3000 CL14 @ DDR-3400 custom CL15 timings
  • SanDisk 480 GB SSD + 1TB Samsung 860 EVO +  500GB Samsung 980 + 1TB WD SN750
  • EVGA SuperNOVA 850W P2 + Red/White CableMod Cables
  • Lian-Li O11 Dynamic EVO XL
  • Ekwb Custom loop + 2x EKwb Quantum Surface P360M Radiators
  • Logitech G502 Proteus Spectrum + Corsair K70 (Red LED, anodized black, Cheery MX Browns)

AMD Ryzen Rig

  • AMD R7-5800X
  • Gigabyte B550 Aorus Pro AC
  • 32GB (16GB X 2) Crucial Ballistix RGB DDR4-3600
  • Gigabyte Vision RTX 3060 Ti OC
  • EKwb D-RGB 360mm AIO
  • Intel 660p NVMe 1TB + Crucial MX500 1TB + WD Black 1TB HDD
  • EVGA P2 850W + White CableMod cables
  • Lian-Li LanCool II Mesh - White

Intel Z97 Rig (Decomissioned)

  • Intel i5-4690K 4.8 GHz
  • ASUS ROG Maximus VII Hero Z97
  • Sapphire Vapor-X HD 7950 EVGA GTX 1070 SC Black Edition ACX 3.0
  • 20 GB (8GB X 2 + 4GB X 1) Corsair Vengeance DDR3 1600 MHz
  • Corsair A50 air cooler  NZXT X61
  • Crucial MX500 1TB SSD + SanDisk Ultra II 240GB SSD + WD Caviar Black 1TB HDD + Kingston V300 120GB SSD [non-gimped version]
  • Antec New TruePower 550W EVGA G2 650W + White CableMod cables
  • Cooler Master HAF 912 White NZXT S340 Elite w/ white LED stips

AMD 990FX Rig (Decommissioned)

  • FX-8350 @ 4.8 / 4.9 GHz (given up on the 5.0 / 5.1 GHz attempt)
  • ASUS ROG Crosshair V Formula 990FX
  • 12 GB (4 GB X 3) G.Skill RipJawsX DDR3 @ 1866 MHz
  • Sapphire Vapor-X HD 7970 + Sapphire Dual-X HD 7970 in Crossfire  Sapphire NITRO R9-Fury in Crossfire *NONE*
  • Thermaltake Frio w/ Cooler Master JetFlo's in push-pull
  • Samsung 850 EVO 500GB SSD + Kingston V300 120GB SSD + WD Caviar Black 1TB HDD
  • Corsair TX850 (ver.1)
  • Cooler Master HAF 932

 

<> Electrical Engineer , B.Eng <>

<> Electronics & Computer Engineering Technologist (Diploma + Advanced Diploma) <>

<> Electronics Engineering Technician for the Canadian Department of National Defence <>

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Mark Kaine said:

You can have bottlenecks though that aren't desirable (for ex 2200g and gtx 1060) but in the case of 3xxx I'd rather see it as "reserve" as long you have a decent CPU (3600 and up)

I guess the biggest things I hate about "bottleneck" posts are two fold:

1, the same "bottleneck" wont happen in all possible variations of settings. You have to define the test case.

2, for some reason it is better for the GPU to be limiting, not the CPU.

 

I can kinda understand #2 if you're doing a tight budget build, where you probably want to overspend on one part or the other.  The first part is usually not well defined, and varies enough with game title even if you pick a resolution and quality class target.

 

While on the topic, the other one people have been flooding the forum with is the PCIe 3.0 or 4.0 question. It was linked somewhere earlier in the thread, but TechPowerUp did the testing, and we're looking around 1% difference. It is really not something to be concerned about. They have also done testing putting red and blue CPUs against each other, specifically 3900XT and 10900k. In short, red is still behind on average. 10% down at 1080p, 7% down at 1440p, and near enough parity at 4k. The PCIe speed difference is nothing in comparison.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, -rascal- said:

My 2-cents.

You can't just compare the TFLOPS numbers; 2x TFLOPS =/= 2x real life performance.

RTX 3070 = RTX 2080 Ti because TFLOPS numbers...smh

 

RTX 3080 still a decent card, though.

I think it's a much better measurement than reviews or benchmarks because both inevitably will be biased in one way or another (and not even necessarily on purpose).

 

 

The fact that you don't get 2x in-game performance with 2x TF is something one should know though I agree.

 

 (Though, you do, "in theory"). :)

5 minutes ago, AnonymousGuy said:

2080Ti vs 3090?

Shouldn't the 3090 be compared to the RTX Titan though?

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I recently upgraded to a 1440p 144hz monitor, and my 2070 is struggling even without any RTX games yet. I think I might spring for a 3080 and try to get some resell value for my old card, thankfully my CPU isn't a bottleneck at all. 

Mini-ITX Desktop: i9-9900K@5GHz, 32GB TridentZ RGB 3200MHz, Asus Strix Z390-i, EVGA 3090 Hybrid FTW3, Samsung 970 EVO+ NVMe 1TB, Lian Li O11 Air Mini White

Plex/Minecraft Server: Dell PowerEdge T320, Xeon E5-2470 v2, 48GB RAM, 19.25TB storage, RTX A2000 6GB

Tablet: iPad Pro 11” M1

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, porina said:

I guess the biggest things I hate about "bottleneck" posts are two fold:

1, the same "bottleneck" wont happen in all possible variations of settings. You have to define the test case.

2, for some reason it is better for the GPU to be limiting, not the CPU.

 

I can kinda understand #2 if you're doing a tight budget build, where you probably want to overspend on one part or the other.  The first part is usually not well defined, and varies enough with game title even if you pick a resolution and quality class target.

 

While on the topic, the other one people have been flooding the forum with is the PCIe 3.0 or 4.0 question. It was linked somewhere earlier in the thread, but TechPowerUp did the testing, and we're looking around 1% difference. It is really not something to be concerned about. They have also done testing putting red and blue CPUs against each other, specifically 3900XT and 10900k. In short, red is still behind on average. 10% down at 1080p, 7% down at 1440p, and near enough parity at 4k. The PCIe speed difference is nothing in comparison.

just adding on this, i think for last gen we were able to somewhat recommend a 3300x/3600 up to a 2070S. For this gen it's important to note that every cpu can bottleneck a 3080 under 4k, and even sometimes at 4k. We need faster cpus, waiting for zen 3 will make your 700usd gpu run faster.

 

Here's some key point i saw

 

In general performance was ~20% faster

RT on it jumps to about 25-30%

CPU bottleneck is severe in some cases

PCIE bottleneck =1%

If doing a new build, wait for zen 3.

I looked at the 3090 specs again and am now expecting the 3090 to be 25% faster than a 3080

3070 will take the price to performance crown, but the 3080 is king for now

 

Just waiting on the hero that buys a 3080 to use with a 1080p/144hz monitor to play RT on/DLSS off.

 

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Consul said:

 

I guess that makes sense. Would it look any good though?

 

Wouldn't my CPU bottleneck the 3080 though? Why would it also be the other way around?

No, your CPU is fine enough. No need to worry. Just use DSR and crank up the resolution to 1440p or even 2160p if your GPU can handle it.

Currently using my i7 4770 and GTX1080Ti and I too use DSR. No problems.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

Tfw I don't feel disappointed about it cause all i want is a 1080p silent system

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

Part of me wants to just grab RTX 3080 and call it a day and my other part of me wants to wait for RX 6000 series and better general availability. The only problem is Cyberpunk 2077 getting released in the meanwhile and I'd really like to play it all maxed out with ray tracing and all that stuff. Also given the amount of raw compute horsepower should help me run RTGI ReShade shader easier on non RTX games since it's done through regular shaders. Argh.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×