Jump to content

Upgrade from PCIe 3.0 to 4.0 for RTX 3080

Kasenumi

Hello guys
It is less than two weeks until we shall see the new RTX graphics cards (in our PC cases). Time for an upgade and getting ready for new games.
The question I want to ask is about upgrading a MOBO for RTX 2080. My current AB350 Gaming 3 motherboard handles RTX 2080, Ryzen 7 2700x and 16 GB 3000 Mhz RAM (that has to run in 3200 but that speed causes BSOD, 3000 Mhz is stable, probably because of MOBO).
I am being adviced by my friend to upgrade my B350 MOBO to X570 for improved performance of both RAM, GPU. My friend keeps telling me it's a big upgrade in terms of performance. 
Would it really be a significant upgrade for 1440p gaming? Or is it better to wait for DDR5 to replace my B350 MOBO?

Case: Fractal Design Torrent;
GPU: MSI GeForce RTX 2080 GAMING X TRIO 8GB;
RAM: G.Skill Flare X, DDR4, 16 GB,3200MHz, CL14;
CPU: AMD Ryzen 7 2700X, 
CPU Cooler: Noctua NH D15;
MOBO: Gigabyte GA-AB350-GAMING 3;
PSU: be quiet! 700W PURE POWER 11CM 80+Gold;
SSD 1: SanDisk Extreme PRO 500GB PCIe x4 NVMe;
SSD 2: PLEXTOR PX-1TM9PeY 1TB SSD

Link to comment
Share on other sites

Link to post
Share on other sites

No. gen 3 x16 for the most part should be enough bandwidth for a 3080, wait for ampere to actually launch and check but for the most part it should be fine.

and the 2700x at higher resolutions shouldn't struggle at least that bad with a 3080, but again, wait and check. 

and either way the 2700x is gen 3 lol.

 

10 hours ago, CircleTech said:

Nvidia GPUs have never used more than PCIe 3.0x4

x4?? they saturated x8 easily, just not x16.

Edited by TofuHaroto

PC: Motherboard: ASUS B550M TUF-Plus, CPU: Ryzen 3 3100, CPU Cooler: Arctic Freezer 34, GPU: GIGABYTE WindForce GTX1650S, RAM: HyperX Fury RGB 2x8GB 3200 CL16, Case, CoolerMaster MB311L ARGB, Boot Drive: 250GB MX500, Game Drive: WD Blue 1TB 7200RPM HDD.

 

Peripherals: GK61 (Optical Gateron Red) with Mistel White/Orange keycaps, Logitech G102 (Purple), BitWit Ensemble Grey Deskpad. 

 

Audio: Logitech G432, Moondrop Starfield, Mic: Razer Siren Mini (White).

 

Phone: Pixel 3a (Purple-ish).

 

Build Log: 

Link to comment
Share on other sites

Link to post
Share on other sites

You'll also need to get a Ryzen 3000 CPU in order to get PCIe 4.0.

 

I'd suggest a B550 motherboard instead of an X570. It's less expensive, and gives you all the features.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Kasenumi said:

My friend keeps telling me it's a big upgrade in terms of performance. 

no

 

your CPU doesnt support pcie 4.0 to begin with

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, CircleTech said:

Nvidia GPUs have never used more than PCIe 3.0x4 bandwidth anyway so there should be nothing to roaay about. 

That is flat out false. The 2080TI was bottlenecked by a 3.0 x8 connection.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, CircleTech said:

Nvidia GPUs have never used more than PCIe 3.0x4 bandwidth

pretty sure they do, even a 2060super does i think

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, CircleTech said:

I keep thinking the GTX 900 series was only released a couple of years ago

Maxwell is quite old at this point. :P

PC: Motherboard: ASUS B550M TUF-Plus, CPU: Ryzen 3 3100, CPU Cooler: Arctic Freezer 34, GPU: GIGABYTE WindForce GTX1650S, RAM: HyperX Fury RGB 2x8GB 3200 CL16, Case, CoolerMaster MB311L ARGB, Boot Drive: 250GB MX500, Game Drive: WD Blue 1TB 7200RPM HDD.

 

Peripherals: GK61 (Optical Gateron Red) with Mistel White/Orange keycaps, Logitech G102 (Purple), BitWit Ensemble Grey Deskpad. 

 

Audio: Logitech G432, Moondrop Starfield, Mic: Razer Siren Mini (White).

 

Phone: Pixel 3a (Purple-ish).

 

Build Log: 

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA already confirmed that the PCIe 4.0 connection of the GPU will not perform any worse on PCIe 3.0. Also they did all their tests on a 10900K system, which only supports PCIe 3.0 to begin with.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, TofuHaroto said:

Maxwell is quite old at this point. :P

6 years.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Stahlmann98 said:

NVIDIA already confirmed that the PCIe 4.0 connection of the GPU will not perform any worse on PCIe 3.0.

No, they didn't. They said that CPU choice typically affects it more than PCIe generation, which is not the same thing. PCIe 3.0 on a Ryzen may result in 5% performance drops compared to 4.0, but we don't know yet.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

Just to bring into context, the benchmarks nvidia showed were all on a 10900k that doesnt even support PCIe 4.0. the real thing holding you back in gaming performance compared to the 10900k is the cpu, not the pcie gen.

GUITAR BUILD LOG FROM SCRATCH OUT OF APPLEWOOD

 

- Ryzen Build -

R5 3600 | MSI X470 Gaming Plus MAX | 16GB CL16 3200MHz Corsair LPX | Dark Rock 4

MSI 2060 Super Gaming X

1TB Intel 660p | 250GB Kingston A2000 | 1TB Seagate Barracuda | 2TB WD Blue

be quiet! Silent Base 601 | be quiet! Straight Power 550W CM

2x Dell UP2516D

 

- First System (Retired) -

Intel Xeon 1231v3 | 16GB Crucial Ballistix Sport Dual Channel | Gigabyte H97 D3H | Gigabyte GTX 970 Gaming G1 | 525 GB Crucial MX 300 | 1 TB + 2 TB Seagate HDD
be quiet! 500W Straight Power E10 CM | be quiet! Silent Base 800 with stock fans | be quiet! Dark Rock Advanced C1 | 2x Dell UP2516D

Reviews: be quiet! Silent Base 800 | MSI GTX 950 OC

 

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Stahlmann98 said:

NVIDIA already confirmed that the PCIe 4.0 connection of the GPU will not perform any worse on PCIe 3.0. Also they did all their tests on a 10900K system, which only supports PCIe 3.0 to begin with.

That doesn't mean it won't do better on PCIe 4.0, though. Nvidia is in a really tough spot. They need to sell these cards to Intel owners, because up until this point Intel has always had an edge in gaming performance, so there's a lot of hardcore gamers (i.e. the exact people that will be rushing out on launch day to scoop these cards up) on Intel. However, Intel doesn't support PCIe 4.0, and still probably won't for a year or two.

 

If I was Nvidia, that's exactly how I would have played it. Bench it on an Intel, as a worst case scenario. Then, if the AMD folks with their shiny PCIe 4.0 get even better performance, great, but that's a lot better than advertising performance that Intel users can't get, and then having unhappy customers because they can't get that performance you promised.

 

That doesn't mean 4.0 will actually do any better. I'm just saying that the fact that Nvidia demoed on Intel doesn't say anything definite. That was the smart move, either way.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Chris Pratt said:

 

Who knows, maybe it'll all change once NVIDIA I/O will come into play and use the PCIe 4.0 bandwidth. Of course this will only be a factor if it's then paired with a PCIe 4.0 SSD aswell. A lot of interesting benchmarks coming up! :D 

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann98 said:

Who knows, maybe it'll all change once NVIDIA I/O will come into play and use the PCIe 4.0 bandwidth. Of course this will only be a factor if it's then paired with a PCIe 4.0 SSD aswell. A lot of interesting benchmarks coming up! :D 

Oh yeah. I think I'm more excited about that than the actual cards themselves. Lots of good incoming content.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Chris Pratt said:

Oh yeah. I think I'm more excited about that than the actual cards themselves. Lots of good incoming content.

For me the most exciting part is de-bottlenecking my TV. With HDMI 2.1 and the extra performance i can finally run my OLED TV at 4K 120Hz with 12-bit colors and HDR all at the same time. Right now i have to use either 4K 60Hz 8-bit or 1440p 120Hz 8-bit and neither is ideal. Not to mention i can actually play 4K at 60 fps and above ;)

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

As some others said all benchmarks shown by Nvidia were in pci e 3.0. They would have used a pci e 4 system if it bestowed more performance. In the reddit ama Nvidia clearly stated that the improvement with pci e 4 was very minor with only a few percentage points. Regarding pcie x4 vs x8 hardware unboxed has done a video dealing in exactly this. Their findings showed that pci e 3.0 x4 was indeed a bottleneck for nividia cards. But pci e 3.0 x8 and above doesn't make a difference. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×