Jump to content

AMD vs NVidia for 4k in 2024?

Aereldor

Currently using a 6850M XT eGPU with my Flow X13 laptop. It is a desktop 6700xt and performs exactly the same as one, and does okay at 4k, running most games at high 60fps with FSR. But it has a bunch of issues, and I'm going to just build a PC instead.

 

That said, I'm not super satisfied with FSR, it's substandard upscaling and while it's a lot better than nothing, it doesn't hold a candle to DLSS. 

 

That said, XeSS 1.3 is amazing, works on all GPUs, and this makes me think... 

 

I had a $1300 budget for a build, and I can get a WHOLE lot more GPU for my money with AMD (7900 XTX, $800), than NVidia (4070 Ti Super, $800) in terms of raster performance. 

 

I intend to use heavy upscaling and frame generation to hit my 4k 120fps target, and DLSS is still by far the best looking upscaler. However, Intel looks to be catching up. 

 

So... What should I get? Hedging my bets.

Link to comment
Share on other sites

Link to post
Share on other sites

if youre just going from a 6700xt to a 7900 i don't think you'll see a very big difference regardless. so whichever the cheapest option would be fine

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, emosun said:

if youre just going from a 6700xt to a 7900 i don't think you'll see a very big difference regardless. so whichever the cheapest option would be fine

How? A 7900XTX is almost 3 times faster. It gets around 31,000 in time spy, a 6700xt manages 12,000 on a good day.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, soundlogic said:

How? A 7900XTX is almost 3 times faster. It gets around 31,000 in time spy, a 6700xt manages 12,000 on a good day.

oh is the machine used for time spy runs or video games?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, emosun said:

if youre just going from a 6700xt to a 7900 i don't think you'll see a very big difference regardless. so whichever the cheapest option would be fine

The 7900 XTX has a solid lead over the 6950XT at 4K, it'd be even better vs the 6700 XT, considering the 6950XT already has a massive lead over that. Tomshardware has their 4K roundup with both the 7900 XTX and 6950XT on the charts here: https://www.tomshardware.com/reviews/amd-radeon-rx-7900-xtx-and-xt-review-shooting-for-the-top/4. And the 6950XT numbers with the 6700XT on the charts here: https://www.tomshardware.com/reviews/amd-radeon-rx-6950-xt-review/4

29 minutes ago, soundlogic said:

I intend to use heavy upscaling and frame generation to hit my 4k 120fps target, and DLSS is still by far the best looking upscaler. However, Intel looks to be catching up. 

4K wants the most GPU you can throw at it. If you have no other reason to prefer Nvidia than DLSS, I'd say you're better off with more rasterization performance and VRAM. You should be able to get away with a higher native resolution, which should look as good or better than DLSS at a lower internal resolution. The only stickler would be games that only have FSR or DLSS, no XeSS. You'd have to make do with FSR there. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, soundlogic said:

Currently using a 6850M XT eGPU with my Flow X13 laptop. It is a desktop 6700xt and performs exactly the same as one, and does okay at 4k, running most games at high 60fps with FSR. 

 

That said, I'm not super satisfied with FSR, it's substandard upscaling and while it's a lot better than nothing, it doesn't hold a candle to DLSS. 

 

That said, XeSS 1.3 is amazing, works on all GPUs, and this makes me think... 

 

I had a $1300 budget for a build, and I can get a WHOLE lot more GPU for my money with AMD (7900 XTX, $800), than NVidia (4070 Ti Super, $800) in terms of raster performance. 

 

I intend to use heavy upscaling and frame generation to hit my 4k 120fps target, and DLSS is still by far the best looking upscaler. However, Intel looks to be catching up. 

 

So... What should I get? Hedging my bets.

The RTX 4090 XG mobile would be a significant upgrade, and you could probably flip that 6850M XT for a significant return. I have a friend who's considered getting one of the XG mobiles who isn't bothering with 4K who wishes those 6850M XT models were still in stock. At $2k though, its quite steep, and is effectively a TDP limited RTX 4080.

 

I don't see how you're getting just a GPU for your Flow X13, so you'd be building a whole rig. You'd at minimum want an RX 6800 to get at least 16GB of VRAM for 4K. Even then, you're not running ultra textures in a lot of games and will likely end up at med-high based on VRAM requirements alone.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, soundlogic said:

Currently using a 6850M XT eGPU with my Flow X13 laptop. It is a desktop 6700xt and performs exactly the same as one, and does okay at 4k, running most games at high 60fps with FSR. 

 

That said, I'm not super satisfied with FSR, it's substandard upscaling and while it's a lot better than nothing, it doesn't hold a candle to DLSS. 

 

That said, XeSS 1.3 is amazing, works on all GPUs, and this makes me think... 

 

I had a $1300 budget for a build, and I can get a WHOLE lot more GPU for my money with AMD (7900 XTX, $800), than NVidia (4070 Ti Super, $800) in terms of raster performance. 

 

I intend to use heavy upscaling and frame generation to hit my 4k 120fps target, and DLSS is still by far the best looking upscaler. However, Intel looks to be catching up. 

 

So... What should I get? Hedging my bets.

If you play on a TV you can use 1080p native on a 4K screen and it'll look okay

Not really on a standard monitor where you stand at 15" distance  tho

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Zando_ said:

4K wants the most GPU you can throw at it. If you have no other reason to prefer Nvidia than DLSS, I'd say you're better off with more rasterization performance and VRAM. You should be able to get away with a higher native resolution, which should look as good or better than DLSS at a lower internal resolution. The only stickler would be games that only have FSR or DLSS, no XeSS. You'd have to make do with FSR there. 

Well I also have the idiotic ambition to run games at 4k 120fps (I have a very nice display lol and want to make full use of it) so native resolution is out of the question and frame generation will be necessary, even though I run everything at medium with high textures. 

 

FSR is REALLY far behind DLSS, to the point where DLSS Performance outperforms FSR Quality at 4k, and XeSS 1.3 while impressive and easy to mod into any game, doesn't have any frame generation. AFMF is terrible. 

And honestly, XeSS is better than FSR but isn't as good as DLSS.

 

Is it really that much of a non-issue? Right now the 4070 Ti Super and the 7900XTX can both be had for $800ish, and for indications of raster performance, score 24k and 30k on time spy respectively.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Agall said:

The RTX 4090 XG mobile would be a significant upgrade, and you could probably flip that 6850M XT for a significant return. I have a friend who's considered getting one of the XG mobiles who isn't bothering with 4K who wishes those 6850M XT models were still in stock. At $2k though, its quite steep, and is effectively a TDP limited RTX 4080.

 

I don't see how you're getting just a GPU for your Flow X13, so you'd be building a whole rig. You'd at minimum want an RX 6800 to get at least 16GB of VRAM for 4K. Even then, you're not running ultra textures in a lot of games and will likely end up at med-high based on VRAM requirements alone.

Not spending $2k on an eGPU locked into a shitty ecosystem. I could outperform the $2,000 4090 XGM with a $1200 PC.

 

And yeah, it's a 150w 4080. Does 20k on Time Spy and is generally okay but not worth $2,000. If it was $1300 there would be no debate honestly, because it doesn't have the weird disconnection issues.

 

I edited it for clarity but I'm selling the XG Mobile and building a pC with the money. I can get $900 for it in a heartbeat and put in $400 of my own towards a PC and the only compromise is size.

 

I don't notice a difference between high and ultra textures fwiw. I've pixel peeped at like 12 AAA titles.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, PDifolco said:

If you play on a TV you can use 1080p native on a 4K screen and it'll look okay

Not really on a standard monitor where you stand at 15" distance  tho

I play on a 48" LG C2 OLED (4k 120hz) but I only sit 2.5 feet from it. For all intents and purposes, it is a monitor. 1080p looks like trash on a 48 inch display.

Before anyone asks why I'm doing this, I got it for $200 through a Best Buy pricing error.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Aereldor said:

Well I also have the idiotic ambition to run games at 4k 120fps (I have a very nice display lol and want to make full use of it) so native resolution is out of the question and frame generation will be necessary, even though I run everything at medium with high textures. 

 

FSR is REALLY far behind DLSS, to the point where DLSS Performance outperforms FSR Quality at 4k, and XeSS 1.3 while impressive and easy to mod into any game, doesn't have any frame generation. AFMF is terrible. 

 

Is it really that much of a non-issue? Right now the 4070 Ti Super and the 7900XTX can both be had for $800ish, and for indications of raster performance, score 24k and 30k on time spy respectively.

Gotta compromise somewhere. I didn't realize the performance difference between FSR and DLSS, I only use it when DLSS or XeSS are not an option (I have Nvidia and Intel GPUs). If you think you'll be fully dependent on DLSS and Nvidia's frame gen then just get the best Nvidia GPU you can afford (seems the 4070 Ti in this case). 

 

I play at 4K60, and I have a 2060 Super and an Intel ARC A770. The ARC is better for rasterization but has some compatibility issues with the games I play, leading to me using the 2060 Super more often. Enough of my games support DLSS that I don't notice the drop in raw GPU power, and the ones that don't are usually old enough that they'll run playably on not-eyesore settings. It's not exactly the same situation you're in, but similar enough that I thought it was worth sharing. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Zando_ said:

Gotta compromise somewhere. I didn't realize the performance difference between FSR and DLSS, I only use it when DLSS or XeSS are not an option (I have Nvidia and Intel GPUs). If you think you'll be fully dependent on DLSS and Nvidia's frame gen then just get the best Nvidia GPU you can afford (seems the 4070 Ti in this case). 

 

I play at 4K60, and I have a 2060 Super and an Intel ARC A770. The ARC is better for rasterization but has some compatibility issues with the games I play, leading to me using the 2060 Super more often. Enough of my games support DLSS that I don't notice the drop in raw GPU power, and the ones that don't are usually old enough that they'll run playably on not-eyesore settings. It's not exactly the same situation you're in, but similar enough that I thought it was worth sharing. 

No performance difference, MASSIVE image quality difference. FSR uses a sharpening algorithm, and doesn't use any machine learning.

 

If I were to rate them 1-5 in terms of image quality with 5 being native. 

 

FSR 1.0: 1/5 (might as well not use any upscaling)

FSR 2.x: 2/5 (Good, but lots of artefacting, oversharpening, TERRIBLE flickering!)

XeSS 1.2: 3/5 (Less flickering, some overall softness, inconsistent AA, moving objects have huge trails)

Xess 1.3: 3.5/5 (Flickering reduced more, better AA, no trails, but a little softer)

DLSS 2.x: 4/5 (Sharp but ghosting issues, weird smearing on slow-moving objects)

DLSS 3.x: 4.5/5 (almost no issues, especiallyt in 3.7)

 

FSR 2.0 is useable, XeSS 1.3 is good, DLSS 3.xx is amazing. With the correct amount of sharpening and ultra textures, 1080p internal upscaled to 4k looks indistinguishable to native during gameplay.
 

 

In terms of raw raster performance, the 7900XTX is 20% faster than the 4070 Ti Super for a similar price. But the upscalers on offer are SO far apart.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, emosun said:

oh is the machine used for time spy runs or video games?

Sis you're tripping if you think there's little difference between a 6700xt and a 7900xtx 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Aereldor said:

Sis you're tripping if you think there's little difference between a 6700xt and a 7900xtx 

the performance difference doesn't appear to be linear so no I don't consider it be be much different.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Aereldor said:

FSR is REALLY far behind DLSS, to the point where DLSS Performance outperforms FSR Quality at 4k

I see this a lot in channels that cover this from a pixel peeping perspective but I have to say, this just isn't my experience in person. I played through all of Jedi Survivor using FSR Quality prior to them adding DLSS and frankly, once they did and I switched, I really could tell the difference let alone have it affect my enjoyment of the video game. 4090/4K LG C2. 

 

Personally at this point just having one of the available technologies available (at least if you're on Nvidia) should be sufficient. 

 

Anyway..

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, emosun said:

the performance difference doesn't appear to be linear so no I don't consider it be be much different.

Over twice as fast is pretty different. 3BUQTn5dZgQi7zL8Xs4WUL.png

BAGV2GBMHHE4gkb7ZzTxwK.png

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GuiltySpark_ said:

I see this a lot in channels that cover this from a pixel peeping perspective but I have to say, this just isn't my experience in person. I played through all of Jedi Survivor using FSR Quality prior to them adding DLSS and frankly, once they did and I switched, I really could tell the difference let alone have it affect my enjoyment of the video game. 4090/4K LG C2. 

 

Personally at this point just having one of the available technologies available (at least if you're on Nvidia) should be sufficient. 

 

Anyway..

There's a pretty huge difference between XeSS for me in Hogwarts Legacy and Horizon Forbidden West. I have no choice but to pixel peep because I'm on a 48" 4k OLED screen and I'm 2.5 feet away from it lol, it's more noticeable than at traditional display sizes. 

 

 There's the frame generation too. AMD has AFMF for all games but it's not very good and adds a troubling amount of input lag and doesn't have software to negate it, while NVidia has lower in put lag and reflex. 

 

There's talk that Intel is implementing AI frame generation into the next XeSS and will have lower input lag than either of them, but that's pretty far out and it's a bit of a gamble to get an AMD GPU and hope for that.

Link to comment
Share on other sites

Link to post
Share on other sites

Upscaling/frame generation from a GPU that has better raster performance will make it easier to meet or exceed a high framerate target. Not subjective

 

Nitpicking image quality? Subjective

 

If its strictly between the 7900XTX and 4070TS and really anal about having both 120fps and good image quality, I would pick the 7900XTX and run 4K native with high graphics settings + frame gen.

 

If you look up videos of 4070TS upscaled to 4K (DLSS quality), framerate in demanding games is only 60-70fps when the image is already not true 4K.

Link to comment
Share on other sites

Link to post
Share on other sites

Dlss plus’s frame get will get you close to you target 

dlss is better than fsr and nvdia frame gen is better 

i get 120 fps in forbidden west with quality dlss and framegen 4k 

-13600kf 

- 4000 32gb ram 

-4070ti super duper 

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/18/2024 at 12:10 PM, Salted Spinach said:

Upscaling/frame generation from a GPU that has better raster performance will make it easier to meet or exceed a high framerate target. Not subjective

 

Nitpicking image quality? Subjective

 

If its strictly between the 7900XTX and 4070TS and really anal about having both 120fps and good image quality, I would pick the 7900XTX and run 4K native with high graphics settings + frame gen.

 

If you look up videos of 4070TS upscaled to 4K (DLSS quality), framerate in demanding games is only 60-70fps when the image is already not true 4K.

 

I know the 7900XTX is 20% faster for the same money. 

 

Unfortunately the frame gen on AMD right now is either not great through FSR 3, or really not great through AMD Radeon Software. Upside it works on every game, downside it isn't very good. 

 

I'm gonna be running at medium settings with high textures almost always. Benchmarks are maxed out (even though they don't look better).

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/19/2024 at 4:30 PM, Ebony Falcon said:

Dlss plus’s frame get will get you close to you target 

dlss is better than fsr and nvdia frame gen is better 

i get 120 fps in forbidden west with quality dlss and framegen 4k 

 

Yeah I have a 6850m XT right now which is a 6700xt and I use FSR and AFMF. It's not good, like visibly not good. I use XeSS whenever I can.

 

AMD also doesn't have a competitor to NVidia Reflex. DLSS Framegen+reflex has input lag similar to framegen off. AMD it always adds LOADS of input lag.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Aereldor said:

Unfortunately the frame gen on AMD right now is either not great through FSR 3, or really not great through AMD Radeon Software.

I am proposing AFMF at native 4K. No upscaling

 

DLSS is not going to match native image quality.

 

Also FSR and AFMF, however inferior they may or may not be in comparison to DLSS, can always be improved.

 

Meanwhile no amount of software update will allow the 4070TS to match the XTX in raw raster

Link to comment
Share on other sites

Link to post
Share on other sites

IMO, Nvidia all the way because of DLSS. DLSS and frame gen is awesome. 

PC Setup: 

HYTE Y60 White/Black + Custom ColdZero ventilation sidepanel

Intel Core i7-10700K + Corsair Hydro Series H100x

G.SKILL TridentZ RGB 32GB (F4-3600C16Q-32GTZR)

ASUS ROG STRIX RTX 3080Ti OC LC

ASUS ROG STRIX Z490-G GAMING (Wi-Fi)

Samsung EVO Plus 1TB

Samsung EVO Plus 1TB

Crucial MX500 2TB

Crucial MX300 1TB

Corsair HX1200i

 

Peripherals: 

Samsung Odyssey Neo G9 G95NC 57"

Samsung Odyssey Neo G7 32"

ASUS ROG Harpe Ace Aim Lab Edition Wireless

ASUS ROG Claymore II Wireless

ASUS ROG Sheath BLK LTD'

Corsair SP2500

Beyerdynamic TYGR 300R + FiiO K7 DAC/AMP

RØDE VideoMic II + Elgato WAVE Mic Arm

 

Racing SIM Setup: 

Sim-Lab GT1 EVO Sim Racing Cockpit + Sim-Lab GT1 EVO Single Screen holder

Svive Racing D1 Seat

Samsung Odyssey G9 49"

Simagic Alpha Mini

Simagic GT4 (Dual Clutch)

CSL Elite Pedals V2

Logitech K400 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/22/2024 at 10:30 PM, Aereldor said:

AMD also doesn't have a competitor to NVidia Reflex.

It's called Anti-lag. Its in the Radeon software. Same with everything AMD to you of course, it's "not good", or "terrible".

 

You are not asking which card/brand to get BTW. Not at any point in the topic have you asked the question correctly. The only thing you've asked is "Why shouldn't i get a 4070TS". Everything said to you about AMD you explained away with "It's not good" or "It's terrible". So you've clearly already made your choice and judging from the reactions here no one is going to convince you to buy AMD again. So go order the 4070TS and be happy.

I have no signature

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Helly said:

It's called Anti-lag. Its in the Radeon software. Same with everything AMD to you of course, it's "not good", or "terrible".

 

You are not asking which card/brand to get BTW. Not at any point in the topic have you asked the question correctly. The only thing you've asked is "Why shouldn't i get a 4070TS". Everything said to you about AMD you explained away with "It's not good" or "It's terrible". So you've clearly already made your choice and judging from the reactions here no one is going to convince you to buy AMD again. So go order the 4070TS and be happy.

Yeah and it's only in 12 games. 

 

'Asking the question correctly' dude who do you think you are? I don't normally do this but here are two of MANY videos from authorities on the subject. Also literally any Daniel Owen comparison video. FSR is MILES behind in terms of image quality and it's readily apparent during gameplay too, and that's compared to XeSS, which in turn is behind DLSS but not obviously so anymore.

 

Your message is fanboying and ragebaiting and thoroughly unhelpful. 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×