Jump to content

AMD-sponsored Avatar: Frontiers of Pandora PC requirements. FSR3, DLSS and XeSS support. AMD bundles shown now

filpo

Summary

Requirements for Avatar: Frontiers of Pandora have been released with upscaling at all performance levels and 16GB of RAM minimum

image.png.1c96256a76592ffbb3609ac5fc1b7619.png

Bundles showcased on the 7th of November:

image.png.1bc3343f99613943f7fa64a147b8bc96.png

 

Quotes

Quote

The game will provide support for FSR3 including Frame Generation, Intel XeSS and Nvidia DLSS2, with all these technologies available right from the game's launch.

 

Additionally, the developers have confirmed that the game will incorporate ray-traced reflections and shadows. The Avatar game is set to offer compatibility with ultrawide resolutions and high resolution multi-monitor configurations. Furthermore, it has been confirmed that the game will be optimized for multicore systems

 

Quote

The PC specs list Radeon RX 5700, GTX 1070 or Arc A750 graphics as minimum for 1080p gameplay at 30 FPS. This includes FSR2 set to Quality mode. The 1080 60 FPS experience will require at least RX 6700 XT or RTX 3060 Ti graphics, while RX 6800XT and RTX 3080 will allow the resolution to be bumped to 1440p. For 4K gameplay, a high-end CPU such as Ryzen 7 5800X3D or i7-12700K will be necessary, alongside RTX 4080 or RX 7900 XTX graphics.

 

My thoughts

It's nice to see that even in AMD sponsored games, AMD are still looking out for the other gpus with DLSS and XESS. However the requirements are quite disappointing for me. At first glance they don't look TOO bad but then you see that for Ultra 4K with a 4080 and 5800X3D, FSR BALANCED is needed for it to hit 60 fps. Also a GTX 1070 for minimum specs means this won't be a game for the budget gamers (sub $500 and the like). The requirement of 16gb of ram isn't too concerning but may make some people want to upgrade. And the need of upscaling at every single level of performance (even if it is only quality up until Ultra) means we might have another Remnant II situation on our hands 

 

Sources

AMD-sponsored Avatar: Frontiers of Pandora PC specs list FSR3, DLSS and XeSS support - VideoCardz.com

 

Edited by filpo
Bundles shown

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, filpo said:

At first glance they don't look TOO bad but then you see that for Ultra 4K with a 4080 and 5800X3D, FSR BALANCED is needed for it to hit 60 fps.

Sounds about right, from a quick google that's an internal resolution of a little over 1440p. I assume that's with ray-tracing on, which depending on how it's implemented can be light (see Metro Exodus Enhanced Edition) or incredibly heavy (see Cyberpunk 2077). 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Zando_ said:

Sounds about right, from a quick google that's an internal resolution of a little over 1440p

Interesting. So FSR's scaling quality is different to DLSS's since DLSS balanced is about half of the output resolution (1080p)

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, filpo said:

Interesting. So FSR's scaling quality is different to DLSS's since DLSS balanced is about half of the output resolution (1080p)

DLSS Balanced should be about 1440p as well. DLSS Performance (at 4K output) is 1080p internal, DLSS Ultra Performance is 720p internal. 1080p is ~2 million pixels, 1440p is ~3.6 million pixels, and 4K is a massive ~8.2 million pixels, so there's a gargantuan leap between 1440p and 4K internal resolution. Thus cards struggling to run it, especially with RT enabled. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, filpo said:

Interesting. So FSR's scaling quality is different to DLSS's since DLSS balanced is about half of the output resolution (1080p)

No, they're the same. It's DLSS Performance that uses a 1080p internal resolution at 4K.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Zando_ said:

DLSS Balanced should be about 1440p as well. DLSS Performance (at 4K output) is 1080p internal, DLSS Ultra Performance is 720p internal. 1080p is ~2 million pixels, 1440p is ~3.6 million pixels, and 4K is a massive ~8.2 million pixels, so there's a gargantuan leap between 1440p and 4K internal resolution. Thus cards struggling to run it, especially with RT enabled. 

 

1 minute ago, YoungBlade said:

No, they're the same. It's DLSS Performance that uses a 1080p internal resolution at 4K.

Never mind then. I just got confused. Thanks for clearing that up guys

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, filpo said:

Never mind then. I just got confused. Thanks for clearing that up guys

No problem. I stubbornly try to play at 4K with an ARC A770 and 2060 Super, so I'm painfully aware of the numbers involved :old-laugh:.

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

nice with all kinds of support, sad its not that good looking game compared to what I thought it would be. also might be similar to far cry, but weaker than some of them. but might have some nice assets though.

Link to comment
Share on other sites

Link to post
Share on other sites

... can't they just list the native resolution instead of having to calculate now what is what because of upscaling? It's incredibly annoying if this is going to become standard. Especially when all resolutions say FSR2 Quality except at 4k where it is suddenly Balanced.

 

1080p Quality = 1280x720

1440p Quality = 1706 x 960

4k Balanced = 2259 x 1270

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, WereCat said:

... can't they just list the native resolution instead of having to calculate now what is what because of upscaling? It's incredibly annoying if this is going to become standard.

I guess we are entering the era where upscaling is expected by default for normal play. On Steam Hardware Survey, we're on 45% of RTX GPUs. DLSS alone covers nearly half the Steam gaming market and there's a good chance it'll cross that mark by the end of the year. XeSS/FSR2 can fill in the gaps of other fast enough GPUs.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, porina said:

I guess we are entering the era where upscaling is expected by default for normal play. On Steam Hardware Survey, we're on 45% of RTX GPUs. DLSS alone covers nearly half the Steam gaming market and there's a good chance it'll cross that mark by the end of the year. XeSS/FSR2 can fill in the gaps of other fast enough GPUs.

I can't find the video from WAN Show that is almost 9y old now probably.... but I'm getting

 

"Resolution is just a number" vibes from all of this

Link to comment
Share on other sites

Link to post
Share on other sites

I hate it how FSR/DLSS became a necessity to play any game. This stuff should be BONUS, not a necessity. Meaning, if game can run at 60fps maxed out with ray tracing at 1440p on RTX 3080, the DLSS anything should be a bonus for me if I want a bit higher framerate and not a necessity to even hit those 60fps. I knew it'll go this route and I hate it the whole way here. And I wasn't the only one predicting this. Yet here we are.

 

Just like in the past throwing moar of specs at the games was a solution instead of optimizing them, these upscalers are now the lazy way out for devs this time around. Sigh.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, WereCat said:

I can't find the video from WAN Show that is almost 9y old now probably.... but I'm getting

 

"Resolution is just a number" vibes from all of this

I don't care if a game needs upscaling, or if the preset is labelled low, high or whatever. If it can give me good looking output at enough fps at my native display, with or without upscaling, I'm happy. I do "see" resolution though. I dislike my laptops' native 1080p display which I feel is a bit low. My main desktop at 1440p is ok, and if I want something special I stick it on a 4k TV. Settings are a struggle. Upscaling helps a lot in that. Much game content scales very well, and especially if they overlay native resolution rendered UI, you'd hardly tell the difference short of analysing pixels of screenshots.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

I don't care if a game needs upscaling, or if the preset is labelled low, high or whatever. If it can give me good looking output at enough fps at my native display, with or without upscaling, I'm happy. I do "see" resolution though. I dislike my laptops' native 1080p display which I feel is a bit low. My main desktop at 1440p is ok, and if I want something special I stick it on a 4k TV. Settings are a struggle. Upscaling helps a lot in that. Much game content scales very well, and especially if they overlay native resolution rendered UI, you'd hardly tell the difference short of analysing pixels of screenshots.

I'd agree if it was not so hit and miss from game to game. I don't mind it in some games and actually prefer to run upscaling but can't stand it at all in other games. Especially FSR AA flicker gets on my nerves which is unfortunately in many games and DLSS is sometimes just way too soft, only few games actually managed to implement FSR well... like No Mans Sky.

Link to comment
Share on other sites

Link to post
Share on other sites

I think the issue here is that it only has ray traced lightning so that's why the minimum specs are still so modern.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

if the games looks as average as the official screenshots and gameplay demos make it out to be...these specs are NOT justified.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/31/2023 at 8:00 PM, RejZoR said:

I hate it how FSR/DLSS became a necessity to play any game. This stuff should be BONUS, not a necessity. Meaning, if game can run at 60fps maxed out with ray tracing at 1440p on RTX 3080, the DLSS anything should be a bonus for me if I want a bit higher framerate and not a necessity to even hit those 60fps. I knew it'll go this route and I hate it the whole way here. And I wasn't the only one predicting this. Yet here we are.

 

Just like in the past throwing moar of specs at the games was a solution instead of optimizing them, these upscalers are now the lazy way out for devs this time around. Sigh.

 

Why shouldn't upscaling be part of the optimization process? It's just a tool same as literally any other graphics setting.

 

This is now the 2nd game that has been announced so far where upscaling is part of the recommended specs. Yet here we are, already preaching that this is the entire future. Sure, just ignore all the other thousands of games that have been released this year where upscaling is entirely optional, if even implemented to begin with.

 

I think people put way too much importance into running a game on native resolution. Why does it really matter if the upscaled output still looks good? FSR is still a bit lacking in some situations, but using DLSS quality with a 4K target resolution is practically indistinguishable from running native 4K while performing much better.

 

Look at the music industry. Most people will not be able to hear the difference between a compressed MP3 audio file and a lossless FLAC file. Many audiophiles rave about how terrible Spotify songs sound compared to the likes of Tidal, etc. but Spotify is still by far the most widely used method of listening to music. That is because listening to music isn't as one-dimensional as audiphiles make it out to be. It's not just about quality. There are many other factors involved.

 

Now apply that to upscaling in video games. If upscaling is good enough, i don't see a problem in it being the default option. AMD still has to put in some work to achieve that with FSR and Intel isn't really a significant competitor yet, so XeSS doesn't really matter right now. But especially with DLSS on it's quality setting, the output image is good enough that it can be considered a default option even at 1080p and 1440p.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Stahlmann said:

 

Why shouldn't upscaling be part of the optimization process? It's just a tool same as literally any other graphics setting.

 

This is now the 2nd game that has been announced so far where upscaling is part of the recommended specs. Yet here we are, already preaching that this is the entire future. Sure, just ignore all the other thousands of games that have been released this year where upscaling is entirely optional, if even implemented to begin with.

 

I think people put way too much importance into running a game on native resolution. Why does it really matter if the upscaled output still looks good? FSR is still a bit lacking in some situations, but using DLSS quality with a 4K target resolution is practically indistinguishable from running native 4K while performing much better.

 

Look at the music industry. Most people will not be able to hear the difference between a compressed MP3 audio file and a lossless FLAC file. Many audiophiles rave about how terrible Spotify songs sound compared to the likes of Tidal, etc. but Spotify is still by far the most widely used method of listening to music. That is because listening to music isn't as one-dimensional as audiphiles make it out to be. It's not just about quality. There are many other factors involved.

 

Now apply that to upscaling in video games. If upscaling is good enough, i don't see a problem in it being the default option. AMD still has to put in some work to achieve that with FSR and Intel isn't really a significant competitor yet, so XeSS doesn't really matter right now. But especially with DLSS on it's quality setting, the output image is good enough that it can be considered a default option even at 1080p and 1440p.

"Still looks good" was applied when it was free performance boost and not a necessity to even get half playable framerate. I didn't mind FSR or DLSS if game was already running 80fps and I wanted it at 150fps because I have a high refresh rate monitor. If you need this stuff to barely hit 60fps, then it's not even remotely the same.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, RejZoR said:

"Still looks good" was applied when it was free performance boost and not a necessity to even get half playable framerate. I didn't mind FSR or DLSS if game was already running 80fps and I wanted it at 150fps because I have a high refresh rate monitor. If you need this stuff to barely hit 60fps, then it's not even remotely the same.

This is just the recommended specs, not a set-in-stone rule for playing the game. I have never seen so many people care about recommended specs before Alan Wake 2 and this game. You probably also still have the choice to turn off upscaling and turn down the graphics settings if you want to hit your fps numbers without upscaling.

 

The way the decided to use upscaling is to use the added performance headroom to further enhance the visuals by default. An image using higher graphics settings but also using upscaling will probably look better than a lower graphics settings image with native resolution.

 

And why is upscaling a bad thing at lower fps numbers? Why is it a worse use case to use it to get to the 60 fps threshhold compared to getting to over 100? This doesn't make any sense.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Stahlmann said:

This is just the recommended specs, not a set-in-stone rule for playing the game. I have never seen so many people care about recommended specs before Alan Wake 2 and this game. You probably also still have the choice to turn off upscaling and turn down the graphics settings if you want to hit your fps numbers without upscaling.

 

The way the decided to use upscaling is to use the added performance headroom to further enhance the visuals by default. An image using higher graphics settings but also using upscaling will probably look better than a lower graphics settings image with native resolution.

 

And why is upscaling a bad thing at lower fps numbers? Why is it a worse use case to use it to get to the 60 fps threshhold compared to getting to over 100? This doesn't make any sense.

I want developers to actually OPTIMIZE games to look great at max settings and still run great at that. We're talking high end hardware here, not low and mid end cards. Why could Doom Eternal do that? That thing runs crazy fast at max settings and still looks better than most games released today. With ray tracing, it looks just jaw dropping, yet still runs at insane framerate. HOW? And why other games can't?

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, RejZoR said:

I want developers to actually OPTIMIZE games to look great at max settings and still run great at that. We're talking high end hardware here, not low and mid end cards. Why could Doom Eternal do that? That thing runs crazy fast at max settings and still looks better than most games released today. With ray tracing, it looks just jaw dropping, yet still runs at insane framerate. HOW? And why other games can't?

Doom is a good example for a game that looks pretty good and performs well. But it's obviously an outlier, not the norm, since no other game franchise has been able to reproduce this balance.  Probably the main reason why no one else can do it is because the id tech 7 engine is completely poprietary, so not available to other developers. The engine is what lay out the groundwork and is one of the most significant factors in game development. And the engine is developed and optimized specifically for ONE SINGLE game. That's obviously not applicable to the whole industry.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, RejZoR said:

I want developers to actually OPTIMIZE games to look great at max settings and still run great at that. We're talking high end hardware here, not low and mid end cards. Why could Doom Eternal do that? That thing runs crazy fast at max settings and still looks better than most games released today. With ray tracing, it looks just jaw dropping, yet still runs at insane framerate. HOW? And why other games can't?

I'm not familiar with the game but on first look it is a rather basic RT implementation. RT isn't a single thing that is on or off, but there's a load of things to tweak: shadows, reflections, global illumination, path tracing, ambient occlusion, and probably a ton more I don't know about.

 

I don't know how many of those Doom Eternal implement, but the reflections which are promoted by nvidia aren't even a full implementation, it's a hybrid with screen space reflections.

Quote

In many locations, ray-traced reflections are combined with screen space reflections in a hybrid solution for optimum image quality and performance. This scene demonstrates the added off-screen detail that ray-traced reflections can add - now, the ceiling and occluded detail is rendered, adding greatly to image quality

https://www.nvidia.com/en-gb/geforce/news/doom-eternal-ray-tracing-nvidia-dlss-upgrade-available-now/

 

Games like CP2077 Overdrive and AW2 throw pretty much the whole RT toolbox at it and the lighting in those are about as good as it gets in the current state of the art.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Stahlmann said:

This is just the recommended specs, not a set-in-stone rule for playing the game. I have never seen so many people care about recommended specs before Alan Wake 2 and this game. You probably also still have the choice to turn off upscaling and turn down the graphics settings if you want to hit your fps numbers without upscaling.

 

The way the decided to use upscaling is to use the added performance headroom to further enhance the visuals by default. An image using higher graphics settings but also using upscaling will probably look better than a lower graphics settings image with native resolution.

 

And why is upscaling a bad thing at lower fps numbers? Why is it a worse use case to use it to get to the 60 fps threshhold compared to getting to over 100? This doesn't make any sense.

Upscaling to me at least should be thought of as a way to smooth out little hitches in performance to get a stable experience, not crank it all the way up to set graphics higher than what you should aim for with your hardware. But some people do that and it's fine. As you said, there's no rules set in stone for this, requirements are rarely that accurate anyway.

 

With that said, how should I expect this to run on a 3090? I'm thinking about getting one specifically to play Avatar. I'm currently running on a 3070 so a used 3090 would be something that works with my budget.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, iSynthMan said:

With that said, how should I expect this to run on a 3090? I'm thinking about getting one specifically to play Avatar. I'm currently running on a 3070 so a used 3090 would be something that works with my budget.

It fits somewhere in between a 3080 and 4080, so expect 1440p high settings, DLSS quality.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, williamcll said:

I think the issue here is that it only has ray traced lightning so that's why the minimum specs are still so modern.

yes if one are going to use only RT, one need those upscale + custom/special solutions, to get lower ray counts and how some of DLSS, xess, fsr plays into the RT to PT parts.

 

Not sure how it would be if they used RTXGI, still expensive, but I think "normal or older GPUs" could do those? but still struggle with it.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×