Jump to content

AMD Radeon RX 6000 Specifications Revealed / Leaked from 2 Separate Locations: Up to 5120-Cores & 2.5GHz Clock Speeds

1 hour ago, LAwLz said:

MLID has pretty much never been correct so why does it matter what he says?

No?

 

He's been correct at least several times however nobody recognises that a lot of people got 30 series leaks wrong and yet we're still calling them reputable.

1 hour ago, LAwLz said:

He is about as accurate as WCCFTech in his "leaks" and predictions.

I respectfully completely disagree. WCCFTech has no real sources whereas MLID does and following what he now knows about what was reliable from his 30 series leaks has helped him solidfy the quality sources he has and he's ditched his sources that have lied.

 

Also, I realize now that he also discussed RDNA's big cache structure as a reason for not needing huge memory bandwidth. Even assuming arguendo that this is somehow untrustworthy or a flat-out lie, Kepler and Maxwell showed us that more L2 cache helped reduce memory bandwidth needs and so the only way this is untrue is if RDNA2 doesn't have more cache.

1 hour ago, LAwLz said:

You just have to go and look at his Ampere "leak" video to see how much things he got wrong.

And everybody else got things wrong.

 

Point is he learned which of his sources he needed to ditch and kept the ones that gave him reliable info.

1 hour ago, LAwLz said:

Even if you buy into the whole conspiarcy theory that Nvidia deliberately sent out incorrect info and MLID's insider source accidentally leaked this incorrect info, that still means MLID's sources have a terrible track record and therefore we can't trust MLID either.

See above.

 

His unreliable sources were ditched and he's only got more accurate sources now.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, suicidalfranco said:

hope it's not an over 90°C card when running the latest games. Cause if i wanted a jet engine in my room i'd get a console

No, that'd be the Nvidia 3080 or 3090 cards.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, AluminiumTech said:

He's been correct at least several times

Even a broken clock is correct twice a day.

 

48 minutes ago, AluminiumTech said:

a lot of people got 30 series leaks wrong and yet we're still calling them reputable.

I don't.

 

49 minutes ago, AluminiumTech said:

WCCFTech has no real sources whereas MLID does

MLID has real sources and WCCFTech doesn't? According to whom? MLID himself?

 

50 minutes ago, AluminiumTech said:

And everybody else got things wrong.

Yes and you shouldn't listen to those either.

 

50 minutes ago, AluminiumTech said:

Point is he learned which of his sources he needed to ditch and kept the ones that gave him reliable info.

Yeah right. Until he is wrong again and use the same excuse?

Or do you mean we should expect 100% accuracy from him from now on because he has only kept his reliable sources? He has already shown that his judgement when it comes to classifying "reliable sources" is shit. In his Ampere video he even said multiple times that he was sure the info he had was correct because multiple insider sources had told him the same things. Guess all of them were unreliable?

 

 

The way I see it, leakers gets a score. Every time they are correct they get 1 point. Every time they are wrong they get -5 points.

Right now, MLID has like -100 points or something. He is garbage as a source. Absolutely garbage. He gets paid to drum up rumors and he has an ac curacy rate of like 20% on the stuff we have been able to verify. I don't see that changing either, no matter how much he tries to convince people who gives him money to keep giving him money.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AluminiumTech said:

No?

 

He's been correct at least several times however nobody recognises that a lot of people got 30 series leaks wrong and yet we're still calling them reputable.

I respectfully completely disagree. WCCFTech has no real sources whereas MLID does and following what he now knows about what was reliable from his 30 series leaks has helped him solidfy the quality sources he has and he's ditched his sources that have lied.

 

Also, I realize now that he also discussed RDNA's big cache structure as a reason for not needing huge memory bandwidth. Even assuming arguendo that this is somehow untrustworthy or a flat-out lie, Kepler and Maxwell showed us that more L2 cache helped reduce memory bandwidth needs and so the only way this is untrue is if RDNA2 doesn't have more cache.

And everybody else got things wrong.

 

Point is he learned which of his sources he needed to ditch and kept the ones that gave him reliable info.

See above.

 

His unreliable sources were ditched and he's only got more accurate sources now.

he was right on basically everything that mattered, the high power consumption, the weird cooler, the overall performance, only important thing he got wrong was the doubling of ray-tracing performance

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Energycore said:

The sad truth is Nvidia will continue to have a massive market cap advantage even if AMD steals the top end in 2021/2022. There's a huge number of people who are 100% convinced that Nvidia is the only GPU manufacturer.

The problem is NVIDIA releases features people get attached to as well.  If the cards are neck and neck but DLSS pushes NVIDIA 30% past AMD at the same image quality do you care how you got those FPS?  AMD software/drivers needs to step up their game.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, BiG StroOnZ said:

Navi 21a @ 2050MHz, and also Navi 21b @ up to 2200MHz. The faster Navi 21b @ 2200MHz would result in performance somewhere around 22 TFLOPs; meaning on paper, it's slower than an RTX 3080 @ about 30 TLFOPs.

Ampere's TFlops numbers are not indicative of gaming performance compared to Turing or RDNA1, Ampere has 2x FP32 Cores but one of the cores is able to process FP32 or INT32 but it can't do both at the same time, if we disregard the second core then the 3080 would have 4352 CUDA Cores, same as the 2080 Ti, but it's 20-25% faster so let's say that it has 4352*1.25 = 5440 Turing CUDA Cores and since RDNA1 has a very similar IPC to Turing and let's assume that there weren't any IPC improvements, then at worst the 6900XT (21b) would be slightly faster or just as fast as a 3080 since it has higher clocks.

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, AluminiumTech said:

snip

 

I'm not necessarily suggesting that they are "lying" so to speak, but rather, instead, have a "don't get your hopes up" mentality to avoid disappointment if they don't reach that 50% mark exactly. 

 

1 hour ago, Syn. said:

Ampere's TFlops numbers are not indicative of gaming performance compared to Turing or RDNA

 

Yup, I agree. Which is why I made sure to point out that, "TFLOPs performance doesn't always translate as faster".

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, BiG StroOnZ said:

 

I'm not necessarily suggesting that they are "lying" so to speak, but rather, instead, have a "don't get your hopes up" mentality to avoid disappointment if they don't reach that 50% mark exactly. 

Okay but Mark Cerny straight up says in so many words that there will be a 36CU RDNA2 graphics card for PC gamers and that it'll perform similarly to PS5.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, AluminiumTech said:

Okay but Mark Cerny straight up says in so many words that there will be a 36CU RDNA2 graphics card for PC gamers and that it'll perform similarly to PS5.

 

Yes, but you do realize that can mean the performance numbers that they are receiving (even if accurate), can be because of using specific benchmarks that are known to favor AMD / Radeon. Why we say wait for independent reviewers, ad nauseam.

 

Capisce? 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, BiG StroOnZ said:

snip

I'm still maintaining my position that there will be certain cards launched including a 36CU Navi 2.0 card.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, AluminiumTech said:

I'm still maintaining my position that there will be certain cards launched including a 36CU Navi 2.0 card.

 

So, a 6600 XT (with 2,304 SPs) and a performance target around RTX 2070 Super / RTX 2080 performance? 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, BiG StroOnZ said:

 

So a 6600 XT (with 2,304 SPs) and a performance target around RTX 2070 Super / RTX 2080 performance? 

More like between 2080 and 2080Ti perf.

 

I mean best case scenario it would beat a 2080Ti and be on par with a 3070 but I feel that's a tad optimistic for just 36CUs when they're launching an 80CU beast.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, AluminiumTech said:

More like between 2080 and 2080 Super 2080Ti perf.

 

I mean best case scenario it would beat a 2080Ti and be on par with a 3070 but I feel that's a tad optimistic for just 36CUs when they're launching an 80CU beast.

 

I feel it's a bit optimistic too, maybe though, 2080 Super performance is possible; if they happen to get those clock speeds up (higher).

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, AluminiumTech said:
Quote

hope it's not an over 90°C card when running the latest games. Cause if i wanted a jet engine in my room i'd get a console

No, that'd be the Nvidia 3080 or 3090 cards.

Yeah, it's the 3080 and 3090. lol

 

Spoiler

temp30803090.png.0a2ac971ddfc07df8e51573d4ba8cd4e.png

noise30803090.png.7e69b9928a5eca912154aa8ea518c968.png

 

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, suicidalfranco said:

hope it's not an over 90°C card when running the latest games. Cause if i wanted a jet engine in my room i'd get a console

That was only with blower coolers. I don't think we'll be seeing those anymore unless for very specific card designs where it's not a furnace where blower can't keep up, but is still hot enough that you do want to blow the hot air directly out of the case. I'm surprised blowers are not more common in those kind of cards.

 

Or like making a fat stack of fins like on these huge coolers we have now, but with fins in parallel with the card and exhausted at the back. Co you actually have a big stack of fins coz if you open any typical blower card it's mostly a massive shroud and a relatively tiny fin stack underneath it. I wouldn't mind a blower card with a massive 2,5 or even 3 slot design...

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, AluminiumTech said:

No, that'd be the Nvidia 3080 or 3090 cards.

Not what I've seen in Linus' reviews

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, RejZoR said:

That was only with blower coolers. I don't think we'll be seeing those anymore unless for very specific card designs where it's not a furnace where blower can't keep up, but is still hot enough that you do want to blow the hot air directly out of the case. I'm surprised blowers are not more common in those kind of cards.

 

Or like making a fat stack of fins like on these huge coolers we have now, but with fins in parallel with the card and exhausted at the back. Co you actually have a big stack of fins coz if you open any typical blower card it's mostly a massive shroud and a relatively tiny fin stack underneath it. I wouldn't mind a blower card with a massive 2,5 or even 3 slot design...

it's not just blower style coolers. Even none blower style coolers sound loke jet engines when cranked at full speed.

Though if i've learned something from deshrouding mods, i blame the awful fans being used, if 2 noctuas at 1500RPM can keep my GPU at the same temp as the stock ones going full throtle... well.. i don't know, maybe drop the fancy stylised plastic shrouds and spend more money on the fans doing the actual work?  

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

Assuming their Chip can work to its fullest with the 256bit memory bus and GDDR6, then its final performance figures will depend mostly on the scaling of the 80CU's vs the 40 of the 5700XT.

 

It also depends on what ur comparing.

4k results heavily favor the 3080 over the 5700XT, while 1440p results reduce the 3080's lead.

I think you'd want an average of 75% better performance than the 5700XT to compete with the 3080. Grab a few games and their 5700XT and RTX3080 performance results (1440p), add 75% onto the 5700XT and on average ull see 3080 level performance. (88% at 4k).

 

At 'only' 70% CU scaling, clock bump to 2200mhz (10%), and possible IPC improvements (5%), totaling an improvement over the 5700Xt of 85%  it should handily compete with the 3080.

 

With perfect 100% scaling, that 85% improvement goes up to 125%, which outright beats the 3080.

 

So its going to depend on how things scale.

I think its safe to assume that even in the worst case of truly poor CU and clock scaling.. say a total of 60%. At 1440p thats enough to get close to the 3080, not as impressive at 4k, but if its cheap enough probably a good buy. I expect better than a 60% improvement over the 5700XT however.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

The odd thing is memory config though. But whatever, we'll see it soon enough.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Still think the RX 6000 series will be better than 2080Ti performance in 4K, but not 3080 performance. That is some massive gains that AMD would have to make in one leap.

However... Nvidia have heavily focused on 4K gains and 8K, if you look at the 1440p gains they didn't scale because of how the Nvidia GPU is designed, there was gains but not the same as 4K.

 

I really think RX 6000 will be a gaming design GPU and as such their gains in 1080p and 1440p might be better than Nvidia and thats possibly where it will be at or beat the 3080.

You know what thats ok, because there are a lot of 1080p and 1440p gamers out there myself included.

 

 

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/28/2020 at 8:13 PM, RejZoR said:

Hot Chips? All I get searching for it is hot chips challenges on youtube... With literal spicy chips...

https://www.anandtech.com/show/15994/hot-chips-2020-live-blog-microsoft-xbox-series-x-system-architecture-600pm-pt

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/28/2020 at 3:59 PM, valdyrgramr said:

Do a 6900 XT vs 690 vid, Linus.

Do not do this. It’s a trap.  If you do 6900 and then 690 the next inevitable step is 69.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Bombastinator said:

Do not do this. It’s a trap.  If you do 6900 and then 690 the next inevitable step is 69.

69 is using full ray tracing so the others will look quite weak in comparison 😛 

Link to comment
Share on other sites

Link to post
Share on other sites

Interestingly the last AMD card I bought was also a 69xx card (HD 6950) 10 years ago, and it was an overlocking beast. Perhaps I will own another 69xx card this generation if the specs shake out to be at the higher end of the rumors.

 

I’ve purchased and owned cards from many brands, some of which are dead now (S3, 3dfx) so I have no particular loyalty to Nvidia or AMD/ATI. I would think all but the most fervent Nvidia fanboys are hoping that AMD’s 69xx cards are NICE 😎 as it will only mean good things for consumers either way, whether it is a strong value in the midrange or forcing Nvidia to change its game.

 

I would expect AMD’s hardware RT to be competitive this generation but I agree that DLSS is Nvidia’s silver bullet. If AMD can deliver a card that kicks ass at traditional rasterized performance they will have a good shot.

 

Current build: AMD Ryzen 7 5800X, ASUS PRIME X570-Pro, EVGA RTX 3080 XC3 Ultra, G.Skill 2x16GB 3600C16 DDR4, Samsung 980 Pro 1TB, Sabrent Rocket 1TB, Corsair RM750x, Scythe Mugen 5 Rev. B, Phanteks Enthoo Pro M, LG 27GL83A-B

Link to comment
Share on other sites

Link to post
Share on other sites

DLSS is not a real threat for as long as it has to be specifically coded for a game. However, if NVIDIA will somehow manage to make it work in ANY game without any coding involved (not even Temporal AA as it was stated in some leaks) then it'll become a huge threat to AMD that I'm not sure they'll be able to counter. If you could just flip a switch in NVCP and gain 40-50% performance boost without any noteworthy degradation in quality, I'd probably have that enabled all the time. We've had bunch of optimisations a an option and all they do is make image look like crap, but you gain maybe 2fps at best. DLSS is entirely different beast. Image is hardly any different while performance boosts are massive. The quality upgrades NVIDIA made to DLSS are massive and it's a shame developers like guys that made Metro Exodus didn't care to also update their game to support latest DLSS. Which is why making it game agnostic would be the most important thing ever for NVIDIA. To base their entire rendering pipeline around DLSS and have it work with any game. That would entirely change the way things are rendered and it would probably give them unmatched advantage.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×