Jump to content

AMD annouces RDNA2 and Zen3 dates

GDRRiley
10 minutes ago, PineyCreek said:

Last system I used is still a backup...built in 2010 with a first-gen i7 (and I've got a 5.25" floppy working on it playing games almost as old as I am in DOSbox).  Only reasons I'm thinking of building another system is that my current board's blown out one of its USB chips, and I'd have to change motherboards anyways to upgrade CPU and memory.  Might as well do the whole shebang at once.  If I don't do that, I'd probably get a new GPU anyways.  I'll always have a use for another GPU as long as I continue doing anything with BOINC or F@H I figure.  It's a fine excuse.

Ouch.  That’s a good reason alright.   Mad respect for the 5.25”. I’ve got a q66something in my basement as my backup.  I don’t know if I mounted a floppy or not.  If I did it’s 3.5” though.  My current is a 4770k.  It’s dicey for me.  I don’t know if the machine will give out or if the game devs will shoot 4/8 down before ddr5 comes out.  If I have to go ddr4 it will probably be am4. A 3700x or something zen3 if it winds up being any good.  I think intel will get it together.  I don’t know if they’ll do it before ddr5 though. 

Edited by Bombastinator
Typos

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GDRRiley said:

Just so that we are 100% clear here.

You claimed that Asus posted the wrong specifications on launch day.

Your evidence for this seems to be a Videocardz article that was posted before launch and that people say is just a RP statement that Videocardz has appended with the rumored specs. I think that sounds pretty reasonable since that would be a great way to get lots of clicks during the peak of the hype.

 

Now you have moved on from "Asus posted the wrong specs" since you couldn't find any solid evidence for that and are now talking about Gainward (not Asus) accidentally posting the wrong specs on their website. 

But when we look at the source for the Gainward leak we can see that it does not anywhere on the page specify the wrong number of CUDA cores. In fact, it doesn't even specify the number of CUDA cores at all.

 

Luckily for us, we actually have an archive of the Gainward page specified in the leaks. Here it is:

https://web.archive.org/web/20200831033826/https://www.gainward.com/main/vgapro.php?id=1087&tab=ov&lang=en

It's from the same date as the Tweet.

 

Notice anything strange with that page? Or rather, something normal? It is exactly the same as the current page:

https://www.gainward.com/main/vgapro.php?id=1087&tab=ov&lang=en

 

Which means it contained the correct information. 

 

What exactly were you trying to prove by pointing out the Gainward page? That AIB partners weren't fed the wrong info by Nvidia? Because clearly they weren't. They knew the specs before the launch. Which isn't a weird concept because the partners are literally the companies that manufacture the bloody thing.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, LAwLz said:

Just so that we are 100% clear here.

You claimed that Asus posted the wrong specifications on launch day.

Your evidence for this seems to be a Videocardz article that was posted before launch and that people say is just a RP statement that Videocardz has appended with the rumored specs. I think that sounds pretty reasonable since that would be a great way to get lots of clicks during the peak of the hype.

 

Now you have moved on from "Asus posted the wrong specs" since you couldn't find any solid evidence for that and are now talking about Gainward (not Asus) accidentally posting the wrong specs on their website.

I used that article not to talk about the gainward but because it talked some about asus and them leaking some. I don't like ether source but I can't find one that directly claims asus leaked more than coolers, supposedly a PR statemet came out with the wrong specs right before the livestream. most of these new sites just quote one and another as a source.

 

you can give AIBs a TDP, reference board design, and let them make a custom cooler over that without them knowing any of the specs

 

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, GDRRiley said:

I used that article not to talk about the gainward but because it talked some about asus and them leaking some. I don't like ether source but I can't find one that directly claims asus leaked more than coolers, supposedly a PR statemet came out with the wrong specs right before the livestream. most of these new sites just quote one and another as a source.

 

you can give AIBs a TDP, reference board design, and let them make a custom cooler over that without them knowing any of the specs

So, no reliable source for any of your claims?

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, tim0901 said:

Why are you listening to this guy? MLID is not only clearly AMD biased,

In what way?

8 hours ago, tim0901 said:

but he has a terrible track record with leaks - most of the stuff he “leaked” about RTX 3000 was proven completely false by Nvidia’s presentation. 
 

In terms of relative performance, not really.

 

Most of what was leaked by everybody about 30 series was wrong.

 

Nvidia’s presentation was as usual a marketing presentation. It’s not trustworthy and we’ll need to wait for reviews for the full information.

8 hours ago, tim0901 said:

For example, he claimed the rtx 3000 series would launch with a complete rework of Nvidia’s software stack, where they merged GeForce experience and the Nvidia’s control panel, which would then no longer required a login. It would also launch alongside DLSS 3.0 which would supposedly work on any game that could use TSAA.

It’s possible this will be announced later on.

The thing is that Nvidia rushed these cards out the door so there was likely no time to rework the software stack.

8 hours ago, tim0901 said:

 

The cards themselves would supposedly “look similar to the 2000 series cards, but with 3 fans”. He quoted an incorrect cuda core count for GA102 (one that doesn’t even line up with the doubling from 2xfp32) as well as an incorrect memory layout. USB-c would also return after being ditched by the Super cards.

 

Pretty much the only thing he got right was pci-e 4.0. Which wasn’t exactly a hard guess.

The incorrect cuda core count fiasco is cos Nvidia lied to their AIBs and then subsequently their AIBs leaked the wrong numbers to everyone.

 

Also the 2x FP32 thing is marketing. Each individual cuda core is now substantially weaker than a previous gen cuda core.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, LAwLz said:

So, no reliable source for any of your claims?

Well, we saw it when it was there. The only real source you might want to believe would be the original page, but obviously it's been corrected since...

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Kilrah said:

Well, we saw it when it was there. The only real source you might want to believe would be the original page, but obviously it's been corrected since...

Which page?

 

 

29 minutes ago, AluminiumTech said:

The incorrect cuda core count fiasco is cos Nvidia lied to their AIBs and then subsequently their AIBs leaked the wrong numbers to everyone.

[Citation Needed]

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, LAwLz said:

Which page?

You can go watch a few hours of the videos from our favorite youtubers from just before the announcement yourself to locate it if you want to find it that bad.

GN explained the misunderstanding BTW.

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Stahlmann98 said:

If RTX 2000 is anything to go by, the first gen of ray-tracing ready tech form AMD won't even come close to RTX 3000 performance while using any DXR features. Not to mention new games help NVIDIA cards with DLSS. AMD still has a lot of catching up to do.

I wouldn't be so negative, although of course we don't really know how it performs yet. The AMD RT tech is going into both next gen consoles. It isn't going away however it performs. Just because nvidia 1st gen RT had a notable performance impact, it doesn't necessarily follow that AMD's 1st offering will have the same limitations. By waiting a whole generation before releasing their offering they have more time to get it done to a good level.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

It's not like any AMD GPU matters or will matter for at least 2 more generations.... All they're good for is lowering the prices of the all mighty Nvidia so you can all buy those. And if they don't get lowered, you still buy them. If AMD gets close for cheaper you'll think of an excuse like features which you imagine AMD not having to still buy Nvidia. When AMD gets better you'll just say they're drivers suck and still buy Nvidia. It will take at least 2 generations of AMD to be better then Nvidia (and by better i of course mean having the fastest most expensive card on the market.), before most will even CONSIDER buying AMD. Source? This topic, and many comments under the articles on the different tech news sites.

So everyone wants competition so they can buy the cards of 1 side, for the price they want to spend on it.

 

And BTW the Nvidia prices ARE outrageous, ridiculous and stupid. But it's the all mighty Nvidia asking them, so it's ok. They spend all the R&D on it so they've earned it. Except you're forgetting that if no one pays it, they'll have no choice but to lower it. But you have to have the latest greatest from ur overlords at Nvidia.

 

Am i an AMD fanboy? Nope, i'm an Nvidia hateboy if anything. Had to many shit cards with shit drivers crashing good systems (confirmed by replacing the nvidia shit with amd cards and never crashing again) to even care about anything they do anymore.

 

And people claiming the overlords having such a great feature set over AMD, what was it... DLSS, raytracing support, shadow play, rtx voice, gsync. Lets go over it:

  • DLSS: I have never cared for AA at all. Never understood why anyone does. At first because of the performance impact, later i just bought better cards so that i had the best AA possible, higher resolutions. Still do, I don't even see aliased edges on 4K. so why do i need it?
  • Raytracing support: Well with the new cards no longer an Nvidia exclusive so that's off ur list then..
  • Showdow play: Same thing here, AMD has had something similar for years now, besides, how important is this? You all content creators or streamers all of a sudden? CPU encoding is still better anyway, if you want to create some real quality content... oh right YT screws it up anyway, doesn't even matter.
  • RTX Voice: Really? is that some big thing? I'll admit i don't watch a lot of streamers but the ones i have watched are never in a noisy environment, why the HELL would you want to record/play in such en environment anyway... go somewhere more quiet. How many people were actually hoping for anything like this to come around? No one i know... i can tell you that. Also if a streaming is typing, i don't mind hearing it so i don't have to wonder wtf they're doing.
  • G-Sync: Sorry to destroy ur hopes and dreams but this is going away sooner or later. Certainly now that its also working on freesync monitors. They might stretch this out a while by adding "features" to the G-sync module but its just fighting the inevitable moment where its obsolete and freesync can do it all for much cheaper.

So ehm... not much left of ur list there....

 

Anyway ill now just continue enjoy reading everyone's ridiculous excuse for the Nvidia prices and why AMD won't compete. It's a blast, really. Nice of you to forget they are absolutely destroying Intel right now with a mere FRACTION of the budget. All ur comments were under the ryzen threads as well before the first generation was released... If you factor in the budgets both the companies are working with, it's actually AMD whos amazing and Nvidia who just plain sucks...

 

Please come at me... I'll have a laugh and not respond ;) just so you know right now.

I have no signature

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Stahlmann98 said:

If RTX 2000 is anything to go by, the first gen of ray-tracing ready tech form AMD won't even come close to RTX 3000 performance while using any DXR features. Not to mention new games help NVIDIA cards with DLSS. AMD still has a lot of catching up to do.

At least in Control, according to Digital Foundry tests, there is no difference in the performance hit caused by RT between the 2080 and 3080, the performance difference stayed the same regardless of RT being on or off. Naturally there are no tests on other games, so that doesn't confirm it, but there's no proper confirmation that the RT performance hit will be lower/won't be there for the new GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, AluminiumTech said:

In what way?

Watch enough of his content and it becomes blatantly obvious that he’s rooting for team red. 
 

Not to mention the AMD shirt he wore while trying to downplay Nvidia’s announcement (the 50th anniversary shirt you got by buying a gold edition 2700X). He may not be a raging AMD fanboy irl, but he still shouldn’t have done that if he wants to try and come across as an unbiased source. 

 

Quote

In terms of relative performance, not really.

Was wrong about relative RTX performance. Didn’t get the 1.9x perf/watt. Claimed 400W of power consumption. A broken clock is right twice a day.

 

Quote

Most of what was leaked by everybody about 30 series was wrong.

So none of them are reliable. Cool.
 

Quote

Nvidia’s presentation was as usual a marketing presentation. It’s not trustworthy and we’ll need to wait for reviews for the full information.

Yes. Absolutely. But that doesn’t mean some random guy on the internet is any better. If anything he’s worse - nobody’s holding him responsible if he lies/gets things wrong. Nvidia can’t get away with that as much. They’ve said 3070 is better than 2080ti (in both rtx on and off modes according to their q&a) and so if that’s not true (a very objective thing to measure) then the community will know and hold them accountable.
 

Quote

It’s possible this will be announced later on.

 

The thing is that Nvidia rushed these cards out the door so there was likely no time to rework the software stack.

Except they’ve just finished bringing their Quadro cards to GeForce Experience? Why would they waste resources moving their professional cards to the current software stack if they’re planning on getting rid of it?

 

As for DLSS 3.0? Again, pretty easy thing to guess about. Doesn’t exactly take much reasoning to guess that the end goal of something like DLSS would be for it to work on any game, without needing developer support. Not to mention Nvidia is very open about a lot of their research, so you can probably find papers about it on their website and make educated guesses off of that.

 

Quote

The incorrect cuda core count fiasco is cos Nvidia lied to their AIBs and then subsequently their AIBs leaked the wrong numbers to everyone.

So? They’re still incorrect. 
 

Honestly I’m not sure if this is true or not. The evidence I’ve seen is sketchy at best. But if it is true all it means is those so called ‘sources’ (who always pick tiny YouTubers with no audience to leak to...) will be silenced pretty soon. If they lied to AIBs, they will have given different specs to each one to identify the source of the leaks. 
 

Quote

Also the 2x FP32 thing is marketing. Each individual cuda core is now substantially weaker than a previous gen cuda core.

I’m aware. But this is how Nvidia wants to market it, so we have to deal with it. 

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Helly said:

 

  • Showdow play: Same thing here, AMD has had something similar for years now, besides, how important is this? You all content creators or streamers all of a sudden? CPU encoding is still better anyway, if you want to create some real quality content... oh right YT screws it up anyway, doesn't even matter.

 

just fyi the reason shadow play and amd's relive are great is that you can capture the fun moments you have with your friends online, and because you can capture after the fact (with a X minutes buffer) you can also capture things you weren't expecting to need to capture, i dont have a youtube account (not really), but i have a good collection of funny clips thanks to this

Link to comment
Share on other sites

Link to post
Share on other sites

Well RIP AMD GPU 1 month too late. Seriously allowing this to be released after Nvidia will destroy any chance of anyone switching teams. Not sure who gave AMD the greenlight to push the release back so far. 1 month is forever in the electronics world. Nvidia people have been waiting a long time for a GPU that is a major upgrade over previous graphics cards and Nvidia gave them one before AMD so they will buy Nvidia now.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Helly said:

IDLSS: I have never cared for AA at all. Never understood why anyone does. At first because of the performance impact, later i just bought better cards so that i had the best AA possible, higher resolutions. Still do, I don't even see aliased edges on 4K. so why do i need it?

AA is more important at lower resolutions. Of course it will be less impactful on 4K or higher. Missing AA was one of the main reasons why games looked so terrible on PS3 and XBox360.

 

Also DLSS is not only about delivering a sharper image, it's mainly about increasing performance. DLSS renders the image at a lower resolution than the target and uses AI for upscaling to deliver for example native 4K image quality, while only needing the recources to render a 1440p image. THIS is the reason why DLSS is impressive. It's not taking performance away, like all other AA methods do, it's giving you extra performance and thus more headroom for stuff like high refhresh rate displays or enabling performance heavy options like DXR ray-tracing.

 

This has nothing to do with fanboying for NVIDIA, but this is a very impressive technology which AMD currently doesn't offer and has not announced anything similar yet. This is also a reason why the same GPU power can deliver better results in new upcoming games with NVIDIA cards compared to AMD cards which don't have similar technologies.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

It will be quite a fun month. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure what AMD could announce that would compete with DLSS. I'd previously written it off, but after doing some research, it's close to being indispensable, certainly going forward. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann98 said:

AA is more important at lower resolutions. Of course it will be less impactful on 4K or higher. Missing AA was one of the main reasons why games looked so terrible on PS3 and XBox360.

 

Also DLSS is not only about delivering a sharper image, it's mainly about increasing performance. DLSS renders the image at a lower resolution than the target and uses AI for upscaling to deliver for example native 4K image quality, while only needing the recources to render a 1440p image. THIS is the reason why DLSS is impressive. It's not taking performance away, like all other AA methods do, it's giving you extra performance and thus more headroom for stuff like high refhresh rate displays or enabling performance heavy options like DXR ray-tracing.

 

This has nothing to do with fanboying for NVIDIA, but this is a very impressive technology which AMD currently doesn't offer and has not announced anything similar yet. This is also a reason why the same GPU power can deliver better results in new upcoming games with NVIDIA cards compared to AMD cards which don't have similar technologies.

Definition of “native” is a bit squishy here. Up scaling is up scaling.  “Native appearing” it might do. 
this is about adding information to an image consistent with other similar images.  This is a tech that was used to add theoretical definition to faces but it can’t be used in court because it doesn’t add real definition.  It adds assumed standardized definition.  May amount to the same thing in video games if not in perpetrator identification on security cameras.  I begin to wonder if this might actually be used as invisibility or as a decoy in some instances in competitive FPS  games at long ranges.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

Definition of “native” is a bit squishy here. Up scaling is up scaling.  “Native appearing” it might do. 

HUB already had a in-depth look at it. In most cases in the games where it's implemented it just gives better performance for the same image quality. Maybe not really "native" but at least indistinguishable. In some cases the image looked even better when DLSS upscaled compared to the native picture.

 

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Stahlmann98 said:

HUB already had a in-depth look at it. In most cases in the games where it's implemented it just gives better performance for the same image quality. Maybe not really "native" but at least indistinguishable. In some cases the image looked even better when DLSS upscaled compared to the native picture.

 

So native appearing. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

So native appearing. 

If it makes you happy call it "native appearing" then. My point still stands.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann98 said:

AA is more important at lower resolutions. Of course it will be less impactful on 4K or higher. Missing AA was one of the main reasons why games looked so terrible on PS3 and XBox360.

 

Also DLSS is not only about delivering a sharper image, it's mainly about increasing performance. DLSS renders the image at a lower resolution than the target and uses AI for upscaling to deliver for example native 4K image quality, while only needing the recources to render a 1440p image. THIS is the reason why DLSS is impressive. It's not taking performance away, like all other AA methods do, it's giving you extra performance and thus more headroom for stuff like high refhresh rate displays or enabling performance heavy options like DXR ray-tracing.

 

This has nothing to do with fanboying for NVIDIA, but this is a very impressive technology which AMD currently doesn't offer and has not announced anything similar yet. This is also a reason why the same GPU power can deliver better results in new upcoming games with NVIDIA cards compared to AMD cards which don't have similar technologies.

https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/ amd has a response to DLSS and we dont know how many games are going to have DLSS. so im not sure if i would consider it a huge make or break feature right now

1 hour ago, Shorty88jr said:

Well RIP AMD GPU 1 month too late. Seriously allowing this to be released after Nvidia will destroy any chance of anyone switching teams. Not sure who gave AMD the greenlight to push the release back so far. 1 month is forever in the electronics world. Nvidia people have been waiting a long time for a GPU that is a major upgrade over previous graphics cards and Nvidia gave them one before AMD so they will buy Nvidia now.

well if they are able to find any rtx 3000 gpus. there are rumors that the stock is going to be much lower than a typical launch since samsung 8nm has terrible yields

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann98 said:

If it makes you happy call it "native appearing" then. My point still stands.

It’s a definition of type rather than appearance.  I was attempting to define a distinction.  I don’t debate the usefulness of DLSS. Appearance and reality are squishy too when dealing with video games. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann98 said:

AA is more important at lower resolutions. Of course it will be less impactful on 4K or higher. Missing AA was one of the main reasons why games looked so terrible on PS3 and XBox360.

Well didn't think i would respond but here i am... I have a PS3 and played plenty of games on it... they never looked terrible to me and even there i barely if ever noticed the lack of AA. Besides, AA comes from the age when PC's could barely run games at a resolution like 1280x1024. The same argument was made then, making it look better at lower resolutions and it wasn't as impactful @ 1280x1024 or higher. So.... when is that argument actually true?

 

1 hour ago, Stahlmann98 said:

Also DLSS is not only about delivering a sharper image, it's mainly about increasing performance. DLSS renders the image at a lower resolution than the target and uses AI for upscaling to deliver for example native 4K image quality, while only needing the recources to render a 1440p image. THIS is the reason why DLSS is impressive. It's not taking performance away, like all other AA methods do, it's giving you extra performance and thus more headroom for stuff like high refhresh rate displays or enabling performance heavy options like DXR ray-tracing.

If this is true (not sure it is) then why do people bitch so much at consoles for having "fake 4k" when they're using it on their pc's as well... Always wondered about that...

I have no signature

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×