Jump to content

AMD annouces RDNA2 and Zen3 dates

GDRRiley
2 hours ago, Mateyyy said:

That's not their fault though, is it? Some people get overly excited over even the smallest leak/rumour. Don't be those people.

AMD's fanbase consisted of this people. Thus, any attempt by AMD to battle Nvidia on the mindshare front would get overblown into unrealistic expectations. The product launches, failed to meet said expectation, the hype train ran straight into a cliff, people get disappointed, Nvidia won in the end. Rinse repeat next release cycles.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

AMD sharpening is NOT a direct replacement to DLSS. If you want basic sharpening, nvidia also offer that as a separate setting.

And NVIDIA's sharpening is pure garbage. I've tried at bunch of different settings and it always creates this garbage sparkling oversharpened artefacting. It's horrible. Where using CAS based on AMD's design (via ReShade since I have NVIDIA card) is very subtle and creates amazingly pleasant effect. No sparkling, aliasing or artefacting, just nice sharper surfaces. Can't imagine playing anything without it anymore.

Link to comment
Share on other sites

Link to post
Share on other sites

This particular launch combination has me in a pickle of a situation. I normally would have jumped on the highend being the 3090, but since it is going to be bottlenecked by pretty much any cpu I toss at it unless you are playing at 8k.... This might be the first generation where I only go up to the 3080.

 

Then we have the CPU side of things. I keep putting off a CPU upgrades and still have an 8700k at 5.2ghz (atm). So with AMD saying we will see big IPC increases with this generation as well as a decent boost in clocks I am really hoping this closes the gap or even surpasses the 10xxx series cards in single thread. Ideally I want to see that gap drop to 5% or less. If the gap doesn't drop then I am probably better served riding out the 2080ti for another year. I for sure will be upgrading my CPU this generation.

 

Now if the benchmarks come out and show that there are some big performance gains to be seen from the 24gb of memory or for high refresh rate 4k monitors then I am not opposed to jumping on the 3090 bandwagon, but again as it stands I don't play at 8k and I don't know anyone that does... so it just doesn't feel worth the investment this gen... outside of something on the productivity side.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, spartaman64 said:

well in many ways its better than the current DLSS i havent seen DLSS 3.0 so idk about that

It was better than DLSS 1.0 (Battlefield, Metro and some others), because DLSS was worse than a simple upscale, the sharpening was a nice bonus to take some of the blurry look away from the traditional upscale. DLSS 2.0(Control, Death Stranding, Wolfenstein: Youngblood) is straight up better than the native image quality in some cases, which is literally impossible with the traditional upscaling methods, while often looking really good even when it isn't as good as native, naturally it does have issues like trails behind some moving objects in Death Stranding, visual artifacts in certain situations and the lack of support in most games, but for the most part DLSS is currently the best option to upscale games that support it.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/9/2020 at 11:42 AM, SolarNova said:

Frank Azor teased that there was going to be something announced.

 

Turns out it was a teaser for an announcement that there will be an announcement on the 28th Oct for RDNA 2 .

 

AMD loves to disappoint it seems.

 

I guess all those comments about AMDs marketing department being terrible are true lol.

they trying to build hype I guess.

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, RejZoR said:

And NVIDIA's sharpening is pure garbage. I've tried at bunch of different settings and it always creates this garbage sparkling oversharpened artefacting. It's horrible. Where using CAS based on AMD's design (via ReShade since I have NVIDIA card) is very subtle and creates amazingly pleasant effect. No sparkling, aliasing or artefacting, just nice sharper surfaces. Can't imagine playing anything without it anymore.

Really? Nvidia's sharpening has actually worked fine for me, at least if you're conservative with it. It looks miles better than any other sharpening filter I've used prior in ReShade or whatever.

This actually got me curious to try it out.

 

6 hours ago, valdyrgramr said:

They had them, but they weren't allowed to post a video like he did.   In order to do such a video you need to lack journalistic integrity.   GN, HC, LTT, and several others don't get to review/show/whatever early like DF was able to because DF bends over for Jensen in order to do early videos.   Then says you can still trust them because of his own word.   That's 

I don't get why/how you're so sure on that, but alright.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, AngryBeaver said:

This particular launch combination has me in a pickle of a situation. I normally would have jumped on the highend being the 3090, but since it is going to be bottlenecked by pretty much any cpu I toss at it unless you are playing at 8k.... This might be the first generation where I only go up to the 3080.

 

CPU load is framerate dependent, doesn't matter the resolution mostly.  It just correlates that the framerates go nuts at low resolution and then the CPU maxes out.  Unless you've seen something that shows 4k144 which a 3090 has a shot at driving somehow needs far more CPU than 2k144.

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, Mateyyy said:

Really? Nvidia's sharpening has actually worked fine for me, at least if you're conservative with it. It looks miles better than any other sharpening filter I've used prior in ReShade or whatever.

This actually got me curious to try it out.

 

I don't get why/how you're so sure on that, but alright.

It just doesn't. Even Adaptive Sharpen filter or Smart Sharpen work better in ReShade and don't let me start on amazing CAS shader (based on AMD's implementation). And I'm talking at really high levels. Where NVIDIA's one is all sparkly, aliasing and artefacting even at 25%.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, RejZoR said:

It just doesn't. Even Adaptive Sharpen filter or Smart Sharpen work better in ReShade and don't let me start on amazing CAS shader (based on AMD's implementation). And I'm talking at really high levels. Where NVIDIA's one is all sparkly, aliasing and artefacting even at 25%.

I use it in Modern Warfare at about 45% strength and it makes the whole image quite a bit clearer while remaining natural. Odd.

I'll give CAS a shot.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Mateyyy said:

I use it in Modern Warfare at about 45% strength and it makes the whole image quite a bit clearer while remaining natural. Odd.

I'll give CAS a shot.

CAS is very subtle, but when you switch it on and off in ReShade, you can see what a massive difference it actually makes and doesn't cause any artefacts. Actually I prefer ReShade because you can tune it in real-time where with NVIDIA's own you have to exit game, adjust and repeat. And I just couldn't get desired results. I just have CAS cranked to max and it looks great.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, AnonymousGuy said:

CPU load is framerate dependent, doesn't matter the resolution mostly.  It just correlates that the framerates go nuts at low resolution and then the CPU maxes out.  Unless you've seen something that shows 4k144 which a 3090 has a shot at driving somehow needs far more CPU than 2k144.

You are correct. It has to do with draw calls and the fact they are very much still single threaded. So you basically have a scale... the higher the resolution the more load the gpu has compared to the CPU... as resolution decreases this flips. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Results45 said:

Pricing rumors arise from "Tier 1" Nvidia partners:

 

Screenshot_20200914_064218.thumb.jpg.1ba84218c39e1161ab0df8dbaf10b620.jpg

.

those prices would be great. lets just hope they don't f up the software side again ;) 

PC: 
MSI B450 gaming pro carbon ac              (motherboard)      |    (Gpu)             ASRock Radeon RX 6950 XT Phantom Gaming D 16G

ryzen 7 5800X3D                                          (cpu)                |    (Monitor)        2560x1440 144hz (lg 32gk650f)
Arctic Liquid Freezer II 240 A-RGB           (cpu cooler)         |     (Psu)             seasonic focus plus gold 850w
Cooler Master MasterBox MB511 RGB    (PCcase)              |    (Memory)       Kingston Fury Beast 32GB (16x2) DDR4 @ 3.600MHz

Corsair K95 RGB Platinum                       (keyboard)            |    (mouse)         Razer Viper Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

3060 isn’t even out yet ignoring completely the concept of the 3060ti.  Nvidia isn’t even selling any 3090/3080/3070 cards yet.  This seems early.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Bombastinator said:

3060 isn’t even out yet ignoring completely the concept of the 3060ti.  Nvidia isn’t even selling any 3090/3080/3070 cards yet.  This seems early.

Yep. Artificial scarcity on the Nvidia end of things means that the 3060TI will likely end up being priced at 3070 MSRP, 3070TI priced at 3080 MSRP, and the 3080 priced at nearly $1000. Things probably won't pan out until after the holidays.

 

I expect the same to happen with AMD cards, but not to the same extent.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×