Jump to content

AMD confirms RDNA 2 "Big NAVI" will release for PC before the release of next gen consoles, calls it their "Halo Product"

Master Disaster
1 hour ago, RejZoR said:

The 10th gen is actually not as much of a stinker as we all thought it would be. It's still far from something I'd be super thrilled about, but if you're a gamer and gamer primarily, I can see someone be excited. But it's not that same thrill as with AMD where they just churn out shit and with every release we're more and more amazed over their products. Intel was like that too. Once. Quite some time in the past...

I have a 9900k, and I would see no reason to upgrade, even though I primarily game on my PC. I play on higher resolutions, so the only possible bottleneck I can have would come from my graphics card (the triforce consists of 9900k, 2080 Ti, 32gb ram). Big Navi is again bringing considerably better IPC, so even gamers would be looking forward to it. 10th gen doesn't even hit the tip of the iceberg in my opinion.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, jasonvp said:

That's the wrong target for AMD to be shooting at, of course.  Pascal is 4+ years old at this point.  In fact, AMD shouldn't be shooting for Turing, either.  They need to guess where NVidia is going to be with the release of Ampere and target a point beyond that.  This is the thing AMD continues to fail at doing with their GPUs, and there's no hint that will change any time soon.

the mid range is much more important than the high end for the bottom line of amd, its quite stupid to compare amd's 400 dollar cards to 600+ dollar cards. thats not how companies look at the market, what matters is how the products they make will compete with the ones being sold at that time by the competition.

 

there is plenty of hint that they will be targeting the high end as well this generation.

1st we have a openvr benchmark showing a amd gpu beating the 2080 ti by quite a bit

2nd - we have the die sizes of the new navi 2x cards being 505mm², 340mm² and 240mm² --source is komachi

3rd - its long being rumored that the next card will have 80Cus which lines up with the previous 2 leaks

4th - Amd saying they will have a new top to bottom stack, and just now saying big navi will be their halo product

 

5 hours ago, leadeater said:

GPU forced induction?

 

  Hide contents

x1080

 

i see you are a men of culture as well "Hell yea brother"

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, jasonvp said:

You're 1080Ti isn't even remotely "today's high end", either.

 

I don't know if I entirely agree with that. There is only one card that would be considered a genuine upgrade from a 1080 Ti, which is a 2080 Ti... A $1,200 GPU, and is the fastest consumer GPU. Yes, a 2080 super is faster, but marginally, and not to the point that justifies an upgrade. So how is a 1080 Ti not high end, in todays market?

GPU: XFX RX 7900 XTX

CPU: Ryzen 7 7800X3D

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Orangeator said:

Yes, a 2080 super is faster, but marginally, and not to the point that justifies an upgrade. So how is a 1080 Ti not high end, in todays market?

Whether the justification is there or not given the price, the simple fact is: the 1080Ti isn't "high end" today.  A single 1080Ti can't do 4K at high refresh; a pair of them will struggle to do so.  It's a 4+ year old card, and isn't the target AMD should be aiming for.  If they do and it is, then they've failed (again).

 

You HAVE to take price out of the equation when it comes to all out performance.  HAVE.  TO.  Yes, there are diminishing returns, but when it comes to high end, that doesn't matter.  It's "performance at any cost" versus "performance per cost".  Hell, even the 2080Ti isn't the top dog, the Titan is.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, cj09beira said:

 what matters is how the products they make will compete with the ones being sold at that time by the competition.

That's the fail, right there.  If AMD is going to make product to compete with NVidia's GPUs today, they're going to get stomped on (again) when Ampere is launched.  It's akin to the runner who's pacing himself to come in third place in the race; he's going to get passed by the three guys trying to win.

 

I'm not sure why that's so difficult for folks to understand.  But it is clear that IF AMD does that (again), they're further proving that they can't innovate in the GPU space.  All they can do is play "catch-up" with Jensen and co.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Orangeator said:

I don't know if I entirely agree with that. There is only one card that would be considered a genuine upgrade from a 1080 Ti, which is a 2080 Ti... A $1,200 GPU, and is the fastest consumer GPU. Yes, a 2080 super is faster, but marginally, and not to the point that justifies an upgrade. So how is a 1080 Ti not high end, in todays market?

When a latest gen 750€ card (cheapest RTX 2080 I could find) can barely rival a 4 years old card, I'd say GTX 1080Ti is pretty damn fast. The RTX part doesn't really sets it apart that much because not enough games use it even today to make a big deal out of it. It's a nice addition, but it's not end of the world if you don't have RT. Next tier that has a performance lead costs 1050€ (cheapest i could find in Europe and has a crappy blower style cooler). And even at this point, the performance gap is nowhere near the one that GTX 1080Ti had over regular GTX 1080 back in the day. And nowhere near a performance gap that would justify the investment.

 

In fact, GTX 1080Ti is still so fast I'm questioning upgrade to RX 6000 series or RTX 3000 series. Both would have to offer something really radical that I can use in basically all games today without any special hassle for me to consider them. Like making RTGI ReShade like functionality that would work with almost any game. Then my curiosity would probably open up the wallet. But for a small general performance bump and faster RT just won't cut it. It's just too big of an investment for nearly zero benefits.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, jasonvp said:

That's the fail, right there.  If AMD is going to make product to compete with NVidia's GPUs today, they're going to get stomped on (again) when Ampere is launched.  It's akin to the runner who's pacing himself to come in third place in the race; he's going to get passed by the three guys trying to win.

 

I'm not sure why that's so difficult for folks to understand.  But it is clear that IF AMD does that (again), they're further proving that they can't innovate in the GPU space.  All they can do is play "catch-up" with Jensen and co.

read that again, because that's not what i said,

Quote

 what matters is how the products they make will compete with the ones being sold at that time by the competition

if its not clear enough, that time is obviously not just the launch date

 

 

and even if amd decided not to go for the high end how does that mean amd is not innovating, just as an example navi increased ipc so much that a 40 cu card handily trades blows with their previous 64 Cu card, on the same node, that had more than twice the bandwidth.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I agree with the premise that a new GPU should never be "on par" with currently available GPUs, but straight out better.

Turing is nearing 2 years old now. If AMD can match it in September, NVidia will just laugh and launch a slight upgrade / refresh, or just lower prices for the luls.

 

That's not gonna be enough.

AMD has to push NVidia, not trail behind all the time. For competition, we need actual progress, not the same product we already have with a red sticker on it, two years later.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, cj09beira said:

and even if amd decided not to go for the high end how does that mean amd is not innovating, just as an example navi increased ipc so much that a 40 cu card handily trades blows with their previous 64 Cu card, on the same node, that had more than twice the bandwidth.

Well, if the same GPU is still inferior to the competition, we can call it innovation, but not enough innovation.

 

If my last software was a calculator that could add and substract and my next calculator can multiply and divide... that is innovation for ME. But noone would care, if other people sell calculators that can do square roots, brackets, exponentials etc.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Tech Enthusiast said:

Well, if the same GPU is still inferior to the competition, we can call it innovation, but not enough innovation.

 

If my last software was a calculator that could add and substract and my next calculator can multiply and divide... that is innovation for ME. But noone would care, if other people sell calculators that can do square roots, brackets, exponentials etc.

except the 5700/xt compete just fine

Link to comment
Share on other sites

Link to post
Share on other sites

That is one GPU that competes in one bracket out of all the possible price brackets.

It is a start, don't get me wrong. The 5700 came as a pleasant surprise!

 

Yet, I don't think that is even remotely enough, unless you only shop in exactly that price and performance bracket.

So you might be happy about it. I am not. Neither the 5700, nor the 5700X are my preferred performance bracket, so I got nothing out of the good start myself. 😞

Link to comment
Share on other sites

Link to post
Share on other sites

I almost bought a 5700 last week. I'm glad didn't. I'll just wait and see if my Vega 56 should be upgraded this summer.

Link to comment
Share on other sites

Link to post
Share on other sites

The best news to come from the Big Navi and Ampere leaks and hype comments is that it would seem like their may well be legitimate competition for the 1st time in many many years.

That ofc resultsi n both better performing cards, and ofc better prices. The latter of which is very important nowadays with Nvidia having royally screwed over consumers with the 20 series and their tier shuffle putting the X80ti up at Titan price levels.

One can hope this undoes that, but I for one fully expect Nvidia to try their damnedest to keep their flagship Consumer gaming card up above £/$1000. This is where we need to hope AMD can compete and push that price back down to the £/$700 mark, or better yet back below it.

 

One can hope ..but one doesnt 'expect' :(

 

Regardless, I will be upgrading, i need HDMI 2.1 for 4k 120 on LG 48" OLED.

Meanwhile, ill still be rocking a Sandybridge 3930k CPU :) still fully capable.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

I am not sure you will get a GPU that can actually push 4k@120 with the next gen tho.

Sadly my 2080ti even gets as low as 70fps on some games with 1440p, let alone 4k. Sure, I could turn down some settings, but damn... id rather have the eye candy haha.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, jasonvp said:

It's a 4+ year old card, and isn't the target AMD should be aiming for.  If they do and it is, then they've failed (again).

Yes, I 100% agree with you on this. Perfectly said.

Quote

Whether the justification is there or not given the price, the simple fact is: the 1080Ti isn't "high end" today.  A single 1080Ti can't do 4K at high refresh; a pair of them will struggle to do so. You HAVE to take price out of the equation when it comes to all out performance.  HAVE.  TO.  Yes, there are diminishing returns, but when it comes to high end, that doesn't matter.  It's "performance at any cost" versus "performance per cost".  Hell, even the 2080Ti isn't the top dog, the Titan is.

I believe the issue here is the misconstruing the definition of "high end". I am using the term on a spectrum. You've got the top cards, the middle of the road cards, and the entry level cards. With many cards fitting into each category. Yes, the 1080 TI is NOT the highest end graphics card. Neither is the 2080 Super. However, of all the graphics cards on the market, they are absolutely in the category of "high end", as there are only one or maybe two (depending if you count Titan as consumer) card(s) above it. 

GPU: XFX RX 7900 XTX

CPU: Ryzen 7 7800X3D

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Tech Enthusiast said:

I am not sure you will get a GPU that can actually push 4k@120 with the next gen tho.

 

 

 

Yeah not gonna happen. Games developers design their graphics level around the expected hardware You might get a huge graphics power jump, but that will just be accompanied by an equal jump in workload thrown at the GPU, your going to need to wait till 4k 120fps becomes a low end monitor before it will become possibble to routinly run at that resolution and framerate. 

Link to comment
Share on other sites

Link to post
Share on other sites

for those thinking a December launch for consoles try before black Friday which means likely early November

which means we may get amd CPUs+GPUs dropping about a month to two appart

 

I'm happy to see a full stack upgrade. maybe that will mean finally a replacement for my 580 that means I gain vram and double or more compute power. for sub 400$

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Tech Enthusiast said:

That is one GPU that competes in one bracket out of all the possible price brackets.

It is a start, don't get me wrong. The 5700 came as a pleasant surprise!

 

Yet, I don't think that is even remotely enough, unless you only shop in exactly that price and performance bracket.

So you might be happy about it. I am not. Neither the 5700, nor the 5700X are my preferred performance bracket, so I got nothing out of the good start myself. 😞

this time it will be top to bottom so there should be a card for everyone

30 minutes ago, GDRRiley said:

for those thinking a December launch for consoles try before black Friday which means likely early November

which means we may get amd CPUs+GPUs dropping about a month to two appart

 

I'm happy to see a full stack upgrade. maybe that will mean finally a replacement for my 580 that means I gain vram and double or more compute power. for sub 400$

fingers crossed i am in the same situation with a 480, 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, cj09beira said:

this time it will be top to bottom so there should be a card for everyone

Fingers crossed for that and for a "less sucky" launch as usual.

Hope the CPU team did a few presentations for the GPU team on how to release stuff that is great and does not under deliver. 🙂

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Tech Enthusiast said:

Fingers crossed for that and for a "less sucky" launch as usual.

Hope the CPU team did a few presentations for the GPU team on how to release stuff that is great and does not under deliver. 🙂

well it should be better, its no longer a whole new architecture with fundamental changes, and they did finally say they wont release cards with blowers, we should be back to just launch drivers dont have as much performance which isn't as bad 

Link to comment
Share on other sites

Link to post
Share on other sites

Can't wait for this to follow every other AMD GPU release and perform marginally within previous gen Nvidia's high end cards.

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

Personally i'd be very impressed if AMD manages to compete above the 2080Ti level by any significant margin. AMD haven't exactly got an amazing springboard to work off of. They had to do a complete architecture shift with their last generation, and unlike with Zen 1 when they've done that they haven't brought anything significant to the table compared to their competition.They're still playing catch up across the board.

Link to comment
Share on other sites

Link to post
Share on other sites

While I do hope for AMD to release a GPU that can properly compete with NVIDIA's latest and greatest on most fronts, also best to temper expectations a bit. I've seen cases where people get massively hyped for a new Radeon GPU and then get disappointed when it matches NVIDIA at best.

 

The RDNA1 line has actually mostly proven to be quite competitive in the midrange, with the 5700XT being quite competitive with a 2070 Super (especially for higher-tiered AIB cards like the Sapphire Nitro+) with the 5600XT and 5700 being compelling alternatives to the 2060/2060S for those who don't really need stuff like NVENC or other NVIDIA-specific features, with buggy drivers being their main criticism. However, catching up and being competitive with your main rival again, while noteworthy, is only the first step.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, CarlBar said:

Personally i'd be very impressed if AMD manages to compete above the 2080Ti level by any significant margin. AMD haven't exactly got an amazing springboard to work off of. They had to do a complete architecture shift with their last generation, and unlike with Zen 1 when they've done that they haven't brought anything significant to the table compared to their competition.They're still playing catch up across the board.

the difference between gcn and rdna isn't as pronounced because they made it at 40 Cus which was also a pretty good spot for gcn (its what the 290x used after all), even still they were able to increase performance per Cu by around 50%(5700xt has the same performance give or take as a radeon VII), they are pretty well positioned all they need is to lower power consumption, which by all metrics seems to be exactly what they done, if the consoles are anything to go by (the gpu on the xbox should be roughly 40% more power efficient).

with how rdna is made navi 10 is the equivalent of a gcn card with just 2 out of the usual 4 shader engines, so scaling should be no issue, 

if clocks dont change much they should be around the 2080 ti +30% mark

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×