Jump to content

AMD Radeon RX 680 Navi 10 GPU With GDDR6 Coming 2019?

AtlasWraith

It seems that despite Sony taking so many hits from Fortnite fans, they may be helping the PC Gaming scene in the long run. AMD is supposedly planing a Navi GDDR6 GPU for 2019.

 

Quote

Radeon RX 680 according to my sources will be powered by the Navi GPU architecture and feature 8GB of GDDR6 memory, with the performance of the GTX 1080 to GTX 1080 Ti or so. We should expect a price of somewhere in the $299-$399 range, and will battle the GTX 1080 Ti at higher resolution games because of its faster GDDR6 RAM.

 

Now do remember, children, SALT SAVES LIVES AND BRAIN CELLS! I am highly curious about these "sources" this article is talking about, but one can never be too sure these days if leaks are real or not, so might as well spread some optimism where we can while still keeping that NaCI handy!

 

Source:

Read more: https://www.tweaktown.com/news/62226/amd-radeon-rx-680-navi-10-gpu-gddr6-2019/index.html

 

And earlier post by @Deedot78 referenced this specific article but did not use it in the post itself:

 

Edited by AtlasWraith
Tagged an earlier post that also refences this article.
Link to comment
Share on other sites

Link to post
Share on other sites

I just want to ask, what does Sony have to do with AMD possibly releasing new GPUs?

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

I think he means people will get pissed off and switch to pc when they find out we got the RZ 680, but i highly doubt they would even know the power of it.

http://pcpartpicker.com/list/Mf3Zcc My build

 

R.I.P Donny- Got banned. We will always remember your spamming of "Cancerbooks"

 

iPhones are like 1 ply toliet paper with a logo slapped on them and years old hardware in them- A Wise Man

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Crunchy Dragon said:

I just want to ask, what does Sony have to do with AMD possibly releasing new GPUs?

Long story short, AMD is the one making the graphics component for the PS5 (rumored to release in 2020) and that historically means AMD will release a dGPU for PC using that same chip about a year or so before the console.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, AtlasWraith said:

Long story short, AMD is the one making the graphics component for the PS5 (rumored to release in 2020) and that historically means AMD will release a dGPU for PC using that same chip about a year or so before the console.

AMD's been making GPUs for PlayStation and Xbox for since the PS3 and Xbox 360.

 

Also, they use different GPUs than they do on PC. Which is why there are equals that you can compare the GPUs of consoles to in the PC market, but it won't be exact since it's not the same GPU.

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

It remains to be seen whether the rumors are more "poor Volta" or more "40% IPC improvement".

4 minutes ago, Crunchy Dragon said:

I just want to ask, what does Sony have to do with AMD possibly releasing new GPUs?

One of the other rumors going around is that the reason that Navi was pushed back and is no longer GCN based is because Sony(and presumably Microsoft as well) demanded custom 7nm Vega chips for the PS5.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Crunchy Dragon said:

Also, they use different GPUs than they do on PC.

You are technically correct, yes. However, it is to my understanding that the name of the graphics on the console is the main difference. The underlying chip is still the same, just "optimized" for the console lol

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, AtlasWraith said:

You are technically correct, yes. However, it is to my understanding that the name of the graphics on the console is the main difference. The underlying chip is still the same, just "optimized" for the console lol

The chip is often different from PC. That doesn't mean that it's a separate effort as such. It uses some of the same designs. That's the nature of semi-custom. Things get changed or mixed in the design process. For example Xbox One S uses the same GPU as the original Xbox One but with a die shrink, a slight overclock and a new display controller.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Crunchy Dragon said:

I just want to ask, what does Sony have to do with AMD possibly releasing new GPUs?

Sony asks AMD to make semi-custom GPUs.

 

In this case, Sony wants AMD to make a very powerful GPU for cheap and the only way to do that going forward is to use multiple dies instead of a single monolithic die.

 

AMD can use this approach with upcoming PC GPUs.

40 minutes ago, AtlasWraith said:

You are technically correct, yes. However, it is to my understanding that the name of the graphics on the console is the main difference. The underlying chip is still the same, just "optimized" for the console lol

Nope.

 

The GPU in the Xbox One X isn't available elsewhere. It is a semi-custom solution and is what Microsoft asked for.

47 minutes ago, AtlasWraith said:

Long story short, AMD is the one making the graphics component for the PS5 (rumored to release in 2020) and that historically means AMD will release a dGPU for PC using that same chip about a year or so before the console.

Again, no. Same architecture? Most likely yes. Same chip? No way. Not happening.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Frankly speaking 8gb of VRAM is more than needed for gaming purposes with the RX480/580, while some may argue 4gb is not enough truth is at 2560x1440p which is as high as you probably wanna go with this GPU is sufficient, 1080p it is plenty.

 

Why doesn't AMD make it 6gb instead? we seen from the GTX 1060 it is the right amount for both resolutions and their processing chip doesn't require the extra 2gb... on the insane memory pricing right now and the belief this GDRR6 will be even more expensive why not quit trying to use VRAM amount as marketing and just throw the right amount on it.

 

This also bothered me with the V56, HBM2 makes no sense for that level of processing power, if they made it GDDR5 from start performance specially on mainstream usage like gaming would be much the same but we'd have a better availability of this card.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Princess Cadence said:

Frankly speaking 8gb of VRAM is more than needed for gaming purposes with the RX480/580, while some may argue 4gb is not enough truth is at 2560x1440p which is as high as you probably wanna go with this GPU is sufficient, 1080p it is plenty.

 

Why doesn't AMD make it 6gb instead? we seen from the GTX 1060 it is the right amount for both resolutions and their processing chip doesn't require the extra 2gb... on the insane memory pricing right now and the belief this GDRR6 will be even more expensive why not quit trying to use VRAM amount as marketing and just throw the right amount on it.

 

This also bothered me with the V56, HBM2 makes no sense for that level of processing power, if they made it GDDR5 from start performance specially on mainstream usage like gaming would be much the same but we'd have a better availability of this card.

I see almost 7GB of VRAM usage at 1440p in PUBG for example... BUT... I don't know whether that's total allocated VRAM or if it actually needs that much.

 

Vega56 with GDDR5 would have at least 50W TDP more than with HBM2 though.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Princess Cadence said:

Frankly speaking 8gb of VRAM is more than needed for gaming purposes with the RX480/580, while some may argue 4gb is not enough truth is at 2560x1440p which is as high as you probably wanna go with this GPU is sufficient, 1080p it is plenty.

 

Why doesn't AMD make it 6gb instead? we seen from the GTX 1060 it is the right amount for both resolutions and their processing chip doesn't require the extra 2gb... on the insane memory pricing right now and the belief this GDRR6 will be even more expensive why not quit trying to use VRAM amount as marketing and just throw the right amount on it.

Because making a 6GB graphics card means putting either a 192 bit memory bus or a 384 bit memory bus and right now neither are optimal for AMD.

 

AMD graphics cards benefit from lots of memory bandwidth. Anything less than 256 Bit would probably harm the performance of the alleged RX 680.

 

AMD has stuck with 256 Bit for mid range since GCN 3rd Gen came out in 2014 with the Tonga graphics chip and in their own slides at one point they probably said that moving from 384 Bit to 256 Bit helped reduce power consumption  (because it did help a lot actually) and their at the time new Colour Compression compensated for the loss of memory bandwidth when they transitioned.

1 minute ago, Princess Cadence said:

This also bothered me with the V56, HBM2 makes no sense for that level of processing power, if they made it GDDR5 from start performance specially on mainstream usage like gaming would be much the same but we'd have a better availability of this card.

Well, to be honest. Vega 56 and Vega 64 would probably not be a marketable product people would actually buy at all if it came with GDDR5.

 

It would probably have a 70-100w higher TDP because of the GDDR5. They used HBM2 as a power saving measure and not so much for performance.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, WereCat said:

I see almost 7GB of VRAM usage at 1440p in PUBG for example... BUT... I don't know whether that's total allocated VRAM or if it actually needs that much.

 

Vega56 with GDDR5 would have at least 50W TDP more than with HBM2 though.

lol. 50w extra? Nah. More like 70-100w extra. They'd need at least a 384 Bit memory bus for it if they went GDDR5. Possibly even 512 Bit which we know from the R9 290X days was very inefficient to do.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AluminiumTech said:

Oh no lol. 50w extra? Nah. More like 70-100w extra. They'd need at least a 384 Bit memory bus for it if they went GDDR5. Possibly even 512 Bit which we know from the R9 290X days was very inefficient to do.

Well, I said at least which implies that most likely more thab the number I said, but yeah, you're right.

Just looking at the power consumption on my 1080ti stock vs VRAM OC gives me a rough idea... but thats GDDR5X.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, ravenshrike said:

It remains to be seen whether the rumors are more "poor Volta" or more "40% IPC improvement".

One of the other rumors going around is that the reason that Navi was pushed back and is no longer GCN based is because Sony(and presumably Microsoft as well) demanded custom 7nm Vega chips for the PS5.

Both turned out to be false :P but lets hope it'll be better!

Wasn't it instead because sony asked for Navi? That forced amd to give them Navi as a post GCN arch because Sony doesn't want a gcn one for obvious reasons.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, hobobobo said:

and then there is this thing https://www.pcgamesn.com/amd-navi-monolithic-gpu-design?tw=PCGN1 which claims navi wont be mcm and wont be big

 

i think it might deserve a thread of itself but im too lazy

Doesn't inherently mean Navi won't be MCM, just that console(assuming Sony does want Navi) and consumer Navi won't. That might be why their consumer card is rumored to be limited to 1080 performance levels since they're not tossing multiple chips on there. Course that means if they do want to go MCM in the gaming sphere they'll have to invest quite a lot into working with engine makers and developers on mGPU coding. The interesting thing is that their relationship with Microsoft and Sony might end up making that significantly easier in the long run if they can get them to cooperate for either enhanced 9th gen consoles or 10th gen.

Link to comment
Share on other sites

Link to post
Share on other sites

amd is the child in the parents divorce lol  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ravenshrike said:

Doesn't inherently mean Navi won't be MCM, just that console(assuming Sony does want Navi) and consumer Navi won't. That might be why their consumer card is rumored to be limited to 1080 performance levels since they're not tossing multiple chips on there. Course that means if they do want to go MCM in the gaming sphere they'll have to invest quite a lot into working with engine makers and developers on mGPU coding. The interesting thing is that their relationship with Microsoft and Sony might end up making that significantly easier in the long run if they can get them to cooperate for either enhanced 9th gen consoles or 10th gen.

They practically have no choice. If they don't go MCM then there's physically no way they can deliver GTX 1080 performance at that price point.

 

They have to use MCM because monolithic GCN GPUs are limited to 64 CUs max. You can't make a traditional GCN based GPU for a single GPU solution with more than 64 CUs. It's physically not possible. And AMD's GPU for Sony is rumored to have around 70-80 CUs which is 100% not achievable without the use of MCM.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AluminiumTech said:

They practically have no choice. If they don't go MCM then there's physically no way they can deliver GTX 1080 performance at that price point.

 

They have to use MCM because monolithic GCN GPUs are limited to 64 CUs max. You can't make a traditional GCN based GPU for a single GPU solution with more than 64 CUs. It's physically not possible. And AMD's GPU for Sony is rumored to have around 70-80 CUs which is 100% not achievable without the use of MCM.

Navi is no longer going to be based on GCN, it's their new architecture.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ravenshrike said:

Navi is no longer going to be based on GCN, it's their new architecture.

Source?

 

I'm incline to say that is factually incorrect.

 

Quote

Navi will be the last GCN-based architecture, will be succeeded by brand new macro-architecture in 2020/2021 timeframe

https://wccftech.com/amd-new-major-gpu-architecture-to-succeed-gcn-by-2020-2021/

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, AluminiumTech said:

Source?

 

I'm incline to say that is factually incorrect.

 

https://wccftech.com/amd-new-major-gpu-architecture-to-succeed-gcn-by-2020-2021/

I bring you the latest from WCCF

https://wccftech.com/exclusive-amd-navi-gpu-roadmap-cost-zen/

 

Quote
  • Vega 7nm will not be coming to gamers.
  • Navi 10 will be the first Navi part to arrive and will be landing sometime in 2H 2019 or early 2020, depending on a couple of factors. The performance level of this part will be equivalent to Vega and it will be a small GPU based on 7nm.
  • Navi 14 will follow Navi 10 soon after.
  • Navi 20 is going to be the true high-end GPU built on the 7nm node and as things stand right now, you are tentatively looking at it landing sometime around 2020 – 2021.
  • Navi will also be the first architecture to transition away from GCN (and along with it, the 4096 SP / 64 CU limit that is inherent to the uArch implementation).
  • ‘Next-Gen’ architecture is the uArch formerly codenamed KUMA internally before AMD decided it didn’t like that name too much (oops) and will be based on the same brand new major architecture that AMD rolls out with Navi.

 
 

 

Now how accurate this is remains to be seen, but if true is certainly interesting for what the future holds.

 

 

 

God damned stupid linus quote system. That last part shouldn't be in quotes.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, ravenshrike said:

Well, I think we should wait for official news before we rush to judgement.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×