Jump to content

AMD RDNA Doc release

AlexOak

So AMD has released a document on their RDNA architecture and so far they talk about the encoder as well.

 
 
 
 
Quote

Graphics processors (GPUs) built on the RDNA architecture will span from power-efficient notebooks and smartphones to some of the world’s largest supercomputers. To accommodate so many different scenarios, the overall system architecture is designed for extreme scalability while boosting performance over the previous generations. Figure 4 below illustrates the 7nm Radeon RX 5700 XT, which is one of the first incarnations of the RDNA architecture. -pg 6

Quote

In many cases, video encoding and decoding can be performed in software on the RDNA dual compute units for the highest quality. However, the dedicated hardware will always yield the best throughput and power efficiency while also freeing up dual compute units for other tasks. The video engine has been enhanced with support for VP9 decoding, whereas prior generations relied on software implementations.

 

The video engine can decode H.264 streams at high-throughput: 1080p at 600 frames/sec (fps) and 4K at 150 fps. It can simultaneously encode at about half the speed: 1080p at 360 fps and 4K at 90 fps. 8K decode is available at 24 fps for both HVEC and VP9. For more advanced compression, the video engine delivers high throughput decoding for either 8-bit or 10-bit color: 360 fps for a 1080p stream, 90 fps for a 4K stream, and 24 fps for an 8K stream. The encoder is designed for HVEC, while the decoder can also handle VP9.

-Pg 19

Opinion:

So If they fix the encoder problem then this might make it a good card? You should read the doc since it covers way more on this new architecture 

 

https://www.amd.com/system/files/documents/rdna-whitepaper.pdf

Link to comment
Share on other sites

Link to post
Share on other sites

I may be thinking about it wrong but i think this will bite them back sometime

Quote

The memory controllers and interfaces for the RDNA architecture are designed to take advantage of GDDR6, the fastest mainstream graphics memory. Each memory controller drives two 32-bit GDDR6 DRAMs with a 16 Gbit/s interface, doubling the available bandwidth of previous generation GDDR5 within roughly the same power budget.

 

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, SlimyPython said:

I may be thinking about it wrong but i think this will bite them back sometime

 

Why would that bite them?

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, dalekphalm said:

Why would that bite them?

for when GDDR7 comes around

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, SlimyPython said:

for when GDDR7 comes around

When is GDDR7 being released?

 

I imagine by the time it becomes available for mass production at reasonable costs, they will be able to upgrade the architecture for the new memory spec. 

 

I wouldn't overthink that one quote. 

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, dalekphalm said:

When is GDDR7 being released?

 

I imagine by the time it becomes available for mass production at reasonable costs, they will be able to upgrade the architecture for the new memory spec. 

 

I wouldn't overthink that one quote. 

AMD has also already confirmed that RDNA can be adapted for either GDDR6 or HBM...

Link to comment
Share on other sites

Link to post
Share on other sites

Read it last night, over 2x Triangle Culling Rates over Vega 64 which makes it perform better in games with high geometry counts that usually old GCN achilles heel imo.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, SlimyPython said:

I may be thinking about it wrong but i think this will bite them back sometime

 

You do realize everything can evolve with time? Currently GDDR6 is a thing. It hasn't been out for 1 year even...

Link to comment
Share on other sites

Link to post
Share on other sites

Ditching my Vega 64 for a 5700XT on Wednesday, my Vega is suffering from serious overheating and the SoC can easily reach over 100°c while gaming. Gonna pick up either the Sapphire Pulse or Power Color Red Devil (which is currently out of stock) and put the Vega on eBay.

 

I know it's almost a sidegrade, I've done my research but tbh I've lost faith in the Vega now and once I've replaced it I see no need to upgrade anything else in my PC for a few years.

 

I run a high refresh 1440p freesync monitor so the 5700XT is perfect.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Quite cool, I wonder how much the 2020 Navi will be improved, not just performance but like architectural and RT additions. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Master Disaster said:

Ditching my Vega 64 for a 5700XT on Wednesday, my Vega is suffering from serious overheating and the SoC can easily reach over 100°c while gaming. Gonna pick up either the Sapphire Pulse or Power Color Red Devil (which is currently out of stock) and put the Vega on eBay.

 

I know it's almost a sidegrade, I've done my research but tbh I've lost faith in the Vega now and once I've replaced it I see no need to upgrade anything else in my PC for a few years.

 

I run a high refresh 1440p freesync monitor so the 5700XT is perfect.

Sometimes you have to go this way. I remember when I bought Sapphire HD6870 Toxic and it was obnoxiously loud. So I ended up returning it and buying Sapphire HD6950 (looked similar to Flex Edition, I think it said Silent Efficiency on it). They were about the same in performance (yeah that Toxic was insanely fast, but also insanely hot and loud), but the HD6950 was way quieter.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Humbug said:

AMD has also already confirmed that RDNA can be adapted for either GDDR6 or HBM...

Yeah and even if they didn’t say it before, I’d be shocked if they were limited to one type of memory controller. 

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, SlimyPython said:

for when GDDR7 comes around

  1. There's no point in designing a memory controller over a future spec that likely isn't finalized yet, if the process of getting it specified is even in progress
  2. The memory controller is an implementation detail, it's not integral to the design of the GPU architecture itself
  3. GPU implementations (i.e., video cards) likely straddle the point of "just enough" to minimize cost to build and performance gained. Yes there may be some room for improvement, but is that worth being on bleeding edge and likely inflating the video card's price even more?
Link to comment
Share on other sites

Link to post
Share on other sites

Exynos with radeon graphics would kick ass.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, williamcll said:

Exynos with radeon graphics would kick ass.

I was thinking the same thing too. But I still have my doubts.

Can this really scale down that well? It will be interesting to see, but I am worried that performance or power consumption won't be competitive with other offers from for example Qualcomm, Apple or ARM. Those GPUs were designed from the ground up to be in phones and tablets. They weren't designed to scale up to 20, or 200 watts of power consumption.

 

Also, Samsung need to fine-tune their core design a bit more for it to be truly top-tier.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, LAwLz said:

I was thinking the same thing too. But I still have my doubts.

Can this really scale down that well? It will be interesting to see, but I am worried that performance or power consumption won't be competitive with other offers from for example Qualcomm, Apple or ARM. Those GPUs were designed from the ground up to be in phones and tablets. They weren't designed to scale up to 20, or 200 watts of power consumption.

 

Also, Samsung need to fine-tune their core design a bit more for it to be truly top-tier.

 

Well the fact that Apple managed to double the performance of their integrated graphics & video encoding from the A10X in 2017 to the A12X in 2018 means Samsung has to up their graphics game via partnering with AMD if they want to stay on-par with Apple in SoC performance ~ especially if folding Android phones are poised to go head-to-head against iPads.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×