Jump to content

AMD Discusses 2016 Radeon Visual Technologies Roadmap (HDR, HDMI 2.0, DisplayPort 1.3)

Albert Song

AMD Slides: http://www.slideshare.net/pertonas/amd-radeon-technology-group-summit?ref=https://benchlife.info/amd-next-gen-gpu-will-add-dp-1-3-and-hdmi-2-support-12082015/

 

Source 1: http://www.anandtech.com/show/9836/amd-unveils-2016-vistech-roadmap

AnadTech: AMD Discusses 2016 Radeon Visual Technologies Roadmap

 

Source 2: http://wccftech.com/amd-bringing-better-pixels-to-pc-hdr-and-larger-color-space-to-consumers-10bpp/
WCCF: AMD Bringing Better Pixels to PC, HDR and Larger Color Space to Consumers

 

Source 3: https://benchlife.info/amd-next-gen-gpu-will-add-dp-1-3-and-hdmi-2-support-12082015/

(This is where I found the slides)

 

Source 4: http://www.tomshardware.com/news/amd-freesync-over-hdmi-hdr,30711.html

 

 

Things to expect in 2016: HDR, HDMI 2.0, DisplayPort 1.3

 

HDMI 2.0, DisplayPort 1.3:

quT12o1.jpg

hq1DgDK.png

shAOFWM.png

 

Next-gen GPU will have HDMI 2.0a and DisplayPort 1.3.

 

FreeSync over HDMI will work on any Radeon card capable of variable refresh over DisplayPort, including the full lineup of GCN 1.1- and 1.2-based processors. Tahiti, Pitcairn, Cape Verde and their rebranded derivatives aren’t compatible.

 

 

HDR:

cT0ptpb.png

ssR1HkU.png

ByLssqp.jpg

gwHnTFI.jpg

bC6nfMH.png

 

The basics are that AMD wants to bring 10-bit color rendering to every type of workflow, especially games and that they also want to introduce the world of true HDR to consumers. We’ll go over what HDR actually is at the end.

This philosophy starts with being able to render and send full 10bpp color encoding to the monitor, which all current-generation Radeon GPU’s do. In contrast, only NVIDIA’s professional line actually support that. They don’t want to stop there, though.

The idea is to eventually have 10bpp capable monitors available in the mainstream, with monitors and TV’s that are capable of higher nits, though not just to have a “brighter” image, but to be able to provide a better image as well. Currently there are very few monitors aside from professional monitors that can actually support true 10-bit color. Most that even claim to support over a billion colors are actually 8-bit panels making use of FRC to for approximations and dithering of actually displayable colors to make the others. The result is generally poor. AMD expects more of these kinds of monitors to be available in the second half of 2016.

In 2016, all new Radon cards will support HDR.

1080p HDR looks nicer than 4k SDR said AMD.

AMD expect more HDR capable monitors in 2016.

Link to comment
Share on other sites

Link to post
Share on other sites

hdr =/= 10bit colour

Yeah they're talking about both.

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

It looks like they're talking about I/O more than actual graphics technologies, which makes sense because they'll talk about those when they announce the cards.

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

???? I don't get what they're talking about, are they making a screen with more than 3 led colors because without it what does this feature do.

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

???? I don't get what they're talking about, are they making a screen with more than 3 led colors because without it what does this feature do.

HDR displays can be designed with the supreme black depth of OLED, or the vivid brightness of local dimming LCD. Both are suitable for gaming. A selection of TVs are already available, and consumer monitors are expected to reach the market in the second half of 2016. Such displays will offer unrivaled color accuracy, saturation, brightness, and black depth - in short, they will come very close to simulating the real world.

Link to comment
Share on other sites

Link to post
Share on other sites

HDR displays can be designed with the supreme black depth of OLED, or the vivid brightness of local dimming LCD. Both are suitable for gaming. A selection of TVs are already available, and consumer monitors are expected to reach the market in the second half of 2016. Such displays will offer unrivaled color accuracy, saturation, brightness, and black depth - in short, they will come very close to simulating the real world.

So better than 99% accuracy ??? because some monitors already have that level of accuracy with out this, just look at any high end dell screen.

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

Man graphics cards next year are going to be insane. Not just double the transistor count at 16nm ff, but also dp 1.3 with hdr and 10 bit. You could use a standard amd card for professional video and photo editing.

Can't wait to see a beautiful game on an hdr TV or monitor.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

So better than 99% accuracy ??? because some monitors already have that level of accuracy with out this, just look at any high end dell screen.

Quess we are gonna have to wait and see.

Link to comment
Share on other sites

Link to post
Share on other sites

Man graphics cards next year are going to be insane. Not just double the transistor count at 16nm ff, but also dp 1.3 with hdr and 10 bit. You could use a standard amd card for professional video and photo editing.

Can't wait to see a beautiful game on an hdr TV or monitor.

I'm pretty sure AMD's cards are going to be 14nm.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Eh, not much to hype over. Nvidia will match this feature-wise even down to HDMI-based G-Sync. I'm waiting until the cards hit the table.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I'm pretty sure AMD's cards are going to be 14nm.

16nmFF. Only the APUs will be based in GloFo's 14nm tech. Yes, they will contain full Greenland dies, but AMD will still be using TSMC as its dGPU supplier.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

So better than 99% accuracy ??? because some monitors already have that level of accuracy with out this, just look at any high end dell screen.

 

That 99% accuracy is based on an old standard. It is 99% accurate compared to what that standard specifies, but far from 99% compared to "real life" accuracy. BT.2020 is a new "standard" for colour (implements 10bit colour, the old one was just 8bit), which contains 64~ times as many colours as the old BT.709 "standard". 

 

AMD-Radeon-2016-GPus-9_vc.jpg'

 

If you look at the graph to the left. You'll see that the black triangle is what has been normal for many years, sRGB. It is also the colour that Blu-ray uses and Windows uses by default. 

The yellow triangle is "Digital Cinema" colours, and is what modern 10bit TV's manages today ~ 

The green one, almost the same size as the yellow, just covers a different area in some places, is what we know as "AdobeRGB" which professional grade monitors (very expensive) have. 

Then the last triangle, the blue one, that is what Rec. 2020 contains and is what AMD is technically supporting with these new cards. No monitors nor TV's can yet produce those colour, but in a few years they're supposed to be close. 

Lastly, the outer circle/triangle (dark blue, orange dots) is the human visible range. 

So, the black triangle is what most normal TV's and monitors have aimed for for many many years, and were monitors today are at 99 to 100% accurate compared to. 

The blue one is the new range they are aiming for. 

Ryzen 7 5800X     Corsair H115i Platinum     ASUS ROG Crosshair VIII Hero (Wi-Fi)     G.Skill Trident Z 3600CL16 (@3800MHzCL16 and other tweaked timings)     

MSI RTX 3080 Gaming X Trio    Corsair HX850     WD Black SN850 1TB     Samsung 970 EVO Plus 1TB     Samsung 840 EVO 500GB     Acer XB271HU 27" 1440p 165hz G-Sync     ASUS ProArt PA278QV     LG C8 55"     Phanteks Enthoo Evolv X Glass     Logitech G915      Logitech MX Vertical      Steelseries Arctis 7 Wireless 2019      Windows 10 Pro x64

Link to comment
Share on other sites

Link to post
Share on other sites

That 99% accuracy is based on an old standard. It is 99% accurate compared to what that standard specifies, but far from 99% compared to "real life" accuracy. BT.2020 is a new "standard" for color (implements 10bit color, the old one was just 8bit), which contains 64~ times as many colors as the old BT.709 "standard". 

For those who may challenge you on that claim:

 

((2^10)^3)/((2^8)^3) = 64

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

For those who may challenge you on that claim:

 

((2^10)^3)/((2^8)^3) = 64

Thanks :)

Edited in a bit more information to illustrate. 

Ryzen 7 5800X     Corsair H115i Platinum     ASUS ROG Crosshair VIII Hero (Wi-Fi)     G.Skill Trident Z 3600CL16 (@3800MHzCL16 and other tweaked timings)     

MSI RTX 3080 Gaming X Trio    Corsair HX850     WD Black SN850 1TB     Samsung 970 EVO Plus 1TB     Samsung 840 EVO 500GB     Acer XB271HU 27" 1440p 165hz G-Sync     ASUS ProArt PA278QV     LG C8 55"     Phanteks Enthoo Evolv X Glass     Logitech G915      Logitech MX Vertical      Steelseries Arctis 7 Wireless 2019      Windows 10 Pro x64

Link to comment
Share on other sites

Link to post
Share on other sites

YA Nvidia will match HBM memory that AMD developed and already uses and then they will dump SLI crap and use technology that AMD already has and then maybe they will have DX12 hardware support. Until then keeping buying there out of date cards.

Link to comment
Share on other sites

Link to post
Share on other sites

YA Nvidia will match HBM memory that AMD developed and already uses and then they will dump SLI crap and use technology that AMD already has and then maybe they will have DX12 hardware support. Until then keeping buying there out of date cards.

They're using GPU-GPU NVLink, not XDMA, so no, not AMD the on that one.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

4K at 120Hz, let's hope we get cards capable of pushing that in 2016. Let's not hype it too much though.

Asus B85M-G / Intel i5-4670 / Sapphire 290X Tri-X / 16GB RAM (Corsair Value 1x8GB + Crucial 2x4GB) @1333MHz / Coolermaster B600 (600W) / Be Quiet! Silent Base 800 / Adata SP900 128GB SSD & WD Green 2TB & SG Barracuda 1TB / Dell AT-101W / Logitech G502 / Acer G226HQL & X-Star DP2710LED

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not looking forward to 5k, With my shitty eyes. I am however looking for 1080p144 or 1440p144 in AAA games.

To save mother earth. xD

Well it's pretty annoying to have power hungry parts. And having heat kicking into your room is a nightmare.

I think we'll have to disagree on that. I live in Québec, the temperatures are pretty low in the winter and I havent heated my appartment once since btc mining/boinc became a thing.
Link to comment
Share on other sites

Link to post
Share on other sites

I think we'll have to disagree on that. I live in Québec, the temperatures are pretty low in the winter and I havent heated my appartment once since btc mining/boinc became a thing.

Yes, but heating with electricity is very inefficient as you lose ~60% of the primary energy in production, distribution and storage.

EDIT: on topic: The monitor development has to bin in line with GPU development. This was the case since the beginning. Now the GPUs makes a big step it's nice to see the displays get some love too.

Mineral oil and 40 kg aluminium heat sinks are a perfect combination: 73 cores and a Titan X, Twenty Thousand Leagues Under the Oil

Link to comment
Share on other sites

Link to post
Share on other sites

I just want more 1440p144 IPS/PLS options without defects. The MG279Q I had was unacceptable build quality, massive bleed in the bottom right corner causing blacks to be discolored yellow

Link to comment
Share on other sites

Link to post
Share on other sites

They're using GPU-GPU NVLink, not XDMA, so no, not AMD the on that one.

NVLink only works on Sierra and Summit IBM workstation HPC motherboards. It is not a replacement for SLI.

 

http://info.nvidianews.com/rs/nvidia/images/An%20Inside%20Look%20at%20Summit%20and%20Sierra%20Supercomputers-3-1.pdf

 

Page 7.

 

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

NVLink only works on Sierra and Summit IBM workstation HPC motherboards. It is not a replacement for SLI.

 

http://info.nvidianews.com/rs/nvidia/images/An%20Inside%20Look%20at%20Summit%20and%20Sierra%20Supercomputers-3-1.pdf

 

Page 7.

 

Nope, read again.

http://blogs.nvidia.com/blog/2014/11/14/what-is-nvlink/

http://www.enterprisetech.com/2014/11/18/early-nvlink-tests-show-gpu-speed-burst/

 

SLI is being replaced altogether with a new external bridging technique. There are 2 forms of NVLink. Only the CPU-GPU interconnect is motherboard-dependent.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×