Jump to content

AMD Discusses 2016 Radeon Visual Technologies Roadmap (HDR, HDMI 2.0, DisplayPort 1.3)

Albert Song

I want 27" 120hz hdr 5k displays to become a thing. That would be the perfect monitor for me.

 

Must be a Tasmanian thing.

Sim Rig:  Valve Index - Acer XV273KP - 5950x - GTX 2080ti - B550 Master - 32 GB ddr4 @ 3800c14 - DG-85 - HX1200 - 360mm AIO

Quote

Long Live VR. Pancake gaming is dead.

Link to comment
Share on other sites

Link to post
Share on other sites

Then you recalled wrongly, because AMD worked on HBM for almost a decade.

Soo AMD has been working with Hynix since just after DDR3 Came out... 

A. I doubt it took them that long to work on it

B. Then why did AMD push for GDDR5 instead of HBM several years ago?

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

Soo AMD has been working with Hynix since just after DDR3 Came out... 

A. I doubt it took them that long to work on it

B. Then why did AMD push for GDDR5 instead of HBM several years ago?

And This is when I pinpointed that this guy has just been trolling the thread.

System CPU : Ryzen 9 5950 doing whatever PBO lets it. Motherboard : Asus B550 Wifi II RAM 80GB 3600 CL 18 2x 32GB 2x 8GB GPUs Vega 56 & Tesla M40 Corsair 4000D Storage: many and varied small (512GB-1TB) SSD + 5TB WD Green PSU 1000W EVGA GOLD

 

You can trust me, I'm from the Internet.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Soo AMD has been working with Hynix since just after DDR3 Came out... 

A. I doubt it took them that long to work on it

B. Then why did AMD push for GDDR5 instead of HBM several years ago?

A. You severely underestimated the amount of time needed to develop a new tech.

B. Because HBM wasn't ready? For example, DDR4 started it's development during 2005, a couple years before DDR3 even launch and you only see it becoming mainstream this year and in very limited application last year, only ECC memories were available during Q2 2014, non ECC were after that. Same thing happened here, HBM started development before GDDR5 (and GDDR4 actually) even officially launched and it's not rare in any way.

Link to comment
Share on other sites

Link to post
Share on other sites

And This is when I pinpointed that this guy has just been trolling the thread.

Best response/10

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

A. You severely underestimated the amount of time needed to develop a new tech.

B. Because HBM wasn't ready? For example, DDR4 started it's development during 2005, a couple years before DDR3 even launch and you only see it becoming mainstream this year and in very limited application last year, only ECC memories were available during Q2 2014, non ECC were after that. Same thing happened here, HBM started development before GDDR5 (and GDDR4 actually) even officially launched and it's not rare in any way.

Sources?

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

Sources?

Just put these keywords in your address bar: amd hbm decade

 

Google will give you at least 10 different sources.

Link to comment
Share on other sites

Link to post
Share on other sites

Umm just about anyone can it is super easy to spot a TN panel.

No, they can spot a crappy TN panel. Good ones are nearly indistinguishable.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Compare a TN panel with an IPS panel side by side and tell me you can notice a difference in an average game.

I dont have an IPS at the moment because I decided to go high refresh rate when I got mine instead of quality but as soon as you sit in front of a tn it's just obvious. Maybe not newer ones but the one I got a few years ago was like that.

Btw you can always find a LTT video on it.

Link to comment
Share on other sites

Link to post
Share on other sites

reply removed due to being stupid

Intel Core i7 7800x @ 5.0 Ghz with 1.305 volts (really good chip), Mesh OC @ 3.3 Ghz, Fractal Design Celsius S36, Asrock X299 Killer SLI/ac, 16 GB Adata XPG Z1 OCed to  3600 Mhz , Aorus  RX 580 XTR 8G, Samsung 950 evo, Win 10 Home - loving it :D

Had a Ryzen before ... but  a bad bios flash killed it :(

MSI GT72S Dominator Pro G - i7 6820HK, 980m SLI, Gsync, 1080p, 16 GB RAM, 2x128 GB SSD + 1TB HDD, Win 10 home

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well fuck , 2016 and we'll be already saturating DP 1.3 .

And before people start commenting on this bullshit that nobody will be using 4k 120hz - please stop .

Link to comment
Share on other sites

Link to post
Share on other sites

murica without internet? rip .... eventually all the major companies will move to the EU... and everything will be fine again...

Wrong thread?

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

it seems like 10 bit will take more bandwidth that's why most monitors using it have lower refresh rate, does that mean it's gonna have a performance hit? which will be addressed with the new gpus?

fx-8350 @4,4Ghz/sapphire r9 fury/2x 8gb Kingstone ddr3 @2030Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

it seems like 10 bit will take more bandwidth that's why most monitors using it have lower refresh rate, does that mean it's gonna have a performance hit? which will be addressed with the new gpus?

 

Not a direct performance hit necessary, but it will impact VRAM usage as each pixel requires up to 6 bits more data (plus whatever HDR needs) for storing the 10bit color

Ryzen 7 5800X     Corsair H115i Platinum     ASUS ROG Crosshair VIII Hero (Wi-Fi)     G.Skill Trident Z 3600CL16 (@3800MHzCL16 and other tweaked timings)     

MSI RTX 3080 Gaming X Trio    Corsair HX850     WD Black SN850 1TB     Samsung 970 EVO Plus 1TB     Samsung 840 EVO 500GB     Acer XB271HU 27" 1440p 165hz G-Sync     ASUS ProArt PA278QV     LG C8 55"     Phanteks Enthoo Evolv X Glass     Logitech G915      Logitech MX Vertical      Steelseries Arctis 7 Wireless 2019      Windows 10 Pro x64

Link to comment
Share on other sites

Link to post
Share on other sites

I just want more 1440p144 IPS/PLS options without defects. The MG279Q I had was unacceptable build quality, massive bleed in the bottom right corner causing blacks to be discolored yellow

^ this

Link to comment
Share on other sites

Link to post
Share on other sites

240Hz Freesync monitor with new DP will be amazing. Maybe 4K with 120Hz and have option to set 1080p at 240Hz for fps games. Would want for sure.
Though when will we see OLED monitors already?

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Not a direct performance hit necessary, but it will impact VRAM usage as each pixel requires up to 6 bits more data (plus whatever HDR needs) for storing the 10bit color

I don't know how textures are compressed, but 10bit does not necessarily mean more bandwidth and storage. If you have a texture in a lossless format then yes, it will need more VRAM. If the textures are compressed in a lossy format however, 10-bit has a good chance of actually using LESS VRAM.

 

If you encode a video with 10bit depth then you will not only get a better image, you will also get a smaller file. Here is a super brief and non-detailed explanation of why. For a very slightly more detailed explanation, see this post (any more details than that and lots of math gets involved and it gets very complicated).

 

But I don't think this will have any impact on performance in the near future. Game developers have already been able to take advantage of 10bit color depth for certain things if they wanted and they opted not to. Your monitor won't magically become 10bit just because your GPU is either so the marketshare will be tiny.

These news are a very good first step in the right direction though. Good job AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know how textures are compressed, but 10bit does not necessarily mean more bandwidth and storage. If you have a texture in a lossless format then yes, it will need more VRAM. If the textures are compressed in a lossy format however, 10-bit has a good chance of actually using LESS VRAM.

If you encode a video with 10bit depth then you will not only get a better image, you will also get a smaller file. Here is a super brief and non-detailed explanation of why. For a very slightly more detailed explanation, see this post (any more details than that and lots of math gets involved and it gets very complicated).

But I don't think this will have any impact on performance in the near future. Game developers have already been able to take advantage of 10bit color depth for certain things if they wanted and they opted not to. Your monitor won't magically become 10bit just because your GPU is either so the marketshare will be tiny.

These news are a very good first step in the right direction though. Good job AMD.

Hopefully affordable cards capable of producing 10bit colour will help to attract monitor manufacturers to make more 10bit panels.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

I dont have an IPS at the moment because I decided to go high refresh rate when I got mine instead of quality but as soon as you sit in front of a tn it's just obvious. Maybe not newer ones but the one I got a few years ago was like that.

Btw you can always find a LTT video on it.

I'm not talking about old panels, I'm talking about new ones.

 

Compare 2 similarly priced high end panels, one TN, one IPS and tell me you can see a difference. 10 Bit color will be a smaller difference than this. 10 bit color for gamers is stupid imo, and just a tool to hype up GPU's.

 

Also, look at the HDR picture in the OP. There is no way it'll be that different in real world. 

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

Just put these keywords in your address bar: amd hbm decade

 

Google will give you at least 10 different sources.

They give 1 actual source from AMD themselves, and a bunch of people reporting on it/ forum posts...

I personally don't trust manufacturer (Especially AMD) sources being unbiased, any besides that? Maybe from Hynix?

Funnily enough on the Wikipedia it says development with AMD began in 2008, and on AMD's own slides it says 2007 with other sources saying 2006.

Just remember: Random people on the internet ALWAYS know more than professionals, when someone's lying, AND can predict the future.

i7 9700K (5.2Ghz @1.2V); MSI Z390 Gaming Edge AC; Corsair Vengeance RGB Pro 16GB 3200 CAS 16; H100i RGB Platinum; Samsung 970 Evo 1TB; Samsung 850 Evo 500GB; WD Black 3 TB; Phanteks 350x; Corsair RM19750w.

 

Laptop: Dell XPS 15 4K 9750H GTX 1650 16GB Ram 256GB SSD

Spoiler

sex hahaha

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not talking about old panels, I'm talking about new ones.

 

Compare 2 similarly priced high end panels, one TN, one IPS and tell me you can see a difference. 10 Bit color will be a smaller difference than this. 10 bit color for gamers is stupid imo, and just a tool to hype up GPU's.

 

Also, look at the HDR picture in the OP. There is no way it'll be that different in real world. 

easy. even the best of TN will have more washed out dark colors.... bright colors are the same, but dark colors, you CAN spot the difference.

 

 

They give 1 actual source from AMD themselves, and a bunch of people reporting on it/ forum posts...

I personally don't trust manufacturer (Especially AMD) sources being unbiased, any besides that? Maybe from Hynix?

Funnily enough on the Wikipedia it says development with AMD began in 2008, and on AMD's own slides it says 2007 with other sources saying 2006.

AMD prolly started looking into the concept in 2007 internally. Then got Hynix in on HBM around 2008.

AMD and Hynix ARE the "official" creators of HBM. known fact.

 

other sources are probably making guesses as to when AMD started their internal R&D process.

Link to comment
Share on other sites

Link to post
Share on other sites

YA Nvidia will match HBM memory that AMD developed and already uses and then they will dump SLI crap and use technology that AMD already has and then maybe they will have DX12 hardware support. Until then keeping buying there out of date cards.

 

Hold on there, AMD cards have T3-DX12 support yes. But Nvidia have T2-DX12.1 support. People seem to forget about this .1 at the end.... not that it will make any difference in the real world. If you think game developers are going to utilize things in DX12 that 70% (current market share of Nvidia GPU's) wont have. You can think again.

 

---

 

They claim that 5K is 78% more than 4K thats just misleading as hell. When we talk about (x)K display we are talking about horizontal pixel count. The HDR section is just total garbage we already have 10-bit displays and GPU's nothing new there at all. I dont like the images they are using at the top either, I have never seen a picture that looks like the left one.

Intel I9-9900k (5Ghz) Asus ROG Maximus XI Formula | Corsair Vengeance 16GB DDR4-4133mhz | ASUS ROG Strix 2080Ti | EVGA Supernova G2 1050w 80+Gold | Samsung 950 Pro M.2 (512GB) + (1TB) | Full EK custom water loop |IN-WIN S-Frame (No. 263/500)

Link to comment
Share on other sites

Link to post
Share on other sites

Hold on there, AMD cards have T3-DX12 support yes. But Nvidia have T2-DX12.1 support. People seem to forget about this .1 at the end.... not that it will make any difference in the real world. If you think game developers are going to utilize things in DX12 that 70% (current market share of Nvidia GPU's) wont have. You can think again.

A specific feature-set Nvidia was pushing rather hard to get through. But a good point, nevertheless.

Of course Nvidia and AMD aren't going to look passively, as studios choose what to feature and what not to feature.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Just when I bought the pg279q, and this news comes out. Good job AMD /sadface.

Link to comment
Share on other sites

Link to post
Share on other sites

Just when I bought the pg279q, and this news comes out. Good job AMD /sadface.

Return it, or sell it and buy a new one in 2-3 years.

 

Most of what they're announcing will not likely make it into consumer monitors for at least another year.

 

This is nothing but good news. Monitors have been relatively stagnant for years. Good to see more new trends pushing the technology forward.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×