Jump to content

GPU color quality

Nik Balor

Is there any difference in color quality between Nvidia and AMD cards? because switching to AMD gave me better quality image, or is it just the how Nvidia GPUs handles the HDMI signal ?!!

Link to comment
Share on other sites

Link to post
Share on other sites

You can tune color accuracy in their control apps, so there's only difference in default settings, not actual ability to deliver accurate colors.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

As the sir mentioned above you could change the color accuracy or temperature through their apps.

But if u want better with any GPU buy IPS panel monitor.

Gaming Mouse Buying Guide (Technical Terms,Optical vs Laser,Mice Recommendation,Popular Mouse Sensor,Etc)

[LOGITECH G402 REVIEW]

I love Dark Souls lore, Mice and Milk tea  ^_^ Praise The Sun! \[T]/

 

 

 

I can conquer the world with one hand,As long as you hold the other -Unknown

Its better to enjoy your own company than expecting someone to make you happy -Mr Bean

No one is going to be with you forever,One day u'll have to walk alone -Hiromi aoki (avery)

BUT the one who love us never really leave us,You can always find them here -Sirius Black

Don't pity the dead,Pity the living and above all those who live without love -Albus Dumbledore

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hello, I don't know why you see color differences with the two, I have had similar situations where I connected different cards to the same monitor but I have never spotted the difference. Color accuracy is mainly related to monitor's quality. It is adjustable by the built-in tools that comes with your monitor and good monitors comes calibrated out of the box from the factory. You can manually fine calibrate them afterwards but it is a tricky and expensive process. Make sure both card's software are set to default color settings and technically there should't be any difference. Tho I might be wrong and it would be interesting if someone can provide more knowledge about it.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, pentotark said:

Hello, I don't know why you see color differences with the two, I have had similar situations where I connected different cards to the same monitor but I have never spotted the difference. Color accuracy is mainly related to monitor's quality. It is adjustable by the built-in tools that comes with your monitor and good monitors comes calibrated out of the box from the factory. You can manually fine calibrate them afterwards but it is a tricky and expensive process. Make sure both card's software are set to default color settings and technically there should't be any difference. Tho I might be wrong and it would be interesting if someone can provide more knowledge about it.

 

 

i just read an article that talks about how poorly Nvidia GPUs handle the HDMI signal ! will it be the same with the display port ?! read this pcmonitors article.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, pentotark said:

Interesting read. Now that I think of it, it is a while I do not use hdmi.

yeah maybe that's why you don't notice the color difference

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Nik Balor said:

i just read an article that talks about how poorly Nvidia GPUs handle the HDMI signal ! will it be the same with the display port ?! read this pcmonitors article.

Did you try to follow the solutions proposed by the articles? In my case Display Port automatically sets itself to full dynamic range of rgb space.

Link to comment
Share on other sites

Link to post
Share on other sites

Is it really the graphics card though? Sometimes it's the monitor.

My own monitor is when using HDMI is trashy because of the it's post-processing, older models have this, but is still very good on VGA.

Link to comment
Share on other sites

Link to post
Share on other sites

Well you know how it goes:

a55.jpg

 

Seriously though colour quality depends on the display not the graphics card, tuning it is a must if you'll work with it professionally.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

 

1 hour ago, Princess Cadence said:

Seriously though colour quality depends on the display not the graphics card, tuning it is a must if you'll work with it professionally.

This is normally true but there are some exceptions. The article posted before by the author says sometimes the GPU classifies the monitor as a TV, limiting the color range displayed and therefore color quality. It is indeed a GPU issue but easily fixable. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/27/2017 at 6:33 AM, pentotark said:

Did you try to follow the solutions proposed by the articles? In my case Display Port automatically sets itself to full dynamic range of rgb space.

no i haven't tried it actually , in that case maybe i'll just change the HDMI and use display port instead

 

On 11/27/2017 at 6:46 AM, Princess Cadence said:

Well you know how it goes:

a55.jpg

 

Seriously though colour quality depends on the display not the graphics card, tuning it is a must if you'll work with it professionally.

 

On 11/27/2017 at 7:04 AM, pentotark said:

This is normally true but there are some exceptions. The article posted before by the author says sometimes the GPU classifies the monitor as a TV, limiting the color range displayed and therefore color quality. It is indeed a GPU issue but easily fixable. 

that's exactly what im talking about , sometimes the GPU classifies the monitor as a TV 

 

On 11/27/2017 at 6:40 AM, Schiwata said:

Is it really the graphics card though? Sometimes it's the monitor.

My own monitor is when using HDMI is trashy because of the it's post-processing, older models have this, but is still very good on VGA.

what monitor are you using ?

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 year later...

usually its because the gpu screws up and tries to treat the monitor as a tv and sets dynamic range to limited instead of full.  you can change it in nvidia control panel under resolution.  you have to scroll down and change dynamic color range from limtied to full.  if you have washed out colors then you are using limited range settings on a full range display and if you have the opposite and its overly black in dark areas then you are using full color range on a limited display.  any other difference is just down to color profile and can be changed.  people say its because of texture compression but both companies use delta compression that is lossless and some poeople say its because that amd gaming cards support 10 bit color while for nvbidia you have to have a quadro but games dont support 10 bit color and not many apps do either so it really makes no difference (HDR excepted)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 1/22/2019 at 9:34 AM, bobhumplick said:

usually its because the gpu screws up and tries to treat the monitor as a tv and sets dynamic range to limited instead of full.  you can change it in nvidia control panel under resolution.  you have to scroll down and change dynamic color range from limtied to full.  if you have washed out colors then you are using limited range settings on a full range display and if you have the opposite and its overly black in dark areas then you are using full color range on a limited display.  any other difference is just down to color profile and can be changed.  people say its because of texture compression but both companies use delta compression that is lossless and some poeople say its because that amd gaming cards support 10 bit color while for nvbidia you have to have a quadro but games dont support 10 bit color and not many apps do either so it really makes no difference (HDR excepted)

but some games really benefit from the 10-bit color, and wouldn't it affect and lower the fps if i changed the color range from limited to full?

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...
On 1/30/2019 at 8:16 AM, Nik Balor said:

but some games really benefit from the 10-bit color, and wouldn't it affect and lower the fps if i changed the color range from limited to full?

no games benefit from 10 bit color.  an app or a game has to be written to take advantage of 10 bit color.  limited and full color are something different.  they are not the same thing as 10 bit or 8 bit.  in limited color the first 4-5 colors shades are cut off and the last 4-5 color shades.  thats out of 256 or 64000 i dont remember.  i think its 256 or so.  if your display supports full then you need to use full.  if it supports limited you need to use limited.  but some tvs need full and some limited.  a very small number of monitors use limited but most are full.  there is an easy way to tell as i said above.  

 

"if you have washed out colors then you are using limited range settings on a full range display and if you have the opposite and its overly black in dark areas then you are using full color range on a limited display.  any other difference is just down to color profile and can be changed."

 

as far as 10 bit color that is for video editing (like in adobe premiere).  NO GAME SUPPORTS 10 BIT COLOR PERIOD!!

Link to comment
Share on other sites

Link to post
Share on other sites

  • 11 months later...
On 2/17/2019 at 6:57 PM, bobhumplick said:

The statement that no games support 10 bit is incorrect...for Forza if you run HDR at 8 bit vs 10 bit, there is a dramatic difference in the baning on the car bodies.  8 bit, almost looks like stripes or CAD, 10 bit, smooth as silk

 

no games benefit from 10 bit color.  an app or a game has to be written to take advantage of 10 bit color.  limited and full color are something different.  they are not the same thing as 10 bit or 8 bit.  in limited color the first 4-5 colors shades are cut off and the last 4-5 color shades.  thats out of 256 or 64000 i dont remember.  i think its 256 or so.  if your display supports full then you need to use full.  if it supports limited you need to use limited.  but some tvs need full and some limited.  a very small number of monitors use limited but most are full.  there is an easy way to tell as i said above.  

 

"if you have washed out colors then you are using limited range settings on a full range display and if you have the opposite and its overly black in dark areas then you are using full color range on a limited display.  any other difference is just down to color profile and can be changed."

 

as far as 10 bit color that is for video editing (like in adobe premiere).  NO GAME SUPPORTS 10 BIT COLOR PERIOD!!

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×