Jump to content

NVIDIA Titan RTX Yields 99% Adobe RGB?

Roger3006

Hello Everyone,

 

This is my first post.  Let me warn Y'all, I am most likely the oldest SOB here not to mention a photographer, not a gamer.  I think the last video game I played was Pong in a hotel bar.  I do need help like a dead man needs a coffin so plese put up with me.

 

This has me a little confused. A Titan RXT was installed in a workstation that has a BenQ SW271 attached that will display 99% Adobe RGB. The monitor was calibrated with a Datacolor Spyder5 and sure enough, the report showed 99% Adobe RGB.

 

It is my understanding that only NVIDIA Quadro Series GPU’s will display 99% Adobe RGB utilizing 10-Bit color depth. The Titans are, in my opinion, between the consumer GPUs and the professional GPUs. I believe this is an 8-Bit card; however, I cannot find "Color Depth" anywhere in published specifications. Why am I getting 99% Adobe RGB? Something does not seem right.  Can Y'all explain this?  

 

 

Have a great day, evening or whatever is appropriate.

 

R

 
Reply
 
 
 
 
 
Link to comment
Share on other sites

Link to post
Share on other sites

As far as i know u can also get 10 Bit color from a 1080:
"High Dynamic Range, or HDR, isn't a new concept in photography. It isn't even new to PC gaming, as some of the oldest games with HDR (using simple bloom effects) date back to the Valve Source engine (early 2000s). Those apps, however, used the limited 24-bit (8-bit per color, 16.7 million colors in all) color palette to emulate HDR. Modern bandwidth-rich GPUs such as the GTX 1080 have native support for large color palettes, such as 10-bit (1.07 billion colors) and 12-bit (68.7 billion colors), to accelerate HDR content without software emulation. This includes support for 10-bit and 12-bit HVEC video decoding at resolutions of up to 4K @ 60 Hz, or video encoding at 10-bit for the same resolution."
Source: https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/3.html 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Welcome to the forum!

11 minutes ago, Roger3006 said:

I do need help like a dead man needs a coffin

First of, I would like to say that is a great analogy so I'm stealing it.

 

Second, GPU performance doesn't affect color accuracy, it's all on the monitor (which would also explain why color depth is not listed on the GPU specs). The Spyder 5 Elite is used to calibrate monitors. You will get the same color accuracy between a 1080 and the Titan RTX, assuming you used the same monitor. What the GPU will affect, photography wise, is the time it takes to render a photo or video.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Roger3006 said:

I cannot find "Color Depth" anywhere in published specifications.

From nvdias whitepaper of the 1080:  IT mentions support for up to 12 bit rec 2020 which is more than 10 bit 100%adobe rgb :)
And the 1080 is pure consumer so i just guess i find the same information also for the titan x.  

 

image.thumb.png.a6216646ee9786052108a2a90c71c163.png

Link to comment
Share on other sites

Link to post
Share on other sites

Go to nvidia control panel (right click on desktop);

Go to "display", "change resolution".

Then set the color depth:

image.thumb.png.002133fa5027037e84352221f237a49a.png

I only see your reply if you @ me.

This reply/comment was generated by AI.

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you for the replys.

 

A little more information on what I do and why.  I do product photography professionally that goes on the web.  Pardon the pun but I shoot firearms that are unique.  I bought the Titan RXT to go in an Intel Core i9 7900X for shooting tethered.  Meaning a Nikon D810 is connected to the computer via a USB 3 cable.  I use Capture One Pro 12.0 for a RAW processing engine and, normally, it works great and is fast.  I cannot set up for the next shot until the current shot is rendered and I see it.  The combination I just mentioned renders a 40mb+ file in about 2.5 seconds from when the shutter is tripped until I see the image  (currently using 2X 1080 Ti).  I also do some printing both inhouse and with custom labs for specility stuff.  That is were 99% Adove RGB comes in.  It is important to seen on the screen exactly what prints.

 

I upgraded two workstations right before the end of the year.  I am still in the process of getting things worked out.  I do have a  couple of AMD cards that are 10-Bit color but are disappointing from a performace standpoing.  One is a WX 8200 which has a certified driver.  The others are 2X Vega Frontier Edition GPUs which do support 10-Bit color depth but the driver is not certified.  I could care less about a certified driver as long as it works.  The AMD drivers are terrible.

 

If NVIDIA prosumer/consumer GPUs will support 10Bit color the AMDs are heading for eBay.  It has always been my impression the only way to to get 10-Bit color depth was with a professional card.  The only exception I found was the AMD Vega Frontier Edition.

 

Thank Y'all again.

 

R

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Roger3006 said:

 

If NVIDIA prosumer/consumer GPUs will support 10Bit color the AMDs are heading for eBay.  It has always been my impression the only way to to get 10-Bit color depth was with a professional card.  The only exception I found was the AMD Vega Frontier Edition.

From what i know the Professional cards like quadros have specific strengths where it outperforms a consumer card but those are features consumers would not use anyway. In the current gens 9,10 and 20s series lots of this features were made available to the consumers. Like RTX now supporting nv-link the "better SLI". For the most people a professional gpu makes no sense. 

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Klemmbrett said:

From nvdias whitepaper of the 1080:  IT mentions support for up to 12 bit rec 2020 which is more than 10 bit 100%adobe rgb :)
And the 1080 is pure consumer so i just guess i find the same information also for the titan x.  

 

image.thumb.png.a6216646ee9786052108a2a90c71c163.png

 

Our terminology may be confusing.  To me, HDR in photography means exactly what is says.  My Nikons are capable of I believe 13 stops of dynamic range.  HDR processing is when identical shots with different exposers are combined to create an image that has a higher dynamic range than the camera is capable of reproducing.

 

I have a couple of LG monitors that are 10-Bit and display 1.07 Billion colors that support 100% sRGB but not 99% Adobe RGB.  I believe it has to do with accuracy.  In my opinion, one is splitting hairs when comparing sRGB to Adobe RGB from a practical standpoint.  For me, the only time it matters is when printing.  It would take a very good idea to see the difference between an image displayed in sRGB or 99% Adobe RGB.  It may be me trying to convenience myself my old eyes can see the difference.

 

Thanks again,

 

R

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Roger3006 said:

I have a couple of LG monitors that are 10-Bit and display 1.07 Billion colors that support 100% sRGB but not 99% Adobe RGB.  I believe it has to do with accuracy.  In my opinion, one is splitting hairs when comparing sRGB to Adobe RGB from a practical standpoint.  For me, the only time it matters is when printing.  It would take a very good idea to see the difference between an image displayed in sRGB or 99% Adobe RGB.  It may be me trying to convenience myself my old eyes can see the difference.

Well im currently not sure what display to buy because a part of me wants high refresh rate gaming with gsync and the other wants an HDR display with nice colors and brightness contrast. And currently the only one display that does both is the 2-2,5 K  euro depending on stock and where you look. Wont necessarily need 4 k,  2560*1440 qhd is fine but there you cant find display that does it all. Not willing to spend 2k + on a display atm. 

Link to comment
Share on other sites

Link to post
Share on other sites

You are right about Pro GPUs being very good at some things and not others.  The big deal with a Pro GPU is a certified driver that produces accurate results in things that do not concern me.  I am not simulating anything.  I just want accurate color, specifically 99% Adobe RGB.  It is little known that consumer cards are capable of 10-Bit Adobe RGB.  It was pleasant surprise.  Granted, at little late but nothing that cannot be undone.

 

There have been many discussions about this topic in photography forums on this topic.  I have never seen it mentioned that any of the NIVIDA consumer cards are capable of 10-Bit color much less 99% Adobe RGB.  As mentioned above, the only noncertified GPU that produces advertises true 10-Bit color is the AMD Vega Frontier Edition.  Between now and the end of the weekend I will test one of my GTX 1080 Ti GPUs with the BenQ SW271 that is designed specifically for photography.

 

I started a similar thread on a photography forum.  I am anxious to see what is discussed.

 

Thanks again,

 

R

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Klemmbrett said:

Well im currently not sure what display to buy because a part of me wants high refresh rate gaming with gsync and the other wants an HDR display with nice colors and brightness contrast. And currently the only one display that does both is the 2-2,5 K  euro depending on stock and where you look. Wont necessarily need 4 k,  2560*1440 qhd is fine but there you cant find display that does it all. Not willing to spend 2k + on a display atm. 

I can't help you on the gaming side, but I can tell you what I am currently using.  One workstation has a BenQ PD2700U and a BenQ SW271.  The SW271 was designed for photographers and does support 99% Adobe RGB.  The PD2700U is excellent but only displays 100% sRGB.  Both are 10-Bit and I like both.  They do not have a fast refresh rate which, to my understanding, is important to gaming.

 

The other machine had one BenQ PD2700Us and two LG 27UD68P-B monitors.  The later are not used for editing, just looking.  All are more than adequate. One of the LGs is not even on my desk.  It is supported from an arm in a position that is easily visible from my shooting position.  It displays the image I just shot, or I use it in "Live View" meaning I can see what the camera sees before I take the shot.  All are helpful with my old eyes especially the sharpness I am getting with 4K resolution.  It has come a long way from the IBM PC I bought in 1984.

 

Thanks again,

 

R

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Roger3006 said:

 

There have been many discussions about this topic in photography forums on this topic.  I have never seen it mentioned that any of the NIVIDA consumer cards are capable of 10-Bit color much less 99% Adobe RGB.  As mentioned above, the only noncertified GPU that produces advertises true 10-Bit color is the AMD Vega Frontier Edition.  Between now and the end of the weekend I will test one of my GTX 1080 Ti GPUs with the BenQ SW271 that is designed specifically for photography.

Didnt you look at my earlier reply?

I showed you how i enabled 10 bit color on my 1080ti which is very much a consumer card.

But i think i can only do that with displayport cable, and with 10bit color enabled on my LG monitor.

 

11 hours ago, Origami Cactus said:

Go to nvidia control panel (right click on desktop);

Go to "display", "change resolution".

Then set the color depth:

image.thumb.png.002133fa5027037e84352221f237a49a.png

 

I only see your reply if you @ me.

This reply/comment was generated by AI.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Origami Cactus said:

Didnt you look at my earlier reply?

I showed you how i enabled 10 bit color on my 1080ti which is very much a consumer card.

But i think i can only do that with displayport cable, and with 10bit color enabled on my LG monitor.

 

 

Sorry, I did see you post and thought I commented on it.  I don't always get the option; however, when I do, it makes no difference.

 

I have done calibrations with both workstation cards and Nvidia Titan RXT.  I get the same results regardless of whether it is set to 8-Bit or 10-Bit.  

 

The monitors are assigned a custom color profile after calibration.  Not sure but that might override everything.

The good new is great results.  I am getting a tremendous boost for a whole lot less money, performance/$$$.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 months later...

I have a 1080 connected to a pair of P2715Q displays. The displays are advertised and spec'd as 10-bit ... but the 1080 is not capable of the 10-bit. I have a similar question regarding the Titan RTX: can it push full 4k over DP to two displays in 10-bit?

image.thumb.png.411bb8255542ff4e4bffb0c0c1d2e536.pngimage.thumb.png.02c9304165aac378293f42310efbdac4.pngimage.thumb.png.15a7827978dc8be9355aaa5c1fa330fe.png

VR Snob ... looking for ultimate full-power full-portable no-compromise VR Box ... Streacom's DA2 starting to look good ...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×