Jump to content

Ghetto 10 bit colour?

NeverNotExhausted

Evening gents, I've got a bit of a weird question to ask.

 

My partner is currently studying and working as a photographer and she's been using a terrible MSI all in one i3 setup for her editing work that's on the very verge of dying. So I'm planning on getting her a nice rig and monitor to help her out. My intentions are to get her something along the lines of a xeon 1231v3 with a decent amount of ram and a nice GPU for OpenCL, something like a 290 or 390 depending on prices on release. But my issue here will be the lack of 10 bit colour on consumer cards (at least I think so?), plus I've not got the cash to shell out for a workstation card, new rig and a hella awesome monitor.

 

So my theory that I'm not sure will work is getting an extreme low end card like a Quadro K420 that I can use on one display port input to go alongside the consumer gpu that can output to the other displayport input. That way she gets the GPU grunt to game with and use it's display port input while gaming. But while doing photo and video editing she can switch over to the K420's input, get 10 bit colour for her editing needs and let the Consumer GPU do the actual editing? Essentially just using the K420 as a method to add 10 bit colour support to the editing rig and not really put any strain on it at all ever.

 

Also could I just have the K420 hooked up to the monitor at all times and have the consumer GPU doing the work in games and still output the same FPS that it would just using the consumer GPU hooked up to the screen?

 

Is something like this possible or even logical to do?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

My r9 290's in CCC were set to 10bit color by default (there's an 8, 10, and 12bit option iirc). I'm pretty sure they are 10bit capable, or CCC is messed up. I will look again later when I get home.

 

*edit* nevermind, found a forum post elsewhere with a picture of the option in CCC. 

Capture006.jpg

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

My r9 290's in CCC were set to 10bit color by default (there's an 8, 10, and 12bit option iirc). I'm pretty sure they are 10bit capable, or CCC is messed up. I will look again later when I get home.

 

*edit* nevermind, found a forum post elsewhere with a picture of the option in CCC. 

Capture006.jpg

 

Thanks anyways mate :)

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Thanks anyways mate :)

 

yeah checked my own control center and same deal. can select from 8, 10 and 12 bit color options for regular Hawaii GPU's.

post-62868-0-10689100-1427934776_thumb.p

post-62868-0-10689100-1427934776_thumb.p

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

yeah checked my own control center and same deal. can select from 8, 10 and 12 bit color options for regular Hawaii GPU's.

 

Yeah I've seen this in a few other forum posts and apparently it doesn't actually enable 10 bit colour on consumer GPU's, So I'm not really sure what that option does, one forum (anandtech I think?) have theorised that it is controlling the mode the panel works in but doesn't change the actual output of the card. It also comes up on my 7870 rig I've got kicking about but I've not got a 10 bit panel to test the theory with yet :/

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×