Jump to content

GPU advice for planned build (10-bit worth it?)

PaulC1

Hey everyone, hope you are all well!

 

I'm just looking for some advice on an upcoming build I'm planning for my friend's dad and was hoping some of you would be able to provide some insight. I'll try and keep this short.

 

I have pretty much everything for the build sorted apart from the GPU. Basically I am stuck between deciding upon either a 'standard' card (probably 1650 Super or 5500 XT, or similar) or a Radeon Pro card (either WX 3100 or 3200) - Quadro options at the <£200 price range seems to be largely underwhelming, which is why I'm only considering the Radeon Pro's here.

 

The person who will eventually be using this PC uses Photoshop professionally on an almost daily basis. For this, he has a Wacom Cintiq Pro which is 4k and 10-bit colour capable. That last part is where my dilemna stems from as it is my understanding that only the Quadro and Radeon/Fire Pro series of cards are capable of displaying full, true 10-bit colour with photoshop and similar software? However, what I don't know if this would actually be of any real benefit to him. The work he normally creates is 2-D, comic-style drawings and I suppose not what would usually be considered intensely colour-complicated; however it is still his daily line of work and I want to try and give him the best possible experience.

 

I have personally never used photoshop for an extensive amount of time or ever created colour-sensitive digital artwork, so as it stands I really have no idea how much of a real-world impact having true 10-bit colour would make to his overall experience. The 10-bit capability is the only reason I'm considering a Radeon Pro card - he won't be using this PC for 3-D modelling, physical simulations or really any of the other 'Pro' features these cards enable. I just know his Cintiq can display 10-bit.

 

So, what do you all think? In every other way, the Radeon Pro card seems to be worse than say a GTX card for a similar price, but then again I have never experienced 10-bit so can't say if it is a significant upgrade from standard, 8-bit colour depth. Does anyone here use Photoshop in this way or know if the 10-bit experience is only advantageous in very controlled, specific circumstances? I would just rather save him £40/50 if he won't even truly be able to use this feature.

 

Regardless of the final decision, he is currently using an old iMac to power his Cintiq off of on-board Intel graphics, which locks the Cintiq at 1440p & 8-bit - so either way, I'm sure his experience will be greatly improved. He also doesn't intend to game with this machine or has ever used 10-bit in the past as far as I know.

 

Any help with this will be much appreciated, thank you! :)

Mirror's Edge 2. One day.

Link to comment
Share on other sites

Link to post
Share on other sites

Hey there,

the 5500XT should be a good workhorse for this job, Radeon usually beats GeForce in productivity stuff so yeah I’ve linked a decent model for your use case: 

https://skinflint.co.uk/2196514?hloc=uk&v=e 
As for 10-bit I don’t know why a DisplayPort 1.4 link shouldn’t output this like a HDMI 2.0 port so the depth is more dependent on the monitor you’re using. 
Cheers! 

CPU: AMD Ryzen 9 3900X | Motherboard: ASUS ROG Crosshair VIII Formula | GPU: AMD Radeon RX 6900XT | RAM: 2*16GB 3200MHz 14-14-14-34 G.Skill Trident Z Royal Silver | SSD: 2TB Gigabyte Aorus Gen 4; 500GB Samsung 860 EVO | CPU Cooler: Corsair H115i RGB Platinum | PSU: Corsair AX850 | Case: Corsair Crystal 680X | Monitor: Eve Spectrum 4K | Keyboard: Logitech G513 Romer-G Tactile | Mouse: Logitech G502 Lightspeed

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×