Jump to content
Phishing Emails & YouTube Messages - Fake Giveaway Read more... ×
Search In
  • More options...
Find results that contain...
Find results in...
MandicReally

10-Bit color, Quadro or Blackmagic Decklink?

Recommended Posts

Posted · Original PosterOP

So I'm going to be moving to a GH5 very my primary camera very soon. One factor of this decision is 10-bit color. I already have a 4k 10-Bit (8+FRC) monitor for my reference monitor while editing. However I've only just realized that consumer graphics cards apparently only put out 8-Bit except in Direct-X applications. Which for me video Editing is quite useless. 

 

So I'm looking at what makes sense for me. I need a new graphics card soon anyway but the price of even a P4000 Quadro is silly for what the card is. So I'm trying to fgiufee out if a Blackmagic Decklink would allow me to use a Consumer card for acceleration and the Decklink for my 10-bit display. This would also allow me to give DaVinci Resolve a better shot since I hate single monitor editing. 

 

Is anyone familiar with the Decklinks and whe her one of them will allow me to do what I need? I dislike that they don't have a display port version from what I can tell but I think HDMI should be OK. 


Ryzen 7 2700X , Gigabyte AX370-Gaming-K5, EKWB Monoblock, Nvidia GTX 980, Bitspower Full coverage block, g.Skill Ripjaws V 32GB PC2800, Dual Alphacool Pump/Res Combos, Dual EKWB SE360 Radiators, Corsair RM750x PSU. All in a Thermaltake Tower 900 case.

Link to post
Share on other sites
27 minutes ago, MandicReally said:

So I'm going to be moving to a GH5 very my primary camera very soon. One factor of this decision is 10-bit color. I already have a 4k 10-Bit (8+FRC) monitor for my reference monitor while editing. However I've only just realized that consumer graphics cards apparently only put out 8-Bit except in Direct-X applications. Which for me video Editing is quite useless. 

 

So I'm looking at what makes sense for me. I need a new graphics card soon anyway but the price of even a P4000 Quadro is silly for what the card is. So I'm trying to fgiufee out if a Blackmagic Decklink would allow me to use a Consumer card for acceleration and the Decklink for my 10-bit display. This would also allow me to give DaVinci Resolve a better shot since I hate single monitor editing. 

 

Is anyone familiar with the Decklinks and whe her one of them will allow me to do what I need? I dislike that they don't have a display port version from what I can tell but I think HDMI should be OK. 

Well HDMI 1.2 will e ok.  You get your 10-bit with 4k resolution.  Most good 4k monitors are 10-bit and what not.


Asus Sabertooth x79 / 4930k Hexa @ 4617 @ 1.440v / Gigabyte WF 2080 RTX / Corsair VG 64GB @ 1915 & AX1600i & H115i Pro @ 2x Noctua NF-A14 / Carbide 330r Blackout Edition

AOC 40" 4k Curved Monitor / Samsung 40" 4k TV Flat 120 MR / LaCie Porsche Design 2TB & 500GB / Samsung 950 Pro 500GB / 850 Pro 500GB / Crucial m4 500GB / m.2 Card

Scarlett 2i2 Audio Interface / KRK Rokits 10" / Sennheiser HD 650 & HD 4.50BT SE / Logitech G910 & G700s & C920 / SL 88 Grand / Cakewalk / Win10 / NF-A14 Intake / NF-P12 Exhaust

Link to post
Share on other sites

HDMI 1.2 can only do 2k,

 

OP what is your budget? 

 

The new Radeon VII cards may be able to do 10 bit,

 

I'll look into it in a bit


Current Build

Spoiler

System

  • CPU
    Ryzen 2700x
  • Motherboard
    ASrock x470 Fatal1ty k4
  • RAM
    16GB
  • GPU
    EVGA RTX 2080 Ti Black
  • Case
    Corsair 570x
  • Storage
    480gb SSD
  • PSU
    Thermaltake Smart M 650W 80+ Bronze
  • Display(s)
    27 inch Dell S2716DG
  • Cooling
    Wraith Prism
  • Keyboard
    Razer Huntsman
  • Mouse
    Corsair M65 pro
  • Sound
    Beyerdynamic dt770
  • Operating System
    Windows 10

 

Link to post
Share on other sites
Posted · Original PosterOP
4 minutes ago, Emanbaird said:

HDMI 1.2 can only do 2k,

 

OP what is your budget? 

 

The new Radeon VII cards may be able to do 10 bit,

 

I'll look into it in a bit

I haven't heard anything about the VII and 10-Bit yet. It's predecessor the Vega 64 didn't support it. Since the VII is same architecture I doubt it does. The AMD Firepro carss do however.

 

My budget is as little as I can get away with. I have a very small side business with videography that is growing but not profitable yet. So the more I put out the worse it is. It's also my primary hobby so that mitigates the pain a bit. However I deal with car stuff, so color accuracy is going to become more and more important for me. 


Ryzen 7 2700X , Gigabyte AX370-Gaming-K5, EKWB Monoblock, Nvidia GTX 980, Bitspower Full coverage block, g.Skill Ripjaws V 32GB PC2800, Dual Alphacool Pump/Res Combos, Dual EKWB SE360 Radiators, Corsair RM750x PSU. All in a Thermaltake Tower 900 case.

Link to post
Share on other sites

Op I've been looking into it and it looks like you may be getting bullied into getting a fire-pro or a Quadro, i'm seeing somewhere that there is evidence of the vega frontiers supporting 10 bit, but online it seems like there is some confusion between directX full-screen 10bit, and professional applications and 10bit 


Current Build

Spoiler

System

  • CPU
    Ryzen 2700x
  • Motherboard
    ASrock x470 Fatal1ty k4
  • RAM
    16GB
  • GPU
    EVGA RTX 2080 Ti Black
  • Case
    Corsair 570x
  • Storage
    480gb SSD
  • PSU
    Thermaltake Smart M 650W 80+ Bronze
  • Display(s)
    27 inch Dell S2716DG
  • Cooling
    Wraith Prism
  • Keyboard
    Razer Huntsman
  • Mouse
    Corsair M65 pro
  • Sound
    Beyerdynamic dt770
  • Operating System
    Windows 10

 

Link to post
Share on other sites
Posted · Original PosterOP
36 minutes ago, Emanbaird said:

Op I've been looking into it and it looks like you may be getting bullied into getting a fire-pro or a Quadro, i'm seeing somewhere that there is evidence of the vega frontiers supporting 10 bit, but online it seems like there is some confusion between directX full-screen 10bit, and professional applications and 10bit 

Yea that is what I keep coming up with as well.  Folks not understanding the difference in 10bit scenarios and that the only cards that support it are Professional cards.

The issue is I am having a hard time wading through Blackmagic's information but it seems to indicate that their Decklink series of cards CAN output 10bit with a very affordable card.  However they only have SDI or HDMI outputs and any afffordable 10bit professional monitors I've found have 30hz HDMI inputs and that is just a no go for real use. 


Ryzen 7 2700X , Gigabyte AX370-Gaming-K5, EKWB Monoblock, Nvidia GTX 980, Bitspower Full coverage block, g.Skill Ripjaws V 32GB PC2800, Dual Alphacool Pump/Res Combos, Dual EKWB SE360 Radiators, Corsair RM750x PSU. All in a Thermaltake Tower 900 case.

Link to post
Share on other sites

Do they make SDI to DP that would support 4k i wonder?

 

I've been using SDI for a while on my CRT BVM's PVM's best quality i've ever gotten 

 


Current Build

Spoiler

System

  • CPU
    Ryzen 2700x
  • Motherboard
    ASrock x470 Fatal1ty k4
  • RAM
    16GB
  • GPU
    EVGA RTX 2080 Ti Black
  • Case
    Corsair 570x
  • Storage
    480gb SSD
  • PSU
    Thermaltake Smart M 650W 80+ Bronze
  • Display(s)
    27 inch Dell S2716DG
  • Cooling
    Wraith Prism
  • Keyboard
    Razer Huntsman
  • Mouse
    Corsair M65 pro
  • Sound
    Beyerdynamic dt770
  • Operating System
    Windows 10

 

Link to post
Share on other sites
Posted · Original PosterOP
55 minutes ago, Emanbaird said:

Do they make SDI to DP that would support 4k i wonder?

 

I've been using SDI for a while on my CRT BVM's PVM's best quality i've ever gotten 

 

The only thing I can find is SDI to HDMI 2.0. Seems like other people are using HDMI 2.0 to DP 1.2 active converters. Which is probably my best overall option. But Blackmagic doesn't support that or make one themselves. So it could just not work and I'd be up a creek. 

 

It's looking like I either have to do that or save my pennies and buy a used Quadro P5000. Which still wouldn't allow me to use DaVinci Resolve but Blackmagic is making that unappealing. 


Ryzen 7 2700X , Gigabyte AX370-Gaming-K5, EKWB Monoblock, Nvidia GTX 980, Bitspower Full coverage block, g.Skill Ripjaws V 32GB PC2800, Dual Alphacool Pump/Res Combos, Dual EKWB SE360 Radiators, Corsair RM750x PSU. All in a Thermaltake Tower 900 case.

Link to post
Share on other sites

OP, i would call AMD or post on their forms and ask if the VII can do 10 bit, because i know the VII is based off of one of their pro cards

 


Current Build

Spoiler

System

  • CPU
    Ryzen 2700x
  • Motherboard
    ASrock x470 Fatal1ty k4
  • RAM
    16GB
  • GPU
    EVGA RTX 2080 Ti Black
  • Case
    Corsair 570x
  • Storage
    480gb SSD
  • PSU
    Thermaltake Smart M 650W 80+ Bronze
  • Display(s)
    27 inch Dell S2716DG
  • Cooling
    Wraith Prism
  • Keyboard
    Razer Huntsman
  • Mouse
    Corsair M65 pro
  • Sound
    Beyerdynamic dt770
  • Operating System
    Windows 10

 

Link to post
Share on other sites
Posted · Original PosterOP
50 minutes ago, Emanbaird said:

OP, i would call AMD or post on their forms and ask if the VII can do 10 bit, because i know the VII is based off of one of their pro cards

 

I completely forgot that folks noted it is just a workstation card repurposed. I'll have to look into it. That would give me the 16GB vram, be half the price of a used P5000 and make my system fully AMD (like that really matters). 


Ryzen 7 2700X , Gigabyte AX370-Gaming-K5, EKWB Monoblock, Nvidia GTX 980, Bitspower Full coverage block, g.Skill Ripjaws V 32GB PC2800, Dual Alphacool Pump/Res Combos, Dual EKWB SE360 Radiators, Corsair RM750x PSU. All in a Thermaltake Tower 900 case.

Link to post
Share on other sites
7 hours ago, Emanbaird said:

OP, i would call AMD or post on their forms and ask if the VII can do 10 bit, because i know the VII is based off of one of their pro cards

 

6 hours ago, MandicReally said:

I completely forgot that folks noted it is just a workstation card repurposed. I'll have to look into it. That would give me the 16GB vram, be half the price of a used P5000 and make my system fully AMD (like that really matters). 

10-bit support on OpenGl applications is, unfortunately, still limited to AMD's FirePro and Nvidia's Quadro cards. It's not so much a matter of them having a different or superior architecture but rather the drivers that accompany the card.

 

That being said, if you absolutely need the precision of 10-bit for some reason, then using an SMPTE-approved video card to a broadcast reference display over SDI is the way to go.

Outputting a 10-bit signal to your 8-bit monitor is not going to give you that reference image.

The monitor simply doesn't have the capability to show the full set of colors. 

You need a professionally calibrated, reference quality monitor. Without it, any fancy signal being provided to your end display is worthless.

 

Link to post
Share on other sites

I wonder with all of these driver mods flying around I wonder if someone could edit a driver for it


Current Build

Spoiler

System

  • CPU
    Ryzen 2700x
  • Motherboard
    ASrock x470 Fatal1ty k4
  • RAM
    16GB
  • GPU
    EVGA RTX 2080 Ti Black
  • Case
    Corsair 570x
  • Storage
    480gb SSD
  • PSU
    Thermaltake Smart M 650W 80+ Bronze
  • Display(s)
    27 inch Dell S2716DG
  • Cooling
    Wraith Prism
  • Keyboard
    Razer Huntsman
  • Mouse
    Corsair M65 pro
  • Sound
    Beyerdynamic dt770
  • Operating System
    Windows 10

 

Link to post
Share on other sites
Posted · Original PosterOP
5 hours ago, LyondellBasell said:

 

That being said, if you absolutely need the precision of 10-bit for some reason, then using an SMPTE-approved video card to a broadcast reference display over SDI is the way to go.

Outputting a 10-bit signal to your 8-bit monitor is not going to give you that reference image.

The monitor simply doesn't have the capability to show the full set of colors. 

You need a professionally calibrated, reference quality monitor. Without it, any fancy signal being provided to your end display is worthless.

 

That is a fairly ridiculous stance quite honestly. Is the setup you are referring to ideal? Yes, but saying that having a 10bit output on anything other than the best of the best is pointless is ridiculous. 

 

True 10bit monitors are available for around $1000 and are calibrated for color work. The 4k 8bit+FRC (faux 10bit) monitor I have is a Dell factory calibrated unit. Colors look better on that monitor than they do on either of my other two and seem to better represent what I see across a wide spectrum of monitors. 

 

How can improving the display accuracy of what I am seeing be a bad thing? It could always be better but to say that incremental improvements are pointless unless you are going all the way is just stupid. If I'm shooting 4:2:2 10bit footage why wouldn't I want to try and view that as clearly as I can afford to? 


Ryzen 7 2700X , Gigabyte AX370-Gaming-K5, EKWB Monoblock, Nvidia GTX 980, Bitspower Full coverage block, g.Skill Ripjaws V 32GB PC2800, Dual Alphacool Pump/Res Combos, Dual EKWB SE360 Radiators, Corsair RM750x PSU. All in a Thermaltake Tower 900 case.

Link to post
Share on other sites
Posted · Original PosterOP

The Radeon VII supports 10-Bit as best as I can tell. It's had some bugs with 10Bit display that the new drivers are going to address. 

 

So Radeon VII may be for me if someone makes a good waterblock for it. 

 

 

Screenshot_20190212-121112.png


Ryzen 7 2700X , Gigabyte AX370-Gaming-K5, EKWB Monoblock, Nvidia GTX 980, Bitspower Full coverage block, g.Skill Ripjaws V 32GB PC2800, Dual Alphacool Pump/Res Combos, Dual EKWB SE360 Radiators, Corsair RM750x PSU. All in a Thermaltake Tower 900 case.

Link to post
Share on other sites

Support for 10-bit color/signal works differently with a Quadro GPU and a broadcast card like a Decklink.  Do not assume they provide the same solution to your requirements.  A Decklink card should be paired with a broadcast/reference monitor that supports 10-bit signals, while a Quadro should be paired with a computer monitor that supports 10-bit color.

Link to post
Share on other sites
On 2/11/2019 at 3:07 AM, MandicReally said:

That is a fairly ridiculous stance quite honestly. Is the setup you are referring to ideal? Yes, but saying that having a 10bit output on anything other than the best of the best is pointless is ridiculous. 

I'm sorry you view it as ridiculous.

 

The point I'm trying to convey is, there's no point in chasing a 10-bit workflow pipeline, if the display, the end link in the chain, isn't 10-bit. If you TRULY need 30-bit color for critical color work, then get an actual, 10-bit reference monitor that supports the full spectrum you need to work in.

 

It is perfectly okay to use an 8-bit monitor to work on photos and video. The majority of people will not notice, and the majority of consumer-level tasks will not *greatly* benefit from the increased colors available in 10-bit. 

On 2/11/2019 at 3:07 AM, MandicReally said:

True 10bit monitors are available for around $1000 and are calibrated for color work. The 4k 8bit+FRC (faux 10bit) monitor I have is a Dell factory calibrated unit. Colors look better on that monitor than they do on either of my other two and seem to better represent what I see across a wide spectrum of monitors.

You're right! Having a properly *calibrated* monitor of any sort is FAR more valuable and impactful than any sort of extra color tech you can buy. I would definitely recommend your Dell over your other two for any work you're doing in this field.

 

On 2/11/2019 at 3:07 AM, MandicReally said:

How can improving the display accuracy of what I am seeing be a bad thing? It could always be better but to say that incremental improvements are pointless unless you are going all the way is just stupid. If I'm shooting 4:2:2 10bit footage why wouldn't I want to try and view that as clearly as I can afford to? 

It's not a bad thing. It's just pointless. If you're going to the trouble and expense of making sure all your applications, display adapters, and drivers all support a 10-bit workflow, *just* to send that information to a monitor that's going to be "inventing" what those extra colors look like by alternating flashes....you shouldn't bother. Save the money and buy more camera gear or lenses or an external recorder. If you can't justify the expense by naming a specific scenario in which your end product/customer has been negatively impacted by not working in a larger color space....it's better to save the money for something else that will forward your creative endeavors in a more impactful way.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×