Jump to content

Apple MacOS X now support 1.07 billion colors (10-bit colors per channel)

GoodBytes

Apple reveals that with El Capitan version of MacOS X, will now support 30-bit color on select systems.

This is joining Windows which supports 30-bit colors since Windows XP. Windows 7 supports up to 48-bit colors, if memory serves correct.

Fullbleed.jpg

Source: http://www.engadget.com/2015/10/30/apple-imac-5k-el-capitan-billions-colors/

What does this mean? Absolutely nothing for ~97-99% of users, same for Windows side users.

In order to enjoy the over 1.07 billion colors you need:

  • Supported monitor (which is believed the iMac 4K and 5K display does support).
  • Supported graphics card (Select FirePro and Quadro cards only)
  • Supported software (the OS will look the same, as everything is in 8-bit colors per channel, or ~16.7million colors, if you prefer. You need PhotoShop and other select supported software)
  • And content. Yes! So if your camera is like the majority of cameras on the market, it doesn't support 10-bit colors per channel, unless you work with RAW format. So, content is not everywhere.
What does it mean for professionals that already enjoying 30-bit colors for their setup?

Simply put, they can now enjoy working with pictures with '0' dithering visible on gradients. Why? because they don't need to be using dithering technique(s) by their photo editor (or have the camera do it). Dithering is a technique of blending colors usually on gradients (the effect of passing from one color to another in a smooth way), so that the effect looks smooth, because 16.7 million colors isn't actually a lot of colors to work with.

Let me explains:

Lets take an example. Without dithering technique, an image that you take from your camera will look like this:

The_Foston_by_pass_on_the_A1_North_-_geo

 

As you can see the sky is not smooth, you see what is called: stepping. This is despite being in 16.7 million colors (8-bits per channel). As we can see 16.7 million colors might sound a lot, but really isn't.

 

This is important for professionals working with 10-bit per channel printers, like printers for posters or magazines, to displays pictures in their best possible way. Dithering is always used when you take images from your camera, so it looks smooth. If you use 10-bit per channel equipment, you don't need to use dithering, either applied by you, or your image software or your camera.

What are 'channels'?

Colors that can be used to display different colors when mixed together, to give you can overall image.

Red, Green, and Blue are example of channels.

So when we say a monitor is for example 8-bit colors, it is always meant: 8-bit per channel, and because we are talking about non-specialized monitor we mean RGB (unless we talk about printers, then we talk about CYMK, another example of set of channels).

So, 8-bit per channel on RGB is:

  • 8-bit of red = 28 = 256 red shades
  • 8-bit of green = 28 = 256 green shades
  • 8-bit of blue  = 28 = 256 blue shades
Mixed together give you: 256 * 256 * 256 = 16,777,216 colors, or nicely rounded down: 16.7 million colors.

So, a 10-bit panel give you 1024 * 1024 * 1024 = 1,073,741,824 colors, or nicely rounded down: 1.07 billion colors.

 

What is the problem with dithering?

It adds a grain effect on the area of an image where you have color shifting from 1 color to another, (this is called gradient), like the sky above, where it passes from 2 shades of color from the top to center/middle of the image. Basically, you loose 'definition' by mixing pixels where the stepping happen to make it less visible.

 

For example:

Image source: http://www.itexico.com/blog/bid/99548/Mobile-Design-Considerations-Throughout-the-UI-Design-Process

dithering.png

 

If you have a decent display, you should see the steeping on the left, and the right, shows the same image of the left with dithering technique applied to it to hide the stepping. Notice the grain effect that you can see in the lens. And you can still kinda see the stepping, still, just much harder.

 

So if you wonder why your background picture has a grain effect, including Windows 10 default wallpaper where the smoke and light intersect, that is why. It's not the image being highly compressed, but rather 16.7 million colors is not enough.

 

If it is so great, why aren't we switching to 10-bit colors per channel on everything, especially if we had the technology since 'ever', why are we staying with 8-bit colors per channel on everything?

  • Monitor that support 10-bit colors, even those supporting it via FRC (that is a technique of taking 2 colors that the panel can produce and switch between them really fast to make you believe that you are seeing the correct color that the monitor can't natively produce. That is called not true 10-bit colors panel) are costly. Already, people on this very forum, have trouble investing in a true 8-bit panel, and opt with 6-bit panel with FRC to emulate 8-bit colors per channel. Even if they pick an IPS panel.

  • Games and image files will be bigger, making it more consuming to download. 30-bit (10-bit per channel (red, green and blue)) images consumes more storage space due to more color information that they need to carry, basically. So it consumes more HDD/SSD space, and longer to download. Website will take more time to load due to the larger images, and server load will increase, for not much benefit in real life.

  • Related to the above point, games will consume more GPU memory. As textures are now in 30-bit (10-bit per channel), instead of 24-bit (8-bit per channel), they consume more space, and that means more GPU memory needed to store the texture in itself to work with them (apply them on objects and environment). It can also cause a reduction in performance due to the increase processing it needs to do, if shader effects are applied to the texture.

  • Consumer don't demand it.
But it is nice to see that now Apple finally support it, and (my opinion) will probably push in its advertising on how its product can push 1.07 billion colors, and PC you can't, and how crummy things look, even thought they'll compare a shitty ~80$ TN panel monitor (which to be honest this is what most people have) against a high-end consumer high-pixel density grade IPS panel under a high-DPI environment, which definitely would highlight the wow effect, and probably will help push sales.
Link to comment
Share on other sites

Link to post
Share on other sites

This is great for all the mac users out there :) especially for colour sensitive work

Desktop - Corsair 300r i7 4770k H100i MSI 780ti 16GB Vengeance Pro 2400mhz Crucial MX100 512gb Samsung Evo 250gb 2 TB WD Green, AOC Q2770PQU 1440p 27" monitor Laptop Clevo W110er - 11.6" 768p, i5 3230m, 650m GT 2gb, OCZ vertex 4 256gb,  4gb ram, Server: Fractal Define Mini, MSI Z78-G43, Intel G3220, 8GB Corsair Vengeance, 4x 3tb WD Reds in Raid 10, Phone Oppo Reno 10x 256gb , Camera Sony A7iii

Link to comment
Share on other sites

Link to post
Share on other sites

Took long enough....

Thats that. If you need to get in touch chances are you can find someone that knows me that can get in touch.

Link to comment
Share on other sites

Link to post
Share on other sites

48-bit? FFS that's stupid. Most people can't even tell the difference when adding 1 to a color channel at 8 bits per channel.

It's useful for extremely precise color toning but even there, that's pointlessly accurate

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

48-bit? FFS that's stupid. Most people can't even tell the difference when adding 1 to a color channel at 8 bits per channel.

Microsoft tries to always go a step beyond to not put a limit on technology from its OS.

 

For example, Windows 7 supports up to 192GB of RAM.

Make that 512GB of RAM with Windows 8, and 2TB, yes! 2TB! in Windows 10.

Windows Server 2012 users gets to enjoy support up to 4TB

https://msdn.microsoft.com/en-ca/library/windows/desktop/aa366778%28v=vs.85%29.aspx#physical_memory_limits_windows_10

 

Can you get me 2TB of RAM? (DDR4 in 2 sticks preferably) :P

Link to comment
Share on other sites

Link to post
Share on other sites

Microsoft tries to always go a step beyond to not put a limit on technology from its OS.

 

For example, Windows 7 supports up to 192GB of RAM.

Make that 512GB of RAM with Windows 8, and 2TB, yes! 2TB! in Windows 10.

Windows Server 2012 users gets to enjoy support up to 4TB

https://msdn.microsoft.com/en-ca/library/windows/desktop/aa366778%28v=vs.85%29.aspx#physical_memory_limits_windows_10

2TB RAM? Why would anyone have that?

Link to comment
Share on other sites

Link to post
Share on other sites

2TB RAM? Why would anyone have that?

Servers -- although any server that would likely use that much would also (hopefully) be split into multiple servers.

Plus, long ago people also asked why anyone would need more than 64kb of ram...so

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Only like 10 years behind the rest of the industry... Better late than never though.

48-bit? FFS that's stupid. Most people can't even tell the difference when adding 1 to a color channel at 8 bits per channel.

I am pretty sure most people could tell the difference between 8bit and 10bit if you put them side by side, and the image was a very fine gradient.

Microsoft tries to always go a step beyond to not put a limit on technology from its OS.

 

For example, Windows 7 supports up to 192GB of RAM.

Make that 512GB of RAM with Windows 8, and 2TB, yes! 2TB! in Windows 10.

Windows Server 2012 users gets to enjoy support up to 4TB

https://msdn.microsoft.com/en-ca/library/windows/desktop/aa366778(v=vs.85).aspx#physical_memory_limits_windows_10

 

Can you get me 2TB of RAM? (DDR4 in 2 sticks preferably) :P

Well, I wouldn't say always (they have often put artificial limitations on their lower tier OSes) but for their Ultimate editions and such they usually pull out all the stops.
Link to comment
Share on other sites

Link to post
Share on other sites

I am pretty sure most people could tell the difference between 8bit and 10bit if you put them side by side, and the image was a very fine gradient.

It would reduce/eliminate the grain effect when you see when zoomed in on pictures where you have a gradient like a picture where you see the sky.

Link to comment
Share on other sites

Link to post
Share on other sites

This is great for all the mac users out there :) especially for colour sensitive work

came here to say this, might have been longer on windows but there aren't really things in windows that make more use of it compared to OSX applications which often promote that they can support xx-bit colour

May the light have your back and your ISO low.

Link to comment
Share on other sites

Link to post
Share on other sites

And all this is useless unless you have a mac pro with firepros in it...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Servers -- although any server that would likely use that much would also (hopefully) be split into multiple servers.

Plus, long ago people also asked why anyone would need more than 64kb of ram...so

Why would a server use standard W10?

Link to comment
Share on other sites

Link to post
Share on other sites

NOW they have 10-bit color?!   :wacko:  I assumed they'ed at least have 10-bit color on their Mac Pro's

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

came here to say this, might have been longer on windows but there aren't really things in windows that make more use of it compared to OSX applications which often promote that they can support xx-bit colour

They do on Windows side. Why do you think Photoshop is so popular. It is part of its "must have" features. And this is just an example. It is critical in any software that is used by professional that work in the movie industry, and printing industry (I mean those making high quality posters, photo editing and epxcte it to be printed in a high quality printer (not the pharmacy/supermarket shitty Kodak printers/stations, or home color printer), magazines, etc).

Link to comment
Share on other sites

Link to post
Share on other sites

Why would a server use standard W10?

It wouldnt. There is just the option. Same with many things. Not necessarily used completly, but if it doesnt hamper anything, why not

"Unofficially Official" Leading Scientific Research and Development Officer of the Official Star Citizen LTT Conglomerate | Reaper Squad, Idris Captain | 1x Aurora LN


Game developer, AI researcher, Developing the UOLTT mobile apps


G SIX [My Mac Pro G5 CaseMod Thread]

Link to comment
Share on other sites

Link to post
Share on other sites

They do on Windows side. Why do you think Photoshop is so popular. It is part of its "must have" features. And this is just an example. It is critical in any software that is used by professional that work in the movie industry, and printing industry (I mean those making high quality posters, photo editing and epxcte it to be printed in a high quality printer (not the pharmacy/supermarket shitty Kodak printers/stations, or home color printer), magazines, etc).

 

Yeah I know, but honestly(as hobby photographer) When it comes to windows I could have known that photoshop supports 10-bit color since it also supports that on OSX, but outside that I wouldn't know any other software(non adobe) in windows that supports 10bit, it is probably more common then I suspect, but when I'm on my mac I just poke in "photo batch converter" or something alike in the appstore and it finds me multiple apps that have 10-bit color capability as additional purchase or included.

but when I enter that on google or the windows appp(win10) store I can find a few converters, but without any info regarding the color depth.

But since I never looked in the supported amount of color depth for both OSes I'm surprised that apple has to catch up to this, knowing it's fair share of hobbyist/professional market share when it comes to creative content creators.

 

(sorry if that doesnt make sense and it's a mess, I got trouble finding the right words to describe/say it)

 

NOW they have 10-bit color?!   :wacko:  I assumed they'ed at least have 10-bit color on their Mac Pro's

 

they already did, if you reread OP you can see it says it is 10bit per channel, coming together as 32-bit total.

May the light have your back and your ISO low.

Link to comment
Share on other sites

Link to post
Share on other sites

Updated news post to improve explanation, and showing the limitation of 16.7 million colors and what dithering is, and how it compares when it is applied and not applied, and what it means to you.

Link to comment
Share on other sites

Link to post
Share on other sites

Updated news post to improve explanation, and showing the limitation of 16.7 million colors and what dithering is, and how it compares when it is applied and not applied, and what it means to you.

Coolio. Btw, 10bit color is one of those things that just never seemed to hit mainstream. I mean I understand why, but it's amusing that it's been around for a decade plus and outside of extremely niche areas, it's basically been stagnant.

Perhaps with the focus on ips and oled increasing color accuracy, we will see more incentivisation for delivering 10 bit and above color.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Coolio. Btw, 10bit color is one of those things that just never seemed to hit mainstream. I mean I understand why, but it's amusing that it's been around for a decade plus and outside of extremely niche areas, it's basically been stagnant.

Perhaps with the focus on ips and oled increasing color accuracy, we will see more incentivisation for delivering 10 bit and above color.

It is more like consumers must demand it. If they demand it to AMD or Nvidia, then they can 'unlock' it in future graphics cards, and not make it exclusive for select Quadros. Sure, we are back with: no content, and no free simple software that does support it, but it will be a starting point.

And for those who wonder, yes, DisplayPort does support 4K 60Hz at 10-bit colors, so don't worry about that.

Link to comment
Share on other sites

Link to post
Share on other sites

It is more like consumers must demand it. If they demand it to AMD or Nvidia, then they can 'unlock' it in future graphics cards, and not make it exclusive for select Quadros. Sure, we are back with: no content, and no free simple software that does support it, but it will be a starting point.

And for those who wonder, yes, DisplayPort does support 4K 60Hz at 10-bit colors, so don't worry about that.

Well I meant more that if consumers have been demanding better color just from screen techs, then perhaps that alone will make monitor makers toute 10 bit color again, and then that brings pressure on graphics makers to push it mainstream.

That's how the resolution wars started anyways.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well I meant more that if consumers have been demanding better color just from screen techs, then perhaps that alone will make monitor makers toute 10 bit color again, and then that brings pressure on graphics makers to push it mainstream.

That's how the resolution wars started anyways.

Marketshare is still minimal, and Nvidia and AMD are probably going "meh, there is no content in any case... GeForce/Radeon is for gaming". So consumer should really demand it, and not just buy 10-bit with FRC monitors. It's also competition. If AMD does it, you can bet Nvidia will enable it in future cards.

Also, will reviewer care? For example, all GeForce 900 series user can notice that AA stops at 8x, while 700 series and older card had 16x. Why? Because: no reviewer ever mentioned it (go check yourself, 0 mention) because AMD doesn't support it, so they can't compare it with anything. So, the feature was cut out, and so far, no one even noticed that option being gone. Also, it makes it so that in games, when you simply pick "Ultra/Max" settings from the preset, AMD users will get 8x, and Nvidia users will now only get 8x and not 16x, making Nvidia GPU not run slower than AMDs.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×