Why isn't there more than 32-bit color?

Nukemann

Junior Member
Mar 29, 2014
18
0
0
I remember circa 1997 or 1998, 24-bit/16.7 million colors "True Color" was the maximum display color quality, and by ~2000, 32-bit "True Color" became the norm. But why hasn't there been any improvements in color quality? FP precision has increased, so technically the GPU runs FP32 (128-bit color) or FP64 (256-bit color?) in the rendering units, but display output is only 32-bit? What's preventing output from going over 32-bit color? Would you be able to notice the difference between 32-bit and 64-bit color?
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
What's preventing output from going over 32-bit color? Would you be able to notice the difference between 32-bit and 64-bit color?

Your eyeballs are the limiter. Why deal with the performance hit from 64/128/256bit color when the result will still look the same to human retinas?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
And just to be clear, most consumer monitors are still 24-bit. Some are 18-bit with dithering to appear like 24-bit. They use 32-bit color on the PC side, only because it is a size that computers handle easily, even though they won't hold more than a 24-bit color in it (30-bit in rare cases).

When I say 24-bit, I'm referring to all 3 colors (Red, Green, Blue). Monitors advertise the size of one of those colors, rather than all 3 added together. So you can find 6, 8 and 10 bit monitors, which when put together into a single color, adds up to 18-bit, 24-bit, and 30-bit, which is what is stored in that 32-bit color value.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Before such colours can be useful the monitor must be calibrated far better than is possible today.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Your eyeballs are the limiter. Why deal with the performance hit from 64/128/256bit color when the result will still look the same to human retinas?
This is only true with low dynamic range monitors.
Proper HDR monitors certainly need more than 256 values for brightness to work properly.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
At any one time 24 bit colour seems like a reasonable approximation of the human eyes capabilities. But because the Iris can change the amount of light coming in we can in actuality see substantially more ranges of brightness levels at those colours and right now games are simulating that in software with High dynamic range algorithms. Basically they render in 64 bit/128bit float colour and then a shader program looks at the brightness range and chooses a particular mid point and sets the colour appropriately on the monitor. The game then also needs to simulate the iris movement so over a number of frames the colour space is changed. This is all due to limitations in the monitors, they aren't capable of outputting the full brightness and colour range of human perception.

The problem is however its actually quite hard to make a monitor that is better. We have 10 bit/30bit monitors but they are quite expensive, most people actually buy 6bit/18bit monitors which dither to try and simulate the RGB (24 bit colour) space as best as possible. But we know that doesn't give us the complete colour space accurately and it certainly doesn't cover all the colours we can see. Read http://www.tftcentral.co.uk/articles/pointers_gamut.htm on gamut and colour spaces its really interesting and explains a lot about 6 bit/8bit/10 bit issues and why none of its actually sufficient yet to match what colours we can see.

But in essence the problem is the hardware, we can't produce a higher bit depth monitor yet. Although this does make me think that a design for a 4 or 5 colour monitor with 10 bit on each channel might really shake the market up as it will look fantastic in comparison to the current crop of monitors.
 
Last edited:

Kippa

Senior member
Dec 12, 2011
392
1
81
Don't the mainstream and high end 4K monitors support 10bit depth (not counting the ultra low end)?
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Don't the mainstream and high end 4K monitors support 10bit depth (not counting the ultra low end)?

haha no. Almost all of them are actually 6 bit + dithering (FRC), they aren't even 8 bit RGB let alone attempting to be 10 bit Adobe RGB. Not to mention the fact that most people don't even have a 10 bit pipeline anyway(another problem with GPUs they don't tell you about).

Is there even bandwidth assigned in DisplayPort for 10 bit colour? I don't even know if there is. I don't really feel that is a failure of knowledge on my part, its a failure of the review sites to do their jobs. I can't even go look in the specification, because its locked behind a paywall and you need to be a member of VESA to read it.

But simply the answer is no, I don't know of any 4k monitors that have been released that are even remotely 10 bit, they are as far as I know 6 bit FRC.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
32-bit colour usually refers to 24-bit colour (8-bits per channel) with an 8-bit alpha channel.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Mostly, it's that we can't control the gradations in the LCDs finely enough to get more bit depth.

Most TN panels are 6-bit with dithering. The Asus ROG Swift is a true 8-bit display, which is unusual for TN. Most IPS displays aimed at professional use are 8-bit, but some of them have 10-bit color (1024 levels). But you're really hitting the limit of LCDs at that point. The next step up would be 12-bit color, which is 4096 levels. That's well beyond what we can control the LCDs to accurately.

Bandwidth over the display cables and through the GPU is another matter entirely, but that's trivial compared to actually displaying the levels required.

I don't know of any 4k monitors that have been released that are even remotely 10 bit.

http://www.amazon.com/ASUS-PQ321Q-31.../dp/B00DJ4BIKA

The ASUS PQ321Q True 4K UHD monitor is a 10-bit RGB panels that delivers more natural transitions between different hues and offers a more flawless image than ever before.
I'm not clear on whether this is true 10-bit or whether it is 8-bit plus dithering, the product pages tend to be a bit vague. This one is 8-bit plus dither, but it's also a hell of a lot cheaper.
 
Last edited:

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Before such colours can be useful the monitor must be calibrated far better than is possible today.

You have no idea what you're talking about.

Infraction issued for personal attack.
-- stahlhart
 
Last edited by a moderator:

dorion

Senior member
Jun 12, 2006
256
0
76
Bandwidth over the display cables and through the GPU is another matter entirely, but that's trivial compared to actually displaying the levels required.

Displayport, DVI, and HDMI have support for 10-bit color at various resolutions. In HDMI it's called Deep Color and it's supposed to offer 30-bit, 36-bit, and 42-bit color. (Windows)GPU support seems to be mostly limited to the Nvidia Quadro and AMD Fire series.

So besides spending an extra $2000 on your gpu and an extra $500 on your monitor it's quite trivial.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
14
81
High end monitors for professional use (e.g. graphic design, publishing, medical) do use 30 bit color (i.e. 10 bits for R, G and B).

The difference in performance between these and regular monitors is very difficult to spot, but there can be slight increases in banding. The issue becomes more apparent with "wide gamut" monitors.

A "wide gamut" monitor can display a much richer selection of colors, but at 8 bit, it still only has 16.7 to choose from; as a result, individual shades are further apart, and banding tends to be a lot more visible. This can be very objectionable if you are doing software (or GPU) color correction to emulate a narrow gamut, such as sRGB.

The real limiting factor is lack of software support, most OSs don't really offer any useful "30 bit" support. Most 30 bit cards basically use driver tricks. The card runs in a 30 bit mode, but emulates a 24 bit mode for the OS - so when the OS sends data to the card, the hardware translates it to 30 bit mode, and vice versa. However, when applications use hardware accelerated functions, the card actually renders in 30 bit direct to VRAM.

This type of trickery tends to be restricted to pro-level cards, and it requires specific software support in the application.

Finally, it is important to distinguish between the panel color resolution, and the display input resolution.

The lowest end panels typically have 6 bit digital-to-analog converters. As this gives a very obviously unsatisfactory picture, many such displays use temporal dithering (also called 3D dithering or FRC) to simulate near 8 bit performance.

Some panels are 8 bit, and the display may or may not use FRC to provide near 10-bit performance. Higher end panels may be 10 or 12 bit natively, and FRC may be used to emulate even higher color resolutions. Certainly, in the very high end market, I've come across displays with 12 bit panels using FRC to achieve a "14 bit" response.

But what is the point of these super-high native color resolutions for a panel, if the monitor and computer only support 8, or maybe 10 bit? The answer is for calibration. You cannot calibrate more precisely than one "step" in the panel response. By having finer panel response, you can more precisely match each input to the desired pixel brightness.
 
Last edited:

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
And just to be clear, most consumer monitors are still 24-bit. Some are 18-bit with dithering to appear like 24-bit. They use 32-bit color on the PC side, only because it is a size that computers handle easily, even though they won't hold more than a 24-bit color in it (30-bit in rare cases).

When I say 24-bit, I'm referring to all 3 colors (Red, Green, Blue). Monitors advertise the size of one of those colors, rather than all 3 added together. So you can find 6, 8 and 10 bit monitors, which when put together into a single color, adds up to 18-bit, 24-bit, and 30-bit, which is what is stored in that 32-bit color value.

There is no 32 bit color. When you choose 32 bit in Windows it's really 24 bit. I have no idea why they call it 32 bit!
 

Mand

Senior member
Jan 13, 2014
664
0
0
There is no 32 bit color. When you choose 32 bit in Windows it's really 24 bit. I have no idea why they call it 32 bit!

The other 8 bits are for the alpha channel, used for transparency typically. It's useful for image processing, but not so much for sending pixels out to a display, as the display only cares about what the end result color it has to show is. It's not going to show two semi-transparent pixels simultaneously - it's going to show one merged pixel in 24-bit color space.
 
Last edited:

cytg111

Lifer
Mar 17, 2008
23,538
13,109
136
High end monitors for professional use (e.g. graphic design, publishing, medical) do use 30 bit color (i.e. 10 bits for R, G and B).

The difference in performance between these and regular monitors is very difficult to spot, but there can be slight increases in banding. The issue becomes more apparent with "wide gamut" monitors.

A "wide gamut" monitor can display a much richer selection of colors, but at 8 bit, it still only has 16.7 to choose from; as a result, individual shades are further apart, and banding tends to be a lot more visible. This can be very objectionable if you are doing software (or GPU) color correction to emulate a narrow gamut, such as sRGB.

The real limiting factor is lack of software support, most OSs don't really offer any useful "30 bit" support. Most 30 bit cards basically use driver tricks. The card runs in a 30 bit mode, but emulates a 24 bit mode for the OS - so when the OS sends data to the card, the hardware translates it to 30 bit mode, and vice versa. However, when applications use hardware accelerated functions, the card actually renders in 30 bit direct to VRAM.

This type of trickery tends to be restricted to pro-level cards, and it requires specific software support in the application.

Finally, it is important to distinguish between the panel color resolution, and the display input resolution.

The lowest end panels typically have 6 bit digital-to-analog converters. As this gives a very obviously unsatisfactory picture, many such displays use temporal dithering (also called 3D dithering or FRC) to simulate near 8 bit performance.

Some panels are 8 bit, and the display may or may not use FRC to provide near 10-bit performance. Higher end panels may be 10 or 12 bit natively, and FRC may be used to emulate even higher color resolutions. Certainly, in the very high end market, I've come across displays with 12 bit panels using FRC to achieve a "14 bit" response.

But what is the point of these super-high native color resolutions for a panel, if the monitor and computer only support 8, or maybe 10 bit? The answer is for calibration. You cannot calibrate more precisely than one "step" in the panel response. By having finer panel response, you can more precisely match each input to the desired pixel brightness.

Nice. Thanks for that.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |