Color depth


Color depth, also known as bit depth, is either the number of bits used to indicate the color of a single pixel, or the number of bits used for each color component of a single pixel. When referring to a pixel, the concept can be defined as bits per pixel. When referring to a color component, the concept can be defined as bits per component, bits per channel, bits per color, and also bits per pixel component, bits per color channel or bits per sample. Modern standards tend to use bits per component, but historical lower-depth systems used bits per pixel more often.
Color depth is only one aspect of color representation, expressing the precision with which the amount of each primary can be expressed; the other aspect is how broad a range of colors can be expressed. The definition of both color precision and gamut is accomplished with a color encoding specification which assigns a digital code value to a location in a color space.
The number of bits of resolved intensity in a color channel is also known as radiometric resolution, especially in the context of satellite images.

Comparison

Indexed color

With the relatively low color depth, the stored value is typically a number representing the index into a color map or palette. The colors available in the palette itself may be fixed by the hardware or modifiable by software. Modifiable palettes are sometimes referred to as pseudocolor palettes.
Old graphics chips, particularly those used in home computers and video game consoles, often have the ability to use a different palette per sprites and tiles in order to increase the maximum number of simultaneously displayed colors, while minimizing use of then-expensive memory. For example, in the ZX Spectrum the picture is stored in a two-color format, but these two colors can be separately defined for each rectangular block of 8×8 pixels.
The palette itself has a color depth. While the best VGA systems only offered an 18-bit palette from which colors could be chosen, all color Macintosh video hardware offered a 24-bit palette. 24-bit palettes are nearly universal on any recent hardware or file format using them.
If instead the color can be directly figured out from the pixel values, it is "direct color". Palettes were rarely used for depths greater than 12 bits per pixel, as the memory consumed by the palette would exceed the necessary memory for direct color on every pixel.

List of common depths

1-bit color

2 colors, often black and white direct color. Sometimes 1 meant black and 0 meant white, the inverse of modern standards. Most of the first graphics displays were of this type, the X Window System was developed for such displays, and this was assumed for a 3M computer. In the late 1980s there were professional displays with resolutions up to 300 dpi but color proved more popular.

2-bit color

4 colors, usually from a selection of fixed palettes. Gray-scale early NeXTstation, color Macintoshes, Atari ST medium resolution.

3-bit color

8 colors, almost always all combinations of full-intensity red, green, and blue. Many early home computers with TV displays, including the ZX Spectrum and BBC Micro.

4-bit color

16 colors, usually from a selection of fixed palettes. Used by IBM CGA, EGA, and by the least common denominator VGA standard at higher resolution. Color Macintoshes, Atari ST low resolution, Commodore 64, and Amstrad CPCs also supported 4-bit color.

5-bit color

32 colors from a programmable palette, used by the Original Amiga chipset.

6-bit color

64 colors. Used by the Master System, Enhanced Graphics Adapter, GIME for TRS-80 Color Computer 3, Pebble Time smartwatch, and Parallax Propeller using the reference VGA circuit.

8-bit color

256 colors, usually from a fully-programmable palette: Most early color Unix workstations, Super VGA, color Macintosh, Atari TT, Amiga AGA chipset, Falcon030, Acorn Archimedes. Both X and Windows provided elaborate systems to try to allow each program to select its own palette, often resulting in incorrect colors in any window other than the one with focus.
Some systems placed a color cube in the palette for a direct-color system. Usually fewer levels of blue were provided than others, since the normal human eye is less sensitive to the blue component than to either red or green.
Popular sizes were:
  • 6×6×6, leaving 40 colors for a gray ramp, or for programmable palette entries.
  • 8×8×4. 3 bits of R and G, 2 bits of B, the correct value can be computed from a color without using multiplication. Used, among others, in the MSX2 system series of computers.
  • a 6×7×6 color cube, leaving 4 colors for a programmable palette or grays.
  • a 6×8×5 cube, leaving 16 colors for a programmable palette or grays.

    12-bit color

4,096 colors, usually from a fully-programmable palette. Some Silicon Graphics systems, Color NeXTstation systems, and Amiga systems in HAM mode have this color depth.
RGBA4444, a related 16 bpp representation providing the color cube and 16 levels of transparency, is a common texture format in mobile graphics.

High color (15/16-bit)

In high-color systems, two bytes are stored for each pixel. Most often, each component is assigned 5 bits, plus one unused bit ; this allows 32,768 colors to be represented. However, an alternate assignment which reassigns the unused bit to the G channel allows 65,536 colors to be represented, but without transparency. These color depths are sometimes used in small devices with a color display, such as mobile phones, and are sometimes considered sufficient to display photographic images. Among the first hardware to use the standard were the Sharp X68000 and IBM's Extended Graphics Array.
The term "high color" has recently been used to mean color depths greater than 24 bits.

18-bit

Almost all of the least expensive LCDs provide 18-bit color to achieve faster color transition times, and use either dithering or frame rate control to approximate 24-bit-per-pixel true color, or throw away 6 bits of color information entirely. More expensive LCDs can display 24-bit color depth or greater.

True color (24-bit)

24 bits almost always use 8 bits each of R, G, and B. As of 2018, 24-bit color depth is used by virtually every computer and phone display and the vast majority of image storage formats. Almost all cases of 32 bits per pixel assigns 24 bits to the color, and the remaining 8 are the alpha channel or unused.
224 gives 16,777,216 color variations. The human eye can discriminate up to ten million colors, and since the gamut of a display is smaller than the range of human vision, this means this should cover that range with more detail than can be perceived. However, displays do not evenly distribute the colors in human perception space, so humans can see the changes between some adjacent colors as color banding. Monochromatic images set all three channels to the same value, resulting in only 256 different colors; some software attempts to dither the gray level into the color channels to increase this, although in modern software this is more often used for subpixel rendering to increase the space resolution on LCD screens where the colors have slightly different positions.
The DVD-Video and Blu-ray Disc standards support a bit depth of 8 bits per color in YCbCr with 4:2:0 chroma subsampling. YCbCr can be losslessly converted to RGB.
macOS and Mac OS refer to 24-bit color as "millions" of colors.
The term true color is sometimes used to mean what this article calls direct color. It is also often used to refer to all color depths greater than or equal to 24 bits.

Deep color (30-bit)

Deep color consists of a billion or more colors. 230 is 1,073,741,824. Usually this is 10 bits each of red, green, and blue. If an alpha channel of the same size is added then each pixel takes 40 bits.
Some earlier systems placed three 10-bit channels in a 32-bit word, with 2 bits unused ; the Cineon file format, for example, used this. Some SGI systems had 10- bit digital-to-analog converters for the video signal and could be set up to interpret data stored this way for display. BMP files define this as one of its formats, and it is called "HiColor" by Microsoft.
Video cards with 10 bits per component started coming to market in the late 1990s. An early example was the Radius ThunderPower card for the Macintosh, which included extensions for QuickDraw and Adobe Photoshop plugins to support editing 30-bit images. Some vendors call their 24-bit color depth with FRC panels 30-bit panels; however, true deep color displays have 10-bit or more color depth without FRC.
The HDMI 1.3 specification defines a bit depth of 30 bits.
In that regard, the Nvidia Quadro graphics cards manufactured after 2006 support 30-bit deep color and Pascal or later GeForce and Titan cards when paired with the Studio Driver as do some models of the Radeon HD 5900 series such as the HD 5970. The ATI FireGL V7350 graphics card supports 40- and 64-bit pixels.
The DisplayPort specification also supports color depths greater than 24 bpp in version 1.3 through "VESA Display Stream Compression, which uses a visually lossless low-latency algorithm based on predictive DPCM and YCoCg-R color space and allows increased resolutions and color depths and reduced power consumption."
At WinHEC 2008, Microsoft announced that color depths of 30 bits and 48 bits would be supported in Windows 7, along with the wide color gamut scRGB.
High Efficiency Video Coding defines the Main 10 profile, which allows for 8 or 10 bits per sample with 4:2:0 chroma subsampling. The Main 10 profile was added at the October 2012 HEVC meeting based on proposal JCTVC-K0109 which proposed that a 10-bit profile be added to HEVC for consumer applications. The proposal stated that this was to allow for improved video quality and to support the Rec. 2020 color space that will be used by UHDTV. The second version of HEVC has five profiles that allow for a bit depth of 8 bits to 16 bits per sample.
As of 2020, some smartphones have started using 30-bit color depth, such as the OnePlus 8 Pro, Oppo Find X2 & Find X2 Pro, Sony Xperia 1 II, Xiaomi Mi 10 Ultra, Motorola Edge+, ROG Phone 3 and Sharp Aquos Zero 2.