Computer monitor
A computer monitor is an output device that displays information in pictorial or textual form. A discrete monitor comprises a visual display, support electronics, power supply, housing, electrical connectors, and external user controls.
The display in modern monitors is typically an LCD with LED backlight, having by the 2010s replaced CCFL backlit LCDs. Before the mid-2000s, most monitors used a cathode ray tube as the image output technology. A monitor is typically connected to its host computer via DisplayPort, HDMI, USB-C, DVI, or VGA. Monitors sometimes use other proprietary connectors and signals to connect to a computer, which is less common.
Originally computer monitors were used for data processing while television sets were used for video. From the 1980s onward, computers have been used for both data processing and video, while televisions have implemented some computer functionality. Since 2010, the typical display aspect ratio of both televisions and computer monitors changed from 4:3 to 16:9
Modern computer monitors are often functionally interchangeable with television sets and vice versa. As most computer monitors do not include integrated speakers, TV tuners, or remote controls, external components such as a DTA box may be needed to use a computer monitor as a TV set.
History
Early electronic computer front panels were fitted with an array of light bulbs where the state of each particular bulb would indicate the on/off state of a particular register bit inside the computer. This allowed the engineers operating the computer to monitor the internal state of the machine, so this panel of lights came to be known as the 'monitor'. As early monitors were only capable of displaying a very limited amount of information and were very transient, they were rarely considered for program output. Instead, a line printer was the primary output device, while the monitor was limited to keeping track of the program's operation.Computer monitors were formerly known as visual display units, particularly in British English. This term mostly fell out of use by the 1990s.
Technologies
Multiple technologies have been used for computer monitors. Until the 21st century most used cathode ray tubes but they have largely been superseded by LCD monitors.Cathode ray tube
The first computer monitors used cathode ray tubes. Prior to the advent of home computers in the late 1970s, it was common for a video display terminal using a CRT to be physically integrated with a keyboard and other components of the workstation in a single large chassis, typically limiting them to emulation of a paper teletypewriter, thus the early epithet of 'glass TTY'. The display was monochromatic and far less sharp and detailed than on a modern monitor, necessitating the use of relatively large text and severely limiting the amount of information that could be displayed at one time. High-resolution CRT displays were developed for specialized military, industrial and scientific applications but they were far too costly for general use; wider commercial use became possible after the release of a slow, but affordable Tektronix 4010 terminal in 1972.Some of the earliest home
computers were limited to monochrome CRT displays, but color display capability was already a possible feature for a few MOS 6500 series-based machines, and the color output was a specialty of the more graphically sophisticated Atari 8-bit computers, introduced in 1979. Either computer could be connected to the antenna terminals of an ordinary color TV set or used with a purpose-made CRT color monitor for optimum resolution and color quality. Lagging several years behind, in 1981 IBM introduced the Color Graphics Adapter, which could display four colors with a resolution of pixels, or it could produce pixels with two colors. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of producing 16 colors and had a resolution of.
By the end of the 1980s color progressive scan CRT monitors were widely available and increasingly affordable, while the sharpest prosumer monitors could clearly display high-definition video, against the backdrop of efforts at HDTV standardization from the 1970s to the 1980s failing continuously, leaving consumer SDTVs to stagnate increasingly far behind the capabilities of computer CRT monitors well into the 2000s. During the following decade, maximum display resolutions gradually increased and prices continued to fall as CRT technology remained dominant in the PC monitor market into the new millennium, partly because it remained cheaper to produce. CRTs still offer color, grayscale, motion, and latency advantages over today's LCDs, but improvements to the latter have made them much less obvious. The dynamic range of early LCD panels was very poor, and although text and other motionless graphics were sharper than on a CRT, an LCD characteristic known as pixel lag caused moving graphics to appear noticeably smeared and blurry.
Liquid-crystal display
There are multiple technologies that have been used to implement liquid-crystal displays. Throughout the 1990s, the primary use of LCD technology as computer monitors was in laptops where the lower power consumption, lighter weight, and smaller physical size of LCDs justified the higher price versus a CRT. Commonly, the same laptop would be offered with an assortment of display options at increasing price points: monochrome, passive color, or active matrix color. As volume and manufacturing capability have improved, the monochrome and passive color technologies were dropped from most product lines.TFT-LCD is a variant of LCD which is now the dominant technology used for computer monitors.
The first standalone LCDs appeared in the mid-1990s selling for high prices. As prices declined they became more popular, and by 1997 were competing with CRT monitors. Among the first desktop LCD computer monitors were the Eizo FlexScan L66 in the mid-1990s, the SGI 1600SW, Apple Studio Display and the ViewSonic VP140 in 1998. In 2003, LCDs outsold CRTs for the first time, becoming the primary technology used for computer monitors. The physical advantages of LCD over CRT monitors are that LCDs are lighter, smaller, and consume less power. In terms of performance, LCDs produce less or no flicker, reducing eyestrain, sharper image at native resolution, and better checkerboard contrast. On the other hand, CRT monitors have superior blacks, viewing angles, and response time, can use arbitrary lower resolutions without aliasing, and flicker can be reduced with higher refresh rates, though this flicker can also be used to reduce motion blur compared to less flickery displays such as most LCDs. Many specialized fields such as vision science remain dependent on CRTs, the best LCD monitors having achieved moderate temporal accuracy, and so can be used only if their poor spatial accuracy is unimportant.
High dynamic range has been implemented into high-end LCD monitors to improve grayscale accuracy. Since around the late 2000s, widescreen LCD monitors have become popular, in part due to television series, motion pictures and video games transitioning to widescreen, which makes squarer monitors unsuited to display them correctly.
Organic light-emitting diode
monitors provide most of the benefits of both LCD and CRT monitors with few of their drawbacks, though much like plasma panels or very early CRTs they suffer from burn-in, and remain very expensive.Measurements of performance
The performance of a monitor is measured by the following parameters:- Display geometry:
- * Viewable image size – is usually measured diagonally, but the actual widths and heights are more informative since they are not affected by the aspect ratio in the same way. For CRTs, the viewable size is typically smaller than the tube itself.
- * Aspect ratio – is the ratio of the horizontal length to the vertical length. Monitors usually have the aspect ratio 4:3, 5:4, 16:10 or 16:9.
- * Radius of curvature – is the radius that a circle would have if it had the same curvature as the display. This value is typically given in millimeters, but expressed with the letter "R" instead of a unit.
- Color characteristics:
- * Luminance – measured in candelas per square meter.
- * Contrast ratio is the ratio of the luminosity of the brightest color to that of the darkest color that the monitor is capable of producing simultaneously. For example, a ratio of means that the brightest shade is 20,000 times brighter than its darkest shade. Dynamic contrast ratio is measured with the LCD backlight turned off. ANSI contrast is with both black and white simultaneously adjacent onscreen.
- * Color depth – measured in bits per primary color or bits for all colors. Those with 10bpc or more can display more shades of color than traditional 8bpc monitors, and can do so more precisely without having to resort to dithering.
- * Gamut – measured as a subset of the CIE 1931 color space. sRGB and Adobe RGB have different gamuts, with Adobe RGB able to show a wider range of color than sRGB.
- * Color accuracy – measured in ΔE ; the lower the ΔE, the more accurate the color representation. A ΔE of below 1 is imperceptible to the human eye. A ΔE of 24 is considered good and requires a sensitive eye to spot the difference.
- * Viewing angle is the maximum angle at which images on the monitor can be viewed, without subjectively excessive degradation to the image. It is measured in degrees horizontally and vertically.
- Input speed characteristics:
- * Refresh rate is the number of times in a second that the display is illuminated. In LCDs it is the number of times the image can be changed per second, expressed in hertz. Determines the maximum number of frames per second a monitor is capable of showing. Maximum refresh rate is limited by response time.
- * Response time is the time a pixel in a monitor takes to change between two shades. The particular shades depend on the test procedure, which differs between manufacturers. In general, lower numbers mean faster transitions and therefore fewer visible image artifacts such as ghosting. Grey to grey, measured in milliseconds.
- * Input latency is the time it takes for a monitor to display an image after receiving it, typically measured in milliseconds.
- Power consumption is measured in watts.