Display resolution

The display resolution or display modes of a digital television, computer monitor or display device is the number of distinct pixels in each dimension that can be displayed. It can be an ambiguous term especially as the displayed resolution is controlled by different factors in cathode ray tube displays, flat-panel displays and projection displays using fixed picture-element arrays.
It is usually quoted as ', with the units in pixels: for example, ' means the width is 1024 pixels and the height is 768 pixels. This example would normally be spoken as "ten twenty-four by seven sixty-eight" or "ten twenty-four by seven six eight".
One use of the term display resolution applies to fixed-pixel-array displays such as plasma display panels, liquid-crystal displays, Digital Light Processing projectors, OLED displays, and similar technologies, and is simply the physical number of columns and rows of pixels creating the display. A consequence of having a fixed-grid display is that, for multi-format video inputs, all displays need a "scaling engine" to match the incoming picture format to the display.
For device displays such as phones, tablets, monitors and televisions, the use of the term display resolution as defined above is a misnomer, though common. The term display resolution is usually used to mean pixel dimensions, the maximum number of pixels in each dimension, which does not tell anything about the pixel density of the display on which the image is actually formed: resolution properly refers to the pixel density, the number of pixels per unit distance or area, not the total number of pixels. In digital measurement, the display resolution would be given in pixels per inch. In analog measurement, if the screen is 10 inches high, then the horizontal resolution is measured across a square 10 inches wide. For television standards, this is typically stated as "lines horizontal resolution, per picture height"; for example, analog NTSC TVs can typically display about 340 lines of "per picture height" horizontal resolution from over-the-air sources, which is equivalent to about 440 total lines of actual picture information from left edge to right edge.


Some commentators also use display resolution to indicate a range of input formats that the display's input electronics will accept and often include formats greater than the screen's native grid size even though they have to be down-scaled to match the screen's parameters. In the case of television inputs, many manufacturers will take the input and zoom it out to "overscan" the display by as much as 5% so input resolution is not necessarily display resolution.
The eye's perception of display resolution can be affected by a number of factors see image resolution and optical resolution. One factor is the display screen's rectangular shape, which is expressed as the ratio of the physical picture width to the physical picture height. This is known as the aspect ratio. A screen's physical aspect ratio and the individual pixels' aspect ratio may not necessarily be the same. An array of Graphics display resolution#WXGA| on a display has square pixels, but an array of Graphics display resolution#XGA | on a 16:9 display has oblong pixels.
An example of pixel shape affecting "resolution" or perceived sharpness: displaying more information in a smaller area using a higher resolution makes the image much clearer or "sharper". However, most recent screen technologies are fixed at a certain resolution; making the resolution lower on these kinds of screens will greatly decrease sharpness, as an interpolation process is used to "fix" the non-native resolution input into the display's native resolution output.
While some CRT-based displays may use digital video processing that involves image scaling using memory arrays, ultimately "display resolution" in CRT-type displays is affected by different parameters such as spot size and focus, astigmatic effects in the display corners, the color phosphor pitch shadow mask in color displays, and the video bandwidth.

Interlacing versus progressive scan

Overscan and underscan

Most television display manufacturers "overscan" the pictures on their displays, so that the effective on-screen picture may be reduced from to , for example. The size of the invisible area somewhat depends on the display device. HD televisions do this as well, to a similar extent.
Computer displays including projectors generally do not overscan although many models allow it. CRT displays tend to be underscanned in stock configurations, to compensate for the increasing distortions at the corners.

Current standards


Televisions are of the following resolutions:
Computer monitors have traditionally possessed higher resolutions than most televisions.


In 2002, eXtended Graphics Array was the most common display resolution. Many web sites and multimedia products were re-designed from the previous format to the layouts optimized for.
The availability of inexpensive LCD monitors made the 5:4 aspect ratio resolution of more popular for desktop usage during the first decade of the 21st century. Many computer users including CAD users, graphic artists and video game players ran their computers at resolution or higher such as QXGA if they had the necessary equipment. Other available resolutions included oversize aspects like SXGA+ and wide aspects like WXGA, WXGA+, WSXGA+, and WUXGA; monitors built to the 720p and 1080p standard were also not unusual among home media and video game players, due to the perfect screen compatibility with movie and video game releases. A new more-than-HD resolution of WQXGA was released in 30-inch LCD monitors in 2007.


, was the most common display resolution.
In 2010, 27-inch LCD monitors with the -pixel resolution were released by multiple manufacturers including Apple, and in 2012, Apple introduced a 2880 × 1800 display on the MacBook Pro. Panels for professional environments, such as medical use and air traffic control, support resolutions of up to pixels.

Common display resolutions

The following table lists the usage share of display resolutions from two sources, as of June 2020. The numbers are not representative of computer users in general.
StandardAspect ratioWidth Height Steam StatCounter
4K UHD16:9384021602.12n/a

When a computer display resolution is set higher than the physical screen resolution, some video drivers make the virtual screen scrollable over the physical screen thus realizing a two dimensional virtual desktop with its viewport. Most LCD manufacturers do make note of the panel's native resolution as working in a non-native resolution on LCDs will result in a poorer image, due to dropping of pixels to make the image fit or insufficient sampling of the analog signal. Few CRT manufacturers will quote the true native resolution, because CRTs are analog in nature and can vary their display from as low as 320 × 200 to as high as the internal board will allow, or the image becomes too detailed for the vacuum tube to recreate. Thus, CRTs provide a variability in resolution that fixed resolution LCDs cannot provide.
In recent years the 16:9 aspect ratio has become more common in notebook displays. has become popular for most notebook sizes, while and are available for larger notebooks.
As far as digital cinematography is concerned, video resolution standards depend first on the frames' aspect ratio in the film stock and then on the actual points' count. Although there is not a unique set of standardized sizes, it is commonplace within the motion picture industry to refer to "nK" image "quality", where n is a integer number which translates into a set of actual resolutions, depending on the film format. As a reference consider that, for a 4:3 aspect ratio which a film frame is expected to horizontally fit in, n is the multiplier of 1024 such that the horizontal resolution is exactly 1024•n points. For example, 2K reference resolution is pixels, whereas 4K reference resolution is pixels. Nevertheless, 2K may also refer to resolutions like , or pixels. It is also worth noting that while a frame resolution may be, for example, 3:2, that is not what you will see on-screen.

Evolution of standards

Many personal computers introduced in the late 1970s and the 1980s were designed to use television receivers as their display devices, making the resolutions dependent on the television standards in use, including PAL and NTSC. Picture sizes were usually limited to ensure the visibility of all the pixels in the major television standards and the broad range of television sets with varying amounts of over scan. The actual drawable picture area was, therefore, somewhat smaller than the whole screen, and was usually surrounded by a static-colored border. Also, the interlace scanning was usually omitted in order to provide more stability to the picture, effectively halving the vertical resolution in progress., and on NTSC were relatively common resolutions in the era. In the IBM PC world, these resolutions came to be used by 16-color EGA video cards.
One of the drawbacks of using a classic television is that the computer display resolution is higher than the television could decode. Chroma resolution for NTSC/PAL televisions are bandwidth-limited to a maximum 1.5 megahertz, or approximately 160 pixels wide, which led to blurring of the color for 320- or 640-wide signals, and made text difficult to read. Many users upgraded to higher-quality televisions with S-Video or RGBI inputs that helped eliminate chroma blur and produce more legible displays. The earliest, lowest cost solution to the chroma problem was offered in the Atari 2600 Video Computer System and the Apple II+, both of which offered the option to disable the color and view a legacy black-and-white signal. On the Commodore 64, the GEOS mirrored the Mac OS method of using black-and-white to improve readability.
The resolution was first introduced by home computers such as the Commodore Amiga and, later, Atari Falcon. These computers used interlace to boost the maximum vertical resolution. These modes were only suited to graphics or gaming, as the flickering interlace made reading text in word processor, database, or spreadsheet software difficult.
The advantage of a overscanned computer was an easy interface with interlaced TV production, leading to the development of Newtek's Video Toaster. This device allowed Amigas to be used for CGI creation in various news departments, drama programs such as NBC's seaQuest, The WB's Babylon 5, and early computer-generated animation by Disney for The Little Mermaid, Beauty and the Beast, and Aladdin.
In the PC world, the IBM PS/2 VGA on-board graphics chips used a non-interlaced 640×480×16 color resolution that was easier to read and thus more useful for office work. It was the standard resolution from 1990 to around 1996. The standard resolution was until around 2000. Microsoft Windows XP, released in 2001, was designed to run at minimum, although it is possible to select the original in the Advanced Settings window.
Programs designed to mimic older hardware such as Atari, Sega, or Nintendo game consoles when attached to multiscan CRTs, routinely use much lower resolutions, such as or for greater authenticity, though other emulators have taken advantage of pixelation recognition on circle, square, triangle and other geometric features on a lesser resolution for a more scaled vector rendering. Some emulators, at higher resolutions, can even mimic the aperture grille and shadow masks of CRT monitors.

Commonly used

The list of common resolutions article lists the most commonly used display resolutions for computer graphics, television, films, and video conferencing.