Digital photography
Digital photography uses cameras containing arrays of electronic photodetectors interfaced to an analog-to-digital converter to produce images focused by a lens, as opposed to an exposure on photographic film. The digitized image is stored as a computer file ready for further digital processing, viewing, electronic publishing, or digital printing. It is a form of digital imaging based on gathering visible light.
Until the advent of such technology, photographs were made by exposing light-sensitive photographic film and paper, which was processed in liquid chemical solutions to develop and stabilize the image. Digital photographs are typically created solely by computer-based photoelectric and mechanical techniques, without wet bath chemical processing.
In consumer markets, apart from enthusiast digital single-lens reflex cameras, most digital cameras now come with an electronic viewfinder, which approximates the final photograph in real-time. This enables the user to review, adjust, or delete a captured photograph within seconds, making this a form of instant photography, in contrast to most photochemical cameras from the preceding era.
Moreover, the onboard computational resources can usually perform aperture adjustment and focus adjustment as well as set the exposure level automatically, so these technical burdens are removed from the photographer unless the photographer feels competent to intercede. Electronic by nature, most digital cameras are instant, mechanized, and automatic in some or all functions. Digital cameras may choose to emulate traditional manual controls or it may instead provide a touchscreen interface for all functions; most camera phones fall into the latter category.
Digital photography spans a wide range of applications with a long history. Much of the technology originated in the space industry, where it pertains to highly customized, embedded systems combined with sophisticated remote telemetry. Any electronic image sensor can be digitized; this was achieved in 1951. The modern era in digital photography is dominated by the semiconductor industry, which evolved later. An early semiconductor milestone was the advent of the charge-coupled device image sensor, first demonstrated in April 1970; since then, the field has advanced rapidly, with concurrent advances in photolithographic fabrication.
The first consumer digital cameras were marketed in the late 1990s. Professionals gravitated to digital slowly, converting as their professional work required using digital files to fulfill demands for faster turnaround than conventional methods could allow. Starting around 2000, digital cameras were incorporated into cell phones; in the following years, cell phone cameras became widespread, particularly due to their connectivity to social media and email. Since 2010, the digital point-and-shoot and DSLR cameras have also seen competition from the mirrorless digital cameras, which typically provide better image quality than point-and-shoot or cell phone cameras but are smaller in size and shape than typical DSLRs. Many mirrorless cameras accept interchangeable lenses and have advanced features through an electronic viewfinder, which replaces the through-the-lens viewfinder of single-lens reflex cameras.
History
While digital photography has only relatively recently become mainstream, the late 20th century saw many small developments leading to its creation. The history of digital photography began in the 1950s when, in 1951, the first digital signals were saved to magnetic tape via the first video tape recorder. Six years later, in 1957, the first digital image was produced through a computer by Russell Kirsch. It was an image of his son.The first semiconductor image sensor was the charge-coupled device, invented by physicists Willard S. Boyle and George E. Smith at Bell Labs in 1969. While researching the metal-oxide semiconductor process, they realized that an electric charge was analogous to a magnetic bubble and that the charge could be stored on a tiny MOS capacitor. As it was fairly straightforward to fabricate a series of MOS capacitors in a row, they connected a suitable voltage to the capacitors so that the charge could be stepped along from one to the next. This semiconductor circuit was later used in the first digital video cameras for television broadcasting, and its invention was recognized by a Nobel Prize in Physics in 2009.
The first close-up image of Mars was taken as Mariner 4 flew by it on July 15, 1965, with a digital camera system designed by NASA and JPL. In 1976, the twin Mars landers Viking 1 and Viking 2 produced the first images from the surface of Mars. The imaging process was different from that of a modern digital camera, though the result was similar; Viking used a mechanically scanned facsimile camera rather than a mosaic of solid state sensor elements. This produced a digital image that was stored on tape for later, relatively slow transmission back to Earth.
The first published color digital photograph was produced in 1972 by Michael Francis Tompsett using CCD sensor technology and was featured on the cover of Electronics Magazine. It was a picture of his wife, Margaret Tompsett.
The Cromemco Cyclops, a digital camera developed as a commercial product and interfaced to a microcomputer, was featured in the February 1975 issue of Popular Electronics magazine. It used MOS technology for its image sensor.
An important development in digital image compression technology was the discrete cosine transform, a lossy compression technique first proposed by Nasir Ahmed while he was working at the Kansas State University in 1972. DCT compression is used in the JPEG image standard, which was introduced by the Joint Photographic Experts Group in 1992. JPEG compresses images down to much smaller file sizes, and has become the most widely used image file format. The JPEG standard was largely responsible for popularizing digital photography.
The first self-contained digital camera was created in 1975 by Steven Sasson of Eastman Kodak. Sasson's camera used CCD image sensor chips developed by Fairchild Semiconductor in 1973. The camera weighed 8 pounds, recorded black-and-white images to a cassette tape, had a resolution of 0.01 megapixels, and took 23 seconds to capture its first image in December 1975. The prototype camera was a technical exercise, not intended for production. While it was not until 1981 that the first consumer camera was produced by Sony, the groundwork for digital imaging and photography had been laid.
The first digital single-lens reflex camera was the Nikon SVC prototype demonstrated in 1986, followed by the commercial Nikon QV-1000C released in 1988. The first widely commercially available digital camera was the 1990 Dycam Model 1; it also sold as the Logitech Fotoman. It used a CCD image sensor, stored pictures digitally, and connected directly to a computer for downloading images. Originally offered to professional photographers for a hefty price, by the mid-to-late 1990s, due to technology advancements, digital cameras were commonly available to the general public.
The advent of digital photography also gave way to cultural changes in the field of photography. Unlike film photography, dark rooms and hazardous chemicals were no longer required for the post-production of an image – images could now be processed and enhanced from a personal computer. This allowed photographers to be more creative with their processing and editing techniques. As the field became more popular, digital photography and photographers diversified. Digital photography expanded the field of photography from a small, somewhat elite circle to one that encompassed many people.
The camera phone further helped popularize digital photography, along with the Internet, social media, and the JPEG image format. The first cell phones with built-in digital cameras were produced in 2000 by Sharp and Samsung. Small, convenient, and easy to use, camera phones have made digital photography ubiquitous in the daily life of the general public.
Digital camera
Sensors
s are arrays of electronic devices that convert the optical image created by the camera lens into a digital file that is stored in some digital memory device, inside or outside the camera. Each element of the image sensor array measures the intensity of light hitting a small area of the projected image and converts it to a digital value.The two main types of sensors are charge-coupled devices —in which the photo charge is shifted to a central charge-to-voltage converter—and CMOS or active pixel sensors.
Most cameras for the general consumer market create color images, in which each pixel has a color value from a three-dimensional color space like RGB. Although there is light-sensing technology that can distinguish the wavelength of the light incident on each pixel, most cameras use monochrome sensors that can only record the intensity of that light, over a broad range of wavelengths that includes all the visible spectrum. To obtain color images, those cameras depend on color filters applied over each pixel, typically in a Bayer pattern, or on movable filters or light splitters such as dichroic mirrors. The resulting grayscale images are then combined to produce a color image. This step is usually performed by the camera itself, although some cameras may optionally provide the unprocessed grayscale images in a so-called raw image format.
However, some special-purpose cameras, such as those for thermal mapping, or low light viewing, or high speed capture, may record only monochrome images. The Leica Monochrom cameras, for example, opted for a grayscale-only sensor to get better resolution and dynamic range. The reduction from three-dimensional color to grayscale or simulated sepia toning may also be performed by digital post-processing, often as an option in the camera itself. On the other hand, some multispectral cameras may record more than three color coordinates for each pixel.