Touchscreen


A touchscreen is a type of display that can detect touch input from a user. It consists of both an input device and an output device. The touch panel is typically layered on the top of the electronic visual display of a device. Touchscreens are commonly found in smartphones, tablets, laptops, and other electronic devices. The display is often an LCD, AMOLED or OLED display.
A person can give input or control the information processing system through simple or multi-touch gestures by touching the screen with a special stylus or one or more fingers. Some touchscreens use ordinary or specially coated gloves to work, while others may only work using a special stylus or pen. The user can use the touchscreen to react to what is displayed and, if the software allows, to control how it is displayed; for example, zooming to increase the text size.
A touchscreen allows users to interact directly with on-screen content, rather than using indirect input devices such as a mouse or touchpad. Touchscreens are commonly found on smartphones, tablets, kiosks, and many modern laptops, where they allow tapping, swiping, and pinching to perform actions on the screen.
Touchscreens are common in devices such as smartphones, handheld game consoles, and personal computers. They are common in point-of-sale systems, automated teller machines, electronic voting machines, and automobile infotainment systems and controls. They can also be attached to computers or, as terminals, to networks. They play a prominent role in the design of digital appliances such as personal digital assistants and some e-readers. Touchscreens are important in educational settings such as classrooms or on college campuses.
The popularity of smartphones, tablets, and many types of information appliances has driven the demand and acceptance of common touchscreens for portable and functional electronics. Touchscreens are found in the medical field, heavy industry, automated teller machines, and kiosks such as museum displays or room automation, where keyboard and mouse systems do not allow a suitably intuitive, rapid, or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying controller-based firmware have been made available by a wide array of after-market system integrators, and not by display, chip, or motherboard manufacturers. Display manufacturers and chip manufacturers have acknowledged the trend toward acceptance of touchscreens as a user interface component and have begun to integrate touchscreens into the fundamental design of their products.

History

One predecessor of the modern touchscreen includes stylus based systems.

1946: Direct light pen

A patent was filed by Philco Company for a stylus designed for sports telecasting which, when placed against an intermediate cathode-ray tube display would amplify and add to the original signal. Effectively, this was used for temporarily drawing arrows or circles onto a live television broadcast, as described in.

1960s

1962: Optical

The first version of a touchscreen which operated independently of the light produced from the screen was patented by AT&T Corporation. This touchscreen utilized a matrix of collimated lights shining orthogonally across the touch surface. When a beam is interrupted by a stylus, the photodetectors which no longer are receiving a signal can be used to determine where the interruption is. Later iterations of matrix based touchscreens built upon this by adding more emitters and detectors to improve resolution, pulsing emitters to improve optical signal to noise ratio, and a nonorthogonal matrix to remove shadow readings when using multi-touch.

1963: Indirect light pen

Later inventions built upon this system to free telewriting styli from their mechanical bindings. In 1963, Robert E. Graham patented an “indirect” light-pen telewriting apparatus that allowed users to draw on a separate surface while the system electronically transmitted and reproduced the strokes on a computer display. This design reduced mechanical limitations of earlier stylus-based systems and demonstrated an early form of electronic handwriting capture, enabling drawings and annotations to be stored or transmitted for later use.

1965: Finger driven touchscreen

The first finger driven touchscreen was developed by Eric Johnson, of the Royal Radar Establishment located in Malvern, England, who described his work on capacitive touchscreens in a short article published in 1965 and then more fully—with photographs and diagrams—in an article published in 1967.

Mid-60s: Ultrasonic Curtain

Another precursor of touchscreens, an ultrasonic-curtain-based pointing device in front of a terminal display, had been developed by a team around at Telefunken Konstanz for an air traffic control system. In 1970, this evolved into a device named "Touchinput-Einrichtung" for the SIG 50 terminal utilizing a conductively coated glass screen in front of the display. This was patented in 1971 and the patent was granted a couple of years later. The same team had already invented and marketed the Rollkugel mouse RKS 100-86 for the SIG 100-86 a couple of years earlier.

1968: ATC

The application of touch technology for air traffic control was described in an article published in 1968. Frank Beck and Bent Stumpe, engineers from CERN, developed a transparent touchscreen in the early 1970s, based on Stumpe's work at a television factory in the early 1960s. Then manufactured by CERN, and shortly after by industry partners, it was put to use in 1973.

1970s

1972

A group at the University of Illinois filed for a patent on an optical touchscreen that became a standard part of the Magnavox Plato IV Student Terminal and thousands were built for this purpose. These touchscreens had a crossed array of 16×16 infrared position sensors, each composed of an LED on one edge of the screen and a matched phototransistor on the other edge, all mounted in front of a monochrome plasma display panel. This arrangement could sense any fingertip-sized opaque object in close proximity to the screen.

1973: Multi-Touch Capacitance

In 1973, Beck and Stumpe published another article describing their capacitive touchscreen. This indicated that it was capable of multi-touch but this feature was purposely inhibited, presumably as this was not considered useful at the time. "Actual contact between a finger and the capacitor is prevented by a thin sheet of plastic".

1977: Resistive

An American company, Elographics – in partnership with Siemens – began work on developing a transparent implementation of an existing opaque touchpad technology, U.S. patent 3,911,215, 7 October 1975, which had been developed by Elographics' founder George Samuel Hurst. The resulting resistive technology touch screen was first shown on the World's Fair at Knoxville in 1982.

1980s

1982: Multi-touch Camera

technology began in 1982, when the University of Toronto's Input Research Group developed the first human-input multi-touch system, using a frosted-glass panel with a camera placed behind the glass.

1983: HP-150

An optical touchscreen was used on the HP-150 starting in 1983. The HP 150 was one of the world's earliest commercial touchscreen computers. HP mounted their infrared transmitters and receivers around the bezel of a 9-inch Sony cathode ray tube.

1983: Multi-touch force sensing touchscreen

Bob Boie of AT&T Bell Labs, used capacitance to track the mechanical changes in thickness of a soft, deformable overlay membrane when one or more physical objects interact with it; the flexible surface being easily replaced, if damaged by these objects. The patent states "the tactile sensor arrangements may be utilized as a touch screen".
Many derivative sources retrospectively describe Boie as making a major advancement with his touchscreen technology; but no evidence has been found that a rugged multi-touch capacitive touchscreen, that could sense through a rigid, protective overlay - the sort later required for a mobile phone, was ever developed or patented by Boie. Many of these citations rely on anecdotal evidence from Bill Buxton of Bell Labs. However, Bill Buxton did not have much luck getting his hands on this technology. As he states in the citation: "Our assumption was that the Boie technology would become available to us in the near future. Around 1990 I took a group from Xerox to see this technology it since I felt that it would be appropriate for the user interface of our large document processors. This did not work out".

Up to 1984: Capacitance

Although, as cited earlier, Johnson is credited with developing the first finger operated capacitive and resistive touchscreens in 1965, these worked by directly touching wires across the front of the screen.
Stumpe and Beck developed a self-capacitance touchscreen in 1972, and a mutual capacitance touchscreen in 1977. Both these devices could only sense the finger by direct touch or through a thin insulating film. This was 11 microns thick according to Stumpe's 1977 report.

1984: Touchpad

released a touch pad for the Micro 16 to accommodate the complexity of kanji characters, which were stored as tiled graphics.

1986: Graphic Touchpad

A graphic touch tablet was released for the Sega AI Computer.

Early 80s: Evaluation for Aircraft

Touch-sensitive control-display units were evaluated for commercial aircraft flight decks in the early 1980s. Initial research showed that a touch interface would reduce pilot workload as the crew could then select waypoints, functions and actions, rather than be "head down" typing latitudes, longitudes, and waypoint codes on a keyboard. An effective integration of this technology was aimed at helping flight crews maintain a high level of situational awareness of all major aspects of the vehicle operations including the flight path, the functioning of various aircraft systems, and moment-to-moment human interactions.