MIDI
Musical Instrument Digital Interface is an American-Japanese technical standard that describes a communication protocol, digital interface, and electrical connectors that connect a wide variety of electronic musical instruments, computers, and related audio devices for playing, editing, and recording music. A single MIDI cable can carry up to sixteen channels of MIDI data, each of which can be routed to a separate device. Each interaction with a key, button, knob or slider is converted into a MIDI event, which specifies musical instructions, such as a note's pitch, timing and velocity. One common MIDI application is to play a MIDI keyboard or other controller and use it to trigger a digital sound module to generate sounds, which the audience hears produced by a keyboard amplifier. MIDI data can be transferred via MIDI or USB cable, or recorded to a sequencer or digital audio workstation to be edited or played back.
MIDI also defines a file format that stores and exchanges the data. Advantages of MIDI include small file size, ease of modification and manipulation and a wide choice of electronic instruments and synthesizer or digitally sampled sounds. A MIDI recording of a performance on a keyboard could sound like a piano or other keyboard instrument; however, since MIDI records the messages and information about their notes and not the specific sounds, this recording could be changed to many other sounds, ranging from synthesized or sampled guitar or flute to full orchestra.
Before the development of MIDI, electronic musical instruments from different manufacturers could generally not communicate with each other. This meant that a musician could not, for example, plug a Roland keyboard into a Yamaha synthesizer module. With MIDI, any MIDI-compatible keyboard can be connected to any other MIDI-compatible sequencer, sound module, drum machine, synthesizer, or computer, even if they are made by different manufacturers.
MIDI technology was standardized in 1983 by a panel of music industry representatives and is maintained by the MIDI Manufacturers Association. All official MIDI standards are jointly developed and published by the MMA in Los Angeles, and the MIDI Committee of the Association of Musical Electronics Industry in Tokyo. In 2016, the MMA established The MIDI Association to support a global community of people who work, play, or create with MIDI.
History
In the early 1980s, there was no standardized means of synchronizing electronic musical instruments manufactured by different companies. Manufacturers had their own proprietary standards to synchronize instruments, such as CV/gate, DIN sync and Digital Control Bus. Ikutaro Kakehashi, the president of Roland, felt the lack of standardization was limiting the growth of the electronic music industry. In June 1981, he proposed developing a standard to the Oberheim Electronics founder Tom Oberheim, who had developed his own proprietary interface, the Oberheim Parallel Bus.Kakehashi felt that Oberheim's system was too cumbersome, and spoke to Dave Smith, the president of Sequential Circuits, about creating a simpler, cheaper alternative. While Smith discussed the concept with American companies, Kakehashi discussed it with Japanese companies Yamaha, Korg and Kawai. Representatives from all companies met to discuss the idea in October. Initially, only Sequential Circuits and the Japanese companies were interested. Using Roland's DCB as a basis, Smith and Sequential Circuits engineer Chet Wood devised a universal interface to allow communication between equipment from different manufacturers. Smith and Wood proposed this standard in a paper, Universal Synthesizer Interface, at the Audio Engineering Society show in October 1981. The standard was discussed and modified by representatives of Roland, Yamaha, Korg, Kawai, and Sequential Circuits. Kakehashi favored the name Universal Musical Interface, pronounced you-me, but Smith felt this was "a little corny". However, he liked the use of instrument instead of synthesizer, and proposed Musical Instrument Digital Interface. Robert Moog, the president of Moog Music, announced MIDI in the October 1982 issue of Keyboard.
At the 1983 Winter NAMM Show, Smith demonstrated a MIDI connection between Prophet 600 and Roland JP-6 synthesizers. The MIDI specification was published in August 1983. The MIDI standard was unveiled by Kakehashi and Smith, who received Technical Grammy Awards in 2013 for their work. In 1983, the first instruments were released with MIDI, the Roland Jupiter-6 and the Prophet 600. In 1983, the first MIDI drum machine, the Roland TR-909, and the first MIDI sequencer, the Roland MSQ-700, were released.
The MIDI Manufacturers Association was formed following a meeting of "all interested companies" at the 1984 Summer NAMM Show in Chicago. The MIDI 1.0 Detailed Specification was published at the MMA's second meeting at the 1985 Summer NAMM Show. The standard continued to evolve, adding standardized song files in 1991 and adapted to new connection standards such as USB and FireWire. In 2016, the MIDI Association was formed to continue overseeing the standard. In 2017, an abridged version of MIDI 1.0 was published as an international standard IEC 63035. An initiative to create a 2.0 standard was announced in January 2019. The MIDI 2.0 standard was introduced at the 2020 Winter NAMM Show.
The BBC cited MIDI as an early example of open-source technology. Smith believed MIDI could only succeed if every manufacturer adopted it, and so "we had to give it away".
Impact
MIDI's appeal was originally limited to professional musicians and record producers who wanted to use electronic instruments in the production of popular music. The standard allowed different instruments to communicate with each other and with computers, and this spurred a rapid expansion of the sales and production of electronic instruments and music software. This interoperability allowed one device to be controlled from another, which reduced the amount of hardware musicians needed. MIDI's introduction coincided with the dawn of the personal computer era and the introduction of samplers and digital synthesizers. The creative possibilities brought about by MIDI technology are credited for helping revive the music industry in the 1980s.MIDI introduced capabilities that transformed the way many musicians work. MIDI sequencing makes it possible for a user with no notation skills to build complex arrangements. A musical act with as few as one or two members, each operating multiple MIDI-enabled devices, can deliver a performance similar to that of a larger group of musicians. The expense of hiring outside musicians for a project can be reduced or eliminated, and complex productions can be realized on a system as small as a synthesizer with integrated keyboard and sequencer.
MIDI also helped establish home recording. By performing preproduction in a home environment, an artist can reduce recording costs by arriving at a recording studio with a partially completed song. In 2022, the Guardian wrote that MIDI remained as important to music as USB was to computing, and represented "a crucial value system of cooperation and mutual benefit, one all but thrown out by today's major tech companies in favour of captive markets". In 2005, Smith's MIDI Specification was inducted into the TECnology Hall of Fame, an honor given to "products and innovations that have had an enduring impact on the development of audio technology." As of 2022, Smith's original MIDI design is still in use.
Applications
Instrument control
MIDI was invented so that electronic or digital musical instruments could communicate with each other and so that one instrument can control another. For example, a MIDI-compatible sequencer can trigger beats produced by a drum sound module. Analog synthesizers that have no digital component and were built prior to MIDI's development can be retrofitted with kits that convert MIDI messages into analog control voltages. When a note is played on a MIDI instrument, it generates a digital MIDI message that can be used to trigger a note on another instrument. The capability for remote control allows full-sized instruments to be replaced with smaller sound modules, and allows musicians to combine instruments to achieve a fuller sound, or to create combinations of synthesized instrument sounds, such as acoustic piano and strings. MIDI also enables other instrument parameters to be controlled remotely.Synthesizers and samplers contain various tools for shaping an electronic or digital sound. Filters adjust timbre, and envelopes automate the way a sound evolves over time after a note is triggered. The frequency of a filter and the envelope attack, are examples of synthesizer parameters, and can be controlled remotely through MIDI. Effects devices have different parameters, such as delay feedback or reverb time. When a MIDI continuous controller number is assigned to one of these parameters, the device responds to any messages it receives that are identified by that number. Controls such as knobs, switches, and pedals can be used to send these messages. A set of adjusted parameters can be saved to a device's internal memory as a patch, and these patches can be remotely selected by MIDI program changes.
Composition
MIDI events can be sequenced with computer software, or in specialized hardware music workstations. Many digital audio workstations are specifically designed to work with MIDI as an integral component. MIDI piano rolls have been developed in many DAWs so that the recorded MIDI messages can be easily modified. These tools allow composers to audition and edit their work much more quickly and efficiently than did older solutions, such as multitrack recording. Compositions can be programmed for MIDI that are impossible for human performers to play.Because a MIDI performance is a sequence of commands that create sound, MIDI recordings can be manipulated in ways that audio recordings cannot. It is possible to change the key, instrumentation or tempo of a MIDI arrangement, and to reorder its individual sections, or even edit individual notes. The ability to compose ideas and quickly hear them played back enables composers to experiment.
Algorithmic composition programs provide computer-generated performances that can be used as song ideas or accompaniment.
Some composers may take advantage of the standard, portable set of commands and parameters in MIDI 1.0 and General MIDI to share musical data files among various electronic instruments. The data composed via the sequenced MIDI recordings can be saved as a standard MIDI file, digitally distributed, and reproduced by any computer or electronic instrument that also adheres to the same MIDI, GM, and SMF standards. MIDI data files are much smaller than corresponding recorded audio files.