MIDI (Musical Instrument Digital Interface) tools have long been as essential to audio engineering as language is to communication. After all, the MIDI protocol basically plays the same role, i.e. it is the basis for the exchange of information, in this case between computers, electronic instruments and other hardware. Initially, in the early eighties, the MIDI protocol was developed to "couple" (connect in series) different synthesizers with each other to make it possible to operate several synthesizers using only one keyboard.
This was virtually a revolutionary development in this field. Thousands of devices from different manufacturers suddenly featured the same specification (MIDI 1.0, originally bound to a specific hardware implementation) and the same components (MIDI Out, MIDI In, MIDI Thru). Since then, these devices can "talk" to and comprehend each other in the same language by means of a MIDI cable. Once more, for a better understanding: MIDI is not audio, it is just a set of instructions, of mere control commands.
Originally limited to be used with synthesizers and keyboards only, the development of computer technology has naturally led to almost unlimited possibilities and enormous advancements. Meanwhile, professionals program their entire sound backdrop via MIDI, utilizing a digital interface such as, e.g. a DAW (Digital Audio Workstation). Thanks to the extension of the comparatively "old" protocol MIDI 1.0 to MIDI 2.0 (new Universal MIDI Packet format) in 2020, MIDI now also works with modern, state-of-the-art devices via USB or Ethernet, for example.
Even though some users view the development critically and every musician and sound engineer should have his or her own preferences - MIDI as such offers a diverse range of solutions to be used on and off stage, in the recording studio and while working on the computer at home.