The Complete Timeline of PC monitors
Monitors are output devices that computers use to send information. People in the early days of computers didn’t need monitors so much as they communicated with computers on paper. Punched card machines allow you to write instructions on cards that a computer reads. After processing the instructions, the computer marked their output on other cards or paper tape for the people to decipher.
If you are looking for a monitor for gaming purpose, then check out our guide on the best monitor for overwatch and CSGO.
Early computers occupied row after row of cabinets full of electronics, usually with little or no visual indication of the information they were processing and spitting out. For that I needed a printout on paper.
Two Major Monitor Innovations In the early days of the IBM PC, users needed a different monitor for every display schedule, be it MDA, CGA, EGA or whatever. To solve this problem, NEC invented the first multisync monitor (named “MultiSync”), which dynamically supported a range of resolutions, scan rates and refresh rates, all in one frame. That capability quickly became the industry standard. In 1987 IBM introduced the VGA video standard and the first VGA monitors, in conjunction with IBM’s line of PS / 2 computers. Almost all analog video standards have since been based on VGA (and the well-known 15-pin connector). Photos: NEC, IBM.
( Source )