Video display standards have evolved from early monochrome to today's high resolution color. The evolution of these standards is summarized here.
Initial video standards were developed by IBM as one of the only players in thePC marketplace. As IBM's influence over the hardware waned (or got diluted, whichever viewpoint you care to take) the Video Electronics Standards Association (VESA) was formed to define new standards for computer video displays.
But, it all started with the...
Introduced in 1981, MDA was a pure text display showing 80 character lines with 25 vertical lines on the screen. Typically, the display was green text on a black background. Individual characters were 9 pixels wide by 14 pixels high (7x11 for the character, the rest for spacing). If you multiply that out you get a resolution of 720x350 but since the individual pixels were not capable of being addressed there were no graphics. Although, some programs managed some interesting bar charts and line art using various ASCII characters; particularly those above 128 used by code page 437.
The IBM MDA card had 4 KB of video memory. Display attributes included: invisible, underline, normal, bright/bold, reverse video, and blinking. Some attributes could be combined. IBM graphic's card also contained a parallel printer port giving it the full name: Monochrome Display and Printer Adapter.
The monitor's refresh rate was 50 Hz and users tended to complain about eyestrain after long days in front of the monitor.
Noting the 720x350 resolution of the MDA display, a company called Hercules Computer Technology (founded by Van Suwannukul), in 1982, developed an MDA-compatible video card that could display MDA text as well as graphics by having routines to individually address each pixel in the display. Because the screen height had to be a multiple of four, the full resolution of the Hercules Graphics Card was 720x348.
The Hercules card addressed two graphic pages, one at B0000h and the other at B8000h. When the second page was disabled there was no conflict with other adapters and the Hercules card could run in a dual-monitor mode with CGA or other graphics cards on the same computer. Hercules even made a CGA-compatible card called the Hercules Color Card and later the Hercules Graphics Card Plus (June 1986) followed by the Hercules InColor Card (April 1987) which had capabilities similar to EGA cards.
The graphics caught on and not only did Hercules cards multiply rapidly but clones of them started to appear; the ultimate homage to success. Most major software included a Hercules driver.
However, despite its attempts to keep up, Hercules started to fail as a company and was acquired by ELSA in August 1998 for $8.5 million. ELSA then declared bankrupcy in 1999 and the Hercules brand was bought by Guillemot Corporation, a French-based company, for $1.5 million. In 2004 Guillemot stopped producing graphic cards but Hercules, the name, lives on in some of their software and other products.
But, color was still the ultimate goal and Hercules was pushed out by other IBM specifications...
IBM came back to the fore when color started to appear in computer displays. The CGA standard, introduced in 1981 and primative by today's standards, was still color; even if only 16 of them. Because the first PCs were for business, the color did not first catch on and the MDA monochrome standard we more often used. As prices came down and clones of the IBM PC were introduced, CGA became more of a standard.
The CGA card came with 16 KB of video memory and supported several different modes:
The 16-color graphic mode used a composite color mode instead of the 16 colors of the CGA text above. Because the color technique was not supported in the BIOS (Built In Operating System) there was little adoption of that mode except by some games.
The CGA color palette was based on the Motorola MC6845 display controller. Red, green, and blue were created by the three cathode rays with black being an absence of cathode rays. The other colors were mixes of two different colors and white used all three color beams. An "intensifier" bit gave a brighter version of the basic 8 colors for a total of 16. There was one exception to this. In the normal RGB model color #6 should be a dark yellow (#AAAA00) however IBM changed the monitor circuitry to detect it and lower its green component to more closely match a brown (#AA5500) color. Other monitor makers mimiced this which is why the intense version of #6, brown, turned out to be a bright yellow as the intense version was not so modified. There is no clear reason expressed why IBM did this but it's speculated they wanted to match 3270 mainframe colors. So, the colors appeared as...
There were several tweaks to the CGA text and graphics systems which resulted in different default background colors, different colored borders, and other tweaks which gave the appearance of the CGA system having more than the graphic modes above; but, these were all tweaks and not changes to the basic system itself.
Refresh rate for CGA monitors was increased to 60 Hz as a result of eyestrain complaints from the MDA 50 Hz rate. (The higher the refresh rate the less likely pixels on the screen will flicker as the phosphor is refreshed at a faster rate.)
But, the low resolution of CGA begged for higher resolutions. To fill those demands IBM developed EGA...
The Enhanced Graphics Adapter was introduced by IBM in 1984 as the primary display for the new PC-AT Intel 286-based computer. EGA increased resolution to 640x350 pixels in 16 colors. The card itself contained 16 KB of ROM to extend the system BIOS to add graphics functions. The card started with 64 KB of video memory but later cards and clone cards came with 256KB of video memory to allow full implementation of all EGA modes which included...
Some EGA clones extended the EGA features to include 640x400, 640x480, and 720x540 along with hardware detection of the attached monitor and a special 400-line interlace mode to use with older CGA monitors. None of these became standard however.
EGA's life was fairly short as VGA was introduced by IBM in April of 1987 and quickly took over the market. In the meantime, IBM had a brief go with a specialized graphics system called PGC and the 8514 Display Standard...
The Professional Graphics Controller (PGC) enjoyed a short lifetime between 1984 and 1987. It offered the "high" resolution of 640x480 pixels with 256 colors out of a palette of 4,096 colors. Refresh rate was 60 Hz. The card had 320 KB of video RAM and an on-board microprocessor. The card had a CGA mode as well but this could be turned off in order to maintain a CGA card in the same computer if necessary.
Designed for high-end use, the controller was composed of three(!) adapter cards (two cards, each taking a single adapter slot and a third card between and attached to each card). All were physically connected together with cables.
The price of several thousand dollars and the complicated hardware brought the PGC to a quick end even though it was a very good graphics card for its day.
IBM introduced the 8514 Display Standard in 1987; about the same time as VGA. The companion monitor (model 8514) was also sold by IBM. The pair (8514/A Display Adapter and 8514 monitor) comprise the 8514 Display Standard and is generally regarded as the first mass-market video card accelerator. It was certainly not the first in the industry, but others before it were largely designed for workstations. Workstation accelerators were programmable; the 8514 was not; it was a fixed-function accelerator and could therefore be sold at a much lower price for mass-market use. The card typically had 2D-drawing functions like line-draw, color-fill, and BITBLT offloaded to it while the CPU worked on other tasks.
The basic modes the 8514 were designed to operate at were...
Note the difference between interlaced and non-interlaced display and the frequency above. While the 8514 displayed a much higher resolution screen than most other mass-market solutions of the day, the use of an interlaced display was unusual.
8514 was replaced by IBM's XGA standard which we'll talk about later on this page. For now, we'll get back in sequence with VGA...
With VGA you see a change in the terminology from adapter to array. This was a result of the fact that VGA graphics started to come on the motherboard as a single chip and not as plug-in adapter boards that took up an expansion slot in the computer. While since replaced with other standards for general use, VGA's 640x480 remains a sort of lowest common denominator for all graphics cards. Indeed, even the Windows splash screen logo comes in at 640x480 because it shows before the graphics drivers for higher resolution are loaded into the system.
VGA supports both graphics and text modes of operation and can be used to emulate most (but not all) of the EGA, CGA, and MDA modes of operation). The most common VGA graphics modes include:
The VGA specification dictated 256KB of video RAM, 16- and 256-color modes, a 262,144 color palette (six bits for each of red, green, and blue), a selectable master clock (25 MHz or 28 MHz), up to 720 horizontal pixels, up to 480 lines, hardware smooth scrolling, split screen support, soft fonts, and more.
Another VGA programming trick essentially created another graphics mode: Mode X. By manipulating the 256 KB video RAM four separate planes could be formed where each used 256 colors. Mode X transferred some of the video memory operations to the video hardware instead of keeping them with the CPU. This sped up the display for things like games and was most often seen in 320x240 pixel resolution as that produced square pixels in 4:3 aspect ratio. Mode X also allowed double buffering; a method of keeping multiple video pages in memory in order to quickly flip between them. All VGA 16-color modes supported double buffering; only Mode X could do it in 256 colors.
Many other programming tweaks to VGA could (and were) also performed. Some, however, caused monitor display problems such as flickering, roll, and other abnormalities so they were not used commercially. Commercial software typically used "safe" VGA modes.
Video memory typically mapped into real mode memory in a PC in the memory spaces...
Note that by using the different memory areas it is possible to have two different monitors attached and running in a single computer. Early on, Lotus 1-2-3 took advantage of this by having the ability to display "high resolution" text on an MDA display along with color (low-resolution) graphics showing an associated graph of some part of the spreadsheet. Other such uses included coding on one screen with debugging information on another and similar applications.
VGA also had a subset called...
Multicolor Graphics Adapter (MCGA)
MCGA shipped first with the IBM PS/2 Model 25 in 1987. MCGA graphics were built into the motherboard of the computer. As a sort of step between EGA and VGA, MCGA had a short life and was shipped with only two IBM models, the PS/2 Model 25 and PS/2 Model 30 and fully discontinued by 1992. The MCGA capabilities were incorporated into VGA. Note: Some say that the 256-color mode of VGA is MCGA but, to be accurate, no MCGA cards were ever made; only the two IBM PS/2 models indicated had true MCGA chips. The 256-color mode of VGA, while similar, stands alone as part of the VGA specification.
The specific MCGA display modes included:
Like the other IBM standards, clone makers quickly cloned VGA. Indeed, while IBM produced later graphics specifications as we'll see below, the VGA specification was the last IBM standard that other manufacturers followed closely. Over time, as extensions to VGA appeared, they were loosely grouped under the name Super VGA.
Super VGA was first defined in 1989 by the Video Electronics Standards Association (VESA); an association dedicated to providing open standards instead of the closed standards from a single company (IBM). While initially defined as 800x600 with 16 colors, SVGA evolved to 1024x768 with 256 colors and even higher resolutions and colors as time went on.
As a result SVGA is more of an umbrella than a fixed standard. Indeed, most any graphics system released between the early 1990s and early 2000s (a decade!) has generally been called SVGA. And, it was up to the user to determine from the specifications if the graphics system supported their needs.
The VESA SVGA standard was also called the VESA BIOS Extension (VBE). VBE could be implemented in either hardware or software. Often you would find a version of the VBE in a graphic card's hardware BIOS with extensions in software drivers.
How could a standard be so fractured? With the introduction of VGA, the video interface between the adapter and the monitor changed to analog from digital. An analog system can support what is effectively an infinite number of colors. Therefore, color depth largely became a function of how the video adapter was constructed and not the monitor. Therefore, for a set of different monitors there could be thousands of different video adapters that could connect to the monitors and drive them accordingly. Of course, the monitors had to be able to handle the various refresh frequencies and some had to be larger to support the increasing number of pixels but it was easier to produce a few large multi-frequency monitors than it was to produce the graphics computing power necessary to drive them.
Thus, while SVGA is an accepted term, it has no specific meaning except to indicate a display capability generally somewhere between 800x600 pixels and 1024x768 pixels at color depths ranging from 256 colors (8-bits) to 65,536 colors (16-bits). But, even those values overlap the various XGA standards...
IBM's XGA was introduced in 1990 and is generally considered to be a 1024x768 pixel display. It would be wrong, however, to consider XGA a successor to SVGA as the two were initially released about the same time. Indeed, the SVGA "definition" has expanded as seen above and one might consider XGA to have been folded under the SVGA umbrella.
Initially, XGA was an enhancement to VGA and added two modes to VGA...
Graphic display processing offloading features from the 8514 system were incorporated into and expanded under XGA. The number and type of drawing primitives were increased over the 8514 and the 16-bit color mode added.
Later, and XGA-2 specification added 640x480 at true color, increased the 1024x768 mode to high color (16-bit/pixel for 65,536 colors) and improved the graphic accelerator performance.
Note: XGA was an IBM standard, the VESA released a similar standard called Extended Video Graphics Array (EVGA) in 1991. The two should not be confused. EVGA, as a standalone term, never really caught on.
XGA, over time developed into a family of different standards. The following entries summarize this family...
Super XGA was another step up in resolution and became a family of its own...
Ultra XGA was another step up in resolution based on four times the standard 800x600 resolution of the SVGA standard. It's basic format is 1600x1200 pixels and also became a family of its own...
As of 2005, the QXGA and related standards are the highest resolution presently defined. As of this writing, there are few commercial monitors with these resolutions and only a very few higher-end digital cameras. Expect more as fabrication techniques improve.
The Quad term derives from a multiple of 4 against a lower resolution standard. QXGA, for example, is 4 times the number of pixels of XGA at the same aspect ratio (4 times 786,432 pixels = 3,145,728 pixels which, at a 4:3 aspect ratio becomes a display of 2048x1536 pixels). Sometimes the name quantum is used instead to indicate there are so many pixels you'd have to measure them at the quantum level; a bit of an exaggeration but in keeping with the fun people have inventing names.
The QXGA family therefore can be summarized as...
As of 2005, the HXGA and related standards are the highest resolution presently defined. As of this writing, there are no commercial monitors with these resolutions and only a very few high-end digital cameras.
The Hex[adecatuple] term derives from a multiple of 16 against a lower resolution standard. HXGA, for example, is 16 times the number of pixels of XGA at the same aspect ratio (16 times 786,432 pixels = 12,582,912 pixels which, at a 4:3 aspect ratio becomes a display of 4096x3072 pixels).
The HXGA family therefore can be summarized as...