• Please review our updated Terms and Rules here

Ega graphics iterations and versions

snowman

Member
Joined
Mar 8, 2017
Messages
20
Location
Upper Peninsula of Michigan - U.S.
I grew up in the late 70s and 80s. I lived through all the pc graphics revisions. What intrigues me the most are the EGA revisions. I dont know much about them. Im talking about what I remember marketing saying was super ega? 640x480 ega, 800x600 ega etc. I thing genoa ega cards supported those and others.
The question I have would be what monitors supported those resolutions for ega? Definitely not analog vga monitors. WERE those enhanced ega mode monitors common or were those ega revisions “flash in the pan” gimmicks to squeeze more juice out of EGA with other better options and there really was no support? I never remember seeing any real life monitors supporting ega modes outside the standard modes set by the original ibm ega card. My friend had ega. Heck even I had ega for a short time. Currently I have two ega monitors in my collection. Ive never experienced a monitor that supports those enhanced modes and I dont remember reading about options in pc world or pc magazine (both of which I subscribed to) except some articles discussing a few ega cards that had support. Thanks for any info or comment on popularity and/or monitor model support.
 
Multisync monitors were common enough on the high end. Those would do the frequencies required for the various modifications to EGA.

I think PC Magazine had done several articles that described how the higher resolution modes could be made to work. One was split between Aug and Sep 1986 if my searches are correct.
 
Multisync monitors were common enough on the high end. Those would do the frequencies required for the various modifications to EGA.

I think PC Magazine had done several articles that described how the higher resolution modes could be made to work. One was split between Aug and Sep 1986 if my searches are correct.
The average folk who could not yet afford VGA settled for the more common 640x200 EGA. My Boca Raton EGA video card and 'Harry Swartz' EGA monitor alone ran over $700 back in 1989 or so. That was about $350 a piece or so.
 
Thanks quagmire I did read the wiki but from what I see it gives brief mention of the extended modes and no examples of monitors that support. Ok I agree from my memory and experience most had 640x350 or even just ran a cga monitor on an ega card flipping the dip switches to cga monitor resolutions (640x200 would be a cga monitor resolution).

Ive never seen a 9pin multi sync ttl monitor that went to 640x480 or 800x553 (the odd super ega resolution) except (if memory serves me) an NEC model and maybe a sony model. I guess Ill do more research on monitor models. Standard Ega monitors seem so rare these days much less these other ttl multisync monitors.
 
I can't recall of any home users that that I knew who went for the fancy multi sync this and that's unless your job called for it. Professional and maintenance people swear by them. By the time I was able to get into VGA I was fortunate enough to be able to sell my EGA setup so that took the edge of of things.
 
One important point was that the extra resolution was nearly free to add to a clone EGA card. The clone card typically had 256K so the memory was there to handle 640x480. Toss on the correct oscillator and BIOS extensions to handle switching to a new resolution and the project was done. The monitor might have a cost but many users could easily make a business case for getting crisp high resolution.
 
Ive never seen a 9pin multi sync ttl monitor that went to 640x480 or 800x553 (the odd super ega resolution) except (if memory serves me) an NEC model and maybe a sony model.

Mitsubishi made several models (Diamondscan?) that went to at least 640x480 and supported both analog and digital. Also be aware that the monitor end of the cable might not be nine pin, there were some of these multi standard monitors that had DIN plugs or different D-shell pin counts.

Another oddball “SuperEGA” variant is there were a few cards that could support TTL 26khz “640x400” monitors like used with the Tandy 2000 and some Japanese machines like the PC-9801 series. These monitors didn’t support the full six bit palette but could otherwise do the job displaying a slightly squished 640x350 and double-scanning the 200 line modes.
 
One important point was that the extra resolution was nearly free to add to a clone EGA card. The clone card typically had 256K so the memory was there to handle 640x480. Toss on the correct oscillator and BIOS extensions to handle switching to a new resolution and the project was done. The monitor might have a cost but many users could easily make a business case for getting crisp high resolution.
Yes, but that's hind sight. Unless you were into hardware development, who would have known? Most folks were just simple users back then.
 
I have an EGA card with both connectors that can output either TTL or analog 31KHz. That seems like a natural choice for cards that came out after the PS/2 line had already launched. Probably didn't take long for 31KHz VGA monitors to outnumber 21.8KHz EGA monitors.
Yes, but that's hind sight. Unless you were into hardware development, who would have known? Most folks were just simple users back then.
I suspect the poster you replied to was referring to the makers of clone EGA cards who decided to add these non-standard, high-resolution modes, not to end users modifying cards themselves. The point is that once you have an EGA-compatible design, you can program a variety of non-standard modes so long as you have a clock generator and adequate memory/bandwidth.
 
I learned some basic CRT theory a long time ago. Haven't really worked with them since.

Why are sub 30kHz CRTs rare, and why are they called multisync?
The direct translation would be many syncs. If it syncs from 15 to 70 kHz, it's still one sync region.
And as far as I know there is a circuit or IC there for deflection, naturally the EM parts need to be able to handle the driving, but is it so hard making it work below 30kHz?
 
but is it so hard making it work below 30kHz?

Obviously not. The reason multisync monitors that A: go below 31.5khz and B: support both TTL and Analog (which is a necessity for working with all popular IBM PC video standards before VGA) are so rare is simply because there really wasn’t a point making them any more after 1991 or so. VGA replaced EGA remarkably quickly and became *the* baseline for PC monitors. Taking the TTL support out is a cost savings, and other than the shambling ghosts of the Commodore Amiga and Atari ST there weren’t any NTSC-frequency analog RGB machines left on the market either.

What I find really annoying personally is how this VGA floor has been inherited by LCD monitors and scaler boards. Look at the datasheets for the scaler chips in any multistandard LCD monitor or TV set and you’ll find they all treat NTSC/PAL frequencies as a special case restricted to the composite/component inputs and put the floor on the analog RGB inputs at 30khz. There’s no technical reason for that, it’s not like theres a big expensive flyback transformer you need to tune to handle a wider range…
 
I have an EGA card with both connectors that can output either TTL or analog 31KHz. That seems like a natural choice for cards that came out after the PS/2 line had already launched. Probably didn't take long for 31KHz VGA monitors to outnumber 21.8KHz EGA monitors.

I suspect the poster you replied to was referring to the makers of clone EGA cards who decided to add these non-standard, high-resolution modes, not to end users modifying cards themselves. The point is that once you have an EGA-compatible design, you can program a variety of non-standard modes so long as you have a clock generator and adequate memory/bandwidth.
Very few were 'modifying' their cards. Maybe the folks like the ham crowd? It's all rather academic, as the EGA thing passed fairly rapidly with VGA pricing rapidly descending and the gaming industry was a driving force. By the mid 90's, EGA was all but gone except for hobbyist and those on a strict budget.
 
By the mid 90's, EGA was all but gone except for hobbyist and those on a strict budget.

Anyone buying an EGA monitor in the mid-90's was buying a used one. Just for laughs I quick-scrolled through a January 1991 issue of PC Magazine, and I didn't see *any* EGA card or monitors listed for sale. CGA and Hercules mono both outlived EGA; there were still a few bottom-of-the-barrel XT-class starter machines around that used CGA (and the rare ad still listed *one* terrible .52 dot pitch Packard Bell monitor to go with those machines) and Hercules cards/monitors were still the bottom choice offered in the "choose your own adventure" PC price matrixes (and, sure, were still a legit choice for your Netware server or whatever). But, yeah, EGA monitors were *firmly* in the dustbin of history at this point.

Perhaps ironically the early 1990's was the golden age of video games that used the 16 color 320x200 video mode (this is when companies like Apogee really figured out how to leverage EGA's strengths), and I image a lot of these games *did* get played on older 286 and 386es with "real" EGA cards, but since VGA was almost perfectly backwards compatible, well, there was no *reason* to buy an EGA card over a VGA card to play them.
 
What I find really annoying personally is how this VGA floor has been inherited by LCD monitors and scaler boards. Look at the datasheets for the scaler chips in any multistandard LCD monitor or TV set and you’ll find they all treat NTSC/PAL frequencies as a special case restricted to the composite/component inputs and put the floor on the analog RGB inputs at 30khz. There’s no technical reason for that, it’s not like theres a big expensive flyback transformer you need to tune to handle a wider range…

Yes, I find it extremely annoying too. It's all in the LCD driver board. You have odd mid 2010 Dells capable of syncing to 15kHz while their documents explicitly say 30. It's like someone put a 'wrong' number in the build configuration.

CGA and Hercules mono both outlived EGA; there were still a few bottom-of-the-barrel XT-class starter machines around that used CGA

I agree, XTs outlived 286 because they were far more cost efficient as bottom of the barrel machine. Buying a 286/EGA in 1990 costed twice more than an XT/CGA, and you still couldn't run contemporary software.

XT and CGA/MDA were designed as low-cost, 286 and EGA not exactly, 386 and VGA certainly not. People forget that buying a 386SX in 1990 is buying one of the faster CPUs on the market, there are still 4.77MHz pieces being sold...

This quote from wikipedia, sourced from 1988 publication, sums it up well - "The EGA standard was made obsolete in 1987 by the introduction of MCGA and VGA with the PS/2 computer line."
 
When I picked up my 286 in 1988, the entire system was about $1,000 including EGA card with monitor. Dropping down to either CGA or MDA would have saved about $50*. The XT clone was about the same price if an equivalent hard disk was included. There was a brief window when the 286 was offered the best performance for the price. IIRC, the 20 MHz 286 was $80 while the 20 MHz 368SX was $200 in 1990.

* Of course, this was as the EGA cards and monitors were being remaindered as production switched to very similar VGA devices. Many of the early clone VGA cards showed the design of the EGA card that was used as a base with enough additional circuitry to generate the VGA signal. EGA available later tended to be old stock and getting a new VGA card that included EGA output was often cheaper.
 
Those "super EGA" 640x480 cards never made much sense to me. I can't think of a single monitor that supports TTL at 640x480 that doesn't also do analog. If you have a monitor that can do VGA, then you might as well get a VGA card since it would be a lot more capable for not that much more money than an EGA card.

"Multisync" was a trademark of NEC. Other brands had their own similar sounding trademarks such as Sony's "Multiscan". It just meant the monitor could sync to any frequency within it minimum and maximum limits. This was compared to most monitors at the time which supported just a single mode, or maybe 2 or 3 specific modes. Syncing down to 15.75kHz actually adds a fair amount of cost to a monitor that also supports higher scanning rates, so it was dropped as soon as most people didn't need it.
 
In retrospect EGA seems kind of like a weird detour. Before it came out there were already a number of machines out there on the market that used those 26Khz 640x400 displays, which essentially cost the same as EGA's sort of oddball 21Khz monitor. I've always wondered why IBM didn't just use one of those? 640x400 has squarer pixels, and it also would have let IBM double-scan the CGA modes instead of needing to make the monitor switch frequencies.

And on the flip side, at the same time (within a month counting by public announcement) as EGA IBM also introduced the "Professional Graphics Controller" which used a 640x480 monitor almost the same as VGA. (Weirdly it has a slightly slower HSYNC of 30.5 instead of 31.5, but it'll lock onto 31.5Khz just by tweaking the horizontal hold.) The PGC monitor didn't need the TTL circuitry so I doubt it actually cost meaningfully more than the 5154 to make despite its higher sync rate, so... I dunno, it seems like in a slightly saner universe IBM might have just had EGA and PGC use the same monitor and make the distinction between them solely the video acceleration features of the latter.(*)

Many of the early clone VGA cards showed the design of the EGA card that was used as a base with enough additional circuitry to generate the VGA signal.

Quite a few early VGA chipsets had both analog and digital outputs to allow backwards compatiblity with whatever monitor a user might already have, allowing someone to upgrade in phases... and, ironically perhaps, some of the last "EGA" cards sold actually used a VGA chipset with this capability and just didn't populate the analog connector.

Really the only major differences between an EGA and a VGA are:
  1. VGA has the 8x18 bit DAC *in addition to* EGA's 4x6bit "attribute controller", and
  2. Both EGA and VGA are fundamentally planar, but VGA supports "chaining" to fake being a chunky pixel device in the low-res 256 color mode.
It's trivial to modify programs that use EGA's 350 line 16 color high-res mode 10h to use VGA's 480 line mode 12h instead, they look/act identical in every respect other than the size of the planes and if you leave the DAC alone the default palette on start-up will emulate EGA's color set for any program that just touches the attribute controller. Some of the late "SuperEGA" cards (I know Geona did in particular) took advantage of this by tweaking their BIOSes so their 640x480 mode was accessible as mode 12h, thereby allowing them to run high-res VGA software unmodified. But this was only a brief stopgap, within a year after the PS/2 came out all these companies had turned their EGA cards completely into VGA.

(*) EDIT: I guess it occurs to me that one semi-decent justification for EGA sticking with a TTL monitor is the fact that technically EGA counted as a "universal" video card when it came out in 1984, in that in addition to running with its own 5154 monitor it was backwards compatible with the 5151 and 5153 MDA/CGA monitors as well, but... I dunno. IBM's EGA card was so expensive when it came out it was pretty hard to justify buying it to go with one of those monitors, especially considering that:

A: Hercules already existed as a graphics-capable replacement for MDA, was much cheaper, and mono EGA isn't compatible with it, and
B: The 16 color low-res EGA modes didn't really make that much difference for "business software", and if you wanted 16 color games in NTSC resolution, well, you could buy a half of a whole PCjr (or within a couple months, a Tandy 1000) for the price of an EGA card.

An EGA card plus a CGA monitor did become a fairly common in-the-wild combo by the latest 1980's, but only because EGA cards were all moved out for fire-sale prices and there were plenty of CGA monitors floating around to use them with.
 
Last edited:
Those "super EGA" 640x480 cards never made much sense to me. I can't think of a single monitor that supports TTL at 640x480 that doesn't also do analog. If you have a monitor that can do VGA, then you might as well get a VGA card since it would be a lot more capable for not that much more money than an EGA card.
Three years separated the introduction of EGA and VGA, plenty of time to do improvements to the IBM design. One saw much the same pattern a few later as all the VGA clone cards rapidly headed to 1024x768 followed by a long pause as 1024x768 became the resolution that would not die.
 
One saw much the same pattern a few later as all the VGA clone cards rapidly headed to 1024x768 followed by a long pause as 1024x768 became the resolution that would not die.

In the latter half of the 90's I was a huge fan of 1152x864, myself. Most multisyncs could *just* manage it, and if you do the math it's the highest resolution that will fit for a given color depth in a given amount of video memory. (IE, if you have a 1MB video card and want 256 colors, 1024x768 will take up 768K of RAM, 1152x864 will *juuust* fit with 972k, and the next common step up, 1280x1024, strikes out, needing 1.2MB.)

It was a pretty common resolution on workstations and macs but for some reason it never seemed to get much traction on PCs. I discovered it when messing around with Xfree86 configurations early in my Linux adventures.
 
Back
Top