• Please review our updated Terms and Rules here

Ega graphics iterations and versions

PC Video cards moved in baby steps; i.e. 64K, 128K, 256K, and 512K up through XP finally arriving at1K and 2K heading into W7. As the gaming industry pushed its boundaries, the video graphics people were happy to oblige. Today's latest 5090's sport 32GB with the RTX Pro 6000 Blackwell series recently listing at around $8,565, which includes 96GB video memory.
 
Yes but not for the same reasons.
After few MB we get to hi-res true color display and it's not any more about the framebuffer, but about acceleration and GPU.
 
  • Like
Reactions: cjs
Yes but not for the same reasons.
After few MB we get to hi-res true color display and it's not any more about the framebuffer, but about acceleration and GPU.

Yep. Even a "4K" display of 3840 × 2160 in 32 bit color is "only" 32MB for just a flat 2D bitmap. I mean, sure, that's a lot, but when you consider that true-color megapixel resolutions requiring more than 2MB of RAM go back to the 1980's, well, there's not a lot else in the realm of computer tech that's only gone up by an order of magnitude in the last forty years. If displays had scaled like everything else we should be running monitors with molecule size pixels... which sure, it'd be pointless, human eyes wouldn't be able to tell the difference, but it'd sure look great on a spec sheet.

The advent of compositing window managers in the 2000's seems like it was specifically calculated to try to make PC UIs suck up all that video RAM and texturing power that was otherwise being wasted on games. I mean, sure, I'll grant it looks... kind of neat? but I gotta be honest, I'd probably be just as happy if I *couldn't* just slightly see through my terminal windows. It genuinely isn't the killer app everyone seemed to think it was at the time.
 
Yes but not for the same reasons.
Sorry, but I'm lost on your point. Wasn't hi-res and acceleration more or less one of the qualities and they simply progressed? Yes, there were hall-of-fame ISA graphics cards which dominated the W95/98 era with their 1-2 MB of RAM prior to PCI, but once you had seen a 3DFX Voodoo 1 PCI or a RIVA 128, in action you forgot all about them.
 
Yes but not for the same reasons.
After few MB we get to hi-res true color display and it's not any more about the framebuffer, but about acceleration and GPU.
Most of the advances in early video were for office apps in Windows 3.x. Color depth came later when people started doing desktop publishing and photo editing. Took a while for Windows accelerators to scroll around massive spreadsheets without lag.

It was cool seeing a Matrox Ultima Plus 4MB VLB card doing 24bit or 16bit color at 1600x1200 at a good refresh rate.

Once the 2D world was conquered real 3D showed up and we are still figuring that out today.
 
Sorry, but I'm lost on your point. Wasn't hi-res and acceleration more or less one of the qualities and they simply progressed?

I think @Zare 's point was that early on the amount of RAM you had on your video card *directly* related to the quality of the static picture you could get out of it, which is a thing that's not even remotely true anymore. IE, when you're comparing the 16K on a CGA card to the 256K on a VGA card every doubling of the byte count translated to a significant difference in color depth or pixel count. These are real quality of life improvements that mattered just as much, if not more, for business software, not just games; anyone who's ever had to slog through doing page layout on a 640x200 monochrome screen (* hands up right here) will attest that the difference between that and 640x480 *alone* is a very big deal. The amount of video memory on your video card directly correlated with how high res/color picture you could drive up until the mid-1990's... and that's really all that's relevant when making comparisons to EGA. 3D is a completely different animal. All that extra memory was used for textures up until GPUs started turning into a general purpose processors; now most GPUs are more powerful than the computers they're slotted into.

Gaming really wasn't that much of a driver for PC video card improvements in the 1980's, simply because PCs were too freaking expensive for that; if you wanted to play a good video game you were better off getting a Commodore 64. People spending $3,000-$5,000 for a PC with a high-end video card were mostly concerned with how well it did Autocad or drew Lotus 1-2-3 pie charts, and it's telling that most of the games that people really remember as showing off what EGA could do didn't come out until the late 1980's-early-1990's, after actual EGA cards were already obsolete. The real driver for improvements in PC graphics for years was the advent of Windows, which most games other than Solitaire didn't run under until the second half of the 1990's, and while 2D acceleration was definitely "a thing" that manufacturers implemented, in part as an attempt to work around the hideous bus restrictions imposed by ISA, for the most part the improvements were all about increasing color depths and resolutions until things started plateauing at high/true color megapixel modes. Gaming was mostly a trailing indicator to these improvements until every new PC sold had a PCI slot, which is what finally cleared the decks for 3D acceleration to be a thing you could start tacking onto cards that were already better than they needed to be for all those mainstream business applications.

To put it another way, if you have 4MB or 8MB of VRAM to play with because the business expectation is you need to be able to do 1600x1200 in 24 bit color for spreadsheets then you've just backdoored your way into having enough RAM for implementing Z-buffers and textures at 640x480. This is how 3D moved from the early dedicated cards like Voodoos into the mainstream, and of course we've never looked back from that.
 
The advent of compositing window managers in the 2000's seems like it was specifically calculated to try to make PC UIs suck up all that video RAM and texturing power that was otherwise being wasted on games. I mean, sure, I'll grant it looks... kind of neat? but I gotta be honest, I'd probably be just as happy if I *couldn't* just slightly see through my terminal windows. It genuinely isn't the killer app everyone seemed to think it was at the time.

I was quite happy to use them. One of mainstays of my personal and professional workstations were Windowmaker running GNOME2/Mate session behind it and a compiz on top.
With focus-follows-mouse which is the mode for me, it's kind of cool peeking "behind" into a stacked down terminal window.

That kind of a desktop I removed for KDE Plasma only when resolutions went full HD and above.

Sorry, but I'm lost on your point.

The memory was utilized differently. Framebuffer (and pages if you're lucky) for classic cards, texture memory nowadays...
You did pin down correctly that vram numbers tended to linearly scale up through time. It is possible because the usage changed.
 
I think @Zare 's point was that early on the amount of RAM you had on your video card *directly* related to the quality of the static picture you could get out of it, which is a thing that's not even remotely true anymore.

Exactly. Today's video RAM is the memory of a dedicated compute engine in the "graphics card" which as we all know can be used for many other things that aren't graphics.

Windows accelerators, e.g. 2D stuff with few megs of ram capable of hi-res true color desktop experience, are still just a graphics card, there may be no point in reusing its 2D geometry devices to turn it into a general computer. The end game is still picture generation.
 
Gaming really wasn't that much of a driver for PC video card improvements in the 1980's, simply because PCs were too freaking expensive for that

CGA was not meant for 'gaming' certainly not through the RGBI port. It's there to draw a pie chart and to outline a table in vector fashion.
It would be interesting to know whether successes of home computers such as C64, Atari and Amiga shaped EGA or VGA in any way. I don't believe so. I those were thoroughly 100% designed as visualization system for the PC, everything else aside.

It's the success of the PC that brought games to CGA and EGA.
Many cool CGA ports released when it was already obsolete, but heavily present on the market.

People think it's vice versa, that "crappy gaming adapter" and successors hampered PC gaming and threw away developers to better platforms. I think that's a load of crap.
 
I was quite happy to use them. One of mainstays of my personal and professional workstations were Windowmaker running GNOME2/Mate session behind it and a compiz on top.
With focus-follows-mouse which is the mode for me, it's kind of cool peeking "behind" into a stacked down terminal window.

I think most of the gripe I have with them is strictly historical; when they came out in the mid-aughts you'd run into all these goofy glitches, like finding out that your Radeon card that otherwise works fine has a hardware limit of 2048x2048 on texture sizes, which means things go all pear-shaped when you plug in your second monitor and the combined desktop is wider than that. (Those same Radeons did multi-head off of one large virtual framebuffer, so even though the two monitors you connected were each less than 2048 pixels wide it was the combined size that counted for the limitation.) You'd also run into performance limitations and glitches that, well... again, 20 years down the river don't seem to be a problem anymore.

It's sort of funny, actually, how much advances in video cards have rendered these problems moot. Lately I've been getting a surprising amount of use out of a 2016 vintage Lenovo Chromebook with a humble Celeron N3060 SoC that I converted to a very oldskool linux. I'm not sure *why* I'm using it so much, I have a much better laptop, but there's just something comfortingly retro about it; it somehow reminds me in a good way of running Linux on a really underpowered 486 laptop back in the 90's, I guess. Despite having integrated video with all the power of a boiled potato (for its time) I have to give it credit, is seems to do compositing better than I remember it ever working in the mid-aughts and... yeah, I think it *is* actually a little useful on its little tank-slit of a screen to be able to kind of see through things. It's definitely something the 486 with its 512K Cirrus Labs VGA could never dream of doing, and it does it effortlessly.

People think it's vice versa, that "crappy gaming adapter" and successors hampered PC gaming and threw away developers to better platforms. I think that's a load of crap.

Back in the day if a computer was popular people wrote games for it, no matter how lousy its graphics were. There are games for the black-and-white TRS-80's that I would rank pretty highly in my personal top 40 of best retro video games simply because they play so well, despite looking ludicrously crude. CGA wasn't great but it actually wasn't *that bad* by the standards of the time, especially when you factor in stupid pet tricks like composite color. The PC port of a given video game might never be the best one, but, eh, it'd probably at least outclass the Apple II port. This made it a ridiculous improvement over the graphics capabilities of the previous standard for business machines, IE, CP/M, which was no graphics at all, not even TRS-80 blobs. In short, it was more than good enough to keep developers pounding on it until the march of time made better graphics standards mainstream enough to target.
 
CGA was not meant for 'gaming' certainly not through the RGBI port. It's there to draw a pie chart and to outline a table in vector fashion.
It would be interesting to know whether successes of home computers such as C64, Atari and Amiga shaped EGA or VGA in any way. I don't believe so. I those were thoroughly 100% designed as visualization system for the PC, everything else aside.

CAD, Business graphics and Desktop publishing would have greatly influenced early graphics decisions. Early PC resolutions were probably intended to display on TV sets as an option - but by the time the EGA came out, I think IBM would have realized that wasn't going to happen - still games did make use of the NTSC colors... And of course, nothing like that made it to the PAL countries of the world.

EGA feels pretty normal for the 80s. I remember wanting to use it for PCB design, because the image was majorly improved from the CGA version. 640x400x16 was fantastic. 640x480x64 was even better, but was rare to see, even with multisync monitors.

EGA also supported this 6 bit color model in TTL that had 64 colors. These had two pins per color IIRC... All on a 9 pin connector. This was less common and required a specific EGA supporting monitor, though I think it was possible to use some monitors with a slight loss of color information.

It was "Super VGA" that seems the odd one out to me. Early SVGA cards weren't even VESA compatible, and software had to be written specifically for them above the normal VGA modes, which outside of high resolution text display, seemed somewhat limited.

VESA compatability wasn't much better -and certainly didn't translate to games, outside of someone called Moraff ( or something like that ) who specifically made VESA compatible SVGA games, like pinball.

VGA though could be thanked for simplifying the whole thing - and that it carried on for 30 years until digital interfaces were more common was amazing.
 
VESA compatability wasn't much better -and certainly didn't translate to games, outside of someone called Moraff ( or something like that ) who specifically made VESA compatible SVGA games, like pinball.
Eh? Nearly all SuperVGA DOS games used VESA modes. And there were quite a few.
 
Eh? Nearly all SuperVGA DOS games used VESA modes. And there were quite a few.

Sorry, I should have said "wasn't much better at first" - though I still don't recall a lot of games coming out for SVGA later on either - and by then we had moved on from 16 bit VGA cards to VLB interfaces for a lot of that.

I remember there was nothing much for the Trident 8900 I got back in the day.... Outside of Windows - and it seemed that if you didn't get specific drivers with the card, you were SOL when it came to selection and range.

Perhaps because I tended to play action games, larger resolutions meant that the game ran slower, as the CPUs didn't have the memory to move a lot of data around quickly, and even then video ram was sometimes considerably slower than system memory.

Also, I was thinking more around 1024x768x256 VESA mode for SVGA rather than 640x480x256 which is probably what those games were using. Actually, I forgot 640x480x256 was a VESA mode to begin with.
 
To put it another way, if you have 4MB or 8MB of VRAM to play with because the business expectation is you need to be able to do 1600x1200 in 24 bit color for spreadsheets then you've just backdoored your way into having enough RAM for implementing Z-buffers and textures at 640x480. This is how 3D moved from the early dedicated cards like Voodoos into the mainstream, and of course we've never looked back from that.
In my experience I can't recall the spreadsheet people using anything more than b/w Hercules. I don't recall any Voodoo types being implemented especially for word processing or accounting either. What happened was the PC's were being constantly upgraded and the newer and better graphic cards were just part of the plan. Gaming was a factor in graphic card design starting the mid to late 90's. I have a box full of them to attest to that and they weren't purchased for office work.
 
I don't recall any Voodoo types being implemented especially for word processing or accounting either.

When did I say they were? My point was that at a certain point the fact that normie 2D graphics cards were getting fitted with four or eight megabytes of VRAM in order to display high “business oriented” resolutions meant it stopped really making sense to slap a separate dedicated card in to do 3D. Some level of 3D capabilities just started coming along for the ride, and it’s only after that when the RAM on video cards started growing much faster than pixel counts.

In my experience I can't recall the spreadsheet people using anything more than b/w Hercules.

They were in the mid-1980’s, but what were they using when everyone was using Excel on Windows 95?

Things changed *really fast* between about 1993 and 1997. At the start of that, sure, an accountant or lawyer’s office very well might mostly still have desks full of 386sx machines with Hercules mono monitors, but just a few years later it’s all Pentiums with megapixel monitors. Comparisons to the pre-Windows graphics standards stop making sense at this point.
 
Gaming wasn't a factor in video card design until 3D came about. When I was going to computer shows in the 90's every vender had a 486 VLB systems with DOOM 2 playing as a demo to sell machines and card (which just happened to be faster then other cards in games). They when Quake came out everybody used that as a DEMO with a Pentium and whatever PCI video card was out that worked well enough.

Gaming was a factor for sound card design and probably the only reason you needed a sound card untill DVD decoding and multimedia presentations became the norm.
 
Early SVGA cards weren't even VESA compatible, and software had to be written specifically for them above the normal VGA modes, which outside of high resolution text display, seemed somewhat limited.
This may be a stupid question... but don't the first SVGA cards predate the VESA BIOS extensions, at least the commonly used versions (VBE 1.2 and 2.0)? Can't fault them for not having support out of the box.

In any case, most vendors shipped a TSR to add VESA support (and later games shipped UniVBE or similar), which are fine in my opinion. It is a pure software standard, after all.

EGA also supported this 6 bit color model in TTL that had 64 colors. These had two pins per color IIRC... All on a 9 pin connector. This was less common and required a specific EGA supporting monitor, though I think it was possible to use some monitors with a slight loss of color information.
EGA can only show 16 colors at a time[*], but it has 64 colors to choose from. There are only four bitplanes.

[*] CPU-intensive hackery aside. Outside of a few demos, this was not done.

VESA compatability wasn't much better -and certainly didn't translate to games, outside of someone called Moraff ( or something like that ) who specifically made VESA compatible SVGA games, like pinball.
Since 640x480x8 is already a (widely used) SVGA mode, you'd need a VESA driver (or specific chipset support) to use it. My oddball ISA card even has a utility (not VESA) to make it pretend to be a different chipset, so that some games can use it at 800x600x4.

Older games did not rely on high resolution - even if the card itself can produce 1024x768, the monitor might not want to (especially notebook displays). On top of that, older 512K cards can only do it at 16 colors, often only interlaced, and ISA bandwidth prevents smooth scrolling or animation at all. Consequently, most games at the time designed for 640x480 only.

As soon as you target VLB or PCI hardware and higher resolutions, you might as well target Windows as well. Doing that gets you hardware acceleration (especially BitBlt, hardware cursor and sometimes color conversions), which VBE did not provide at all. So VESA support did not matter.
 
Older games did not rely on high resolution - even if the card itself can produce 1024x768, the monitor might not want to (especially notebook displays). On top of that, older 512K cards can only do it at 16 colors, often only interlaced, and ISA bandwidth prevents smooth scrolling or animation at all. Consequently, most games at the time designed for 640x480 only.

Most games of the era were designed for 320x200 though more static games did go up from there... Myst was one, but it was nearly entirely static.

As soon as you target VLB or PCI hardware and higher resolutions, you might as well target Windows as well. Doing that gets you hardware acceleration (especially BitBlt, hardware cursor and sometimes color conversions), which VBE did not provide at all. So VESA support did not matter.

Yes, windows had a huge effect, and even if it was bad at the time, the first time Microsoft shipped me some "Games For Windows" I could see the writing on the wall that signalled the end of the console era. PC games were pretty good by then, but once Microsoft got involved it was clear that graphics accelerators would move to Windows and that it was going to be the new platform... That only took about 4 years, and then VGA card drivers seemed only to care about Windows support.

I was a tech journalist at the time, and saw how quickly Microsoft dominated less catered-to markets such as education - and it was good stuff too. Magic school bus was a great idea to move to PC and nothing else of the era came close, but then they shifted their focus in the late 90's to start towards dominating games and while they were a little slow at first, they quickly caught up. I can't recall the last DOS game I ever played... It would have been really close to the turn of the century... I've got a strange feeling it might have been Strike Commander. Though by then Flight Unlimited had shown what would come in the future.

@Eudimorphodon is correct about the era of the most change - I'd agree with 1993 and 1997. From 386 to Pentiums. Even Intel started supporting OpenGL on some of their cards. But DirectX was just emerging at the time and it was clear which way it would go even though OpenGL was smashing them until the end of the decade.

The history of DirectX is quite interesting.
 
There was a brief period from 1995-1996 where Pentium CPUs were common, but 3D accelerators were not, and most games were still in DOS. Quite a few of those support SVGA and are far from static, such as Apache Longbow, Need for Speed, Screamer, Duke Nukem 3D, etc. VESA is also not specific to ISA cards. So this idea that the only "VESA games" were static is just not accurate.
 
There was a brief period from 1995-1996 where Pentium CPUs were common, but 3D accelerators were not, and most games were still in DOS. Quite a few of those support SVGA and are far from static, such as Apache Longbow, Need for Speed, Screamer, Duke Nukem 3D, etc. VESA is also not specific to ISA cards. So this idea that the only "VESA games" were static is just not accurate.

That was a long time from 1990/91 to 1995/1996 while the VESA BIOS extensions were available, but not really used much IMO, and how many DOS games from 1990 to 1996 played in 1024x768?

And fast ones at that?

And how long did this last before Windows made more sense to developers than VESA/DOS?

My comments were comparing EGA to SVGA as interim formats. SVGA outlasted EGA over a long period of time, but outside of Windows, we rarely saw the benefits of SVGA over direct drivers and natural resolution progress, while EGA left it's mark on future standards.

This is the guy IIRC who made 1024x768 VESA based games - https://en.wikipedia.org/wiki/Steve_Moraff

Google Gemini's take on his contribution -
"Steve Moraff and his company, Moraffware, were notable for pushing the boundaries of DOS graphics. In the early 1990s, his shareware games were among the first to support "tweaked VGA modes" and SuperVGA chipset support, allowing for resolutions up to 1024x768x256. This was a significant step forward in DOS gaming graphics."

I can't recall anyone else doing those kind of graphics to really push SVGA in the early years. I'm not sure I would have thought of 640x480x256 though as SVGA at the time - and a lot of games did support that mode later on.
 
Back
Top