Sorry, but I'm lost on your point. Wasn't hi-res and acceleration more or less one of the qualities and they simply progressed?
I think
@Zare 's point was that early on the amount of RAM you had on your video card *directly* related to the quality of the
static picture you could get out of it, which is a thing that's not even remotely true anymore. IE, when you're comparing the 16K on a CGA card to the 256K on a VGA card every doubling of the byte count translated to a
significant difference in color depth or pixel count. These are real quality of life improvements that mattered just as much, if not more, for business software, not just games; anyone who's ever had to slog through doing page layout on a 640x200 monochrome screen (* hands up right here) will attest that the difference between that and 640x480 *alone* is a
very big deal. The amount of video memory on your video card directly correlated with how high res/color picture you could drive up until the mid-1990's... and that's really all that's relevant when making comparisons to EGA. 3D is a completely different animal. All that extra memory was used for textures up until GPUs started turning into a general purpose processors; now most GPUs are more powerful than the computers they're slotted into.
Gaming really wasn't that much of a driver for PC video card improvements in the 1980's, simply because PCs were too freaking expensive for that; if you wanted to play a good video game you were better off getting a Commodore 64. People spending $3,000-$5,000 for a PC with a high-end video card were mostly concerned with how well it did Autocad or drew Lotus 1-2-3 pie charts, and it's telling that most of the games that people really remember as showing off what EGA could do didn't come out until the late 1980's-early-1990's,
after actual EGA cards were already obsolete. The real driver for improvements in PC graphics for
years was the advent of Windows, which most games other than Solitaire didn't run under until the second half of the 1990's, and while 2D acceleration was definitely "a thing" that manufacturers implemented, in part as an attempt to work around the hideous bus restrictions imposed by ISA, for the most part the improvements were all about increasing color depths and resolutions until things started plateauing at high/true color megapixel modes. Gaming was mostly a trailing indicator to these improvements until every new PC sold had a PCI slot, which is what finally cleared the decks for 3D acceleration to be a thing you could start tacking onto cards that were already better than they needed to be for all those mainstream business applications.
To put it another way, if you have 4MB or 8MB of VRAM to play with because the business expectation is you need to be able to do 1600x1200 in 24 bit color for spreadsheets then you've just backdoored your way into having enough RAM for implementing Z-buffers and textures at 640x480. This is how 3D moved from the early dedicated cards like Voodoos into the mainstream, and of course we've never looked back from that.