• Please review our updated Terms and Rules here

640x400 256 color mode on standard VGA

The basic issue for a bog standard VGA is it requires two pixel clocks to load an 8 bit pixel. That only leaves you 320 (640/2) or 360 (720/2) horizontal 8 bit pixels at standard refresh rates. If your VGA supports faster pixel clocks and has higher bandwidth memory, there is probably a way to support many other resolutions.
 
This may not be implied in the https://www.vcfed.org/forum/forum/te...722#post828722 thread, but just because I was able to get 640x400x256 working doesn't mean I recommend trying it. I don't recall what card it worked on, but I do recall timing frequencies were causing my multisync monitor to whine quite loudly, which spooked me from trying to use the mode in anything production. I'm sure no "real" software ever used this mode. It was much, much safer to support VESA and instruct the user to install a VESA 1.2 or better driver for their specific video card, then enumerate all of the modes available on startup (you can't always rely on 100h being 640x400x256) and pick the one you want.

By roughly 1991, effectively every card and chipset manufacturer was producing TSRs that provided at least VESA 1.1 or 1.2, so there was no reason to try to support exact chipset (ATI, Cirrus, etc.) modes. I have a crapload of them archived at ftp://ftp.oldskool.org/pub/drivers/VESA Drivers if anyone needs to leech them.

IMO, the only unchained VGA modes that are worth anyone's time, are:
  • 320x200 - 70Hz, 4 video pages. Can use 2 pages for pageflipping, use the rest for storing sprites/tiles and using the card to copy them around.
  • 320x240 - 59.71Hz, 3+ video pages. Square pixels = win. Resist the urge to triple-buffer, it's not worth it.
  • 320x400 - 70Hz, 2 video pages. Double the resolution for free and still have 2 pages.
  • 360x480 - 59.97Hz, 1 video page. Good for higher-res static title screens.
  • 320x600 - 56.55Hz, 1 video page. If you want to make an "18-bit color" fakemode, this works; see https://www.pouet.net/prod.php?which=6094 for an example of it used in the real world.
All others are not worth using IMO because they don't offer benefits over the above. For example, 360x270 might seem appealing as a resolution as it is larger than 320x240 and still has square pixels, but it has a 56Hz refresh, might not work with fixed-frequency monitors, and has 2 pages instead of 3. You'd be hard-pressed to find better unchained video mode choices than the five I listed above, and the first four are 100% compatible with all VGA cards and monitors (which also translates into being the most compatible with emulators).
 
From what I can tell almost no card except the IBM original is actually a 'Standard' VGA, so just because a mode works on one card which isn't a 'super' VGA doesn't mean its possible on any non-'super' card. From what I can tell 'super' doesn't actually mean anything unless it supports the VESA standard.

FWIW, this wasn't a problem that started with VGA. Many third-party EGA cards also support oddball modes that a true-blue IBM card won't. Run the setup program for an old version of "PC Paintbrush" sometime and marvel at just how much diversity there was. And of course none of them were mutually compatible in their extensions. Something like VESA standardization was needed for years before it came along. (And even longer before it really worked well.)

Nonetheless, I will try out CompuShow, but I suspect it's using a special driver which is aware of the extra registers.

It's right in the documentation that it knows about OAK VGA cards.

Like Resman says, the deal killer is a regular VGA card can't fetch 8 bit pixels at the same rate it can 4 bit ones. The RAM on a standard VGA card is 32 bits wide. (Look at the card and you'll see its 256k of RAM will be in the form of four 64kx8 banks, not a single 256kx8 bank.) In planar 16 color mode this means the hardware only has to fetch a byte from memory every eight pixels. (You can have four 1-bit shift registers in parallel latching from each bank and collectively clocking four bits at a time for each tick of the pixel clock into the output DAC.) This means that even in the higher of the two pixel clocks the memory only needs to run at an effective ~3.5 mhz to supply the ~14Mbyte/second worst-case bandwidth needed for pixel painting. 256 color modes shuffle the deck so each fetch gives you four pixels for each memory fetch instead of eight but that's fine as long as there's half as many pixels. But, yeah, your pixel clock is effectively halved which means you'll be limited to modes you can render with clocks of ~12.5 or ~14mhz.

What you're trying to do is get 256 color pixels at the same pixel clock as 16 color pixels, which means you're doubling the bandwidth you're asking from memory. By modern standards expecting sustained 25Mbyte or 28Mbyte/s performance from a 32 bit wide memory array doesn't sound like too much to ask, but in 1987 that was actually a little bit ambitious for consumer-grade hardware. (By IBM's standards, anyway. Something to remember is that IBM was never ambitious when it came to the PC line, the hardware engineering was always conservative for the time.) But more to the point, the original IBM VGA wasn't built to do that so far as I know. I assume clones that could do 256 color modes at the same resolutions as 16 color could?

Trixter's workaround, as he explains, was to reduce the frames per second so a 14mhz pixel clock paints a 640x400 screen thirty-something times a second instead of 60. This would melt an IBM 8513 into slag. (Or, hopefully, just make it not sync.)
 
Last edited:
I haven't looked into whether this is actually possible, but I'm wondering if there is a way to pan the screen horizontally by half a pixel in 320x400x256 mode. If so, then you could set up a sort-of 640x400x256 mode on a standard VGA by flipping between two pages and two pan positions at each vsync. The technique would be similar to how you can double the vertical resolution for a given hsync frequency by using interlacing, except that you'd be "interlacing" horizontally instead of vertically. The CPU would need to be involved (it's not a "set and forget" mode that can be used with arbitrary code). And there would be some flicker in areas of high horizontal detail but (for typical images) not nearly as bad as the flickering you'd get by halving the vsync rate.
 
My concern with this is that, like CGA RGB monitors, the phosphor in most VGA monitors is fast decay, so the flickering would range from "noticeable" to "irritating".
 
Actually, I wonder if the pixel clock is actually halved or if the pixels are just doubled.
 
Back
Top