• Please review our updated Terms and Rules here

How does a MDA/Hercules/CGA/EGA/VGA/SVGA display know the pixel aspect ratio to output in each video mode?

juj

Member
Joined
Jan 25, 2022
Messages
32
Hi all,

I have been looking into the VGA and other display standard outputs recently. If I understand correctly, VGA is able to output essentially two vertically different display modes, 400 lines and 480 lines.

How does a monitor know the pixel aspect ratio it should output on the display in a given mode?

In 320x200 resolution (which is practically a 640x400 resolution as far as the VGA output on the wire is concerned, iiuc?) pixels have an aspect ratio of 5:6 since the 320x200 image is getting stretched to cover essentially a 4:3 screen estate.

Int 320x240 Mode-X resolution, or in 640x480 VGA resolution, pixels have a square aspect ratio, since this pixel count is shown on the same 4:3 screen estate, and the aspect ratios match.

However, how does a VGA monitor know essentially how much downwards to increment the electron beam on each horizontal retrace pulse?

In 400 lines modes there are 400 horizontal retraces per frame, and in 480 lines modes there will be 480 horizontal retraces. Wouldn't the electron beam advance vertically the same amount by each retrace period independent of how many lines there are total? Well, we know that is not the case, since otherwise both 400 and 480 lines would have the same pixel aspect ratio, but instead, somehow the VGA monitor knows that in 320x200 modes (400 lines modes?), it needs to traverse down the screen "quicker" than in a 480 lines mode, since both modes produce a 4:3 image.

How does the monitor know how to do that? There is no extra wire information pin on the VGA connector to carry that info, and it is not e.g. the hsync pulse duration either that would control this?

Do VGA monitors implement some hardcoded line counting logic? If so, what would happen if one creates just a slightly nonstandard video mode, e.g. a 399 lines mode instead of a 400 lines mode?

What about the pixel aspect ratio of other video modes? 640x480 modes are 1:1 pixel aspect ratio, 640x400 is 5:6 (vertically stretch 20% taller to make 640x400 cover 4:3 screen estate). There are a lot of other video modes with different video card standards. What kind of pixel aspect ratios are in use in those modes? Are there others than 1:1 and 5:6?

How do the monitors know which pixel aspect ratio to implement?

Thanks!
 
For the original VGA the switches between the available mode was controlled by the polarity of the sync signals. (Both the horizontal and vertical could flip, which gave it the necessary options for the 350, 400, and 480 line modes.) EGA used a similar system on a single sync line to switch between 200 and 350 line mode.

Newer multisync monitors do essentially count lines (# of hsync pulses between vsync) to adjust themselves to the far more varied set of options they need to support.
 
For other modes and oddball resolutions, just go by the PAR=DAR/SAR formula, with DAR always being 4:3 for these pre-widescreen standards. E.g. some pixel aspect ratios:

MDA/HGC 720x350: (4:3) / (720:350) = 35:54 (≈2:3)
VGA 80-column text, 720x400: (4:3) / (720:400) = 20:27 (≈3:4)
VGA 40-column text, 360x400: (4:3) / (360:400) = 40:27 (≈3:2)
Hi-res EGA, 640x350: (4:3) / (640:350) = 35:48 (≈3:4)
CGA 80-column text, 640x200: (4:3) / (640:200) = 5:12

So yep, there are definitely lots of weird pixel aspects out there. Just keep in mind that strictly speaking, the monitor doesn't know or care much about the size of a pixel. It does have to worry about scanlines (=the vertical dimension), but each of those scanlines is basically a stream of voltages, and in the horizontal dimension it can be 'sliced up' pretty much artbitrarily. That depends more on the video hardware's dot clock. For instance, a VGA monitor doesn't really distinguish between 720x400 and 640x400.
 
How does a monitor know the pixel aspect ratio it should output on the display in a given mode?
It doesn't. It knows how long a line is - in microseconds, not in pixels - and simply draws it.
For pre-VGA standards, this time is fixed, and multisync screens use the time between
horizontal sync pulses to figure that out.

Analog video is not about pixels. Each line has infinitely many pixels in theory.
In practice, the analog bandwidth is limited, so you can't differentiate them all,
and in color CRTs, the shadow mask puts a limit on the achieveable resolution.

However, how does a VGA monitor know essentially how much downwards to increment the electron beam on each horizontal retrace pulse?
The number of lines is fixed. Again, for MDA/CGA, the increment is fixed and
the vertical sync just starts over. EGA has two different modes (200 and 350 lines),
and it uses the vertical sync signal polarity to differentiate. The same holds true for
early VGA (400 and 480 lines), and the polarity is used here, too.

Multisync screens simply count the number of horizontal sync pulses
between vertical sync pulses to detect the correct line count.

There is no difference between 200/400 or 240/480 line modes in the signal,
because it is the video card which doubles each line. This provides compatibility
to software with a different signal timing in hardware.

Do VGA monitors implement some hardcoded line counting logic? If so, what would happen if one creates just a slightly nonstandard video mode, e.g. a 399 lines mode instead of a 400 lines mode?
Older VGA monitors do not implement any logic. Depending on the mode, the
beam is advanced by a specific amount (this is analog, not digital). If no vsync
happens, then the beam will advance below the visible area and eventually
reach some limit (e.g. saturating the capacitor).

Each vsync causes the beam to return to the top. Send only 300 lines and the
bottom part will never be changed.


What about the pixel aspect ratio of other video modes? 640x480 modes are 1:1 pixel aspect ratio, 640x400 is 5:6 (vertically stretch 20% taller to make 640x400 cover 4:3 screen estate).
Your problem is thinking about "pixels". The video signal has no pixels, only lines.
In standard VGA mode (640x480, the line length is 31.7 us. It is the pixel clock of
25 MHz which turns that into a number of pixels on the screen. Increase the pixel
clock and you get more pixels per line.

I used to ran an old fixed-frequency VGA monitor at 1056x480, which gave me
an aspect ratio of about 7:3. But it worked fine, because it was a standard signal.

How do the monitors know which pixel aspect ratio to implement?
They don't really care: A line is from "left" to "right", and a frame is from "top" to "bottom".
Whatever those may be.

On a digital display (LCD), the situation is different. They look at the timing values,
by measuring the line and frame lengths, then they take a guess at the mode
(this is why most OSDs show "720x400" even in 320x200 modes), then they
read the complete frame into a buffer and upscale it to whatever native resolution
the screen has. Therefore, LCD screens introduce lag compared to CRTs.

Hope this helps.
 
Your problem is thinking about "pixels". The video signal has no pixels, only lines.
In standard VGA mode (640x480, the line length is 31.7 us. It is the pixel clock of
25 MHz which turns that into a number of pixels on the screen. Increase the pixel
clock and you get more pixels per line.

I used to ran an old fixed-frequency VGA monitor at 1056x480, which gave me
an aspect ratio of about 7:3. But it worked fine, because it was a standard signal.

Note of course that if you're dealing with an LCD monitor or something else that does digital scaling using non-standard pixel clocks, which is totally kosher on analog displays, may well have suboptimal effects because the monitor might be hardwired to think "well, I'm getting HSYNC and VSYNC that feels like it should be the 640x480 pixel mode, so I'll sample pixels at ~25mhz, give or take a little slop". IE, the monitor might be too dumb to realize your pixel clock is non-standard and just do a lousy job aliasing your input into 640 horizontal chunks even if the LCD it's destined for has enough pixels to provide a better representation. People fooling around with doing video output from microcontrollers run into this annoyance a lot.

But, yes, this imposition of "pixelization" in the horizontal direction is an digitization artifact, not something that an old analog monitor would do.
 
Back
Top