• Please review our updated Terms and Rules here

Gateway P5-133XL on a modern DisplayPort/HDMI monitor without using VGA convertor - is it possible?

Peregrinans

New Member
Joined
Jul 28, 2023
Messages
3
Good evening, first post here. I have a Gateway P5-133XL computer in very good condition that I use for running DOS and Win9x software and reading old software on 5.25" floppies. The machine has what may be the original video card - a Matrox Millenium VGA card in a PCI slot. The motherboard has four PCI slots and several ISA slots. Right now, I have it plugged into a Dell 2407 monitor with VGA ports and it works fine. I'd like to see if I could set it up on a KVM switch to allow me to use main monitor, which only has DisplayPort and HDMI inputs. I can do that (kind of) using a VGA-to-HDMI converter, which works fine for Windows 98 and associated Windows software. However, it doesn't display anything until the machine loads Win98, so I can't see the boot screen or a basic DOS screen. Even worse, when I run an old DOS-based game (e.g., Maniac Mansion or KOEIs Genghis Khan) I can hear the music but get no screen output.

I suspect the machine is too old to support a video card with HDMI output, but I am wondering if there is an early DVI card that would fit in an early-gen PCI slot that may play nicer with a convertor. Any suggestions for how I could display boot screens/DOS screens to a modern monitor would be greatly appreciated. I could simply get a new monitor with a VGA input but I eventually want to wean myself completely off of VGA. Any help would be greatly appreciated - while I built several computers back in the 90s, my memory of 90s tech is now very dim and I always eventually upgraded computers rather than trying to match a new video card with a really old motherboard. Thanks!
 
GeForce 6200 should work.

That's not going to fix his problem. No gender changer or different video port is going to solve the problem he has.

His modern monitor simply does not support the old CGA/EGA/VGA/VESA resolutions that his legacy machine runs. This is why the monitor can sync when Windows is loaded, because it uses a standard resolution that the monitor supports, but is a black screen in all other cases.

He will either need to keep his older monitor, or invest in a scan doubler that can take the resolutions that DOS and various games run at and turn it into a format the modern monitor can sync to.

A different video card can be tried, but it's most likely not going to work. There's also the additional problem of those late PCI video cards often having broken VESA BIOS support, so he could run into problems trying to run some software.
 
That's not going to fix his problem. No gender changer or different video port is going to solve the problem he has.

His modern monitor simply does not support the old CGA/EGA/VGA/VESA resolutions that his legacy machine runs. This is why the monitor can sync when Windows is loaded, because it uses a standard resolution that the monitor supports, but is a black screen in all other cases.

He will either need to keep his older monitor, or invest in a scan doubler that can take the resolutions that DOS and various games run at and turn it into a format the modern monitor can sync to.

A different video card can be tried, but it's most likely not going to work. There's also the additional problem of those late PCI video cards often having broken VESA BIOS support, so he could run into problems trying to run some software.

Nearly every monitor supports 720x400. His problem is the VGA to HDMI converter. Many of them only support 640x480, 800x600, etc. GeForce 6200 has a native DVI output and only needs a passive adapter.
 
I don't know of any that support 320x200.
Most likely because a VGA card simply cannot output that resolution, so VGA monitors don't need to deal with it. Ever.
If you set a 320x200 resolution, a VGA card will double-scan and use an appropriate pixel clock to actually output a 720x400 video timing.

Please stop spreading misinformation.
 
Most likely because a VGA card simply cannot output that resolution, so VGA monitors don't need to deal with it. Ever.

VGA monitors, and monitors in general had to put up with all sorts of signal abuse and non-standard nonsense, they did have to deal with it, all the time. 3rd party VGA clone manufacturers all treated VGA and its backward compatibility with EGA and CGA differently, there was no conformity between manufacturers.


If you set a 320x200 resolution, a VGA card will double-scan and use an appropriate pixel clock to actually output a 720x400 video timing.

Please stop spreading misinformation.

No, you stop spreading misinformation. Please learn how to maths. Doubling a 320x200 signal would be 640x400. And not all modes are double scanned. Straight from the horse's mouth:


Page 2-12 and 2-13, only 200 line horizontal modes are double scanned, the rest are not. So a VGA monitor is expected to be able to display weird modes like 320x350, 360x400, 640x350, etc. Nevermind LCD monitors hating these resolutions, late CRT monitors didn't like them either. I remember on more than one occasion having to setup a custom geometry for these modes when playing old DOS games that used them, or they'd end up squished and sometimes off the left or right of the screen. The only monitors that didn't generally have issues were monitors around the period when these resolutions were common.

An easy test to see how much abuse your monitor can take is to fire up Quake and go to its video mode section, it had support for many of the unusual and lesser used display modes.
 
Page 2-12 and 2-13, only 200 line horizontal modes are double scanned, the rest are not. So a VGA monitor is expected to be able to display weird modes like 320x350, 360x400, 640x350, etc.

Just FYI: the 350 line modes in VGA have the same “400 line at 70hz” framing and scan rate as VGA text 720x400 and the double-scanned 200 line modes. (Which are effectively 640x480; VGA has two pixel clocks, 25 and 28 mhz, which give you the 640 or 720 pixels across with the same sync framing.) VGA switches sync *polarity* in the “350 line” mode to signal the monitor to stretch out its vertical margins. And yes, this requirement kind of broke the 350 line mode late in analog VGA’s lifecycle because some monitors wouldn’t honor the stretch.

In short, though:

If you set a 320x200 resolution, a VGA card will double-scan and use an appropriate pixel clock to actually output a 720x400 video timing.

This. If a monitor can display VGA text it can at least frame VGA’s 320x200 mode. Some LCD monitors and scalers do a s***ty job actually displaying it because they get thrown by the pixel clock change, but they still throw *something* up on the screen.

(I have to hit the auto sync function on the LCD I have on my Tandy 1000+old VGA card when I go to a 70hz graphics mode because, yes, I get this vertical bar distortion that resolves when the monitor is forced to resync its dot lock. It sucks a little but it’s not uncommon.)

Anyway. Honestly, the only way to know if the new monitor *really* can’t sync to the 400 line@70hz framing or if it’s the converter is to read the manual or try a selection of native DVI equipped vga cards made before 2005 or so. Why do I throw that “2005-ish” proviso in there? Because it’s around that timeframe that PC video card BIOSes started looking at the EDID info coming from the monitor and automatically scaling *on their end* their bios video modes to fit the resolutions the monitor says it supports.

Anacdote: in late 2006 they bought me a Mac Pro tower with dual 23” monitors at work. Apple’s DVI monitors from that era *do not support* PC centric resolutions or scaling, they report their native pixel resolutions and expect any scaling to happen on the PC end. For “reasons” I tried using those monitors on some DVI equipped PCs, and long story short, anything “older” like a GeForce 4 or whatever would not display boot text, screen would only come on when the graphical desktop loaded, but the Radeon X200 in a 2005 vintage Dell GX280 worked 100%. So… yeah. Your mileage is going to vary.
 
VGA monitors, and monitors in general had to put up with all sorts of signal abuse and non-standard nonsense, they did have to deal with it, all the time.
A standard VGA card only has two clock generators (25 MHz and 28 MHz) and does not divide those pixel clocks. This design severely limits the number of video timings a VGA card can produce. Note that I am talking about timings here, not modes.

You know that a CGA card outputs 320x200 at 60 Hz using a 15 kHz line frequency. But you also know that normal VGA monitors* do not handle a 15 kHz/28 kHz signal. Why do you think this is? Because straight VGA cards do not output such a video timing.

They would end up outputting 320x200 at 120 Hz instead of 60 Hz. Instead, they output 320x400 at 60 Hz.

* Some multi-sync monitors do, but these often come with additional inputs or use TV-derived circuitry internally.

No, you stop spreading misinformation. Please learn how to maths. Doubling a 320x200 signal would be 640x400.
Yes, and the video timing of a 640x400 signal is identical to a 320x400, 720x400 or anything-times-400 signal. Because the line timing is identical and there is only one 400-line option.

The number of pixels per line does not matter - it is an analog video signal and the receiver decides how many pixels it wants to see. This is why many LCD screens always show "720x400" for a "640x400" mode and scale badly - on the wire, both are identical.
 
So just a brief followup...i tried a Matrox G450 dual (VGA/DVI) output PCI card (I think it is a Matrox G450, the card was marked as "MATROX 7003-0301 G45FMDVP32DB"). I was taking the chance because that card came out in 2000, which is only five years younger than the computer. The DVI worked on my digital monitor with a simple DVI to HDMI cable, even at the power up screens and it displayed several old DOS game screens perfectly. Unfortunately, in the process I appear to have corrupted the BIOS - I get a persistent "NVRAM data invalid, NVRAM data cleared" error that doesn't go away even after resetting the BIOS and swapping back in the original Matrox video card. Not sure if it was the card itself or if something got fried during the card swap. The system still seems to work, I just need to hit ESC to boot every bootup. Planning to source another P5-133 motherboard and see if I can eliminate the NVRAM error. Thanks for the input!
https://www.ebay.com/itm/166669992925
 
Did you go into the BIOS setup and save the settings? If you just press ESC, the checksum won't update. Also make sure your CMOS battery is good.
 
I tried resetting the bios and removing (and replacing) the CMOS battery several times but no joy. However, this evening I flashed the bios with a replacement AMI 1.00.10.BR0T bios image and that seemed to clear the problem. It also reset the CPU speed to 100 MHz, so it's probably not the exact right bios but it did clear the NVRAM messages and seems to be working well so far. Really wish I could find the original Gateway P5-133XL BIOS image but c'est la vie. Now just need to find drivers for the Matrox card...the journey is reminding me of why it used to be called "Plug and Pray" :). Thanks!
 
I very much doubt changing a video card corrupted your BIOS. It may have already been corrupted and the act of changing the hardware made the BIOS write nonsense to the NVRAM. I've had motherboards with slightly corrupted BIOSes that still worked cause all sorts of odd behavior with settings.

As for getting a correct image, try digging through the Evergreen Spectra BIOS images. Evergreen Technologies made modified versions of hundreds of BIOSes for computers of the time to allow their upgrade to work properly.


I had a Gateway 2000 in the same line as yours in 1997 and had an Evergreen Spectra upgrade. Your motherboard is most likely in there. I think the Evergreen software can detect which BIOS image you need.
 
Back
Top