• Please review our updated Terms and Rules here

How long is too long to disable interrupts? (x86-16)

sqpat

Experienced Member
Joined
Mar 21, 2009
Messages
263
Location
Seattle, WA
I've been working on a project the past year porting DOOM (a 32 bit x86 program) code to real mode x86-16 code. The architecture is about where it needs to be now, and I'm now digging into rendering code segments where most of the CPU time is spent and I need all the performance I can get. There are long runs of pixels drawn at a time where texture u/v cordinates are calculated per pixel, and I've found having BP and SP available to hold some numbers around essentially lets me get away with 3 memory accesses per pixel instead of 5, which adds up (320 x 200 resolution means as many as 64000 pixels at multiple frames per second). Of course I can only do this if i wrap the relevant code with cli/sti, and i've mathed that out to about 5000 cycles (240 memory accesses and some ADDs and ANDs) on an 8088 for runs of 80 pixels in worst case scenarios. Honestly it may be worse if the prefetch queue struggles, but there are no instructions greater than 2 bytes.

I'm travelling and away from my old hardware at the moment, so I can't test for sure, but emulators like 86box seem to not have issues. Does anyone have any experience with anything like this? I haven't implemented sound and I assume that could cause trouble eventually since I think sound hardware uses frequent interrupts.
 
I imagine that if you block interrupts for longer than the time between two timer ticks (18.2Hz) then you'd miss some, and throw off the DOS time of day. That would be enough to throw off sound and input code as well. But 18.2Hz is over 260,000 cycles even on an 8088. 5,000 cycles is fine. 50,000 is probably OK. Unless you were planning to have multiplayer DOOM using serial ports...
 
Unless you were planning to have multiplayer DOOM using serial ports...
Multiplayer code is long gone... Funny thing, dumping DOOM multiplayer code is like a free 5-10% FPS bump even in single player.

Speaking of serial port, mouse does use serial port though. I have never looked into how that really worked, because it "just worked", and honestly I rarely used the mouse in DOOM myself. I wonder if I've broken it...

There is some use of the programmable interrupt in the code to manage the game's internal framerate and some other things - it's set for 35 times per second, which is about twice that of the 18.2 Hz you mentioned but still leaves a lot of cycles.
 
I don't see why you need to disable interrupts to preserve BP and SP. Any interrupt handler that changes them is required to restore them before returning. (And that's true of any registers that get touched under an interrupt handler, not just BP and SP.)
 
I don't see why you need to disable interrupts to preserve BP and SP. Any interrupt handler that changes them is required to restore them before returning. (And that's true of any registers that get touched under an interrupt handler, not just BP and SP.)

As I understand it, interrupts may write to data beneath bp/sp in the stack segment as a function call would. So if your bp/sp registers are “nonsense” numbers in the context of the stack and just numbers being used for computation, then the interrupt will restore bp and sp but they will write garbage into some weird memory location.
 
Oh, I see what you are doing ... Nevermind, you have to disable interrupts if you are just going to use them as general purpose registers.

You might be able to use BP safely, but you can't safely use SS or SP like that. (You know this but for anybody else reading, the first thing an x86 in 16 bit mode is going to do on an interrupt is push the flags and a far address to return to onto the stack.)
 
You might be able to use BP safely, but you can't safely use SS or SP like that. (You know this but for anybody else reading, the first thing an x86 in 16 bit mode is going to do on an interrupt is push the flags and a far address to return to onto the stack.)

Oh yes now that you mention it that sounds correct about BP.
 
I recall from discussion with mills26, the author of Little Game Engine, that they used to have simple VGA vsync synchronization code that would first disable interrupts, then wait for start of vblank, and then resume interrupts.

On VGA 320x200, this would result at worst an excess 1/70th of a second period spent with vsyncs disabled, so about 4.77MHz / 70 ~= 68143 clocks. And they reported occassionally losing keyboard presses via the keyboard interrupt handler with this scheme.

This led to refactoring the code so that it would not spend at worst a full frame waiting for keyboard. Here is the GitHub code diff: https://github.com/mills32/Little-G...354ab998827385c6e40aca06539948368af2L600-L632

(left in red was the original code that waited for a full frame, and right in green is the code that did not exhibit lost keypresses)
 
Any chance of dividing your work up into smaller segments with interrupts enabled between segments?

I think it is broken up as much as makes sense. It corresponds with drawing a horizontal line of pixels across the screen for a single texture. It's an unrolled loop, 80 instances corresponding with 320 pixels of VGA planar mode. It's as short as I can get it, so breaking it up would be pretty arbitrary.

Anyhow - I should be able to test this on an XT class machine early next week. I can confirm that I noticed nothing wrong running this code on a Pentium MMX 233, but that's of course a much, much faster machine.
 
Well, it took a while (I had to write a driver for the above board) but I can report the code ran fine on a 5150 last night without any noticeable issue. Since its a 5150 and lacks an RTC, I can't say if the system clock fell behind or not.
 
Well, it took a while (I had to write a driver for the above board) but I can report the code ran fine on a 5150 last night without any noticeable issue. Since its a 5150 and lacks an RTC, I can't say if the system clock fell behind or not.
Set the clock before running the code. Run the code. Execute TIME and see if PC clock lost anything compared to the clock on a different device. Admittedly, unless shutting down the interrupts for a very long time like loading from cassette, lost time would be difficult to discern from the inaccuracy of the line clock.
 
Serial mice send data in packets of 3 bytes each, and generally use 1200 bps, either 7N1 (Microsoft) or 8N1 (Mouse Systems). Worst case is 134 interrupts/second, or every ~7.5 ms. For graphics tablets or other devices communicating at 9600 bps 8N1, you'd expect up to 960 interrupts/second, or about 1 ms between interrupts.

So the serial port is far worse than the timer interrupt. Your target systems most likely don't use 16550A UARTs with FIFO buffers.
 
Back
Top