• Please review our updated Terms and Rules here

AT to XT Keyboard Converter

Forgive me if I haven't thoroughly combed 38 pages of this thread, but I've noticed something interesting while debugging two keyboards I have. One is a pre-Windows 95 (lack Windows Key) Compaq keyboard with a microcontroller from American Megatrends. The other is an older keyboard from BTC, that came with a Windows 98 computer.

When I tell the MSP430 to send a Request-to-Send signal (bring CLK low for 100 microseconds, then release CLK, then bring data low), the Compaq controller sends the data once according to my logic probe, then does nothing. The BTC keyboard sends the data repeatedly, according to the pulse LED on the probe. I deliberately never bring the data line high after sending a RTS signal, just to check that the keyboard is responding (and the logic probe won't trigger the pulse LED from a single high-to-low transision).

From what I understand according to http://www.computer-engineering.org/ps2protocol/, a PS/2 keyboard is supposed to send additional clock pulses after the 11th clock pulse if the data line is not released by the host after the 11th clock pulse- the keyboard then sends an error code representing a framing error. Does this imply that it's optional portion of the spec, or that the spec changed at some time between the PC/AT and Windows 95/98 keyboards?
 
Parity is correct... but I can CONSISTENTLY verify that the keyboard is NEVER sending an ACK bit back... I am so unbelievably sick of this... I can receive data from the keyboard just fine... I have no idea why it's so off.

EDIT (Mainly for MSP430 users- if you value your sanity, you will ignore this section): I have no clue what TI's C compiler is doing to my code but...

I have a single Interrupt so far for P1 (I/O ports). The ISR is divided into two halves which is triggered by an if statement and a flag- 'receive from keyboard' routine, and 'send command to keyboard' routine. I have two counting variables... one is for the number of interrupt triggers from the keyboard's clock (the only I/O pin on P1 which has interrupts enabled) which take the 'receive' branch- the other counts the number of triggers for the 'send command' branch. In my code's current state, the flag which determines which branch to take is set to FALSE (in other words 'don't take the 'send command' branch') and never altered in my code! The counter for the 'send' section reflects this- the microprocessor never executes that statement, and therefore never takes the 'send' branch in the code's current state. The counter is always 0.

Keeping the if statement and the 'send command' branch of my code causes neither section to work correctly. Commenting out the if statement and the 'send' branch causes the 'receive' section to work properly.

So why is dead code causing the working part of my interrupt to fail?!

Is it possible that the branch is causing the ISR to take too much time to execute? A single comparison?!
 
Last edited:
I have no experience with MSP430 C; I tend not to use HLLs for simple (and time-critical) tasks. I can only offer my PIC source code and a couple of observations. First of all, the little 12F PIC is no speed demon--I'm using the 2MHz internal RC oscillator to run things; the MSP430, being a 16-bit MCU with a more-or-less orthogonal register set (as compared to the PIC's use of the W register for nearly everything), certainly can run rings around the PIC.

I do use a single interrupt as well to service the keyboard--and the interrupt code is not short. So, unless the 430's C code is really long, I doubt that that's your problem either.
 
I have no experience with MSP430 C; I tend not to use HLLs for simple (and time-critical) tasks. I can only offer my PIC source code and a couple of observations. First of all, the little 12F PIC is no speed demon--I'm using the 2MHz internal RC oscillator to run things; the MSP430, being a 16-bit MCU with a more-or-less orthogonal register set (as compared to the PIC's use of the W register for nearly everything), certainly can run rings around the PIC.

I do use a single interrupt as well to service the keyboard--and the interrupt code is not short. So, unless the 430's C code is really long, I doubt that that's your problem either.

I'm using C mainly for portability reasons, but the interrupt is not exceptionally long- about 100 lines in it's current state... I'm still not sure what's going wrong, but I can verify that addition of that if-statement is causing the ISR to fail- God knows what I did to the source code. Not sure how much it matters, but the MSP430 defaults to 1.1 MHz.

For the time being, I'm going to keep things simple, get a minimum-functionality version working, and scale from there. I'm considering doing a few PCBs as well- something I've never done before, but have been meaning to do for a long time. Sure would be more visually appealing than my through-hole veroboard mess I have right now XD.
 
I have no experience with MSP430 C; I tend not to use HLLs for simple (and time-critical) tasks. I can only offer my PIC source code and a couple of observations. First of all, the little 12F PIC is no speed demon--I'm using the 2MHz internal RC oscillator to run things; the MSP430, being a 16-bit MCU with a more-or-less orthogonal register set (as compared to the PIC's use of the W register for nearly everything), certainly can run rings around the PIC.

I do use a single interrupt as well to service the keyboard--and the interrupt code is not short. So, unless the 430's C code is really long, I doubt that that's your problem either.

I don't exactly like 'stealing' other people's code, but I'm taking your PIC assembly (which isn't difficult to learn) and converting it to semi-equivalent C code. Using my implementation based on the directions you provided in this thread: http://www.vintage-computer.com/vcforum/archive/index.php/t-20999.html, I got my IBM PC to receive bytes, but for the time being all that happens is that the PC receives 8 bytes, lets out a beep indicating overflow, and never prints any characters to the screen (in my case, I'm using the scancode for K, or 0x25). Repeat ad nauseum.

EDIT: Turns out I reversed the Data and Clock ports (on the XT side) in my code from how I soldered them on the board, so the DATA output from the MSP430 was going into the CLOCK pin on my PC, and vice versa... that was boneheaded...

Haven't tried fixing my code yet, but let's see if that fixes my issues...

EDIT2: Of course that didn't fix it... that would've been too easy. Now the PC's not responding at all... of course if I hook up a real keyboard, everything's fine... I guess I don't have a choice but to use a logic analyzer... for something so damn trivial too!

EDIT3: Okay, I don't need the logic analyzer... my second mistake was the following:
for(num_bits = 0; num_bits > 7; num_bits++)
{
SendXTBit((test_keycode & 0x01));
test_keycode = test_keycode >> 1;
}

I managed to forget that a for-loop (tests that condition is true) is not an until-loop (tests that condition is false). I've done this before, but... I really shouldn't be making these types of mistakes...
 
Last edited:
I remember reading somewhere on this thread and elsewhere that converting AT codes to XT keycodes is a royal pain, because AT-codes are variable length codes, and XT-codes are all single byte codes.

Over the past 24 hours, I created a Moore Finite-State Machine representation of AT-to-XT conversion (more possible states, but easier for me to understand while drawing it). For anyone wanting to make their own AT2XT keyboard converter, the following (ANSI C) code is meant to be a drop in for the main routine in a microcontroller which handles keycode conversion- an interrupt is meant to handle capturing keycodes from an AT keyboard:

https://dl.dropboxusercontent.com/u/20852311/KEYFSM.zip

As the code currently stands, this is ABSOLUTELY minimum functionality. With the setup I currently have, this code (minus the printf statements- which I just used to debug the FSM on my laptop- and the static routines updated accordingly) will allow a 101-key keyboard to serve as an 83-key PC (PC in my case) or XT keyboard. Additional keycodes (such as those beginning with E0, PRNT SCRN, or PAUSE, both of which require their own special states), LED synchronization and host to device communication- are currently NOT handled. While some of those keys (E0 arrow keys) might work, consistent behavior is not guaranteed. Any keycode that is NOT recognized as being valid currently sends 0x00 to the XT (though I think I should've made it 0xFF, which as I recall sends a beep).

Also I do not know how to make an XT keyboard pass POST yet, so I'm pretty sure the good ol' 301 error remains (haven't checked yet)... I'll check the technical reference manual in a bit and fix that before uploading the microcontroller code proper and my own circuit diagrams.
 
Last edited:
Okay, having to monitor the XT for the clock line being low to 20 milliseconds or more throws a wrench into making this circuit successfully... while the IBM PC seems content with letting me plug in my circuit and AT-keyboard after getting a 301, the PC XT is doing one of two things:

1. If my AT2XT circuit is plugged in at boot, the XT will trigger a 301 and refuse to acknowledge any keystrokes from the micrcontroller (CLK doesn't go low after sending a byte), as if it's expecting the '0xAA' scancode to magically appear. Note that I have a 256kB to 640kB XT, if that makes any difference.

2. Except for one time (where I verified that the circuit works as intended), swapping my IBM Model F keyboard with an AT keyboard attached to my circuit crashes my XT (video framebuffer is written with garbage), causes the speaker to output numerous beeps, and requires power off.

Using an hi-to-lo edge interrupt to handle the CLK input from the XT sounds obvious, but I cannot guarantee whether the CLK (to XT) pin is an input or output, and I have already seen the case where the SendKeytoXT keyboard routine will lock up expecting the CLK line to go high while the CLK is already configured to output (and is low), indicating that the interrupt is called multiple times! Disabling interrupts within the loop doesn't help either! Well, at least this keeps me busy if anything.

Chuck: Does any of what I've written above sound consistent with your experiences debugging this circuit? My current setup is that I'm using a separate power supply to run my circuit while the XT keyboard connecter is present, letting the 5V inputs from different sources share a common ground.
 
Take a look at the routine "PollHost" in my code and the loop in Main that calls it. Pollhost is entered between every sent character. If the XT holds the interface low for at least 20 msec, we respond with a hex AA code. If it's high when we enter, we just exit.

Does this make any sense? You have to assume that you can get a reset request from the XT at any time (a lot of XTs have RESET buttons, so it's not just on power-on).

The cool thing about all of this is that you can leave the converter connected and plug and unplug the PS/2 keyboard at will without hanging the PC.
 
Warning, long post ahead. I apologize in advance.

Take a look at the routine "PollHost" in my code and the loop in Main that calls it. Pollhost is entered between every sent character. If the XT holds the interface low for at least 20 msec, we respond with a hex AA code. If it's high when we enter, we just exit.

Does this make any sense? You have to assume that you can get a reset request from the XT at any time (a lot of XTs have RESET buttons, so it's not just on power-on).
Yes it makes sense... it's also a conflicting requirement with being low-power and interrupt-driven :p... Ahh well, if push comes to shove I'll implement that. In my FSM implementation, the MSP430 will only poll (well, sleep) if the buffer for sending characters is empty. So basically, you're saying that all time doing nothing should be spent polling the XT CLK port (no sleeping)? I'll take a look at your code and see what I can do (i.e. poll after every character sent and while empty buffer).

Now what I think I could ALSO do- to keep the program completely interrupt-driven is:
1. Use two I/O pins for XT CLK... one that is bidirectional with interrupts disabled, and one that is always input with interrupts enabled (high-to-low)...
2. If the clock goes low at anytime on always-the input CLK pin, trigger an interrupt on the always-input CLK pin. Switch interrupt edge to low-to-high.
3. Start a timer for 20 ms and exit.
4. Wait for 20 ms to elapse- do whatever. If always-input goes from low-to-high beforehand, stop timer, and switch interrupt edge back to high-to-low.
5. If still low after 20 ms, timer will trigger interrupt, and then send 0xAA keycode using my C implementation of SendXTByte. SendXTByte will wait until the clock goes high. Disable timer. Switch always-input CLK back to high-to-low.
6. Disable said interrupts while within the SendXTByte routine itself since the bidirectional CLK port will have control of the bus.

I'm probably missing something important here, but it's 3 in the morning. I'll fix it later.

I do have a few concerns now that I realize that I still have some work left to do:
Does your XT refuse to accept incoming keycodes if the keyboard does not send 0xAA? My XT will recognize a Model F keyboard if I wait to hook it up until after receiving a 301 error. I can then press 'F1' and continue normally. Currently, my circuit if I attach it to the XT before or when the 301 error occcurs and press 'F1' will not work. According to my logic probe, CLK never goes low after me sending the keycode (it's not acknowledged by the XT), as if the XT is still waiting for the keycode. If I attempt to attach a Model F keyboard at this point, the XT STILL won't accept any input- I need to power off to get the keyboard to work. If I start with a Model F keyboard OR use a Model F keyboard, THEN swap keyboards, the circuit works on my XT.

Is this consistent with what an XT is supposed to do in your experience? I suppose since the PC (5150) doesn't wait for a user to press F1, I don't have any issues other than a glaring '301' error telling my circuit that it's not a PC keyboard :p.

EDIT: Apparently I'm wrong about the above two paragraphs. If my XT gets a 301 error at all, it 'gives up' accepting anything from the keyboard port. Is THAT behavior consistent with an XT? I must've confused myself because I was working on both the PC and XT simultaneously.

Also lastly, but equally important, does the software reset that occurs in the XT also occur in the PC? I only have a PC Model F keyboard (not an XT keyboard), so it's obvious that a PC keyboard understands the software reset command since my PC keyboard works on the XT. But does the 5150 use the reset line or also use the same software reset procedure (and if it uses the reset line, do XT keyboards work on PCs)?

The cool thing about all of this is that you can leave the converter connected and plug and unplug the PS/2 keyboard at will without hanging the PC.
My PC/XT appears to be sensitive to transient voltages :(... I have to unplug the Model F keyboard and replace it with a PS/2 keyboard very gently, otherwise my XT will just give up with a garbage framebuffer. It may be worth noting that I'm not using any filter capacitors currently (hence the transient voltage spikes), except for the one required on the reset line of the MSP430.
 
Last edited:
You'll note that on my implementation, I used a 47uF tantalum on the board to decouple the power supply.

One unnecessary thing that I did was to use two pins to control the PS/2 keyboard clock; pulling the line down through a diode. I could have simply changed the state of the input pin to output and set it low. That would have reduced the cost of the adapter by about a dime. I felt a little uneasy using a pin changing state from input to output to input for less than a millisecond was a little risky, but in retrospect, it probably would have worked.

Whether or not the XT reacts to a keyboard presence after a 301 error depends on the BIOS. Some clone BIOSes are quite happy with a keyboard being attached after a 301; others (such as the 5160) assume that you're running headless and don't bother to check after that.

The clone that I was using for testing (ERSO BIOS) did reset the keyboard upon a soft reset. Otherwise, the keyboard LEDs would never be reset to the proper state. You can check the XT BIOS code in the Tech Ref to see what happens on a 5160.

Low-power is sort of ho-hum. Your adapter is probably using less total current than the LS323 is drawing on the motherboard.
 
I remember reading somewhere on this thread and elsewhere that converting AT codes to XT keycodes is a royal pain, because AT-codes are variable length codes, and XT-codes are all single byte codes.

The multi-byte codes were originally chosen so you could ignore the E0 or E1 prefixes and still have the result make sense. So rather than go to the trouble of distinguishing them, you could just get away with dropping the E0/E1 on the floor, or pass them through unchanged. This works even on the multi-byte sequences from Pause and PrtSc.
 
There are indeed XT-type keyboards that send E0 codes. Some non-English BIOSes/drivers look for them, so it's not entirely safe to ignore them. See, for example, the setting of the jumper on the AT-to-XT converter board. If omitted, the converter "eats" the E0s, if jumpered, the converter passes them.

Similarly, it's not safe to assume that the "make" codes from a PS/2 or AT keyboard are all 7-bit. In particular, key 118 on the PS/2 keyboard has a value of 83 hex and translates to XT scan code of 41 hex.
 
Similarly, it's not safe to assume that the "make" codes from a PS/2 or AT keyboard are all 7-bit. In particular, key 118 on the PS/2 keyboard has a value of 83 hex and translates to XT scan code of 41 hex.

Which is a weirdness dating back to the 3270PC and/or the 3197 terminal. The keyboard has 127 key positions, with scancodes from 01h-7Fh. The firmware passes all scancodes unchanged to the host, except for 02h (Ident) which it translates to 83h, and 7Fh (Space) which it translates to 84h. The AT keyboard, which is based on the same hardware, does the same, except those keys now have the caps F7 and SysRQ. And the keyboard controller on the motherboard hand-converts them straight back to 02h and 7Fh before looking them up in its AT->XT translation table.
 
My converter passes E0, and E1 triggers a special case in the FSM, but I haven't implemented their entries into the AT2XT table yet, so the converter sends them as 0x00 (it was supposed to be 0xFF to sound a beep- i.e. 'don't use these keys', but I'll change it later).

Additionally, this page mentions that PrtSc and Pause send out different scan codes if certain modifier keys are pressed, which means more http://www.win.tue.nl/~aeb/linux/kbd/scancodes-1.html
Section 1.6 talks about sending 'fake shifts' as well, but I have no clue what that section is referring to, if I'm honest.

My full-blown FSM, as drawn on an 8.5x11, handles PrintSc and Pause as special cases. I'll keep the FSM simple for now and add to it later... I still have 1kB to work with- my entire program only takes up half of flash memory, even including the 132 bit conversion table (since 'F7' uses 0x83 for the reasons stated above).
 
Good heaven's man--the PIC version takes up 469 (decimal) 13-bit program words--less than half of the 1K program memory and 160 of that is the translation table. What on earth are you doing with all that code?
 
Additionally, this page mentions that PrtSc and Pause send out different scan codes if certain modifier keys are pressed, which means more http://www.win.tue.nl/~aeb/linux/kbd/scancodes-1.html
Section 1.6 talks about sending 'fake shifts' as well, but I have no clue what that section is referring to, if I'm honest.

It's all in the name of compatibility. Using AT / Set 2 scancodes throughout:

On the AT keyboard, PrtSc was shift-Keypad *, and SysRQ was a separate key. The 102-key keyboard combines them on one key. So PrtSc sends E012 E07C; a PC that doesn't understand E0 will treat this as 12 7C, shift-Keypad *. Alt+PrtSc sends 84, SysRQ.

The same thing happens with Pause. On the AT keyboard, Pause was Ctrl+NumLock and Break was Ctrl+ScrollLock. On the 102-key keyboard, Pause sends E114 77 E1F014 F077, and a PC that doesn't understand E1 will treat it as 14 77 F014 F077 (Ctrl+NumLock down, Ctrl+NumLock up). Ctrl+Pause is break, so if Ctrl is pressed, Pause sends E07E E0F07E, which without the E0s is 7E F07E (ScrollLock down, ScrollLock up).

(The 122-key Host Connected Keyboard has five keys with this sort of behaviour, not just two).

The fake shift business is for the same reason. A PC that doesn't understand E0 scancodes will treat cursor left (E06B) as number pad 4 (6B). If Num Lock is off, then the keyboard will just send E06B. If Num Lock is on, then the keyboard will send E012 E06B (fake-shift, cursor left) so that the PC treats it as 12 6B (shift-numpad 4). If Num Lock is off but Shift is pressed, the keyboard will send E0F012 E06B (fake-shift-up, cursor left) so that the PC treats it as F012 6B (unshifted numpad 4).

It seems to me there are three approaches that can be taken when dealing with this.

The easiest one is to behave like an AT keyboard controller -- translate scancodes as single bytes and pass the E0/E1 codes unchanged. So if left Ctrl (14) was mapped to 1D, the same mapping would convert right Ctrl (E014) to E01D. The only state you need is 'was the previous code F0?' and if it was, set the top bit of the translated code.

The next easiest is to put the keyboard in scancode set 3, where all the keys have unique one-byte scancodes. The problem here is that that you can't trust every keyboard to support set 3 properly.

The most difficult is to try and parse the set 2 scancode sequences and work out which key they correspond to. In this scenario, the fake shifts E012 / E0F012 can be ignored completely, and you'll need to keep track of at least two more items of state: 'was the last code E0?' and 'is fake Ctrl pressed?' (if it is, scancode 77 is Pause. If it isn't, it's Num Lock).
 
The big problem with making the converter too smart is that there's always the possibility that the host will get out of sync with the converter regarding internal states. In the old DOS/games world, some programs did their own keyboard scan code translation.
 
Good heaven's man--the PIC version takes up 469 (decimal) 13-bit program words--less than half of the 1K program memory and 160 of that is the translation table. What on earth are you doing with all that code?

It's C, and debug symbols are enabled. You sound surprised at my lack of optimization skills :p. I haven't tested the code size when I disable debug symbols.

Additionally, I was originally going to have the microcontroller autodetect which DIN port was connected to the host and which DIN port was connected to the keyboard- and also detect whether an XT or AT keyboard was present and do dual conversion... a portion of the code I haven't removed yet reflects this (i.e. has switch statements which provide identical functionality to both ports)... I may still do this, if I have the memory, and can get AT communication from microcontroller-to-keyboard to work properly!

Also, I recall you discussing that the MSP430 runs circles around the PIC, and that I shouldn't be getting into interrupt hell between bytes sent from the keyboard. I believe I was getting into interrupt hell between bits (like you, I interrupt when the clock goes low on the AT keyboard), however thanks to a badly placed section of code that exited out of waiting for a byte potentially before a whole byte arrived from the keyboard. Then the keyboard_head pointer was incremented, and since the buffer is empty only when head == tail, all hell broke loose.

I have a counter which detects the number of poorly framed 11-byte streams... I also have a counter which counts the number of bits which arrived (cleared when all 11-bytes arrived). When I originally got this program working yesterday, it was possible to have the number of bits arrived not cleared while waiting in a loop doing nothing- while the number of bad frames was still 0, and changing the number of statements in the interrupt routine aggrevated or eliminated this problem. So I DO think interrupt routines are timing sensitive. I'm also running the MSP430 at a slower speed than your PIC12 (1.1 MHz vs 2 MHz). I don't exactly know the issue I was having, but to be honest- I could spend far more time than is wise worrying about it, unless the problem comes back of course.

ADDENDUM: As of about an hour of this edit, I have a fully working, minimum-functionality converter. Not something I'd sell to anyone (i.e. it's a soldered prototype and doesn't even implement host-to-keyboard communication), but sufficient for my needs- it can pass an XT post and make an XT class machine think a keyboard is attached. I'll work on the host-to-keybord component and autodetection when I'm less exhausted.
 
Last edited:
Oh, I'm just yankin' yer chain. :)

I was interested in the minimum hardware that could do the job. I played around a bit with a 6-pin PIC, but decided that there wasn't quite enough I/Os to really do the job well. The 12F MCU was the next step up--with 1K of ROM and all of 64 bytes of RAM (the register set), 32 of which are used as a FIFO for keystrokes going to the XT. Interestingly, the thing can run at 20MHz and is about as cheap as an MCU as one can find.

The lack of a real stack (it has 8 levels for holding return addresses and is not accessible by other than CALL and RETURN) and the inability of program code to be directly addressed by data instructions (the lookup table is an indexed RET instruction list) pretty much meant that anything but assembly was out of the question. For all of that, you get an idle current in nanoamperes and a 4MHz operating current in microamps at 5V.

Maybe it's my age or training, but I love minimalist solutions, particularly in this era of megabyte file-copy utilities.
 
Back
Top