• Please review our updated Terms and Rules here

NEC V20: BRKEM, CALLN and RETEM

And if you have wwaaayyyyy too much time on your hands, you could run multiple CP/M programs at once, ala M/PM.
 
Speaking of NEC V40...

I built this using a V40, ca. 1989.
Simple ECU for a car project, controlling the ignition and an automatic transmission I had added solenoids to.

wrljet1988-ecm40.jpg
 
It's all the same to me. Microcode vs. program emulation. Program emulation has the advantage of portability. If you have an 8088 or a Core I13 CPU, you're still good.

Sorry for the necro, but I've been mulling this off and on since you wrote it, and I think I now have my own answer to this question.

I don't have a boatload of old CP/M source or executables that I want to run as fast as possible to do anything remotely resembling real work. In fact - I have none at all (and don't bother giving me a link to a boatload of the same - there's nothing I actually want to run even if I had every CP/M program in the world).

I have an interesting chip I wish to exercise. Actually two of them. Because I bought two Book 8088s in case one of them broke. So that I had the chip designed to run 8080 executables without compilation, thus provide an edge over Intel's 8088 and win in the marketplace.

They didn't actually win, but that doesn't bother me.

I predicted the Amiga would win over the IBM PC. I was wrong about that too. That still interests me today, and one aspect of my PDOS work is preparing C90 and ANSI X3.64-compliant software ready to make the jump to the Amiga which can also support exactly that.

I'm very very late to market. Like 35 years late or something. But that doesn't bother me either.
 
Well, just before the 5150 was officially revealed, the 68K lab computer was debuted. That started some people talking about IBM introducing a 68K-based PC. Of course, that didn't happen and many people who were expecting something phenomenally earthshaking from IBM were disappointed. The 68K had to wait for the Atari ST debut, I guess.
 
Well, just before the 5150 was officially revealed, the 68K lab computer was debuted. That started some people talking about IBM introducing a 68K-based PC. Of course, that didn't happen and many people who were expecting something phenomenally earthshaking from IBM were disappointed. The 68K had to wait for the Atari ST debut, I guess.

But IBM didn't need to be involved. I'm more interested in the culture. If people (programmers) had been writing their programs according to the appropriate specs - which were either available or close to being available - ie ANSI X3.159-1989 (as a draft, or else just use K&R 1) and ANSI X3.64 - then both line mode programs and fullscreen applications could have been portable to everywhere. I've even demonstrated EBCDIC ANSI terminals working on a mainframe (using emulation).

Ok, so I've heard that ansi.sys was slow because it used MSDOS which used the BIOS to write to the screen.

So? Replace ANSI.SYS to write directly to the screen buffer. Rather than change every application to statically link libraries that directly write to the screen buffer.

In fact, there's nothing wrong with doing the static link still, so long as it is isolated in the code, and whatever screen writing package or functions you use (microemacs 3.6 manages) results in ANSI codes being sent to a function called scrwrite that takes the same parameters as fwrite, and one of them is stdout, and if scrwrite is not defined via compiler overwrite it defaults to fwrite, so that by default your app conforms to the spec. For MSDOS, feel free to define scrwrite as a different name that is a statically linked module.

You need to recompile for the Amiga anyway, so none of that matters anyway.

There could have been a scrwrite.c that has an #ifdef scwrite surrounding it so that by default it compiles to nothing, so the same command line could have been used to compile for both targets.

Similar issue for keyboard input - you need to translate to ANSI escape codes and interpret them rather than using the extended codes given by MSDOS.

Keystrokes aren't time-critical anyway.

ANSIPLUS (or equivalent) can give you the ANSI X3.64 keyboard strokes instead if (like me) you don't even want to see the pollution of kbdread.c or whatever you want to call it in your source tree. Ditto for not wanting scrwrite.c in your source tree.

I also don't mind if there was a defacto standard/convention that when writing ANSI controls you write them in a way that is optimized to allow ANSIPLUS or custom replacement to rapidly convert your buffer to direct screen writes. Like an escape sequence to clear the screen followed by your full screen data and ansiplus is specifically designed to do a memchr looking for an additional ESC char, and so long as that isn't present, and the buffer is the exact 80*25 size, then it uses an efficient algorithm to write to the screen.
 
Actually it just occurred to me that I could put both the keyboard and screen logic into pdpclib for MSDOS. I already do that in pdpclib for EFI.
 
But IBM didn't need to be involved. I'm more interested in the culture. If people (programmers) had been writing their programs according to the appropriate specs - which were either available or close to being available - ie ANSI X3.159-1989 (as a draft, or else just use K&R 1) and ANSI X3.64 - then both line mode programs and fullscreen applications could have been portable to everywhere. I've even demonstrated EBCDIC ANSI terminals working on a mainframe (using emulation).
Non-sequitur. The IBM PC was under development in 1981.
 
But IBM didn't need to be involved. I'm more interested in the culture. If people (programmers) had been writing their programs according to the appropriate specs - which were either available or close to being available - ie ANSI X3.159-1989 (as a draft, or else just use K&R 1) and ANSI X3.64 - then both line mode programs and fullscreen applications could have been portable to everywhere. I've even demonstrated EBCDIC ANSI terminals working on a mainframe (using emulation).

...

What you are saying is that every program, for every system, should either be written in C, or link with a C library. Or you're even assuming that this is already the case, as if C was somehow fundamental to how computers function and there was no way - or at least no reason - to avoid it, ever. That's also an attitude common among UNIX/Linux/GPL zealots...

The ANSI standard provides a lowest common denominator for console i/o, but if you want a program (in any language) to run well on a particular machine or OS, using its full capabilities, you have to adjust it anyway. Or have some abstraction layer that is a superset of what the hardware provides.

Many key combinations can't be represented by escape sequences, or at least there is no widely implemented standard for it. Certainly no ANSI one.

I have written a somewhat functional TSR for DOS that scans the video buffer for updates and sends them over the serial port, translated to UTF-8 and terminal escape sequences. It also attempts to translate input to the scan/ASCII codes expected by programs which use INT 16h. At 115200 bps, I'd call it "usable", but still with noticeable lag of course, and occasional screen corruption because of dropped characters.

Now I can use my 286 PC from a terminal window on Linux by running "screen /dev/ttyUSB0 115200".

But for example Ctrl+Enter (copies a filename into the command line in Norton/Volkov Commander) doesn't work, because there is no escape sequence for it that my Linux terminal emulator sends. Worse than that, it ignores the Ctrl and just sends ASCII CR, running whatever is in the partial command line. Yet somehow, the same key combination does work locally in Midnight Commander, and I actually looked at the source code to find out how this magic is possible -- turns out that when the DISPLAY environment variable is set, this console program will connect to the X server, and poll it for the state of modifier keys when receiving CR on stdin!

I'd call that a gross hack, one made necessary by the "standards-compliant" console i/o layer.

Back in the day, people ported games written in assembly language from one processor architecture to a different one, by rewriting the machine instructions but keeping the logic the same. And rewriting the i/o (which was usually part of the code, not a separate driver or library) to interact with whatever hardware existed in the target system.

"Write code once, compile/run anywhere" may be convenient, but giving up so much for it as you want is hardly an optimal state.

I also don't mind if there was a defacto standard/convention that when writing ANSI controls you write them in a way that is optimized to allow ANSIPLUS or custom replacement to rapidly convert your buffer to direct screen writes. Like an escape sequence to clear the screen followed by your full screen data and ansiplus is specifically designed to do a memchr looking for an additional ESC char, and so long as that isn't present, and the buffer is the exact 80*25 size, then it uses an efficient algorithm to write to the screen.

And this would completely defeat the goal of portability, while still being inefficient compared to direct hardware access, and unable to deal with color/attribute changes.
 
Last edited:
Some programs were able to get good performance with portable code. UCSD Pascal's major problem was the very limited memory space allowed which meant a lot of swapping compared to other programs that could take advantage of more than 128K.

I would not want to program in C on any micro platform before 1985. The compilers yielded slow yet buggy code. The fictional world with systems having enough memory to run complete compilers that have been thoroughly debugged would result in different development decisions.
 
Some programs were able to get good performance with portable code. UCSD Pascal's major problem was the very limited memory space allowed which meant a lot of swapping compared to other programs that could take advantage of more than 128K.

I would not want to program in C on any micro platform before 1985. The compilers yielded slow yet buggy code. The fictional world with systems having enough memory to run complete compilers that have been thoroughly debugged would result in different development decisions.

People were still doing direct screen writes when I started PC programming around 1987.

They were already writing in C, so that wasn't an issue.

I noticed when I had a similar discussion on this topic, someone said that supporting ANSI terminals on a mainframe would bring a z machine to its knees. I think I calculated how many instructions it took every time I pressed a key on my ANSI terminal (connected to PDOS/3X0) and it was 2000 instructions. I asked him what numbers he was using for performance of a modern z machine and he decided that was a good time to end the discussion.

Apparently there is a "monkey experiment" where you zap the monkeys with electricity every time one of them goes for a banana. And when you put a new monkey in, the others hold him back when he goes for the monkey. Keep replacing monkeys until there are only new monkeys, and they still refuse to go for the banana. The person doing the zapping may have died since then, and no-one bothered to write down the reason why we couldn't eat the banana to see if that circumstance was still valid.

So multiple (numerous) terminals was one reason for using block mode terminals.

And the reason on the PC may have been that people were calling MSDOS which in turn was calling the BIOS, instead of changing that equation.
 
Non-sequitur. The IBM PC was under development in 1981.
1978:


1979:


(I mentioned K&R 1 in my original message)

(BTW - even into the mid-late 1990s some people got irate when I converted some K&R 1 code into C90 since I was the one who had to maintain that code - they called it the "nancy standard" and when I pointed out that Ritchie himself supported the standardization effort they told me that's his problem for going insane - I never bothered to ask what the issue was because I started in 1987 with C89 drafts available and I wasn't willing to budge from C90 until I was knowledgeable enough about languages to know what is allegedly wrong with C90 - I still don't see anything wrong).
 
What you are saying is that every program, for every system, should either be written in C, or link with a C library. Or you're even assuming that this is already the case, as if C was somehow fundamental to how computers function and there was no way - or at least no reason - to avoid it, ever. That's also an attitude common among UNIX/Linux/GPL zealots...

I don't think I'm saying that.

I'm saying the programmers are already writing (or were, circa 1987) in C - but their programs couldn't be compiled on the Amiga and there wasn't an Amiga version because of the required porting effort.

The ANSI standard provides a lowest common denominator for console i/o, but if you want a program (in any language) to run well on a particular machine or OS, using its full capabilities, you have to adjust it anyway. Or have some abstraction layer that is a superset of what the hardware provides.

Well - I'm talking about fullscreen text applications. That's not using the full capabilities of the hardware. I'm looking at a fullscreen application written in C and wondering "why isn't this available on the Amiga? It's C. C is allegedly portable. Just recompile".

Many key combinations can't be represented by escape sequences, or at least there is no widely implemented standard for it. Certainly no ANSI one.

I have written a somewhat functional TSR for DOS that scans the video buffer for updates and sends them over the serial port, translated to UTF-8 and terminal escape sequences. It also attempts to translate input to the scan/ASCII codes expected by programs which use INT 16h. At 115200 bps, I'd call it "usable", but still with noticeable lag of course, and occasional screen corruption because of dropped characters.

Now I can use my 286 PC from a terminal window on Linux by running "screen /dev/ttyUSB0 115200".

But for example Ctrl+Enter (copies a filename into the command line in Norton/Volkov Commander) doesn't work, because there is no escape sequence for it that my Linux terminal emulator sends. Worse than that, it ignores the Ctrl and just sends ASCII CR, running whatever is in the partial command line. Yet somehow, the same key combination does work locally in Midnight Commander, and I actually looked at the source code to find out how this magic is possible -- turns out that when the DISPLAY environment variable is set, this console program will connect to the X server, and poll it for the state of modifier keys when receiving CR on stdin!

I'd call that a gross hack, one made necessary by the "standards-compliant" console i/o layer.

I would say the fundamental problem here is cultural/knowledge. Why use ctrl-enter as a key combination (at least - with no alternative) when you know that isn't something that can be transmitted over an ANSI terminal? How are you supposed to connect a VT100 to your MSDOS machine? Or allow its use from a BBS? Or work on the Amiga?

Back in the day, people ported games written in assembly language from one processor architecture to a different one, by rewriting the machine instructions but keeping the logic the same. And rewriting the i/o (which was usually part of the code, not a separate driver or library) to interact with whatever hardware existed in the target system.

"Write code once, compile/run anywhere" may be convenient, but giving up so much for it as you want is hardly an optimal state.
I'm not expecting games to be ported.

I'm expecting command line utilities to be ported, plus fullscreen text applications. Ported to the mainframe too. I wondered why there was nothing that looks like MSDOS on the mainframe. Took me decades to chase that one down. I now know how to do it, but haven't implemented it (unless you count the current z/PDOS that doesn't have sub-directories, just a root directory).

And this would completely defeat the goal of portability, while still being inefficient compared to direct hardware access, and unable to deal with color/attribute changes.

My flavor of microemacs 3.6 is working fine. And I don't need separate code for mainframe versus PC other than taking care of EBCDIC vs ASCII - which, ironically, is the exact one thing that I think is missing from C90 so I would like to fork C90. I need defines for ctrl-a, ctrl-b etc plus ESC. Any suggestions on a header file and define for those?
 
My non-sequitur comment was that citing ANSI x3.4 1989--note the date. Was I familiar with VT100 escape sequences before the PC? I sure was--I even did terminal software on contract before the PC gained momentum(I have the Z80 source to prove it). I was using C in the late 70s as well on Unix; Lattice C for the PC was not very good (2 floppy set)--one can hardly assert that its code generation was optimal.

One may as well ask why the PC didn't use NAPLPS or Videotex for graphcs.
 
Last edited:
I think we can all agree that life would have a been a lot easier if Michael Shrayer had had the foresight to design the original Electric Pencil word processor to leverage the completely contemporary-to-1976 Open XML file format and set a good example for subsequent editors like Wordstar. Could have saved us a good forty years of having to deal with incompatible document formats if everyone had just gotten it all 100% correct and portable from the start.

For that matter, why can't I CLOADM a .PNG directly into the framebuffer of my TRS-80 Color Computer without having to jump through all these ridiculous hoops to dumb it down into a specific size, color depth, and byte ordering using external software? I mean, jeeze, these standards exist for good reasons, they should have built COLOR BASIC 1.0 to handle all this.
 
My non-sequitur comment was that citing ANSI x3.4 1989--note the date. Was I familiar with VT100 escape sequences before the PC? I sure was--I even did terminal software on contract before the PC gained momentum(I have the Z80 source to prove it). I was using C in the late 70s as well on Unix; Lattice C for the PC was not very good (2 floppy set)--one can hardly assert that its code generation was optimal.

One may as well ask why the PC didn't use NAPLPS or Videotex for graphcs.

Sorry, I'm confused.

Are you referring to ANSI X3.159-1989 or ANSI X3.64?

The former I already said "or draft, or K&R 1".

The latter I just looked up again and it clearly says:

The name "ANSI escape sequence" dates from 1979 when ANSI adopted ANSI X3.64

As to your question about graphics - my question isn't about graphics. And I'm not questioning the hardware decisions at all. That's what the language is supposed to sort out for you. K&R 1 was on mainframes too - it was one of the early ports.

People didn't need to wait for formal ratification of C89 - they could have used a draft, or K&R 1 (as I said in my original).

The C89 standard had the benefit of very careful wording to cover the different platforms (without stating them). But someone who already knew what the mainframe looked like in 1980 and was now programming on a PC could have at least figured out what they needed to do to get command line programs to work. People didn't need to freeze until 1989.

And regardless - the problem still existed in 1989.

And also C89 only covers command line programs anyway. My question is about extending that to fullscreen text applications.

For the last 35 years I have rarely strayed beyond command line programs, and rarely strayed beyond C90.

I am now wanting to go bananas and develop fullscreen text apps. Getting the existing Microemacs 3.6 to work was basically my first foray - only a couple of months ago or something. But although it works, it is still proof of concept, still only has cursor keys working, not page down etc, and I don't yet have the infrastructure in place to get it to compete against direct screen manipulation on an XT.

And that is my question - can it compete? If you're willing to change the C library and the OS. Not necessarily the C compiler. And that could potentially be the answer - "it was only in xyz year that the C compilers started generating the code necessary to bundle up an ANSI data stream that could have competed with direct screen manipulation".

Indeed - there was probably a year where the mainframes had enough memory such that even the largest mainframe user with the largest number of 3270 terminals in recorded history could have switched to EBCDIC ANSI terminals (fairly trivial if you're using emulated terminals anyway). But - like the monkeys - no-one reevaluated the reason why the non-standard 3270 needed to be used in the first place, making mainframe fullscreen applications (but not command line), inherently incompatible with the PC.

Although you could argue that the 3270 should have been the standard in the first place, to allow efficient block mode. Even though the PC and/or DEC terminals didn't need it to be efficient since they were serving a single user.
 
And this would completely defeat the goal of portability, while still being inefficient compared to direct hardware access, and unable to deal with color/attribute changes.

Sorry - I didn't fully answer this.

Arranging for ANSI escapes to be carefully lined up would still be portable. It's still a valid ANSI data stream.

And yes, it would be optimized ready for transfer to an XT screen buffer.

But the main competition is exactly that - the XT screen buffer.

I don't see a problem. If it can compete - ie be viable - the program will port. There is nothing at all lost compared to the alternative - using a method which won't port, because you can't make ANSI fast enough.
 
I think we can all agree that life would have a been a lot easier if Michael Shrayer had had the foresight to design the original Electric Pencil word processor to leverage the completely contemporary-to-1976 Open XML file format and set a good example for subsequent editors like Wordstar. Could have saved us a good forty years of having to deal with incompatible document formats if everyone had just gotten it all 100% correct and portable from the start.

That's a question to ask ANSI/ISO - why didn't they create an early standard for documents?

And it's a separate question from what I asked.

For that matter, why can't I CLOADM a .PNG directly into the framebuffer of my TRS-80 Color Computer without having to jump through all these ridiculous hoops to dumb it down into a specific size, color depth, and byte ordering using external software? I mean, jeeze, these standards exist for good reasons, they should have built COLOR BASIC 1.0 to handle all this.

And this is a separate question - a standard for graphics. Why isn't there the equivalent of ANSI X3.64 for graphics (even in 2023)?

I may well ask that exact question at a later date, and maybe ANSI X3.64 can be extended to address each pixel in graphics mode, and use an "X" to write a pixel in foreground color, blank to write in background color.

But at the moment I am just trying to do the easier task of plain text - as fullscreen instead of command line.

Command line was already done in 1990 when ISO signed off without non-whitespace modifications to the ANSI standard.

ANSI X3.64 was only added to Windows with Windows 10 - and not even the first version. Despite the fact that they supported it for output (but not input) in the 1980s.

Seems pretty bizarre to me.

But that's my question.

If I hop into my time machine and go back to the 1980s and ask people to use ANSI X3.64 in preparation to port to both the Amiga and a future mainframe, is there any technical reason to object, or just cultural?

Note that if there is a technical reason, I may update microemacs 3.6 and pollute it with code to do direct screen buffer writes for use on my Book 8088.

More likely I'll just upgrade to a Hand 386, but I would at least pencil it in as a "valid body of work to do".
 
Back
Top