• Please review our updated Terms and Rules here

A list of support chips for the Intel 4004

No, it was sliced and diced. The microelectronics division was spun off as Microchip in the 80s; the power semiconductor business became part of Vishay; the cable and antenna business became part of Motorola.

Donald Rumsfeld (yes, him) was CEO between 1990 and 1993.
 
..and we're back to the "what's so special about a single-chip CPU" question again. By 1976, we had several 16-bit CPUs. Weren't the MicroNova and Fairchild 9440 a reality in 76? You could do whole database and business management software on those.

MicroNOVA (mN601) came in '77, but MicroFlame (9440) was out in '76. I've heard rumors of MicroFlame samples in '75. And, of course, the IMP-16 was first.

They're all interesting as technical achievements, in my mind. Though as practical processors they all suffered from poor performance for the overall system cost. They were all dog-slow. Even compared to other processors of their time, 16 bits or no. Plus you had to pay to put the full 16-bit bus into the system. At the time, that often meant building up the bus with 6-bit ICs since the MSI 8-gate chips were, if available, prohibitively expensive compared to the older 6-gate or 4-gate per chip ICs.

At any rate, if you were in the position of putting money down on apps that required a larger data space or a wider word, then the price differential between a micro and a mini would not be enough to suggest the microcomputer. Minis were quite cheap for the power, especially since VAX had moved into the top of the market with 32 bits.

16 bit micros weren't cheap enough to compete with 8 bitters in opening up new market at the bottom end, and weren't cheap enough to erode the market for 16-bit minis, either.

And the early ones were really slow, in part because the process technology they were implemented in was slow, and in part because other compromises made in the designs to get them on a single die hurt their performance. Not so much compared to an 8008, but they were competing with 8080, 6800, 6502, etc., not the 8008 which was already old and slow by the time they came out.

So are there any benchmarks, say, comparing floating-point BASIC performance on the 1802 with other CPUs of the time?

As far as 16-16 bit registers go, I think the GI CP1600 precedes the 1802 by a few months--and it was a 16-bit CPU.

The point I was trying to make with the 1802 relates to its in-built DMA. Basically I was positing that if we had the microprocessor without the simultaneous nonvolatile memory revolution, would microprocessors with a DMA feature like the 1802's--which allows loading memory and running with a bit of simple logic--have had a marked advantage over processors that are really only practical with a body of nonvolatile memory?

I wasn't trying to make out the 1802 as the be-all and end-all processor. I was referring to its LOAD mode DMA. With LOAD mode you can make the 1802 boot off practically anything--paper tape, cassette tape, artfully arranged bricks, you name it. This is part of what makes it a good chip for things like space probes, of course. The question was whether we would have seen more of this in a world without the EROM/PROM revolution of the mid 70s.

As to 1802 performance, I've already gone into that. The ability to take advantage of the voltage tolerance/high clock rate of the CMOS process you could outpace anything up to (and often including) a Z-80A with an 1802 system designed for performance. Most implementations of the 1802 traded away performance for simple video interfacing by running the system clock at 1/2 colorburst frequency (1.89MHz) and no faster.

Of course, if you wanted real compute performance, you didn't go to an micro if you could afford better. You bought time on a PDP, IBM, Eclipse--or a VAX once they appeared.

Getting back to the memory/microprocessor discussion, personally I'm convinced the real revolution was in memory. From the perspective of the foundries, the microprocessor is practically nothing more than a device designed to sell memory. Which is why the low-end, low-cost designs dominated. Once the market demand for semiconductor memory in mainframes and minis was growing more slowly than production capacity, the micro opened up new markets at the bottom end, where the costs of a mini weren't justifiable, and the advantages of computers were often unknown.

So we got microprocessors because semiconductor memory had already caused a revolution. And 8 bits dominated in the new low end market specifically because it was lot cheaper to build a reasonably competent system around than 12 or 16 bits. Supercalc and dBase II (or Spock, if you want to go a little earlier) were enough for that market segment.

What's interesting about the 4004 is how it's perceived today. It looks like it was more of a technical achievement than it really was. It's got a strange, crabbed design that looks like it was a result of really being out on the cutting edge. In fact, it was a design shaped by the constraints of Intel's packaging ability and inexperience at designing processors.

The much more sophisticated designs, based on long experience with processor design, and with better packaging options from chip houses that had a broader range of equipment than Intel don't have that unfinished, odd Victorian design appearance. Which somehow makes them look less "cutting edge." No matter how technically amazing it was to put a full 16 bit design on a single die. (And disregarding the fact that the price/performance ratio of a system built around it turned out to be a poor business investment for the manufacturer.)
 
Back
Top