• Please review our updated Terms and Rules here

A list of support chips for the Intel 4004

Personally, I think too much is made of the significance of the first MPUs.

People tend to treat them as if they arrived like a bolt from the blue, but in fact, they were just part of the evolutionary process. From SSI circuits, complexity started increasing. I recall that the 1-chip ALU, the 74181 was greeted with quite a bit of interest, but it was bipolar because MOS was slow-slow-slow. For more ideas on where bipolar design was going, consider the Fairchild Macrologic family.

Let's not forget that the early micros right up through 8080 required a fair amount of external chip support with clock generation and system bus control. It's pretty much a stretch saying that the 4004, 8008 or even 8080 was a "complete" microprocessor.

A year or two after the 4004, National Semi came out with the IMP-16, a chipset that implemented a 16-bit minicomputer. Far more versatile than the 8008 or even the 8080. Yet almost no one remembers it because it wasn't a one-chip solution.
 
I agree with Chuck. After all, the 8080 didn't even really start to get traction until the latter half of 1975. The IMP and PACE were both more capable, but didn't have the "mindshare". The 1801 was another good CPU that got overlooked because it was a two-chip solution. Low power, powerful register set (When I read the 8080 data sheets, my familiarity with the 1800 series had me actually break out in a laugh when Intel called their registers "general purpose".) These chips were RISC before there was RISC.

The adulation of the 4004 baffles me, to be honest. It was a step in the process, but its contemporary impact was far less than is made out. It's only in retrospect that it can be made to appear important. It was an expensive, complex, multichip solution. I avoided it back in the day and I'd never want to build around one today unless I was feeling especially masochistic. What it did the SC/MP did better in every possible way. I'd rather base a system on an 8041 or 8048 than a 4004.

The 8008 was crippled as well. Again, why waste your time with a chip that was crippled by Intel's limited packaging abilities so as to be spread among so many chips when you could get a nice two-chip solution like the 1801? I have to say that what I was really looking for in a processor at the time was a 12-bit processor, rather than an 8-bit. I ended up doing plenty with 8-bit chips and having a great time with them, more than I expected coming from an IBM/DEC background, but I wasn't along among those who at the time looked at the 8008 and 8080 and said "when are you going to make a real computer chip?"

The 8085 was the first of the Intel line that I really liked, and by the time it shipped in quantity the Z-80 was already out. The 8085 finally had a single supply, no external bus expander required, and for good measure an internal clock. But Intel was beat to market on all these features by other CPU manufacturers.

The single-chip CPU was coming, 4004 or not. There were pressures from several areas pushing toward the integration of the different functions. Witness the differences between the different early CPUs, like the F8 and 1802 and so on. Each was shaped by a different view of how the processor should be organized and used, and by each company's manufacturing abilities. What later began distorting the market was the effect of the software base. That has continued to this day, making certain processors stand out more in retrospect than they did in their own day.

Anyway, the 4004 is an ugly, ugly, ugly chip set. The instruction set is narsty as can be. It's technically interesting and it has an interesting history, but it wasn't the pinnacle of achievement in 1971-2 by any means--even the dates are mythical retro-inventions. But that's another story. ;)
 
Now that I've thought about it, I'll go one farther by saying the one-chip microcomputer was not the greatest evolution of the early 1970s. In my opinion it was the MOS DRAM.

Consider a terminal that used the 8008, such as the Beehive SuperBee. What to use for memory for a screen buffer (24x80, with several pages)? Core was a possibility but was power-hungry and a bit difficult to use (after a read, core needs to be rewritten as reading is destructive). There were other types of storage, such as magnetostrictive (too slow ans sensitive to mechanical shock). In the end, a recirculating shift register was used. 1Kbit MOS shift registers were fairly inexpensive and fast, as long as you could wait for your bit to come along.

Remember that the original MITS Altair had 256, count 'em, 256 bytes of static RAM. The 2101 SRAM was used in the TV Typewriter and used 6 of them to provide an uppercase-only display--and they ran hot.

The 1101 and then 1103 DRAMs were the answer, followed by the 2107 (4Kx1) and then the 2116 (16Kx1)...

Memory is what made the personal computer practical.
 
Memory is what made the personal computer practical.

Right on the nose.

Though I'll say that the PIA was up there in importance--but it was more of what made the uP successful than a stand alone technical achievement. Using general-purpose MSI for I/O has always been my preference over LSI and VLSI, but the programmable I/O chip brought in a lot of engineers, and therefore applications, to uPs than would have happened otherwise. The uP, whether a single chip like the sixers or 3-chip like the 8080A, had been reduced to a cookbook component for most designs. But they would have remained a curio without the sorts of apps you could get with uP + PIA (of whatever family or, often, mixed families.)

At the time I remember a lot of designers treating I/O as a black art. They considered the programmable I/O chips as a panacea, and offered design solutions using them with a uP that brought uPs into a lot of places they wouldn't have been otherwise.

Another point on semi memory's importance--it would have revolutionized the world even if we'd never developed a single-chip processor (doubtful as that may be with demand for microcontrollers, but imagine the 8041 as an apex there, perhaps.) There were so many things to do with it, that had nothing to do with tying it to a processor. Now we can hardly imagine anything else.

For those who can't imagine, take a look at a memory data/applications book from around 1969-1973. Intel was strongly targeting their memories toward use in mainframes and minicomputers, so it's almost all digital computer apps in their books, but Moto, Nat'l Semi and others had all sorts of other applications in their books. State machines galore, with no CPU.
 
My memory is that the Altair 8800 used 8212 for parallel interface initially; my OEM Diablo 630 was driven from a 3P board that had no programmable I/O whatsoever--just ports where the direction was set manually. Serial I/O was from a TR1602-type UART, with everything set manually.

So, I tend to discount the PIA or PPI--the only one I ever used was the 8255--and it's got its own quirks that persist to this day.

On the other hand, the UART made connecting computers to each other and ultimately, the first internet access possible.
 
If you are talking about influential chips in that era, don't forget the 6502. Not so much technologically, but the way it was sold for $25 a pop by Chuck Peddle at a trade show. This not only helped bring out the Apple I, but more important I think helped create the idea of mass-market computers in general. Also, low chip count (including a single-chip CPU and good companion chips) doesn't make a computer better, but it does usually make it cheaper.
 
Well, MOS Technology definitely broke the standard for immense margins on CPUs, but that was an effect of a competetive environment rather than technology. Prices had already been falling, CPUs that had been over $300 had fallen to about $100 before the WESCON where the 6502 was debuted.

I think what's at issue is the sort of after-the-fact idolization the 4004 and 8008 have gotten. Much more has been attributed to them than was the fact at the time. In my mind, the 6800, 1801, and 8080 were the real turning point for the use of processors in small/inexpensive system design. The 4004 and 8008 were no better than many other chipsets available at the time, treating them as if they were single-chip bolt-out-of-the-blue wonders is silly in historical context.

Integration and dropping prices were the nature of the business. Neither design had a level of integration that really stood out from other developments at the time--they were just a slightly different mix of processor components relative to other processor chipsets of the time, but which placed processing, control, and a register file on one of the chips of the chipset, technically filling the checkboxes for being a microprocessor.

Lest it be thought that I'm a 4004 basher, have a look at the 4004 article I put in the wiki months ago. I'm quite familiar with it and the 8008. Both look more important in retrospect than they were contemporaneously. They were significant, but not technically dominant in any way.

When a pair of guys left Fairchild to build the semiconductor memories that Fairchild said weren't worth building, however...the Earth shook. ;)
 
To what degree do newer X86 designs (starting from 8086/8088 I suppose) share addressing modes and instruction set from the 4004 and perhaps even more the 8008 and 8080? Or in other words, is there any legacy or heritage from the 4004 in newer CPUs? If so, I would assume a bit of the idolization comes from that fact. Many of the other microprocessors you mention don't live on in modern times, at least not as obvious as the Intel line has been.
 
A better question would be what do x86 CPUs suffer from the 8008 design? The answer to that is, of course, considerably.

And it's not as if Intel tried to get away from the x86--it was, after all, mostly intended as a stopgap design until the 432 was ready (a very advanced architecture). Even the N10 project that ultimately resulted in the i860 was started in 1986 showed that Intel was desperately trying to drop the x86 notion (BillG at Microsoft thought it would revolutionize computing). They tried again with the Itanium.

No soap.

It's sort of like replacing a cart and horse with a jet fighter, only to discover a feedbag for the horse is part of the fighter's standard equipment.
 
Hi
About the only thing one can think of that was brought
forward was the decimal adjust.
Still, one has to realize that the 4004 was more like the
typical mainframes and minis of the time than later
processors.
All the I/O and memory watched the instruction stream
and knew when it had action to do. The processor had
nothing to do during these cycles other than read or
write data to the correct register.
Later uPs would explicitly send I/O address and data
to ports ( even if memory mapped ).
As for the 4004, you have to realize it was a relatively
specialized design to run calculators and had to fit into
DRAM chip packages.
I agree about solid state memory as begin a significant
step. EPROMs were also a window opener. Before that
the only way to test code was a diode matrix or simulation
on a mainframe. Making a mask ROM had a high entry
fee. I doubt any of the PC would have even started without
the ability to develope code 'on the cheap'.
Dwight
 
I agree about solid state memory as begin a significant
step. EPROMs were also a window opener. Before that
the only way to test code was a diode matrix or simulation
on a mainframe. Making a mask ROM had a high entry
fee. I doubt any of the PC would have even started without
the ability to develope code 'on the cheap'.
Dwight

Memory does seem to be it, doesn't it?

And yeah, EROM/EPROM/EEPROM were really important to development.

A funny thought, though. In an alternate history without user programmable nonvolatile memories, does the 1802 (usable for ROMless development thanks to LOAD mode) become a hobbyist favorite over the eighters and sixers in the 1976-78 time frame? :D

...Actually, I expect not. Instead I'd think a 1K masked monitor ROM would become a "standard" part for the other processors until user programmable memories do appear. The 1802 was interesting, but RCA just let too many opportunities slide to ever become a top tier uP supplier.
 
The 1802 was awkward to program (everything funneled through the D register) and a bit spare in the instruction set. The LOAD mode was interesting, but could be emulated on other CPUs (e.g. the Altair and IMSAI machines didn't have any problems not having a CPU with that mode. And when cheap fusible-link PROMs came about, it got even simpler).

Another thing that hampered the 1802 was the lack of a comprehensive support chip set.

Where the 1802 was useful was where low-power remote telemetry applications demanded battery power supplies. Since it was a fully static CPU, you could turn off the clock and the power consumption would drop to microwatts.

RCA wasn't known for their business acumen outside of radio and TV. After "General" Sarnoff turned over the firm to his kid, Robert, it was all downhill.

They shed their profitable mainframe business (remember the Spectrola? (Spectra 70)), got out of micro-electronics and instead went into car rental and TV dinners, both of which turned out to be disasters.

Reminds me a lot of An Wang and his kid Fred, who essentially destroyed Wang Labs.
 
I have been eyeing the 1802 for at least five years. I printed Tom Pittman's short introductionary course and even managed to squeeze out a few instructions of my own but didn't really get into it. I found a reference about 1802 programs usually are more compact than the exact same program written for the 8080, 6800, 6502 and Z8, but also that 1802 uses more clock cycles than the rest.
 
I'd seen the claim that since most of the 1802 instructions were one byte, that programs were more compact, but I took the claim with a grain of salt. The one-byte instruction claim comes about because most instructions are one-address, with that address being a 4-bit quantity specify a single register. The D register as an operand is implied. There are no 16-bit transfer instructions, so address registers have to be loaded 8 bits at a time.

I find the "smaller" claim suspicious because I've never run into a real large-scale application for the 1802, such as a PL/I compiler, relational database manager, accounts receivable package, etc. Implementation of data mechanisms to support such stuff is pretty cumbersome on an 1802. For example, write the code for a C-type calling sequence with on-stack local variable in the routine being called. It gets very messy.

It might be true for simple operations, but that's not characteristic of serious applications.
 
If you try to program an 1802 like another processor, it'll be a disaster. If you approach it with the right mindset it will have tighter code than other chips of the era.

It's up to the same tasks as any other processor with the limited memory space of that time. With its register set it handles sophisticated software well and efficiently. Personally I'd put it at better than an 8080A or 68A00, about on a par or slightly better than an 8085 (comparable I/O capabilities, better code/register set in 1802, higher speeds in later 8085s), the Z-80 beats it by a nose and the Z-80A clearly trumps it, without considering the 8080 codebase--just on technical merits.

That said, the system implementations that RCA came out with for general purpose computing were pretty limited, giving the chip the appearance of being limited.

But it was a good, solid chip in 1976. ;)
 
..and we're back to the "what's so special about a single-chip CPU" question again. By 1976, we had several 16-bit CPUs. Weren't the MicroNova and Fairchild 9440 a reality in 76? You could do whole database and business management software on those.

So are there any benchmarks, say, comparing floating-point BASIC performance on the 1802 with other CPUs of the time?

As far as 16-16 bit registers go, I think the GI CP1600 precedes the 1802 by a few months--and it was a 16-bit CPU.
 
...As far as 16-16 bit registers go, I think the GI CP1600 precedes the 1802 by a few months--and it was a 16-bit CPU.
GI is a name that's largely ignored these days, but they were very much a presence way back then with their CPUs, PICs, UARTs, EEROMs, etc., not to mention their game, sound and speech chips, and some of their grandchildren are still around today...
 
GI is a name that's largely ignored these days, but they were very much a presence way back then with their CPUs, PICs, UARTs, EEROMs, etc., not to mention their game, sound and speech chips, and some of their grandchildren are still around today...

...amongst which is the PIC MCU. Dates back to 1976 or so--the instruction set is clearly recognizable.
 
Back
Top