• Please review our updated Terms and Rules here

When did the 8-bit era end?

Chuck and others have taken the technical aspect of the question but from a marketing point of view (and the public's perception) I'd say the 8-bit era ended on 12th August 1981 - the day the IBM PC was announced.

You get 1 vote from me on that.
:toast:
 
Chuck and others have taken the technical aspect of the question but from a marketing point of view (and the public's perception) I'd say the 8-bit era ended on 12th August 1981 - the day the IBM PC was announced.

But Texas Instruments had the first 16-bit home computer, in 1979, finally... :)

ti99_finally_large.jpg
 
Mattel had one that year too but they forgot to sell the keyboard so it was doomed to play video games. :(
 
It should also be observed, that, in 1981, virtually no original x86 applications existed for the then-64KB PC. The ones that were available were mostly translated from x80 products and not very different (or faster) than their 8-bit original versions.
 
My first computer was a TRS-80 Color Computer, around 1980 or 1981. For me, the 8-bit era ended when I got my Amiga 1000 in 1986.
 
Never really had an 8-bit era save a couple of C64s for the kids. A 286/16 was my primary system in the early 90s and being a generic clone was upgraded quite a bit over time using second hand components.
 
Chuck and others have taken the technical aspect of the question but from a marketing point of view (and the public's perception) I'd say the 8-bit era ended on 12th August 1981 - the day the IBM PC was announced.

How do you figure?
As far as I know, IBM never tried to market the PC as a 16-bit machine.
We always spoke of PC/XT-class machines as being 8-bit, as far as 'public perception' goes.
 
And indeed, some 16-bit processors with 16-bit buses have been successfully interfaced to 8 bit buses.

There's even the 68008, which is a 68000 with an 8-bit bus. So that's technically a 32-bit processor (okay, purists will note that the ALUs aren't fully 32-bit, but it will perform 32-bit operations on 32-bit registers, so to the software it appears as 32-bit).

The Atari ST was called ST for a reason: Sixteen/Thirty-two.
Likewise, the Amiga marketed as a 16-bit machine, not 32-bit.
I don't think anyone ever used register/ALU size as the definition of how many bits a machine was.
Let alone the FPU, because those were very wide even in the early machines... 80-bit for the 8087, and 96-bit for the 68881.
I think more often than not the data bus was the metric.

Which means the PC world missed a huge marketing opportunity when the Pentium arrived: it was a 32-bit CPU on a 64-bit data bus to feed its double pipeline architecture.
Could have been marketed as an early 64-bit machine :)
 
The 8-bit *microcomputer* era may have ended (except for hobbyists still building them), but the 8-bit CPU era hasn't ended yet. The 65C02 is going strong, and 8-bit microcontrollers are used everywhere. The alarm company replaced my home alarm system because I got rid of the land line and the board they ripped out of the old system (and gave to me) was Z8-based.

-Tor
 
How do you figure?
As far as I know, IBM never tried to market the PC as a 16-bit machine.
We always spoke of PC/XT-class machines as being 8-bit, as far as 'public perception' goes.

Yes, sure the 8088 physically has 8-bit I/O paths, but logically the registers and instuction set are 16-bit 8086.
Here's how BYTE magazine described it in the opening paragraph of their in-depth review of the IBM PC in the January 1982 issue:

"What microcomputer has color graphics like the Apple II, an 80-column display like the TRS-80 Model II, a redefinable character set like the Atari 800, a 16-bit microprocessor like the Texas Instruments TI 99/4, an expanded memory space like the Apple III, a full-function uppercase and lowercase keyboard like the TRS-80 Model III, and BASIC color graphics like the TRS-80 Color Computer? Answer: the IBM Personal Computer, which is a synthesis of the best the microcomputer industry has offered to date."
 
a redefinable character set like the Atari 800

If only... :)
That would have made CGA a LOT better... and MDA would already be halfway Hercules.
But alas, no, the characterset is NOT redefinable. Not sure where they got that idea.

Answer: the IBM Personal Computer, which is a synthesis of the best the microcomputer industry has offered to date."

Apparently this was before the Commodore 64 was introduced :)
 
If only... :)
But alas, no, the characterset is NOT redefinable. Not sure where they got that idea.

In CGA graphics mode, the upper ASCII characters are redefinable. In fact, they're not defined to anything in the first place unless you load GRAFTABL.
 
In CGA graphics mode, the upper ASCII characters are redefinable. In fact, they're not defined to anything in the first place unless you load GRAFTABL.

Uhh yea, *graphics mode*... But in graphics mode you are basically just plotting raw pixels anyway, since there's no hardware handling of fonts or blitting or anything. It's just a BIOS routine that does a software blit of the font in ROM.
I was talking about using custom characters in textmode. Which is how most C64 graphics are done (and the Atari mentioned). Dynamically switching character sets, or even drawing into the character set as if it were a bitmap.
 
The 8-bit *microcomputer* era may have ended (except for hobbyists still building them), but the 8-bit CPU era hasn't ended yet. ...

-Tor

Z80's and derivatives are still used in many DVD burners as the microcontroller. Z80's were also at the core of many alarm systems in the 1990's. But the question was abut 'personal computers.'

As to the bittedness of a CPU, I have always used the terminology of 'the basic word size of the system' for the definition. Data bus size is just about irrelevant, as modern PC data buses are not 64 bit (PCIe is one bit per lane, serial data; HyperTransport, used in the AMD Athlon 64 etc, is a 2 to 32 bit bus, but Athlon 64 is a 64-bit CPU; QPI and DMI have yet different bittedness, with QPI defined as two 20-lane PCIe connections and DMI can be implemented in 4-bits; etc).

Virtually no one defines a first generation Pentium as a 64-bit machine, yet the Pentium has a 64-bit data bus.

The 8088 has a 16-bit word; it's a 16-bit CPU. The Z80 has an 8-bit word, and is an 8-bit CPU (16-bit instructions notwithstanding, and 16-bit SP and PC notwithstanding, and 4-bit ALU notwithstanding). The 68000 has a 32-bit word and is a 32-bit CPU, data bus size notwithstanding (many people even write '16/32-bit' for the original 68K and 68010, as many also write '8/16-bit' for the 8088 and 80188 chips). The PDP-8/s has a 12-bit word and is a 12-bit CPU, 1-bit serial ALU notwithstanding.

I don't see why this is so hard.

CPU bittedness has nothing to do with peripheral bittedness, either, as 32-bit PCI slots are found in 64-bit PC's, 8-bit ISA is found in 32-bit PC's (and you can get 8- and 16-bit ISA in 64-bit PC's with industrial motherboards and passive backplane systems).

But that's all just my own opinion.....
 
The definitions were blurred by the "16-bit" NEC Turbografx 16 having an 8-bit 6502-based CPU (but a 16-bit graphics processor), and the "world's first 64-bit video game console", the Atari Jaguar, having the 16/32-bit 68000 as its main CPU (but a 64-bit graphics co-processor).

Early on, the Sega DreamCast and Nintendo GameCube were claimed to be "128-bit", but no one believed that.
 
Oh, the definition was blurred long before that. What's the bit-size of an IBM 1620? 1 digit? 2 digits? 5 digits? 20,000 digits? Well, you can't do arithmetic on a single digit, since a number has to have both a sign and a field mark, so that's 2 digits. But addresses are 5 digits--and the longest quantity that a single arithmetic instruction can work with is a shade less than 20,000 digits. But then, the system is strictly memory-to-memory, so no user-addressable registers. The same observation can be made on the IBM 1401.

On machines with user-accessible register files, you can create a general rule-of-thumb by saying "what length register do most arithmetic operations use?". And then, the 68000 becomes a 32-bit machine....
 
...and this is what I've been trying to hint at. The 8-bit era did not end in 1981, as the original 5150 was scarcely better than a 5 MHz "8 bit" Z80. Remember the GI CP1600 or the National PACE--both ostensibly 16 bit CPUs from the mid-70s that offered no material benefit of their 8-bit contemporary CPUs.

An "era" is defined not solely by hardware or software, but by what can and is done with both.
 
Back
Top