• Please review our updated Terms and Rules here

Maximum bitrate for storing information on home-quality cassette tapes?

Floppy disks of the era had 250 Kbps rates, but usually included an interleave as well in many cases (not the IBM, but not uncommon on CP/M machines) - in which case your data rate is effectively halved or worse. And the raw rate as you note(rev/sec) is 5 x 512(bytes) x 9(sectors) x 8(bits) = 184,320 bps @1:1 and 92 bps @ 2:1 and 61,440 @ 3:1

And while you had relatively fast switching between tracks and settling time, that's a single track change, and sometimes the files are located in other locations so it's assuming all the data placement in sectors is optimised. If you don't have the data defragmented as much as possible, then you're going to run into more at both the effective data rate and lost time to changing tracks if the sector you want is missed because you're not staggering tracks to allow for switching tracks, that's another overhead.

Finally, you have the "number of files" issue, when the system needs to refer once again to the directory and seeks back to track zero.

Compared to a single 96Kbps linear stream, I imagine the tape might actually be faster in some practical cases and I picked 256K as an arbitrary number that would be more practical in the early 80s... Anything smaller than 256K and such a blisteringly fast result from a tape is pretty good ! And I think would have detracted from the value in floppy drives, slowing their uptake and affecting how quickly they evolved.

And if they leveraged a UART, then that's around 10 Kbytes/sec real speed. Most 80s 8-bit CPUs could manage that with a uart delivering the bytes, or a simple buffer.

They would be completely impractical though for anything requiring random access to files.

And of course, this is hypothetical as not only did QAM-256 not exist pre-1990, but we've already discussed why it probably wouldn't be possible even in the current era. But some level of multi-bit symbol encoding would be possible.

I am curious as to how fast a tape could reliably run. And a 5 minute tape load to 30 second disk load is a valid reason to upgrade. A 1 minute tape load to a 30 second load? Not so much.
 
The Coleco Adam (at least when it works) is able to achieve 19.2 kbps from its high-speed "Digital Data Pack" cassette tapes:


That is very quick, and they too look just like normal cassettes, though clearly work in a different ways.

It's apparent that the thinking to keep things cheap when Disk Drives were expensive was prevalent in the 80s and so was the desire for speed.

It's a shame they aren't an external device that can be moved between systems.
 
Floppy disks...usually included an interleave as well...
And while you had relatively fast switching between tracks and settling time, that's a single track change, and sometimes the files are located in other locations....
Yes, but I'm assuming here that whomever is creating the floppy to "load stuff fast" is setting an interleave of 0, regardless of the system, has a load routine that can handle that, and is not using directory structures etc. that would slow things down, at least not for the initial 256K load you're talking about. This is no problem for someone e.g. writing a game that's a booter, or building install diskettes, or whatever.
 
That is very quick, and they too look just like normal cassettes, though clearly work in a different ways.
The Adam tapes can't be flipped over to use the other side, so they're using a special head to record on both "sides" of the tape simultaneously, which effectively doubles the transfer rate. (The aforementioned streamer cassettes work the same way.)
 
I am curious as to how fast a tape could reliably run. .
High speed cassette duplicators in the 80's reached a peak of 80 times normal speed with reasonable fidelity. So 150 IPS; 960kHz high frequency response (and 3.2kHz low frequency cutoff). At this point true FM centered at 480kHz with a 256-QAM might very well get 400kBps or 3.2Mbps.

Sony made a few prosumer duplicators that ran somewhat slower; I want to say 16x was the speed of the Sony one of my client radio stations had.

In practice, the highest speed I know of for a consumer product using an audio cassette was used by The Fisher Price Pixelvision in 1987; I worked at Kmart at the time and those PXL2000's were really cool, but we got a lot of returns. I had opportunity to get a few that were being 605ed but passed on it. 11 minutes of rough monochrome video on a C90; 16.875 IPS is what I just read was the speed. That's 9 times normal speed, 108kHz high frequency response, 360 Hz low frequency response; you can do a lot with FM in a 100kHz wide audio channel. Those PXL2000's used special heads; standard audio heads' magnetic material isn't optimized for those frequencies, and so, just like with high speed duplicators, special heads are required. Same gap; different magnetic material and electronics.
 
How much overhead? With FM? That's a good question. I haven't done the math, so I don't have a ready answer. EDIT: speed fluctuations would show up on the demodulated signal as amplitude and phase shifts in the QAM constellation; the resistance to wow and flutter would be determined by the spacing in the constellation.

Techniques similar to those used by the CCSDS spacecraft protocols that are resistant to Doppler effects could be employed, but I again haven't done a detailed analysis to determine just how much overhead would be needed.

EDIT: there would need to be some measurements and tests made on real equipment before a really detailed analysis could be done. I'd love to see it done, or to get paid to do it, for that matter.
 
Last edited:
Yes, but I'm assuming here that whomever is creating the floppy to "load stuff fast" is setting an interleave of 0, regardless of the system, has a load routine that can handle that, and is not using directory structures etc. that would slow things down, at least not for the initial 256K load you're talking about. This is no problem for someone e.g. writing a game that's a booter, or building install diskettes, or whatever.

Isn't an interleave of zero a single sector per track? or multiple repeating (redundant) sectors?

But yes, for a lot of reasons tapes can't compete with Floppy Disks, but if tapes were a LOT faster, then in many cases the user would perceive them as faster due to the way tapes just keep pushing in data.

One of the key elements is feedback. They used to put counters in some loaders so you could see how much there was to go, which made a long load feel faster. Likewise on long floppy loads, I like to gently touch the disk with my finger and then I can hear the floppy drive working and stepping through bone conduction even if the computer is too noisy to hear it otherwise. The feedback makes it feel like the disk is working faster even though it's working just the same :)
 
Techniques similar to those used by the CCSDS spacecraft protocols that are resistant to Doppler effects could be employed, but I again haven't done a detailed analysis to determine just how much overhead would be needed.

If the change in tape velocity is relatively consistent, it's easy to address. If it changes slowly enough, it's also consistent enough to calculate for. After all, you don't get more inconsistent reading than with the old swipe-style credit card readers. They resync on each bit.

The problem with cassettes in the cheaper end is that something as simple as a kink in the drive belt will cause a periodic speed change. It's fairly slow though in terms of bit-to-bit transitions, so as long as some kind of bit index exists it should be possible to do something.

You keep using terms I've never encountered too - so I'm reading to catch up. I've never heard of a QAM constellation before. But then my understanding of it is pretty rudimentary, from reading about early WIFI technology about 30 years ago.

:-)

I know, but that level of work isn't free. The lab-grade equipment gets powered up for that, and would then need a system to test with, and a sample encoded data stream. Doable, but not free.

It's not a crazy idea, it's just anachronistic now since all that research lacks relevance in the modern world with flash memory. I'm just heading down this rabbit hole for fun :)

But back in the 80's and even the early 90s, I imagine this work already got done more than once and the results were lost to time.
 
The problem with cassettes in the cheaper end is that something as simple as a kink in the drive belt will cause a periodic speed change. It's fairly slow though in terms of bit-to-bit transitions, so as long as some kind of bit index exists it should be possible to do something.

It's definitely an interesting problem. Periodic speed changes... I still remember trying to use a square section rubber band as a belt in my old Sears walkman-style cassette recorder. That did not work well, unless gurgling audio is your thing. Special ordered a replacement at the local Radio Shack and fixed it, for a while.

You keep using terms I've never encountered too - so I'm reading to catch up. I've never heard of a QAM constellation before.

A good page about QAM is https://www.electrical4u.com/quadrature-amplitude-modulation-qam/

I work in a couple of niche fields full time: radio astronomy and space communications, and they both have a different set of jargon, for sure. I'm always reading to catch up; thankfully I enjoy reading!
But back in the 80's and even the early 90s, I imagine this work already got done more than once and the results were lost to time.
The Pixelvision patent (https://image-ppubs.uspto.gov/dirsearch-public/print/downloadPdf/5010419 ) might be interesting to you. I would love to see the research behind it, but those results are highly likely to have been lost to time.

Now, more to Al's question about how much overhead for error correction might be needed. To answer that, I would need to get an idea of what the bit error rates look like at various data rates and under various encodings. It would be a fun task. If you'll indulge me for a minute, here's how I would approach trying to find the answer to Al's question empirically.

As far as the equipment to test, I did put a bit of thought behind how to test this with hardware that's more easily accessible than some of the lab equipment I have at $dayjob. The hardest part is a signal generator with sufficient linearity that can cover the low frequency bandwidth required. Doing this at VHF and higher frequencies is easy; get a PlutoSDR, write some GNURadio flow graphs, and call it a day; there are GNURadio modules for various modulators and demodulators, and you can experiment to your heart's content for relatively low money. There are books galore; No Starch Press just put out a new one called Practical SDR that is a must read if you're into this sort of thing.

But you're talking baseband, DC to a few hundred kHz. That's much harder, and not a lot of cheap hardware out there that covers that on the generator side (plenty of LF and HF receivers, though). A high rate 192kHz audio interface is one possibility. Not ideal, but usable, depending upon how accurate the results need to be. Recommend a Focusrite Scarlett 4th gen interface for 192kHz 24-bit goodness if this route is chosen.

National Instrument has a broad selection of very nice and rather expensive DAQ hardware that includes signal generation that can go to DC. Well supported in software, too. But did I mention it's expensive?

To get something a little more accessible, I would probably go back a few years in hardware design and pull out a first generation Ettus Research USRP with the LFTX and LFRX daughterboards, which can be had second hand for fairly low money if you look. Software support isn't as good as it used to be, but they can be made work. Plus, I already have some of those available to me, so that's my DAC/ADC pair of choice.

The USRP gives me highly linear 14-bit generation and 12-bit reception, sufficient for the dynamic range of audio cassette, the LFTX/LFRX pair are linear from DC to 30MHz, and an 8MHz bandwidth can be managed over its USB 2.0 interface, even though the DACs clock at 128Msps and the ADCs at 64Msps.

The linearity of the signal generator would need to be measured first, then the receiver's characteristics can be measured based on the known characteristics of the generator. My spectrum analyzers at work are built for RF testing; none of them go below 100kHz, so I'll have to do it the old fashioned way, with an RMS signal meter. I have one of those that goes to 500kHz. Characterizing the signal generator is likely the step that will take the longest.

Then it's time to generate/encode signals with various modulations, record on the equipment under test, and then playback into the demodulator, upping the data rate and measuring the error rates. The answer to Al's question will show in the BER data.

This would be a really cool project to do....

EDIT: Why so far oversampled? To reduce or eliminate the contribution of the quantization noise of the test equipment in the result.
 
Last edited:
Isn't an interleave of zero a single sector per track? or multiple repeating (redundant) sectors?
Yes, sorry. I meant an interleave of 1.

But yes, for a lot of reasons tapes can't compete with Floppy Disks, but if tapes were a LOT faster, then in many cases the user would perceive them as faster due to the way tapes just keep pushing in data.
My point is that, if such tape systems existed, and diskette systems had to compete with them, the diskette systems would achieve or probably exceed the speed of those tape systems by simply tweaking how they do their loads.

In fact, even without such competition, they did do this; look at e.g. PC hard disk backup programs from the mid- to late-80s, which IIRC saved data at "format" speed. (I don't recall if they actually formatted the disk at the same time as they saved they data, which would be most efficient, or if they needed formatted disks, but I think it was the former.)

Well, and booter games, too, I'm sure did this. After all, why would they do otherwise?
 
My point is that, if such tape systems existed, and diskette systems had to compete with them, the diskette systems would achieve or probably exceed the speed of those tape systems by simply tweaking how they do their loads.

In fact, even without such competition, they did do this; look at e.g. PC hard disk backup programs from the mid- to late-80s, which IIRC saved data at "format" speed. (I don't recall if they actually formatted the disk at the same time as they saved they data, which would be most efficient, or if they needed formatted disks, but I think it was the former.)

Well, and booter games, too, I'm sure did this. After all, why would they do otherwise?

I've no doubt disks can go faster - It's simply a better technology for the purpose. I'm just noting that a lot of the disk systems of the era were far slower in practice than the technology supported.

But had tapes been as fast as say 115200 Kbps - the practical limits of RS-232 back in the 80s, I'm not sure that home computers would have popularized disk interfaces as quickly as they did, and even the IBM would have seen a lot of disk use. It didn't of course, but the logic behind it is entirely understandable. It was the right decision and I think highlights that they themselves didn't realize the market they had just opened.
 
But had tapes been as fast as say 115200 Kbps - the practical limits of RS-232 back in the 80s, I'm not sure that home computers would have popularized disk interfaces as quickly as they did
Convenience and reliability is what justified the extra cost of disk drives to consumers, not necessarily speed. The Atari 810 and especially the Commodore 1541 disk drives are dog-slow, yet quickly made cassettes obsolete on those computers in North America.
 
In fact, even without such competition, they did do this; look at e.g. PC hard disk backup programs from the mid- to late-80s, which IIRC saved data at "format" speed. (I don't recall if they actually formatted the disk at the same time as they saved they data, which would be most efficient, or if they needed formatted disks, but I think it was the former.)
I know the FD1793 and family (FD1771, WD1770, WD1772, WD1773, WD2793/7, etc) use a write track command for formatting where actual sector data is part of the data that is written, but the data is not transparent, there are special bytes that write gaps, IDAM, DAM, CRCs, etc) that cannot be in the data area of the sector. There is a corresponding read track command that does similar but is data transparent. So with special encoding to avoid those special bytes you could stream out track at a time.

The most impressive display of floppy streaming (albeit read-only) I know of is George Phillips' Dr. Who intro video on the TRS-80 4P. Video at
; four full floppies in thirty seconds.

But that's the Western Digital FDC chips that can write track with data on the sectors, not a PC FDC. I don't know enough about the 765 FDC to know if sector data can be written during formatting or not.
 
Last edited:
Convenience and reliability is what justified the extra cost of disk drives to consumers, not necessarily speed. The Atari 810 and especially the Commodore 1541 disk drives are dog-slow, yet quickly made cassettes obsolete on those computers in North America.

Tapes aren't that inconvenient for home application - it's only once you need to work with directories that it becomes an issue. Also they are relatively reliable - certainly I have little problem recovering software data from old cassettes which is not always the case for disks.

The Commodore 1541 is very slow, sure, but it was worlds faster than using a tape.

LoadSpeed10-12-2020.png

The C64 tape was 37.5 bytes per second - more than ten times slower!

Had tape on a C64 been over 10KBytes/sec, no one would have ever bought a disk drive for it. Still, they did get faster as they progressed. I get the feeling this post more supports my original assertion than disproves it ;)
 
Back
Top