• Please review our updated Terms and Rules here

Self Modifying Code... Best Practice or avoid?

I'll admit to exaggerating somewhat there, it was certainly a question that was asked and, yeah, there was a market for CP/M modifications. But I really do think it had very little to do with the ultimate success (and downfall) of the platform. The TRS-80 crashed and burned by 1982 because Radio Shack was selling a barely-improved five year old computer for too much money; being able to run an even staler OS wouldn't have helped. (And didn't, the Model 4 proved that.) In some alternate universe where the Model III had, I dunno, just fixed the bugs without breaking backwards compatibility, added high-res graphics onto it as a standard feature and *hadn't* jacked the base price up $300 maybe it would have fought things out with the Apple II a little longer. (The Apple had color, sure, but a *lot* of Apple II's ended up being paired to mono monitors.) But probably not much.

LoL! It's usually me exaggering :)

And I agree, It definitely wouldn't have saved it as you note - The PC was a juggernaut.

I've come to the conclusion that looking for saviours for CP/M in the 80's is like saying "Maybe if we gave the knight two swords instead of one, the dragon wouldn't have eaten him". It's a nice sentiment, but the dragon was going to eat him even if he came in with a Davy Crocket.

Though I still think that there was scope and a realistic likelihood that things could have changed back in the mid 80s... There were gaps in the PC armor. EISA and VLB then PCI proved that.

One interesting thing to note - when a new and more powerful computer came out, no one made software for it. They all made it for the lowest common denominator. So it was either backwards compatible, or it had to sell a huge range of software ( or have the potential to ) to bring in software writers.

That nearly killed the PC too until Windows started supporting games, and suddenly games would get better if you had a better system. You buy a computer, buy a game, but a better computer and the game gets better. That was never an option on early computers. The only real upgrade was the Maths Coprocessor. But modern PCs and games for them just keep getting better. There are a lot of early games I love playing on a modern PC because they are so much better on the new technology.

CP/M was kind of that salve to the fears when buying a new computer. You might never use it. You might never even be able to use it if you wanted to. But if you believed it would still support a reasonable lowest common denominator, then it was OK. You had protection.

Of course, for any chance CP/M would have had, either the industry needed to change CP/M or Digital Research needed to examine what the business market needed... In fairness, even IBM didn't know the answer to that... But color graphics were clearly the first thing on their mind, as was connecting to NTSC televisions of the era... Not something that was often talked about back in the day, though is quite popular to discuss now.

Backwards compatability is both a curse and a blessing. After all, for a long time, the only thing VGA cards were good for ( aside from a nicer picture ) was to show pictures of that girls face with the GIF software to your friends to show how superior your graphics were... Then you loaded up your software in CGA or EGA modes if you were lucky. Only a few programs that really needed the extra graphics capability were written to use it.
 
One interesting thing to note - when a new and more powerful computer came out, no one made software for it. They all made it for the lowest common denominator. So it was either backwards compatible, or it had to sell a huge range of software ( or have the potential to ) to bring in software writers.

To my mind why the Model III sealed Tandy's fate is how it failed on both counts; it had all sorts of severe compatibility gotchyas with the Model I(*), but at the same time being hardly improved over it. (Other than adding double-density support to the disk controller essentially everything else was just bugfixes; bugfixes that were already implemented on the Model I either by Radio Shack or third parties.) The "least common denominator" curse is certainly a thing that happened to some companies tried to introduce enhanced versions of older machines (witness machines like the Commodore 128), but it's also true that other platforms have been very successful in incrementally advancing a baseline forward. (The IBM PC is the poster child here, but I think the Apple II is a pretty instructive example in showing how a "baseline config" can evolve over time. The majority of Apple II software titles want either a 64K Apple II Plus, which is the best config an original Apple II can be upgraded to, or an Apple IIe, which happened to be almost perfectly backwards compatible with that previous baseline. It is at least *possible* to move the bar forward relatively smoothly.) What Radio Shack needed was an "Model I plus" or Model Ie", but instead they tried to do what Apple did with Apple III (a machine that's interesting because its feature set is a lot like the Apple IIe, but its backwards compatibility with the II is severely limited and broken by comparison), except Tandy forgot to actually include the equivalent of the Apple III's enhanced features, it's *just* the broken half-***ed compatibility mode. Other than having more disk space available there's nothing on the Model III that makes it a better/more exciting software development environment than the Model I; they *should* be able to run exactly the same software but they can't *quite*. This isn't the sort of package you come up with if you're trying to convince customers you're fully invested in the future of your product.

(* Even Radio Shack's marketing for the III leaned more in the direction of it being a "new" computer that happened to have a one-way migration path from the older machine with a few less potholes than switching to a different computer entirely. Perhaps most ironically they even replicated the worst feature of CP/M, IE, they made it impossible to just pull a working disk out of a Model III and use it in a Model I, or vice-versa; this is something you *could* do if you ran a third party DOS on both machines, but Tandy wouldn't carry such things in their stores. It's, I dunno, kind of like they wanted to fail.)
 
Last edited:
I dunno, kind of like they wanted to fail.

I doubt they wanted it to fail, but the people who would have been making the decisions probably had no idea what the market really wanted, or would pay for. Market research on computers was non-existent in the era, and even if they had it, it was a crazy period of time. New models were coming out with incredibly feature improvements every year. The home market went from 2k/4k to 16k to 64k in the space of 3 to 4 years. It was the first true "bubble" period I can remember when even if you failed, you still probably sold some computers based on the promises inexperienced customers believed, and few people knew what they really wanted, so they bought based on simple metrics and look and feel without understanding compatability, or perhaps while assuming the same groundswell of software support would happen again and they would have the same level of support in just a couple more years...

The PC wasn't really much different, but it was faster, and you could upgrade the whole thing piecemeal, or just buy the part you need for compatability. There were lots of sound cards, but if your latest wasn't supported, well you just went an bought another sound card, this time a sound blaster. Latest game you love won't run on your old Trident 8900? Dump it and get a Tseng Labs. Game's a little slow? That old 286 motherboard can be sold and a 386SX will replace it nicely. Only laptops got old - the PC's became modern methuselas and made the grandfather axe problem pale by comparison. Because all of the old parts could be reused and sold until truly unusable.

So if i upgraded my original PC 15 times, and now my entire family has PCs containing most of hte parts, which is the original PC?

If you had a Model 3 with a modular graphics system and standard memory maps for different peripherals, you could have made the "downgrade" to model I compatability.

I never bought a new hard drive for around... Hmmm... I think it was around 20 years. I don't think I bought a new hard drive ever back last century... I just kept getting better ones second hand when people upgraded... Started with a second hand 225 that had alignment issues... That one eventually I have to my brother in law who had to low-level it every time it switched off. So he would keep it running all week just to play a game loaded on from floppies.

Manufacturers tried to integrate "All In One" motherboards time and time again. All of them were limited to a single cycle of use because they weren't upgraded. Video is integrated into motherboards now, but it's very basic video, and mainly because the chipsets support it. People who just run windows t browse use it, but one you need to play a game, you get a proper GPU.

That is the true magic of the PC. It's not the processor. It's because it was completely modular. Everyone was talking about modular computers in 1984 that I remember and I imagine it started long before that, but very few computers were modular to the extent you could replace entire parts of the system. The Apple II was amazing in that respect. The Mac not so much. Needless to say, it's one of the things I don't like about the Mac.

And it's why I respect Microsoft so much for making windows run on more machines than even Linux supports.
 
That is the true magic of the PC.
I would say that it was the true magic of the PC. Most people buy non-upgradable technology nowadays.

And it's why I respect Microsoft so much for making windows run on more machines than even Linux supports.
Worth noting that Microsoft did have a strong hand in making life hard for Linux.
On the other hand, Linux supports far more machines than Microsoft ever did (by most criteria; comparing a single system with a company is a bit weird).
 
Other than having more disk space available there's nothing on the Model III that makes it a better/more exciting software development environment than the Model I; they *should* be able to run exactly the same software but they can't *quite*. This isn't the sort of package you come up with if you're trying to convince customers you're fully invested in the future of your product.
Come now, come now. Think of this not as a fairly notable f*ck-up, but instead as a vast improvement in how you design a new model over the Model II, where the floppy diskettes weren't even the same size! :p
 
Back
Top