• Please review our updated Terms and Rules here

A pretty good rant!

8008guy

Veteran Member
Joined
Jun 25, 2014
Messages
584
Location
Salt Lake City, Utah
I enjoyed this blog. Many of us came from the days where we wrote SW for resource limited machines. Now it seems the goal is to consume every resource just because it's there. It brought to mind a posting where someone said nothing should be written in 'c'... Really? Why, because you cant...

Enjoy

http://tonsky.me/blog/disenchantment/
 
The young 'uns don't know what "resource limited" means. The Control Data 6600 was classed as a "Supercomputer". Any guess on the CPU clock speed? 10MHz. The peripheral processors ran at 1MHz. Okay, that was 1965. How about memory-limited? Look at the memory requirement for DOS/360's "kernel" or "resident". 8KB to multitask 2 foreground jobs and one background job. If you had, say, a Model 40, full memory complement was 128KB; you could also get the basic machine with 32KB or 64KB. Disk and tape storage was similarly limited.

We could go back to many second-generation machines that were even more resource-limited.

And we sent men to the moon with computing support like this--almost 50 years ago.
 
I guess you could program the trajectory of a rocket and hit the moon using a C64 if needed. Most of the space race centered on the rockets to get out of earths gravity with the required payload without blowing up.
 
<shrug> I've been saying all this for years, and everyone laughs it off. I'm just stupid because I believe a 50MHz Amiga can do what I need a home computer to do. I still make good use of the HP41, and I still know how to optimise bits of 6502 until they seemingly can't be optimised anymore (in many cases they can anyway, especially through unconventional means of programming.)

But what does he mean "Linux kills random processes by design." I'm not exactly a fan of Linux. I run several computers with it, and several others with Unix. The Unix ones seem to be more reliable. but I haven't ever noticed "random processes being killed" by the OS. I think I'd notice that.
 
I've run Linux pre 1.0, I've never seen random process killed. I run racks of Linux servers and get up-times of years, my windows laptop I get excited when I get two weeks! I think people just get used to the instability of windows and think it is normal.

I don't discount the complexity and size of the data that modern pc's handle, but that should have nothing to do with the stability of a computer. Anymore is seems developers are chasing the holy grail of self writing languages. Previously the definition of a lazy programmer was to write code once and reuse as much as possible, that's what I do. The new definition is to never have to write a line of code. Sigh...


<shrug> I've been saying all this for years, and everyone laughs it off. I'm just stupid because I believe a 50MHz Amiga can do what I need a home computer to do. I still make good use of the HP41, and I still know how to optimise bits of 6502 until they seemingly can't be optimised anymore (in many cases they can anyway, especially through unconventional means of programming.)

But what does he mean "Linux kills random processes by design." I'm not exactly a fan of Linux. I run several computers with it, and several others with Unix. The Unix ones seem to be more reliable. but I haven't ever noticed "random processes being killed" by the OS. I think I'd notice that.
 
I enjoyed this blog. Many of us came from the days where we wrote SW for resource limited machines. Now it seems the goal is to consume every resource just because it's there. It brought to mind a posting where someone said nothing should be written in 'c'... Really? Why, because you cant...

Enjoy

http://tonsky.me/blog/disenchantment/

As much as I enjoy writing Assembly Language, and I do a LOT of things in assembly: it would be hard to imagine everyone abandoning .c and other high-level languages and doing EVERYTHING in assembler. It does run Fast.
 
Well if Adobe did all their apps in nothing but assembly language we would have a new version of Photoshop once every 20 years.
 
You mean updates wouldn't be so frequent?:)

Abandoning C is not the answer. Reducing dependencies is, partly. But that's just as much work.

I'm glad we're finally seeing a reasonable amount of platform-independence. Reverting to assembly would negate that, unfortunately.
 
Assembly pretty much restricts you to a given ISA. If you wanted to change from, say, 16 bit X86 to 32 or 64 bit X86, you pretty much have to rewrite most of the code. If you wanted to change your support to ARM, even worse.

I remember hand-timing code for CDC 6600 and STAR 100 systems. You thought a lot about instruction and register choice and ordering. The 6600 was a bit easier because of the limited instruction set. The STAR had a huge instruction set--there was no way that I could write good assembly for that thing without a reference manual open in front of me. That whole business could really pretty a crimp in your productivity. You not only worried about the algorithm you used, but how that algorithm was coded; i.e., the individual instructions that made up the thing.

I think it's remarkable that I can get versions of systems with millions of lines of code for various ISAs in fairly short order (i.e. not years). Had OS/2 been written in an HLL, I wonder what its future would have been.

I'll also venture the guess that proportionally, there's a lot more really awful assembly out there than there is C.
 
I'll also venture the guess that proportionally, there's a lot more really awful assembly out there than there is C.
I rather doubt that - or, at least, I'd imagine that it's only true on platforms where HLLs were not generally available or feasible for a certain class of project. It is certainly harder to write good assembly than good C, but that also means that most of the sloppiest programmers are too lazy to bother with it in the first place if they can avoid it.
 
I'm not sure why C was brought up here as the antithesis of optimization, or something. After all there are about twenty rungs up (or down?) the ladder from "write everything in C" to "write everything using specialized software stacks containing dozens of tiers of interpreted, scripted, managed, virtualized code, each with its own labyrinthine river-deltas of dependencies, comprised mostly of frameworks and libraries knitted together from all over the place". Which is what that blog post is rightfully targeting.

It's even worse when you consider that modern software development is increasingly being dominated by mobile and web development, which is where these things get really ugly. The web wasn't conceived as a framework for applications, and all the advances we've gotten up until Thee Current Year don't fundamentally make it the right tool for the job.

Writing complex ("rich") software applications for the web (and that includes mobile apps, which largely *are* web apps to varying degrees) is still like shoehorning a woolly mammoth into a rat. No matter how much grease you have, and how much power you're given to do the shoving - your best-case result is just a very bloated rat, being used for purposes that nature certainly didn't intend.

I'm not a developer by trade but I've seen all of these things in action in my last few jobs... and it's managed to convince me I wouldn't *want* to be a software developer (except as a hobby and mostly in outdated systems/environments).
 
C is essentially portable assembly, and requires the same attention to detail to properly write.

From what I know, C is not assembly at all but C compilers tend to have assembler-like functionality which is technically a separate language from C, but most C users incorporate it into their code.
 
I'm not sure why C was brought up here as the antithesis of optimization, or something. After all there are about twenty rungs up (or down?) the ladder from "write everything in C" to "write everything using specialized software stacks containing dozens of tiers of interpreted, scripted, managed, virtualized code, each with its own labyrinthine river-deltas of dependencies, comprised mostly of frameworks and libraries knitted together from all over the place". Which is what that blog post is rightfully targeting.

It's even worse when you consider that modern software development is increasingly being dominated by mobile and web development, which is where these things get really ugly. The web wasn't conceived as a framework for applications, and all the advances we've gotten up until Thee Current Year don't fundamentally make it the right tool for the job.

Writing complex ("rich") software applications for the web (and that includes mobile apps, which largely *are* web apps to varying degrees) is still like shoehorning a woolly mammoth into a rat. No matter how much grease you have, and how much power you're given to do the shoving - your best-case result is just a very bloated rat, being used for purposes that nature certainly didn't intend.

I'm not a developer by trade but I've seen all of these things in action in my last few jobs... and it's managed to convince me I wouldn't *want* to be a software developer (except as a hobby and mostly in outdated systems/environments).

Android apps at least don't resemble web apps, except that some web apps use Java, in my experience.
 
I enjoyed this blog. Many of us came from the days where we wrote SW for resource limited machines. Now it seems the goal is to consume every resource just because it's there.

Yeah, I think most everyone in these forums in on board with the Bloggers take, at least about how no-one bothers to write clean, efficient code anymore because they don't have to.

That said, there are certain programmers that are required to pull every ounce out of the system still. Specifically the high-end game developers (Virtual world games such as GTA5, etc.) as well as VR/AR, Self driving cars, etc. Why on earth things like Microsoft Office and Chrome are so bloated, I do think is laziness, but as well, there are still programmers who are are still being forced to not be lazy ... in certain areas anyway.
 
My personal opinion is that a large part of this is the lack of sufficient/adequate domain-specific languages and various management's refusal to use anything other than the "one true language" of the day.

I recall one place where management was constantly stomping up and down, blathering "we're going to re-write everything in Java!". Before that, it was XML (never mind that is not a programming language, I saw "XML" data and hacked-in hand built parsers shoehorned in to places where they had no business because of this), and before that it was "let's put it on the web!".

A number of years back I had to take my car to the dealer's shop for a repair. I recall watching the service center guy struggle entering data in to an awful, sluggish, web-based entry form. Recently I had to go back to them for something, and I saw what at least looked like a proper Windows desktop application running via Remote Desktop. The difference was they were zooming around, viewing and entering much more data at once, UI controls were snappy and seemed to do more than any web crap ever could.

I used to be a fan of Oracle Forms. That was a nice Win32 GUI client/server form and report builder and runtime with a development environment similar to Visual Basic, but PL/SQL all the way down! Then some nuclear physicist got the idea to re-write it all in Java and stuff it in a web browser. Complete slow-ass garbage, and there was no reason whatsoever to put it in a web browser. The last I checked, they were forced to eliminate the web browser part as everyone stopped supporting the Java plugin.

And these days it is all "make it run on my SellPhone!"
 
I rather doubt that - or, at least, I'd imagine that it's only true on platforms where HLLs were not generally available or feasible for a certain class of project. It is certainly harder to write good assembly than good C, but that also means that most of the sloppiest programmers are too lazy to bother with it in the first place if they can avoid it.

Hence my qualification "proportionally"... I've some remarkably horrible stuff done by professional programmers--even in assembly. In programming assembly, the comments are key. Any idiot who doesn't write meaningful commentary (and there are several types of both) for assembly is asking for trouble, both for himself and for others.
 
From what I know, C is not assembly at all but C compilers tend to have assembler-like functionality which is technically a separate language from C, but most C users incorporate it into their code.

From the very beginnings of C, it was considered low-level enough to do all the jobs of assembly languages. Here are a few references to that mindset, along with the opposite mindset.
https://stackoverflow.com/questions/3040276/when-did-people-first-start-thinking-c-is-portable-assembler
https://www.amazon.com/C-Programming-Introducing-Portable-Assembler/dp/1977056954

https://zenhack.net/2015/08/02/c-is-not-portable-assembly.html

And you are correct that most C compilers allow in line assembly nemonics, but that's not quite what I was talking about.
 
If not assembly, C (not C++) is a close cousin. I base my opinion on the fact that C imposes several hardware requirements that are required for it to work. Of course, it's possible for any Turing-complete machine to emulate those requirements, but that's not the same thing.

Consider: C pretty much requires twos' complete binary arithmetic and logical operators. It assumes some sort of stack facility for scoping (function calls and call-local variables).

And the list goes on. FORTRAN (say, basic FORTRAN 66) is a real HLL because it really doesn't care much about the hardware. Run it on a variable-word length decimal machine with no stack? Easy--been done many times, even from the earliest days. You could similarly run BASIC on such a machine. But I've never seen a C for an IBM 1401 or 7070 or 1620. Does one exist for a 7090?

In my humble opinion, C represented shorthand to get at a PDP-11-type hardware interface.
 
Last edited:
Back
Top