• Please review our updated Terms and Rules here

When does BASIC not become BASIC?

Nobody (I think) is disparaging BASIC. Like a lot of other languages, however, it's very tolerant of bad programming practices.

I simply meant in general, not here in the forums. And yes, I do agree that BASIC can have a tendency of fostering bad programming practices. I believe that was because of the people that taught themselves to code in BASIC using their 80's home computers. I cringe when looking at my BASIC code from the 80's and some of it even made it into magazines at the time. Once I started taking programming courses in other languages though (ASM, Pascal, COBOL, ADA) the proper programming practices started emerging and my BASIC code became much better.
 
I cringe when looking at my BASIC code from the 80's and some of it even made it into magazines at the time.
I remember writing a bubble sort routine in MS-DOS BASIC which sorted out close to a thousand telephone numbers. The thing was working through 4 or 5 fields and the office staff ran it on an AT 286 at quitting time and it was still churning away in the morning when they came in. I soon learned to use DOS based 'Alpha 4' database with a comma delimited routine for that. None of us in the office were programmers although I did have some college level assembly and BASIC behind me.
 
Running BASIC programs on a 286 would have been slow which is why people used a C (not C++) and assembly back in the day for speed. The thing is today where most CPUs have very fast cores and quite a few of them I am surprised BASIC has not made a huge comeback. It probably has to do with most people using a computer like a consumer and not needing to program anything since they can find dozens of finished apps that will do it online. But for the few that need to do some custom task BASIC on a modern machine would be kind of fun.
 
I remember writing a bubble sort routine in MS-DOS BASIC which sorted out close to a thousand telephone numbers. The thing was working through 4 or 5 fields and the office staff ran it on an AT 286 at quitting time and it was still churning away in the morning when they came in. I soon learned to use DOS based 'Alpha 4' database with a comma delimited routine for that. None of us in the office were programmers although I did have some college level assembly and BASIC behind me.

I remember a home computer magazine that had quite a few different sorting routines and which ones were faster. I also liked a magazine that showed how to get a Timex 2068 to do 80 column text with a custom font.
 
Running BASIC programs on a 286 would have been slow which is why people used a C (not C++) and assembly back in the day for speed. The thing is today where most CPUs have very fast cores and quite a few of them I am surprised BASIC has not made a huge comeback. It probably has to do with most people using a computer like a consumer and not needing to program anything since they can find dozens of finished apps that will do it online. But for the few that need to do some custom task BASIC on a modern machine would be kind of fun.

I hear you. What's even more astonishing is why FORTRAN hasn't come roaring back.
 
I hear you. What's even more astonishing is why FORTRAN hasn't come roaring back.

For the same reason Uknown_K pointed out ... people see no need to write their own programs any longer, which is a shame. The wonderment and enjoyment of creating a working piece of code has been lost to the ages. I partly blame Microsoft for this. All of the early home computers either used BASIC as their OS or had BASIC as an available language. When Microsoft finally dominated the PC OS world they first hid QBASIC on the 95/98 CD-ROMS and then completely abandoned any included form of programming language after that. Not including a programming tool for users was just wrong in my opinion.

Here is an example of what I write in QB64, the modern QuickBasic:

https://www.qb64sourcecode.com/games/WidescreenAsteroids.zip

It's a modern version of Asteroids for widescreen monitors, source code included. My own kids see my creations and are not even interested in programming. They think it's cool but in the end .. meh.
 
Last edited:
Don't forget FreeBASIC, either. There definitely are modern-ish dialects available for modern operating systems, but as has been noted, the set of people who want or need to write their own programs but aren't interested enough in programming as a discipline to learn more complicated but widely-accepted "standard" languages is much, much smaller today than it was in the '80s - plus, it's been displaced as a teaching and small-time utility hacking language by Python, in most cases. (Bafflingly, to me, since Python is, if anything, more complex, and also brings in grody bizarro-isms like literal whitespace.)
 
I hear you. What's even more astonishing is why FORTRAN hasn't come roaring back.

Who said it ever left? Well, it's "Fortran" today--"FORTRAN" was dropped in F90. Still the language of choice for scientific number-crunching, AFAIK.

Of course, today's Fortran doesn't look a lot like the stuff you learned in school. Fortran 2018
 
The University of Cambridge still lists one. https://www.training.cam.ac.uk/course/ucs-fortran

It's a short form seminar-style course, but a course nonetheless.

Here's a full eleven week course in .ch-land (Switzerland): http://jupiter.ethz.ch/~pjt/FORTRAN/FortranClass.html

And, of course, if Cambridge has one then Oxford must surely: http://www.chem.ox.ac.uk/fortran/

Five minutes with Google and the search term
university offering fortran courses

Fortran is still in heavy use in the science community, even with excellent tools like MATLAB, Octave, SciLab, and scipy (and friends numpy and astropy). Fortran is known, it produces repeatable results, and it's relatively simple and straightforward to write and to read. It does the job of FORmula TRANslation exceptionally well.
 
The point is it's an old language the use of which is in decline. To talk to a younger generation you'd get the impression scientists primarily use Python, I suspect there isn't absolute truth to that. More are using C, yes still using GF*. I wasn't knocking it, Fortran is a simple easy to understand language like basic. But just because relatively few colleges offer it doesn't mean it's used to the extent it was 25+ years ago.

Intel still offers it. They used to give it away with magazines 15+ years ago (I still have my CD for Windows and Linux). Curious if any other languages go through the levels of optimization to the extent that some F* compilers do.
 
The point is it's an old language the use of which is in decline. To talk to a younger generation you'd get the impression scientists primarily use Python, I suspect there isn't absolute truth to that.

Python, especially in the form of Jupyter Notebooks, is definitely on the upswing. With modules like astropy, scipy, and numpy, the pip module installer and the anaconda distribution, newer analysis pipelines are being built in python in many many places. I'm using python myself to do protocol translation between the Gpredict satellite tracker and its rotator contol and a radio telescope control system; Gpredict outputs azimuth and elevation over a simple socket, and the radio telescope software accepts a more complex protocol using right ascension and declination, also over a network socket but in a much weirder format (the radio telescope control software is written in Delphi). The astropy module makes the AZ-EL to RA-DEC (J2000 epoch) transformation easy so that I can concentrate on the mechanics of the network protocols and not have to troubleshoot the frame of reference and coordinate transformations. I use the WingPro IDE for this, and it works very well indeed.

Interactive python is very commonly used, very much like how MATLAB is used and BASIC was used, directly at an interpreter command line. The beauty of python is the nonexistent cost; have you priced MATLAB? Be sitting down when you do! Jupyter Notebooks allows a very interactive approach where you ask the modules questions directly in python, and you get immediate results as part of your notebook. It is a very powerful paradigm, much like how the interactive BASIC command prompts were revolutionary back in the late 1970's.

And with micropython and serious microcontrollers like the Raspberry PI RP2040 (like used on the $4 Pico), even the Arduino-modified Processing language feels like slow development; connect via USB and pop a REPL command line and work with the microcontroller interactively. (https://docs.micropython.org/en/latest/reference/repl.html)

... Curious if any other languages go through the levels of optimization to the extent that some F* compilers do.

The largest package I know of still written predominately in Fortran is IRAF. IRAF is the gold standard for astronomical image processing, but thanks to astropy specifically its use is in decline. But it's still the standard that scientific results are compared against. With the datasets IRAF can process even relatively minor compiler optimizations can have significant impact.

The point I was making is that Fortran is still the gold standard for numerical accuracy and precision, with scientifically-valid means of controlling both. The numpy module gives python similar capabilities, and the scientific community is slowly proving out how well it works and how it compares to Fortran.
 
Last edited:
The point is it's an old language the use of which is in decline. To talk to a younger generation you'd get the impression scientists primarily use Python, I suspect there isn't absolute truth to that. More are using C, yes still using GF*. I wasn't knocking it, Fortran is a simple easy to understand language like basic. But just because relatively few colleges offer it doesn't mean it's used to the extent it was 25+ years ago.

Easy to understand? Have you even looked at F2005? It's not your daddy's language. In fact, I think it's fair to say that it mostly occupies a spot in the heavy number-crunching trade example.

You're not going to get Python to do that.
 
I find modern Fortran easy enough to read even with the addition of C inspired structures. It isn't like the number of languages that saw APL's use of special symbols and bolted on a set of special symbols just to make the language more opaque.
 
I was an alternate for my company at the X3J3 F90 confabulation; my area was vector extensions. At times, I thought there was going to be a knock-down, drag-out genuine donnybrook erupting. Both DEC and IBM threatened to leave unless X3J3 certified their way of doing things. This was supposed to be F88, but took two years longer to get everyone equally unhappy.

There was a lot of griping about "This isn't the way that F77 did it.". But a standard did emerge.

One thing that distinguished F90 from earlier versions was that "vendor extensions" were verboten unless explicitly enabled. Take a look at an old McCracken FORTRAN IV book in the back and note how many "unique" features there were among various vendors. It's a wonder that people wrote portable programs back then.

Of course, with FORTRAN IV, the vendor environment was quite a bit different from today to say the least. Decimal architectures, 6 and 8-bit characters, ones and two's complement binary...the list is long--and we're not counting OS specific stuff.

Remember that FORTRAN originated with diskless vacuum-tube systems--and there was even a dialect (FORTRANSIT) for the IBM 650. It's remarkable that the language survives today. I recall running the FORTRAN II card version compiler on the 1620--you read the compiler binary deck first, then the source program, which punched an intermediate deck, then the second pass deck and the library deck and you got another deck punched that you could actually load and run. It was laborious enough that you did a lot of desk-checking before pushing the LOAD button...
 
I'm enjoying reading this thread. I have very fond memories about BASIC as it was, as many other people, my first contact with computer programming. The first time I ever wrote a line in BASIC was around 1986. I didn't have a computer at home yet, but one of my cousins had an Amstrad CPC 6128. Well, most of the time we used that computer to play games but on some rare moment we read the book that came with it and tried some of the example programs in BASIC.

But the great moment came three years after that: on my new school, they had a computer room and they offered extracurricular classes. The computers were some generic XT clones with green or amber monitors. Right there we learned writing programs, using GWBASIC. It was so amazing. We also learned that there existed other languages, more complicated and advanced than BASIC, we learned that a very complex yet powerful thing named assembler existed, but we never touched any other programming language than BASIC there.

Soon after, the first computer came to my home: an 8086 with a hard drive and 640 Kb of RAM. I continued toying with GWBASIC but I felt it came short for what I wanted the most to do: programming video games! It was slow and it had quite limited graphic capabilities, and the GOTO/GOSUB way of programming made programs a mess. I also wanted to make an executable of my programs. So I heard somewhere that BASIC compilers existed, so via some contacts I got, on diskettes, of course, QuickBasic 4.0 and, soon after, Turbo Basic 1.0. The leap from that lined spaghetti code to the structured programming that both QB and TB offered was almost mind blowing. But still I felt BASIC couldn't do the games I was playing. So researching I bought a book on graphic programming and animation for the PC. Then I knew that QB and TB where rather limited for some of the very essential things needed for game programming. That way I jumped from BASIC to Turbo Pascal (3.0 was the version I could get, thanks to a teacher of my brother: no WWW yet at that time, no Internet as we know it today...). After that I started learning C and Assembler.

In my opinion, the most limiting thing about QB and TB is that the compiled code is huge (40kb or more just to say Hello World...), there are no unsigned integers, no byte types and, I think most important, they have no native pointer management. Of course, even in GWBASIC you can execute custom machine code, and in QB and TB you can interface with Assembler or other programming languages OBJ files but it's still too complex in comparison with Pascal or C. That's why from that moment I used BASIC only for prototyping and for creating some auxiliary tool. The immediacy of BASIC I think had no rival at the time.

Many years after, when DOS had been dead for quite a time, I returned to BASIC with Visual Basic 6.0. It was a very impressive piece of software that allowed to create real Windows applications in a very natural way. I just never understood why you need so much C code just to create an darn empty window! And the Microsoft Foundation Classes made that almost even worse, even more code, more gibberish! Who chose those class names? VB had some limitations but it was quite powerful to do many kinds of office applications. But to me Borland C++ Builder was my favorite tool to make Windows desktop applications, as it had all that familiar C/C++ syntax and libraries plus a visual environment for RAD, just like VB. Anyway I no longer program for Windows, It's not a matter of interest to me.
 
VB.Net is actually a pretty good product - I've been using it on a daily basis since .Net 1.1 hit the street.

I've found that VB6 users that hate on VB.Net do so because it won't permit the kind of stupid shenanigans that you could get away with in VB6.

The VB.Net haters fall into two camps - the ones that do it because it's "Cool" to make stupid, sweeping statements like Dijkstra and have no concept of how a language can evolve over 55+ years. The other camp is basically no-talent language snobs. ;)

BTW, VB.Net has full support in .Net 5, including WinForms.

Another great RAD tool is Lazarus - the core language is Object Pascal and it's incredibly cross-platform. (and open source!)

g.
 
What puzzles me is when I see a diagram of Programming Languages since Fortran I in 1957 on this chart below, which goes on to list all the other Fortran II & IV and so on, but with BASIC (and yes it's incorrectly spelt it Basic!), nothing changes until Quick BASIC, rigaux.org don't even regard BASIC as a Major Language when they simplify what they regard as a Main Language, though include Pascal, which people might argue shouldn't be a Main Language as shown on the lower diagram.

diagram.png


diagram-light.png


Their main chart suggests Fortran IV and JOSS were the main influences when BASIC was created, though in other websites their charts are suggesting Fortran II infleunced BASIC rather than Fortran IV and Algol 60 another instead of JOSS, though JOSS doesn't even get mentioned, which is where I guess the debates come from if information is unreliable.
 
Back
Top