• Please review our updated Terms and Rules here

Looking for "BASIC/Z - Native Code Compiler" software

nullvalue

Veteran Member
Joined
Oct 8, 2019
Messages
1,034
Location
Indiana
I acquired an S100 system from the original owner last year and along with it, a bunch of manuals. Among them were 2 binders for "BASIC/Z". The owner said he actually knew the developer of the language. He told me that he developed some kind of software - and used it to run his electronics/security business for many years - on the MP/M S100 system that I now own. Unfortunately all the disks/software are long gone.

I'd really love to get my hands on this compiler - one somewhat unique thing about it is it would compile the BASIC code directly to machine language - pretty cool!

I researched this once before and thought I found some information about it (perhaps the name was changed? the original programmer passed away and his wife completed processing orders for a while?) but now I can't find that page.. Could be wrong.

EDIT: I think the language I was looking at before was called "ZBASIC", and I'm not convinced it was the same as "BASIC/Z".

So... has anyone heard of this compiler?

PXL_20220711_005508075.MP.jpg

PXL_20220711_005605371.MP.jpg
PXL_20220711_005625812.MP~2.jpg

PXL_20220711_005739112.MP.jpg
 
Last edited:

Name was changed to PowerBASIC. I think I used this briefly, back in the 80's, but ended up mostly on CBASIC, then C. I had a look for a copy, but was unable to find mine. I have a pallet of my old gear on the way to me, though... if I do find it, I'll post it here.
 
Ah, thank you! Mystery almost solved! But I'm guessing when it was TurboBASIC, it was DOS/x86 only... Found PowerBASIC/TurboBASIC on WinWorld.. But would like to find a copy of the old CP/M Z80 version (which I'm assuming was only distributed under the original name, BASIC/Z?)
 
Last edited:
Yes. I think I may have one on my old disks (if they are readable, and make the voyage here). I tried to find a copy in the online archives, but did not have any luck.
 
Yes. I think I may have one on my old disks (if they are readable, and make the voyage here). I tried to find a copy in the online archives, but did not have any luck.
Awesome! Please let me know if it turns up. Would be great to get it on an online archive. I've search many of the CP/M archives and haven't found it.
 
That would be a really useful addition to the community.

TurboBASIC was a very good product when I used it on a my first MS-DOS system (an ACT Apricot) - so it is nice to know that it had a former life on Z80/8080 CP/M systems prior to that.

It would be nice to get that running on the Cromemco...

Dave
 
One aspect of this "compile to native code" on old 8 bit CPUs is a bit of a nonstarter. Compile to p-code can actually result in a faster-running program in BASIC.
 
From what I remember of TurboBASIC under MSDOS it was pretty impressive (both size and speed wise)...

You wouldn't generate inline code anyhow - but link to libraries of code. The trick is to only include the library code that is absolutely required for the prevailing application code (as any sensible linker would).

It would be an interesting activity (not that I have time for it though) to go back and look at the code that was actually generated.

Dave
 
I recall an informal competition conducted between our P-code STAR BASIC and Microsoft's BASCOM for x80 platform. We didn't design the benchmark or even conduct the tests. But we beat BASCOM every time--and our numeric precision was higher--and our programs were smaller. It really paid off when we implemented a multi-user version. Clever design in the interpreter is what did it.
I also recall a much earlier competition between an aggressively optimizing FORTRAN compiler versus hand-coded assembly. In many cases the FORTRAN version did better because it was able to exploit more characteristics of the hardware that the assembly code author was unaware of. In fact, it became not usual that if an assembly coder had a particularly complex bit of code, that he'd code it in FORTRAN and look at the generated code for optimization hints.
 
Chuck(G), I'd like to learn more about STAR BASIC. I can't find anything online about it, only North Star Basic. Ideally an online manual would be nice. Thanks
 
It's a pretty big manual--more than I can reliably scan. If someone wants to do the work and post the result online, please contact me. It's about an inch and a quarter of two-sided letter-sized pages.
You can find mentions of it (this was all pre-Web), for example,
here and here

This was by no means a hobbyist system--it was intended for small business use and featured the entire suite of MCBA small-business applications as well as spreadsheet and word processing. We squeezed a lot out of a 3.5MHz 8085...

The BASIC was my brianchild--I still have a couple of design documents. I also did the ISAM implementation for the operating system as well as the floppy driver and a few other things. It was fun.
 
Thanks Chuck. I see, a proprietary system and appeared to be an impressive one for 1980. Thanks for the links.
 
"You can find mentions of it (this was all pre-Web), ..."

Man, there's a world things hinted at in that phrase. The "post web" generation, and even some folks born before the advent of the web, often assume that the sum total of all human knowledge is stored out there somewhere and accessable to those with enough Google-fu. If I pull out old hardcopy of stock market data and challenge them to find it online, they find that what is stored is often incomplete and/or incorrect. Their fall back position is often that the information is too old to be useful. But when I show them studies pointing out that today's information generators create new data faster than they can acquire storage hardware, and insufficient time to store or retrieve it, their eyes start to glaze over.

Just consider for a moment, the volume of printed material that was not generated digitally, has never been scanned, and likely never will be before is is discarded or rots away.
 
Just read though the product announcements in that CW (what's that?) issue from the first link. How much of this stuff does any under-60 "computer literate" person know about? It's mostly vanished from human consciousness.
The multi-user word processing program that's mentioned in the article, StarText, FWIW, is a WYSIWYG written entirely in Star BASIC....

In a way, looking back at the old stuff, you have to wonder why modern software today can't seem to get by without hardware capabilities not even dreamed about 40 years ago. It could be a rule of nature that as software progresses, it's less efficient in its use of hardware. I don't know.
 
Last edited:
Just read though the product announcements in that CW (what's that?) issue from the first link. How much of this stuff does any under-60 "computer literate" person know about? It's mostly vanished from human consciousness.
The multi-user word processing program that's mentioned in the article, StarText, FWIW, is a WYSIWYG written entirely in Star BASIC....

In a way, looking back at the old stuff, you have to wonder why modern software today can't seem to get by without hardware capabilities not even dreamed about 40 years ago. It could be a rule of nature that as software progresses, it's less efficient in its use of hardware. I don't know.

Yeah and it's all cyclical too. I started much later in the mid 90's as a Windows software developer. With the ultimate push to cloud applications, we lost so much functionality of a native app. Only in maybe the past 3-4 years have we finally reached a level where we can match functionality to the end-user. And even still, what would have taken me 5 minutes to develop in say, Visual Basic - requires 5 different javascript/CSS libraries, plus backend server side code and takes much longer to develop/test/deploy.
 
"In a way, looking back at the old stuff, you have to wonder why modern software today can't seem to get by without hardware capabilities not even dreamed about 40 years ago. It could be a rule of nature that as software progresses, it's less efficient in its use of hardware. I don't know."

"In za ole days" developers worked miracles given what modern folks would consider absurdly minimal hardware because they simply had no choice.

My first RAM expansion, 9 4116's, cost $90.00 US. And those were "1980 dollars." Whether personal or corporate, you ran what you could on the hardware you could budget, and new products had to fit into those constraints or simply didn't sell.
 
You got away cheaply--you should have seen RAM costs in the 1970s--or core in 1960s. :) Now we measure RAM in GB--amazing. As a point of reference--the first "supercomputer", the CDC 6600 circa 1964, was clocked at a blistering 10 MHz.
 
This was also in the days when hardware was much more 'intelligent' than it is today (in favour of software).

On our computer control systems we have digital and analogue input systems that automatically scan and transfer the data values directly into memory (using what is effectively DMA). Ditto for the disk subsystems. The software specifies the drive, track, sector and transfer address and the hardware gets on and does it. Ditto for the graphics units. The software specifies a descriptor list of graphics primitives and data values and the hardware gets on and 'draws the pictures'.

This leaves the software to do what it is good for - coordinating all of the hardware (the Operating System) and running the desired application software.

There is (in principle) no reason why the same can't be implemented today (of course, it was in the days of the Commodore Amiga etc. where dedicated chips were used).

Software 'bloat' these days is just (I am afraid) a fact of life (with the odd exception - some of which are identified above), until the large corporations run 'full circle' and are forced into looking at alternative solutions...

It is also a syndrome of he 'be all to everyone' computer. Our computers are dedicated to controlling power stations! Although the off-line systems can be used to produce Christmas-tree pictures etc.!

Dave
 
Back
Top