My journey into personal computer software development in 1983

166 points
1/20/1970
13 days ago
by saloama

Comments


dang

Gosh, it's telling that as early as 1983 (!) the inventors of the spreadsheet thought that spreadsheets were 'done' and they needed to move to more important things. This is like Rickenbacker in 1938 deciding that electric guitars were 'done' and moving to, I don't know, Theremins or something.

13 days ago

jetti

I think the attitude is more the result of lack of competition. All of this was from before my time but doing a quick check on Wikipedia shows that VisiCalc came out in 1979 where 1-2-3 came out in 1983. I’m not familiar enough with the spreadsheet software landscape from the early 1980s but given that the article mentions 1-2-3 specifically I am going to assume there wasn’t really any real competition for at least 4 years. It can be easy to think that the future is in a new product since they seemingly had little to no competition in their original market for 4 years

13 days ago

whartung

1-2-3 was 3rd generation. VisiCalc, while groundbreaking, was quite crude. Its fundamental utility outweighs its lack of sophistication. It was quite 1.0, and once the cat was out of the bag, it was hardly secret tech.

There was at least SuperCalc, and Multiplan from MS. But those were really still from the 8-bit world. Lotus was able to start again from scratch with the large memory potential of the PC.

MS did a version of Multiplan for the Macintosh. It was amazing! Mice and spreadsheets were a match made in heaven. But it was just a pre-cursor for Excel, and did not last long.

13 days ago

[deleted]
13 days ago

ralphc

These comments are burying the lede, or burying the (lede).

The author said that VisiCalc was written in Lisp and I said "wtf?

Then I reread it and said "seriously, wtaf?"

I've never heard that before. Was the original Apple ][ version written in Lisp?

13 days ago

dang

I wish! But no. It was written in 6502 assembly language:

https://rmf.vc/implementingvisicalc

Edit: Oh - the OP is talking about a later version of VisiCalc. Bricklin and Frankston were both MIT CS grads, so it would make sense. But at what point did VisiCalc get ported to Lisp? You're right—this is burying the lede!

13 days ago

rozzie

At Software Arts I wrote or worked on the IL interpreter for the TRS 80 Model III, the DEC Rainbow, the Vector Graphic, the beginnings of the Apple Lisa port, as well as the IBM PC port. To put you into the state of mind at the time,

- in the pre-PC era, the microcomputer ecosystem was extremely fragmented in terms of architectures, CPUs, and OS's. 6502, z80, 68K, z8000, 8088. DOS, CPM, CPM/86, etc. Our publisher (Personal Software) wanted as much breadth of coverage, as you might imagine

- one strong positive benefit of porting from 6502 assembly to IL and using an interpreter was that it enabled the core code to remain the same while leaving the complex work of paging and/or memory mapping to the interpreter, enabling access to 'extended memory' without touching or needing to re-test the core VisiCalc code. Same goes for display architectures, printer support, file system I/O, etc.

- another strong benefit was the fact that, as the author alludes to, the company was trying to transition to being more than a one hit wonder by creating a symbolic equation solver app - TK!Solver - that shared the interpreter.

Of course, the unavoidable result is that the interpreter - without modern affordances such as JIT compilation - was far less snappy than native code. We optimized the hell out of it and it wasn't unusable, but it did feel laggy.

Fast forward to when I left SoftArts and went across the street to work for my friend Jon Sachs who had just co-founded Lotus with Mitch Kapor. Mitch & Jon bet 100% that the PC would reset the ecosystem, and that the diversity of microcomputers would vanish.

Jon single-handedly wrote 1-2-3 in hand-tuned assembly language. Yes, 1-2-3 was all about creating a killer app out of 1.spreadsheet+2.graphics+3.database. That was all Mitch. But, equally, a killer aspect of 1-2-3 was SPEED. It was mind-blowing. And this was all Jon. Jon's philosophy was that there is no 'killer feature' that was more important than speed.

When things are moving fast and the industry is taking shape, you make the best decisions you can given hunches about the opportunities you spot, and the lay of the technical and market landscape at that moment. You need to make many key technical and business decisions in almost an instant, and in many ways that determines your fate.

Even in retrospect, I think the IL port was the right decision by Dan & Bob given the microcomputing ecosystem at the time. But obviously Mitch & Jon also made the right decision for their own time - just a matter of months later. All of them changed the world.

12 days ago

ralphc

What versions of Visicalc can we find in the wild that would have used the IL interpreter?

12 days ago

dang

Thank you!—that fills out the story very nicely.

12 days ago

gilbetron

"Ceruzzi: For legal reasons?

Bricklin: Both. We couldn’t afford to spend money on time-sharing. We could buy our own machine. In those days, development – a lot of people did development, as Bob explained, on another machine. And then from that other machine, you loaded to the micro because the development systems on the micros weren’t up to it. That’s how Microsoft Basic was done – that’s how Microsoft ended up having a PDP-10 I think. They eventually used XENIX to do their development. We did the same thing. We developed all our own tools over the years. We improved the tools. We wrote our own implementation language, a higher level language. In fact, that’s one of the issues that eventually came in, is that in the early days of the PC industry, there were so many different machines and you didn’t know which was a winner. You sold to each manufacturer. So you had to port all over the place. That’s what Digital Research was – a porting company. Microsoft was a porting company. That’s what we did. We had to figure out ways to port the same product and cookie cutter it out. And everybody went a little different and you had to fight with them. Otherwise, the costs would go up. So we eventually moved things to a higher level language. We did our version 2 of VisiCalc, in a higher level language.

Ceruzzi: What language?

Bricklin: We wrote it in something we called IL, which is a Lisp derivative. It was like writing in Java or something like that today. An interpreter. Microsoft had a similar type of thing for Multiplan. They wrote in a language which let them use a cookie cutter to put it on many different machines. But the Apple II version of the VisiCalc II (VAV) was written in assembly code. We realized that to port that was going to be so expensive. When we ported from the Apple II and Apple III, doing the IIe was next, then to port to the IBM PC it’s a different code base. The way we did the VisiCalc code base is, since we had our own tools, we hired Seth Steinberg, who had worked at the [MIT] Architectural Machine Group – Media Lab, real experience, real bright guy, helped bring a culture into our company. Free Cokes came from him. He ordered it and all that stuff. Lotus and others all copied this. It helped bring that type of environment from MIT into our company and spread hopefully to others. What Seth did is he said, “I’m going to do an idiomatic translation, basically, from the 6502 code to a Z80 code to do the TRS80.” And what he did was he modified the compiler to list the two sources synced on labels. The compiler we used had macros and it had no ‘go-to’s in it. Basically it was all IF THEN ELSE and stuff. It was a macro assembler"

https://conservancy.umn.edu/bitstream/handle/11299/113026/oh...

12 days ago

lispm

the "version 2" of VisiCalc was written in IL, a "Lisp derivative"

https://conservancy.umn.edu/bitstream/handle/11299/113026/oh...

13 days ago

ralphc

I don't see it mentioned anywhere else. If someone gets a copy of VisiCalc and look at it in a hex editor, are there ways to tell that it's written in a Lisp?

13 days ago

whartung

Yes. Apply common sense. Especially considering the state of Lisp back in the day, not to mention the utter lack computational power that a 1MHz 6502 has.

It’s not even worth considering the concept of them writing in some high level proto-lisp that is cross compiled into 6502 from a larger machine.

While a lot of software was cross assembled on larger hardware for microcomputers back in the day, almost nothing of note was written in a high level language. (To wit someone will mention things like the Canon Cat being written in a Forth dialect, which is why I said “almost”.)

13 days ago

eichin

Or to hit more closely, in 1983 or so there was Grammatik https://en.wikipedia.org/wiki/Grammatik , a grammar-checking tool for CP/M (on Z80 so it had a little more power than the 6502) written entirely in Forth - I spent a bit of time prying an interpreter prompt out of it.

Consider that the PDP-10 was effectively around 500khz, and the 704 that lisp was invented on (and gave us the CAR and CDR names) "could execute up to 12,000 floating-point additions per second".

It was a couple of years before Pascal and C really caught on for micro development, but it really wasn't the barren wasteland of raw machine code that you seem to be suggesting...

13 days ago

ralphc

Look up a couple of comments. From Dan Bricklin himself, "We wrote it in something we called IL, which is a Lisp derivative."

12 days ago

abraae

Say what you will about MS (and I said a lot when I worked at Lotus on the mainframe port of Lotus 1-2-3) but they knew what was important for success.

A spreadsheet on it's own is a thing of technical beauty but for market domination you don't want to keep pouring resources into that one product, you want a suite of complimentary products.

You want to be able to embed your spreadsheet into a document, into a slide presentation. You want cutting and pasting to work sensibly between products. You want consistency in the menus and layouts.

Bill Gates understood all of this from the beginning, the same as he understood that the strength of a PC operating system is not how reliable, memory safe and performant it is, it's in how flashy it looks and how important the windows paradigm is.

13 days ago

hnlmorg

What you’re describing there better fits Microsoft’s competition than Windows itself. Microsoft don’t even follow their own UI guidelines with regards to toolbars and menus (how often have they built bespoke widgets for Office rather than using their public APIs?).

Bill Gates was a great businessman. Microsoft succeeded because Gates knew how to make deals with suppliers et al. Much has been written about the good (IBM bundling) and bad (threatening retailers who shipped PCs with alternative operating systems) already though.

However if you want to talk about the UI consistency or flashiness of computers in the 80s and 90s, then you’re better off looking at Apple Macs, Amiga, Acorn Electron, or even Atari before you look at Windows and DOS.

13 days ago

wslh

Microsoft success cannot be explained by the old story with IBM. Obviously Bill Gates has/had amazing skills, was lucky to born in a rich family, to be in the IBM deal, etc. All condiments that does not explain the Microsoft success until today. I think Microsoft has a record of unsuccessful projects while being successful as a business, as you say they don't use their own UI offerings and had zilliones but look at their balance sheet...

If it were by the original IBM tale, Xerox and others dead companies were thriving. I recommend to read "Idea Man", Paul Allen (Microsoft cofounder) autobiography. You will realize there was an incredible Bill Gates before the IBM deal. Also check the non-official chronicles of Bill Gates in "Hard Drive: Bill Gates and the Making of the Microsoft Empire" [2] (1993).

[1] https://en.wikipedia.org/wiki/Idea_Man

[2] https://www.amazon.com/Hard-Drive-Making-Microsoft-Empire/dp...

12 days ago

hnlmorg

Every successful company has had unsuccessful products and of course people born into money are more likely to be successful themselves.

You’re also focusing too much on IBM specifically. I didn’t say IBM made Microsoft successful, I said it was an example of a deal Gates successfully negotiated.

12 days ago

wslh

My main point is that Microsoft is a different company and probably people will understand Bill Gates genius^3 in hindsight. I think Bill Gates is in another league of intelligence. I also think that Steve Jobs was in another league, different than the Bill one, and with more detours.

I expected Google founders to lead Google but they quit. Jeff Bezos work was also amazing and he continued for long. We have Mark Zuckerberg in Meta, don't doubt he is really smart but the Oculus execution was completely wrong, it is in the business literature. I remember when Microsoft launched Xbox there were a lot of concerns about attracting AAA games like Sony or Nintendo. They did it. Mark launched and maintained the Oculus without a set of apps to play with. Basic mistake: you can show the device to your father and he will be amazed and forget about it next time.

12 days ago

hnlmorg

I really don’t understand how your point relates to my original comment.

Also I’m old, so have far more experience with early Microsoft than most people. Some of the software I’ve written is probably older than a lot of people who chat on here.

Edit: you’ve added a whole bunch more to your post. It’s now sounding like you took issue that I said Gates was a a great businessman without acknowledging that there are other great business people. Weird thing correct someone on given the mere existence of other companies proves that there have been other great business people. But yeah, Gates isn’t the only one. However we were talking about Microsoft’s success not Facebook/Meta nor any other company.

12 days ago

wslh

I am following the thread that initiates with a parent comment. Please let me know where I am wrong or unfocused on the topic and we can continue from there.

12 days ago

canucker2016

"how often have they built bespoke widgets for Office rather than using their public APIs"

I don't know. How many?

UI/UX development is a two-way street. The OS includes a basic set of UI controls for common scenarios. But it can't anticipate every situation.

MS Office's Ribbon is designed to show only the actions available given the current selection and document. A toolbar would show available actions, but also have the invalid actions disabled, increasing required screen space. Is there some other standard Windows control that could have worked instead of the custom Ribbon?

12 days ago

hnlmorg

I can think of 3 different occasions.

First was when the menu bar would show a bezel when you hovered over it, which didn't happen normally in Windows.

The second was when the menu bar became detachable like a tool bar. That also wasn't supported by public Windows APIs.

The third was Ribbon. That I can at least forgive a little more because it's a massive shift in UI/UX. But the first two instances were so subtle that most people wouldn't have noticed, which then begs the question, why even waste resources literally reinventing their own APIs?

11 days ago

mixmastamyk

Yes, however this story is from before that, the early DOS era. No standards for UI or drivers etc.

13 days ago

smackeyacky

Please write a blog post about this - are you saying you had Lotus 1-2-3 working on a 3270 terminal and they sold it?

13 days ago

abraae

No blog posts coming but yes, a port of lotus 123 that worked beautifully on terminals of the day particularly3279 colour terminal.

The premise was that people were already pushing the limits and building incredibly elaborate systems on spreadsheets, so why not go to the next level and leverage the huge processing power of big iron. Actually a very solid idea and I worked with a few progressive customers that bought into it.

My job was as a sales engineer, helping customers adopt it in Europe, particularly Scandanavia. It's Tuesday so it must be Copenhagen. What a great job and a super team in lotus at the time.

13 days ago

smackeyacky

Now that is awesome. I worked at a place that had the sun version of 1-2-3 running for a guy who had blown the memory limit of the MS-DOS version during that awkward period where windows still wasn’t quite a thing. That spreadsheet gave us nightmares

13 days ago

electroly

This product was publicly known; it was called Lotus "1-2-3/M" and you can find a little information (not a lot) by googling for that name. The Wikipedia article for 1-2-3 also cites a few scanned articles from the time, although sadly they also have some broken citations here.

13 days ago

kragen

a lot of good indian or indian-american programmers signed on at microsoft in the early 80s and didn't get whispering campaigns started about them in management and then didn't quit. retaining adept hackers (without strong morals, at least) was and is one of microsoft's strong points. imagine where visicorp could be if they'd been able to retain people like this

13 days ago

jnaina

I remember being on a call, while at CSC, to discuss moving one of our projects to our team in our Indian subsidiary. One of the folks on the call, in his texan drawl cooly mentioned, "What next? Train dogs and cats on software development?".

Oh the joys of being brown in the Tech Industry.

12 days ago

kragen

ugh

not to limit this to texans (racism is pretty widespread, especially in the usa) but the especial intensity of racism was one of the things my mother called out as a primary reason she left texas as soon as she could

was this recently

do you suppose he'd heard of sanjay ghemawat or raj reddy

11 days ago

jnaina

About 8 years ago. Doubt he would heard of both these gentlemen.

11 days ago

BlueTemplar

> the strength of a PC operating system is not how reliable, memory safe and performant it is

Didn't OS/2 fail because it wasn't performant enough, compared to Windows/DOS which could run on pretty shitty PCs ?

13 days ago

Blackstrat

OS/2 ran circles around Windows NT. It's API was one of the best defined that I had worked with. Certainly better than anything for Windows. I worked for an IBM subsidiary in the early 90s. IBM was just never sufficiently committed to OS/2 IMO. Hence it ultimately went away. But programming on Warp was a great experience. And for a while at least, OS/2 was very well established in the banking industry.

13 days ago

BlueTemplar

But wouldn't the banking industry have been able to afford the (relatively) powerful computers to run it well ?

12 days ago

Blackstrat

We were running the same hardware as the rest of the world - nothing high end. OS/2 always outperformed NT regardless of the hardware. And it didn’t need to be rebooted everyday.

12 days ago

abraae

As I remember it, OS/2 failed mainly because IBM were a day late and a dollar short with presentation manager, the gui for OS/2. Customers unfavourably compared the dull character mode appearance of OS/2 to the flashiness of Windows.

13 days ago

timbit42

OS/2 v2 and later versions had replaced Presentation Manager with Workplace Shell. It was great. Fully object-oriented. Very powerful.

12 days ago

readyplayernull

> You want to be able to embed your spreadsheet into a document, into a slide presentation.

I mean, MS made ActiveX and it was a security risk:

https://www.wired.com/1996/11/will-activex-threaten-national...

13 days ago

mtmail

If I recall being able to embed an Excel spreadsheet into a Word document was OLE https://en.wikipedia.org/wiki/Object_Linking_and_Embedding

13 days ago

tn1

It's still possible, even into places like WordPad. When it's rendered (i.e., when you're not currently editing the embedded item) it becomes a bitmap (at a usually not great resolution). So text in your spreadsheet isn't selectable

13 days ago

kragen

it got rebranded activex for browsers

13 days ago

cmpalmer52

My first professional software development job was circa 1982-83. I was in high school, working part time for my step-brother. He’d just got a luggable PC like the one in the picture and he paid me like $20 to write a MS-BASIC program to calculate payroll withholdings.

13 days ago

mlhpdx

Around the same time (maybe a bit earlier) I was in 7th grade and paid a few hundred dollars to port a Basic program to C. I’d never seen C before and had to buy the book, port the code on paper, then go to work with my mom on the graveyard shift to use computers in her lab to type it in and eventually compile it.

A long way from the git push to CI/CD I did a few times today…

13 days ago

zubairq

Great article about a guy who gets loads of stuff fixed in Visicalc. I have seen many times when someones gets loads of productive tasks done for a company, instead of the coworkers thinking that the person is productive and encourage them, they think that the job is "easy" and try to find a way to get rid of the productive person!

13 days ago

orangesite

There's an inverse relationship between how good folk think they are vs. how good they actually are.

Anyone who's spent time in musician communities will be intimately familiar with the phenomenon.

Anyone who's spent time in musician communities will also understand just how good you need to be if you're still doing it after two decades.

13 days ago

YZF

Good musicians know they're good though. I'm not sure the comparison holds that well for software people. With music you have a lot more immediate feedback. You can record yourself and play back. You can see how long it takes you to learn something technically complex (if you even can learn something very technical). With software sometimes the outcome of decisions can only be seen years later and there's really very little in terms of absolute metrics you can rely on for feedback. Music is a hobby for me but in my circle I haven't seen people that thought they were amazing musicians but really are terrible.

13 days ago

ChrisMarshallNY

Isn't that pretty much the definition of the Dunning-Kruger effect?

I was actually a pretty decent bassist, way back, when mullets were en vogue, but I also knew that I wasn't good enough to stand out from the crowd.

13 days ago

SaberTail

It's a common misunderstanding of the Dunning-Kruger effect.

In actuality, there's a direct relationship between how good a person thinks they are and how good they actually are. Not inverse. The Dunning-Kruger effect is that the people at the low end of the scale tend to rate themselves as slightly better than they are, and people at the high end of the scale tend to rate themselves as slightly worse than they are. The best people know they're good, but they tend to think they're not the best. The worst people know they're bad, but they tend to think they're not the worst.

13 days ago

richrichie

This is the best and most succinct explanation of DK.

DK is the most abused pop psych in the world.

13 days ago

smackeyacky

There is a good underlying story here about what makes a "good" programmer.

There are plenty of incredibly smart folk out there programming giant, indecipherable messes where somebody like me would write something very boring but did the job. My boring code can always be tizzied up, but you can't always fix an awesomely complex abstraction written in some niche language. What scratches your itch as a "super programmer" is often directly at odds with actually solving a problem.

I am kinda tired of inheriting those kinds of messes - the guys that write code that ways inevitably leave when the bug list pile gets too high.

13 days ago

spitfire

Or even worse, they're CTO.

Ask me how I know.

Or even worse, that pile of indecipherable mess runs a good solid business.

Ask me how I know.

Edit: They were CTO, not CEO.

13 days ago

richrichie

How do you know?

13 days ago

spitfire

Well, I started a job at a small startup doing something relatively mathematically and technically complex.

I missed a few red flags.

The CTO had built a pile of indecipherable mess that runs a good solid business. And that's the rub, it was a good business. The CTO had build a steaming pile, but had it all in his head. If others couldn't wade through it, well, it was a good business.

Being a good business, they could afford to run through bodies. I discovered my "coworkers" were all contract. Another red flag.

So as my work extended out into the rest of the system I slowed down, and was eventually moved on. After all the CTO who wrote most of it could handle it, why couldn't I?

I'll take the Lisp over 6502 assembly any day of the week.

13 days ago

richrichie

> I discovered my "coworkers" were all contract. Another red flag.

Why is this a red flag?

13 days ago

[deleted]
13 days ago

shrubble

Note that at the time many programmers were familiar with assembler, since even mainframe shops coded in assembler for some projects.

13 days ago

richrichie

>Besides my educational background, at that time Indians weren't particularly considered to be suitable software material. (Amazing how the world turns, eh?)

Has this really changed? Sheer size of the base (a billion plus each from China and India) distorts and produces massive survivorship bias. Like soccer or tennis, are there countries that produce ridiculously great programming talent per capita?

13 days ago

kragen

there are still dumb white proto-hackers who are convinced that they're better at everything than sanjay ghemawat, raj reddy, vinod khosla, and umesh vazirani, but almost all of us have at some point had indian coworkers who were much better than us by now. and it takes a special kind of racism to consider yourself smarter than all four of that list above just because you have less melanin; people that bigoted are rare indeed

13 days ago

theragra

Probably countries with strong math education, like Russia or China. Still, I guess top programmers from these will be great, but not average.

13 days ago

ongytenes

Followed the link but it was flagged as spam with the message "Document Unavailable".

9 days ago

ongytenes

What happened? Followed the link and got

Publication Not Available

The page you are attempting to access is unavailable.

10 days ago

mixmastamyk

Great story. Surprised that he had experience with C, but wanted to rewrite their product in assembly instead. The timeline given was understandably not very precise, but C and even Pascal compilers should have been starting to be available around this time.

13 days ago

Animats

Available, yes. Good, no. Nobody had enough memory space on PC-class machines to do a good compiler.

(AutoCAD for the original MacOS was compiled on Sun machines, because the Apple compilers were so bad.)

13 days ago

seanmcdirmid

I’m pretty sure the memory/storage resources and performance available on Sun machines had something to do with it also. Since this is pre-Sparc, they would have been running the same CPU architecture (Motorola 68k).

13 days ago

SoftTalker

Were Borland Turbo-C and Turbo-Pascal not "good" compilers? I seem to recall they were pretty popular back in the early PC era.

13 days ago

mixmastamyk

Those were, a decade later on a 386+. Understandably, 1983 with a 8080 and compiler version 0.9 was probably a lot dicier.

It's hard to remember now, but the original PC was downright primitive, and most didn't have even the full 640k RAM installed!

I assumed a decent lightweight compiler existed, but sounds like not.

13 days ago

bruce511

As an anecdotal reference, in high school, 1985-1987, we used Turbo Pascal 3 on 8086 machines. 640k ram, but it was a pretty zippy compiler even then.

13 days ago

MichaelRo

Precisely. I started high school in 1992 and the only two IBM-PC compatibles in the computer class (the rest being CP/M machines) were an XT-8086 and an AT-80286. The XT had a very fast Turbo Pascal compiler, I don't recall the exact version but could have been 3.0. AT machine had 5.0 or something and was significantly slower although not to render it unusable. That's until someone wiped out 3.0 from the XT machine and replaced it with the same 5.0. Compile time of a simple "hello world" program raised from under 1 second to at least a minute. Rendered unusable and worst part was that 3.0 was completely gone, no floppies to re-install it. They had it solely on the hard drive (probably 20Mb or so) but now was gone.

13 days ago

tie-in

The 1-min compile time for TP5 doesn't sound right to me. I used Turbo Pascal versions from 3 to 7 back in the day, and all of them were quite fast (one-pass compiler). Turbo C++ was another matter though. The same example programs compiled drastically slower (seconds vs minutes).

13 days ago

theragra

That is one thing I really miss. TP was defacto language to teach in exUSSR.

In many cases,TP or later Delphi followed as IDE for real apps. Pascal had its issues, but compilation speed was insane. When most people switched to java or c#, we lost this. Authors of both could have not used language so similar to C, but it happened.

I dunno, maybe I need to adjust and think before compiling, or switch to Go.

13 days ago

mixmastamyk

Ok, but no IBM PCs shipped with an 8086 processor. Maybe you mean an upgraded clone, or the 286?, which was a significant leap.

Wikipedia says the first line of PCs maxed out at 256k of RAM. Except on expansion card. Not until the XT did it support 640k on board.

Also, importantly I meant 8088 above! Not easy to stay accurate after all these years, right? We were reading Petzold's Code recently and it goes on and on about the 8080, and so believe that made its way into my writing.

10 days ago

tyingq

The most popular targets for Visicalc at the beginning were 6502 and z80 machines, where really no C compiler was ever good for those targets.

13 days ago

MikePlacid

There was a Mini C (C subset) compiler written in Mini C and running on a Russian version of z80 in the beginning of 80’s. It was small and very observable (I’ve changed the code generator there to produce code for IBM/370 - and so we’ve bootstrapped this compiler to a Russian IBM/370 to run some games written in Mini C))

Not sure if such a thing is much better than assembly language in industrial context: it would speed coding up, but you would need to check the results thoroughly. Only the real test can tell if there is a time / quality benefit in sum.

13 days ago

eichin

There were definitely pascal compilers though (you could boot p-system on the trs-80 model I even, though you needed at least 32k of RAM.)

13 days ago

kragen

as i understand it, turbo pascal generated pretty bad code and i don't think turbo c existed yet. it isn't impossible to do an optimizing compiler in 64k data segments and half a mip but usually people used multiple tape drives to make that kind of thing feasible

13 days ago

droptablemain

Looks like the article got flagged as spam?

11 days ago

kbutler

Substack now reports "flagged as spam".

Not sure why that is, but article content is available at

https://web.archive.org/web/20240420183201/https://farrs.sub...

11 days ago

NeilSmith2048

Experienced veteran programmer, hats off to you!

13 days ago

[deleted]
13 days ago