Operatings Systems: reminiscences

Speaking of DOS, anyone here use 4DOS, QEdit, PC-Write, or TSE (The Semware Editor)?

I remember QEdit very well, and I also used Wordstar on CP/M. In Wordstar I used Ctrl-A as a shift key, because my Apple II+ didn't have one. Maybe that's why I collect (mechanical) keyboards now. I love a nice keyboard.

On DOS I used Word, but I never liked it, and I still don't.
 
Last edited:
I used WordStar on my twin floppy ICL CP/M machine. I worked for ICL at the time, and they had a special offer on their PC, which, IIRC, was over £3000, but you could pay it off over, I think, three years via a salary deduction. I don’t think I did much with it except use it as a storage device for my BBC Micro (someone else at work wrote drivers to do that).
 
It's never too late to revisit the past.

DEC H-500 Computer Lab Reproduction [Instructables]

Many people reading this will be familiar with the Digital Equipment Corporation (DEC) lines of PDP machines. I would guess though that far fewer have encountered the H-500 Computer Lab. Launched in the late 60's the H-500 was part of a COMPUTER LAB curriculum to introduce students and engineers to digital electronics. It's not surprising that DEC would undertake this since more than half of it's PDP machines at the time were installed in educational institutions.

The machine itself shipped with a wonderful workbook that contained a complete course in digital electronics. Together the COMPUTER LAB package was intended to accompany courses in binary arithmetic, Boolean algebra, digital logic or computer technology. While not a true computer, the H-500 could be "wired" to perform many of the underlying operations of a true computer using a point-to-point patch cord mechanism.
 
For those who weren't there, this will explain a few things.


old_days_2.png
 
Immortalizing the half-Scottish half-Yorkshire inventor of the poke instruction. Originally done with a poker (from the fireside set used to keep the coal fires powering the earlier systems) to bridge connections as you will know.

Also where we get the term "burn in" for when you poke too many times on the same bit.
 
And of course, "programming on the bare metal".

And, again of course, the "bit bucket" was a real thing for spare parts. In fact we had 2 (hence binary bits), one for the burnt-in ones and one for new, hence also known as off and on.
 
Last edited:
Bsd 4.2 on the University of York CS department VAX 11/750 With 6mb RAM. I had a friend whose first year project never bothered to free any of its heap space. You knew when the scheduler decided it was time her process got to run because everybody else’s terminal sessions froze while it swapped in her address space.

Which year was that? When I was there (87-89) "Minster" was a 11/780 (can't remember the rest of the details).
 
LOL, of course there was memory protection, just not on THOSE STUPID UNIX BOXES!

We had a thousand people working on insurance policies over 9600 baud modems with sub second response times. :thumbsup:

Was it Unix that didn't have memory protection, or the C programming language? I compile and run C programs on my AMD processor (which can mark parts of memory as read-only once they've been initialized) and they still crash due to memory issues. By contrast, Perl and Python, which are interpreted and have automatic garbage collection, never crash due to dangling or overwritten pointers.
 
I used WordStar on my twin floppy ICL CP/M machine. I worked for ICL at the time, and they had a special offer on their PC, which, IIRC, was over £3000, but you could pay it off over, I think, three years via a salary deduction. I don’t think I did much with it except use it as a storage device for my BBC Micro (someone else at work wrote drivers to do that).

Ugh!!! CP/M!! OMG. Multiple slot boards of Static RAM, big power consumption, running at about 50°C with big noisy fans to keep it from overheating...

...ah, those were the days!
 
Here's my story, from 10 years before an IMSAI-8080 was on my desk.

I was working in the back office of a stock brokerage as a figure clerk. We had an IBM 360, fed by an army of punch operators. I found out we had one guy whose entire job was to compute the commission on bond trades. He used a mechanical calculator, and every time a bond trade was done somewhere in the world by our traders, he would fill out a form with the correct commission.

I asked him what his procedure was. He had a very simple formula, not much more than A * B, lookup in a table, multiply by C, and there you have it. Next transaction...

Although my entire experience with computers was remembering not to spindle or mutilate my electric bill, it seemed obvious to me that a computer, even the lowly 360, could handle the job. So I went to the IT department and explained it.

Mr. IT boss thought it might be a good idea. He asked me to write down each step in detail so they could consider implementing it. He wanted something like "Take value A, multiply it by value B..." If I didn't do this, he had no interest.

I realized that I was being asked to program his IBM360, something that highly paid "priests" were doing in an isolated environment. So I applied for a job in the IT department -- if I was going to be doing their work for them, I wanted to be paid appropriately, and I envisioned a short learning curve.

I was turned down, since I didn't have a computer science degree, an absolute prerequisite. So I quit.

A few years later, I was offered a job at a large missile-guidance company as a Senior Software Engineer. I still didn't have that Computer Science degree, but I took it.

The moral? Some people think inflexible, obsolete job requirements are more important than innovation and initiative. The brokerage went out of business years ago.
 
The moral? Some people think inflexible, obsolete job requirements are more important than innovation and initiative. The brokerage went out of business years ago.

A day or two ago I was reading a collection of such stories : a guy turned down for a job as he didn't have 10 years of experience with a technology - the very technology he wrote himself 6 years ago.
 

Back
Top Bottom