In the Beginning Was the Command Line (1999)
Posted25 days agoActive15 days ago
web.stanford.eduTech DiscussionstoryHigh profile
informativeneutral
Debate
20/100
Command LineConversational UIApi_design
Key topics
Command Line
Conversational UI
Api_design
Discussion Activity
Very active discussionFirst comment
6d
Peak period
65
Day 7
Avg / period
24.5
Comment distribution98 data points
Loading chart...
Based on 98 loaded comments
Key moments
- 01Story posted
Dec 8, 2025 at 3:44 AM EST
25 days ago
Step 01 - 02First comment
Dec 14, 2025 at 3:19 PM EST
6d after posting
Step 02 - 03Peak activity
65 comments in Day 7
Hottest window of the conversation
Step 03 - 04Latest activity
Dec 18, 2025 at 5:27 PM EST
15 days ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 46189905Type: storyLast synced: 12/15/2025, 2:20:31 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
The analogy is definitely a bit outdated now, what with Windows 8 then 10 then 11 getting aggressively less user-friendly each year.
I've ridden in people's cars that are still displaying "agree to the terms of service"; I think a number of cars are starting to become far too much like computers.
But now that we've trained users that they'll need to click accept on the screen, we can sneak any conditions we want in there about how we collect and use their data...
Sadly, the most reliable signal american tech companies send is that they are building a surveillance state, whether it's for the US government or just their own fiefdoms (franchulates?) seems not to concern them, but neither is particularly appealing to me as a prospective customer
Sagacious point. With emphasising. This is how non-European web business look to everyone.
This bit of libel needs to be put to bed. The Pinto did not have a greater propensity to explode than other "in-class" cars and arguably had a better safety record than Beetles or Corollas of the time. Nader made himself a nice career of this libel, but it does not make it true. Of course, other cars didn't have a "memo" but that's beside the point.
The 'crime' of the pinto was not that it was an unsafe car, it was that it could have been safer with a minimal (even by my standards, and I'm on the record as being opposed to mandatory backup cameras) increase in cost - that was why it grew the reputation, it was pure cost engineering (aka, cheapness - on the same level as the ignition switch failure issue GM had in the 2000's).
Just give it a couple of years.
I don't have that rosy 50's Chevy picture, it's more like a luxury coupe with a tighly locked hood. Sleek, desirable, you pay through the nose for every upgrade, and don't attempt to fix it yourself.
... that someone occassionally decides to wrap with a shiny covering to make it look like a luxury SUV. The covering sometimes peels off when travelling on the highway.
That's a city-driver, never-gonna-see-mud truck owner thing.
I didn't have it crap on me ever, since about two years, by choice of a so called 'rolling' gamer distro.
Looks very nice and comfy to me with KDE Plasma, and its Breeze (light) style, which is "automagically" applied to apps written for other toolkits/DEs like GTK/Gnome. Everything of what I do(mostly just browsing, some LibreOffice, remoting into other systems) is running ultrasmooth without lag, or stuttering, while almost always some music plays via YT in the background, without resorting to solutions which would pipe that via yt-dlp into mpv. It isn't necessary for me. On obsolete systems with Kaby Lake Core i5/7t :-) The only thing which could be called special or unusual about them, is that they have 32GB RAM. That may help, too. Oh, and the BIOS/UEFI/Firmware, from Lenovo.
Just don't buy crap.
Please. They resold an already existing OS created by another individual. The idea that there was some "vision" here in being an IBM contractor is a total misunderstanding of the history of the time.
https://en.wikipedia.org/wiki/Z-80_SoftCard
Imagine how different the world might be if gates’ mom didn’t work at ibm.
That’s not a maybe, he’s talked about it in interviews.
I met Bill Gates a couple of times at Microsoft. He wasn't an average man who got lucky. He was/is a hard-working, extraordinarily brilliant man who got lucky.
I know the playing field is not level. We don't all have an equal chance to be a billionaire. But I do know that most of us have not reached our full potential. Most of us could be better (on whatever dimension you desire) if only we tried harder.
Imagine how different the world might be if we did.
The 8086 was out there and selling for years. AT&T ported UNIX™ to it, meaning it was the first ever microprocessor to run Unix.
But even so, DR didn't offer an 8086 OS, although it was the dominant OS vendor and people were calling for it. CP/M-86 was horribly horribly late -- it shipped after the IBM PC, it shipped about 3-4 years after the chip it was intended for.
The thing is, that's common now, but late-1970s OSes were tiny simple things.
Basically the story is that there was already an industry-standard OS. Intel shipped a newer, better, more powerful successor chip, which could run the same assembly-language code although it wasn't binary compatible. And the OS vendor sat on its hands, promising the OS was coming.
IBM comes along, wanting to buy it or license it, but DR won't deal with them. It won't agree to IBM's harsh terms. It thinks it can play hardball with Big Blue. It can't.
After waiting for a couple of years a kid at a small company selling 8086 processor boards just writes a clone of it, the hard way, directly in assembler (while CP/M was written in PL/M), using the existing filesystem of MS Disk BASIC, and puts it out there. MS snaps up a licence and sells it on to IBM. This deal is a success so MS buys the product.
IBM ships its machine, with the MS OS on it. DR complains, gets added to the deal, and a year or so later it finally ships an 8086 version of its OS, which costs more and flops.
The deal was very hard on Gary Kildall who was a brilliant man, but while MS exhibited shark-like behaviour, it was a cut-throat market, and DR needed to respond faster.
2020 (179 points, 64 comments) https://news.ycombinator.com/item?id=24998305
2019 (148 points, 50 comments) https://news.ycombinator.com/item?id=20684764
2018 (102 points, 13 comments) https://news.ycombinator.com/item?id=16843739
2016 (145 points, 55 comments) https://news.ycombinator.com/item?id=12469797
2008 (24 points, 12 comments) https://news.ycombinator.com/item?id=408226
In the Beginning Was the Command Line (1999) - https://news.ycombinator.com/item?id=41084795 - July 2024 (260 comments)
In the Beginning was the Command Line (1999) - https://news.ycombinator.com/item?id=37314225 - Aug 2023 (2 comments)
In the Beginning Was the Command Line - https://news.ycombinator.com/item?id=29373944 - Nov 2021 (4 comments)
In the Beginning was the Command Line (1999) - https://news.ycombinator.com/item?id=24998305 - Nov 2020 (64 comments)
In the beginning was the command line (1999) [pdf] - https://news.ycombinator.com/item?id=20684764 - Aug 2019 (50 comments)
In the Beginning Was the Command Line (1999) - https://news.ycombinator.com/item?id=16843739 - April 2018 (13 comments)
In the Beginning Was the Command Line (1999) - https://news.ycombinator.com/item?id=12469797 - Sept 2016 (54 comments)
In the beginning was the command line - https://news.ycombinator.com/item?id=11385647 - March 2016 (1 comment)
In the Beginning was the Command Line, by Neal Stephenson - https://news.ycombinator.com/item?id=408226 - Dec 2008 (12 comments)
In the beginning was the command line by Neil Stephenson - https://news.ycombinator.com/item?id=95912 - Jan 2008 (5 comments)
In the Beginning Was the Command Line - https://news.ycombinator.com/item?id=47566 - Aug 2007 (2 comments)
(Reposts are fine after a year or so, and in the case of perennials like this one, it's good to have a thread every once in a while so newer user cohorts learn what the classics are.)
"Now THAT is a cool operating system, and if such a thing were actually made available on the Internet (for free, of course) every hacker in the world would download it right away and then stay up all night long messing with it, spitting out universes right and left."
At the same time, there will be uncool operating systems designed for data collection, surveillance and ad services
It runs on Assembler.
s/runs on/written in/
If it's assembler, then what's the processor architecture
What's it's endianness
It's hardware requirements are little, even overlapping with BeOS on the low end. I have personally run Haiku beta5 on a 666Mhz Pentium 3 with 256MB of RAM (normally, I run BeOS on that machine, with 512MB of RAM). I'm not sure what I'm trying to say here, besides a general call to give Haiku a try on that old thinkpad, in a VM[0], or anywhere else really.
[1] If you're using virtualbox don't give it more than 1 cpu, virtualbox has a bug which makes haiku slow with multiple CPUs.
Linux won most all that and now there is no competition; that's why it seems quiet.
Is this… right?
I thought some of the earliest mechanical computers (as opposed to human computers) that had much real uptake were “fire control computers,” for things like naval guns (for example). You move around dials and cranks to put the measurements in. I’d call this essentially graphical… it isn’t a series of text based commands that you issue, but a collection of intuitive UI elements, each of which is used to communicate a particular piece of data to the computer. Of course the GUI of the past was made of gages and levers instead of pixels, but that’s just an implementation detail.
I much prefer the command line to a gui, but I think we should call it what it is: an improvement. A much more precise and repeatable way of talking to the computer, in comparison to cranking cranks and poking dials. And a general, endlessly flexible channel that can represent basically any type of information, at the cost of not necessary being intuitive or glance-able.
The teletype system was invented shortly after 1900. It was in widespread commercial use by the 1920s, for sending text over telegraph wires.
The US government began using punch card machines to do the census in 1890. They were named Hollerith machines. Hollerith is one of the companies that later became IBM. When they entered into electronic computers, their prime market was their own customers who were already using their punch card machines for things like accounting and payroll. For backwards compatibility, they kept the format the same!
Punch cards themselves date back to the early very 1800s, where they were introduced for the Jacquard loom. With the cards providing programmable instructions for fabric design.
It is worth noting that Hollerith was not the first place to try to repurpose punch cards to computation. That honor goes to Babbage's analytical machine (which admittedly was not actually completed).
Basically everything in technology has a far longer and richer history than people realize. I could go on for a while about this...
Gun Fire Control might be more interactive and predates ENIAC (which was, of course, initially used to calculate artillery trajectory). ENIAC's user interface was plugging wires from outputs to inputs same as telephone operators connecting calls. Hardly interactive.
I don't think we get to REPL/TUI like features until 1960s, you've got Sutherland's Sketchpad with a CRT and lightpen representing GUI and LISP REPL via Teletype just before it 1959ish (actually I'm trying to find an old video I saw demo'ing LISP or APL being used interactively by teletype, it's the earliest kind of terminal I've seen)
Interestingly, it is also the original description of the science citation index. When this was later combined with hypertext, the result was Google's PageRank system...
Now how could someone in 1945 be that visionary about how computers could be used some day? Well you see, he'd been in computers for nearly 20 years, and had been thinking about this system off and on for around a decade, in between real jobs like being in charge of R&D for the USA during WW 2...
If you fast forward to the 1960s, everyone should watch the Mother of all Demos. That was possible in 1968. In some ways it was better integrated than what we put up with today...
In some sense, early player pianos (IIRC with holes in paper that controlled key presses) weren't computers, but were a related precursor technology / infrastructure.
The metaphors for Windows and MacOS are swapped. Windows' technical underpinnings were - from the start - way better than Apple's. Microsoft actually bothered to copy everything from XEROX PARC, albeit poorly, while Apple saw the fancy windows-and-desktop UI and ignored the object system underpinning it. This isn't me making a jab at Apple - Jobs himself said it when he was at NeXT. Windows 95 and NT also both brought memory protection to the existing Windows API. Apple had spent several years trying and failing to build a memory protected Mac OS before just giving up and buying NeXT.
The correct metaphors are:
- Someone working at the phone company secretly designs a tram (UNIX). They're actually prohibited from selling vehicles, but they license the design under the table to a bunch of universities. A bunch of tram manufacturers make trams based off the phone company design.
- A wheel factory (Microsoft) sells wheels for bicycles. Bicycle dealerships crop up everywhere using their wheels. Even the railroads (IBM) want to get in on it, and they ship a terrible bike that everyone copies because it's the railway bike.
- Phone company designed trams are really popular and every city has like five of them. Except they keep breaking down and all the control cabs are just slightly different, so it pisses off the operators. Some kids at Berkeley try to make their own standard tram design (BSD) but they get sued by the phone company and nobody uses it.
- A car dealership (XEROX) moves in. They sell SUVs (Xerox STAR). They cost $100k each, and they only sell them in huge fleets to big corporations because XEROX wants to compete with the railroad. Nobody buys them and they leave town, but not before giving a demo of their tech to the last bicycle dealer (Apple) not using the railway design.
- The bicycle dealership decides to build their own SUV (Lisa) and a moped (Mac). The SUV is a huge flop while the moped is a minor success. Their CEO gets fired by the board and starts a trucking company (NeXT).
- A homeless man that lives on public transit and thinks vehicles should be free starts working on his own tram (GNU), but he overengineers the engine (Hurd) and it doesn't work at all. Still, he's not being sued by the phone company, so people start putting his parts into their trams anyway.
- The wheel factory learned how to make a moped from selling wheels to the moped dealer. So they sell their own moped upgrade kit (Windows). It works with any bicycle, but it looks like shit, even though it has the same power as an SUV engine.
- The wheel factory also starts work on a joint venture with the local railroad to produce their own trucks (OS/2). They can't agree on anything and divorce after a few years.
- Turns out mopeds suck! They break down constantly and need an oil change every 400 miles. The moped dealer starts work on a station wagon (Copland). A prototype is produced that's about as elegant as The Homer. It is unceremoniously cancelled.
- The wheel factory also has problems with their moped kits breaking down, but since they sold a lot more of them, they're the ones getting the reputation of selling an unreliable vehicle. They decide to design a truck of their own (Windows NT) and a car made out of moped parts (Windows 95) and sell the design to all the bicycle (now car) dealers.
- The moped company is ridiculed by the car dealers and nobody buys their elegantly designed mopeds. They wind up buying the trucking company.
- Someone in Finland designs an electric motor (Linux) that happens to fit in the homeless guy's tram. People hail this as a revolution in public transport, even though cities are full of NIMBYs who tore down the tramways and put in buses that ride worse and get delayed in traffic.
[0] https://news.ycombinator.com/item?id=24998305
> Someone in Finland designs an electric motor (Linux) that happens to fit in the homeless guy's tram. People hail this as a revolution in public transport, even though cities are full of NIMBYs who tore down the tramways and put in buses that ride worse and get delayed in traffic.
It's so easy to forget just how strange the actual history seems when zoomed out a bit. It sounds so absurd that this is the story of the OS I'm writing this comment on, but yet here I am doing it!
[0] Funnily enough, James Gosling also made Java, which is why I suspect the FSF was so ready to pounce on "the Java trap".
It has what I guess are American references that are meaningless to me. What is or was The Homer? In what universe are mopeds some sort of unsuccessful trial? Much of Asia has travelled by mopeds for ~75 years now; the Honda C90 is the best-selling motor vehicle of all time, and it's not even close.
As a super-extended metaphor for computing, I don't think the timeline fits together: it has Xerox, Apple, and IBM in the wrong order, but I'd find that hard to nail down. There was overlap, obviously.
It feels to me like the big influences are squeezed in, but not the smaller ones -- possibly because they mostly aren't American and don't show up on American radar. Wirth and Pascal/Modula-2/Oberon, the Lilith and Ceres; Psion; Acorn; other Apple efforts notably the Newton and things it inspired like Palm; Symbolics and InterLisp.
Nice effort. I respect the work that went into it, but it doesn't fix Stephenson's effort -- it over-extends it until it snaps, then tapes the bits together and tries again.
It's a reference to a Simpsons episode where Homer Simpson designs a car, and it's supremely hideous: https://simpsonswiki.com/wiki/The_Homer
Ok, I also swapped out Be for NeXT, mainly because NeXT was the one that actually got bought by Apple and ultimately had a lot more influence.
Xerox, Apple, and IBM were all releasing products concurrently to one another, so I kinda just had to pick a (wrong) order and stick with it.
I wasn't trying to make a ding at mopeds, I was trying to make a ding at the classic Mac OS. I guess if you want to fix that metaphor, the classic Mac OS was like a nice moped that had a bunch of shit added onto it until it became a really unstable but nice-looking car, while Microsoft just made a real car that looks like dogwater. If that still feels too American, well, I'm sorry but Neil started with a car metaphor, and I've already exhausted my permitted number of dings at American car centric urban design with the Linux bit.
The Homer is a Simpsons reference. The joke is that Homer Simpson designed a car in almost the same way that managers decided what features shipped in Copland.
[0] If this was a mobile OS discussion, I'd be dropping IBM, UNIX, and XEROX from the discussion to make way for Psion, Newton, and Palm. Microsoft would be pared down to "Well around the same time they were shipping real desktop OSes they also shipped Windows CE and Windows Mobile".
But even then, I almost feel like mentioning the actual inventors of the PDA is overindulgence, because absolutely none of those companies survived the iPhone. Microsoft didn't survive iPhone. Nobody survived iPhone, except Android, and that's only because Android had enough Google money backing them to pivot to an iPhone-like design. Even flipphones run Android now (or KaiOS). It's way more stark and bleak a landscape for innovation than desktop was in 1999 when Windows was king.
[1] OK, yes, both early Mac OS and early Windows were built in Pascal, not C. But neither of those are operating systems, and normal users would not be able to tell if their software was written in one or the other unless it crashes.
Well, I mean, I can -- e.g. I loved classic MacOS. But that's a personal judgement call.
I think I've seen Homer's Car in meme format, now you come to mention it.
As a programmer, I can point out all the many, many flaws with its technical architecture. Or how Apple's managerial incompetence let Microsoft leapfrog them technologically. Or even how Microsoft eventually figured out how to give Windows its own visual identity[0].
But at the end of the day, people were buying Macs despite the company making them. Apple had built an OS that made everything else look like a copycat, by worrying about the little details that few else cared about. It's the only reason Apple survived where literally every other non-Wintel PC company died. Atari STs and Amigas might have been fondly remembered, but their fanbases all jumped ship for PC the moment DooM came out, and the companies in question all got sold off for peanuts.
[0] My personal opinion regarding Windows visual design:
- Windows 1.x-3.x (and also OS/2 1.x): Really clunky and piss-poor attempt at cloning the Mac. It has the "programmer art" feel all over it. 3.x is slightly better in that they actually figured out how to pick a good default color scheme, but it still doesn't even have a proper desktop, instead using the root window as minimized window storage.
- Windows 9x/NT/2000: Not only does Windows finally get a real desktop, but it also gets a unique visual design, and a good one. Hell, they actually leapfrogged Apple on this one; as Mac OS 8 would take a few more years to ship its Platinum appearance.
- Windows XP: Cheap. Toylike. Microsoft saw OSX's Aqua and realized they needed something for Whistler, but they didn't seem to know what, and this is what we got. Media Center Edition would ship a slightly less toylike Windows visual theme.
- Windows Vista / 7: The absolute pinnacle of Microsoft's visual design chops. Aero is the thing that Liquid Glass wishes it could be. The glass effects were a perfect way to show off the power of GPU compositing, and Microsoft managed to do it without sacrificing readability or usability.
- Windows 8/10/11: Flatslop.
However, the one thing I'd take issue with:
> As a programmer, I can point out all the many, many flaws with its technical architecture.
I think, since we started out on history here, we must consider the history and its context.
1. Apple does the Lisa: a cheaper Xerox Alto, minus the networking and the programming language. Multitasking, hard disk based, new app paradign. The Future but 1/4 of the price of the original.
It's not cheap enough. It flops, badly.
2. Jobs repurposes the parallel information-appliance project into a cheaper Lisa. Remove the hard disk and the slots and all expansion, seal it up, floppy only, remove the fancy new app format & keep it simple: apps and documents. Smaller screen but square pixels. Keeps most of the Lisa good stuff.
It's still expensive but it's cheap enough. It sells. It gets Pagemaker. It changes the course of the industry.
But to get a GUI OS into 128kB of RAM, they had to cut it brutally.
It worked but the result is significantly crippled, and Apple spent the next decade trying to put much of that stuff back in again.
Remarkably enough, they succeeded.
By MacOS 7.6 it had networking, network-transparent symlinks, TCP/IP, a HiColour GUI, usable multitasking, virtual memory, and more. It was actually a bloody good OS.
Yes, it was very unstable, but then, remember so was DOS, so was Windows 3.
The snag is, that time was 1997 and by then MS had surpassed Windows NT and Windows 95 with NT 4.
NT 4 had no PnP, no power management, no working 3D except vastly expensive OpenGL cards, it lost a lot of NT 3.x's stability because of the frantic desperate bodge of putting the GDI in the kernel, but it was good enough, and it made Apple look bad.
Apple was ploughing its own lonely furrow and it made a remarkably good job of it. It was just too slow.
When Jobs came back, he made a lot of good decisions.
Junk most of the models. Junk all the peripherals. Make a few models of computer and nothing else.
Junk Copland, Pink, Taligent, all that.
Meanwhile, like Win9x + NT, 2 parallel streams:
[a] Win9x parallel: salvage anything good that can be stripped out of Copland, bolt it onto MacOS 7.x, call it 8.x and kill off the clones.
[b] NT parallel: for the new project, just FFS get something out the door ASAP: Rhapsody, then Mac OS X Server. All the weird bits of NeXTstep that were to avoid Apple lawsuits (vertical menus, scrollbars on the left, no desktop icons, columnar file browser, etc.): remove them, switch 'em back to the Apple way.
Meantime, work on a snazzy facelift for the end-user version. Make the hardware colourful and see-through, and do that to the OS too.
I think, looking at the timeline and the context, all the moves make sense.
And I used MacOS 6, 7, 8 and 9. All were great. Just such a pleasure to use, and felt great. I didn't care that NT was more solid: that was a boring reliable bit of office equipment and it felt as exciting as a stapler. NT 3.51 was fugly but it worked and that's what mattered.
B) With the current enshittification, your comment it's the obsolete one. 100 times over. Why? Enjoy your crappy iOS'ified OS with a maze of dependencies (Python3 for instance), SIP and updates breaking everything.
C) If any, the newbies are the doomed ones, as they don't know anything about computers. Solaris SMF commands (or AIX ones) blindly applied ro RHEL systems? Why not?
I know Neal said the essay was quickly obsolete, especially in regards to Mac, but I'll always remember this reference about hermetically sealed Apple products. To this day, Apple doesn't want anyone to know how their products work, or how to fix them, to the point where upgrading or expanding internal hardware is mostly impossible (no M-series Mac Pro discrete GPUs?). Even after 25 years, some things never change.
> Bullhorn: "You don't know how to maintain a station wagon either!"
Literally every conversation I have had with people where I have tried to get them to use Linux :').
Still true 25 years later!