The PC Was Never a True 'ibmer'
Posted4 months agoActive4 months ago
thechipletter.substack.comTechstory
calmmixed
Debate
70/100
Ibm PC HistoryOpen ArchitectureIndustry Disruption
Key topics
Ibm PC History
Open Architecture
Industry Disruption
The article discusses how IBM's PC was never fully integrated into IBM's culture and business practices, leading to its eventual decline, with commenters debating the reasons behind this and the impact on the industry.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
28m
Peak period
56
12-24h
Avg / period
25.2
Comment distribution126 data points
Loading chart...
Based on 126 loaded comments
Key moments
- 01Story posted
Sep 14, 2025 at 5:13 AM EDT
4 months ago
Step 01 - 02First comment
Sep 14, 2025 at 5:41 AM EDT
28m after posting
Step 02 - 03Peak activity
56 comments in 12-24h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 19, 2025 at 3:33 AM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45238567Type: storyLast synced: 11/20/2025, 9:01:20 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Remember IBM had gone through a very painful antitrust case and was still subject to the consent decree. I’m not sure right now of the terms, but it certainly limited the leverages IBM could apply against third parties profiting from the PC.
If it is not "it is," it's "its."
-----
Or to be clear (lol),
1) The possessive of "it" is "its."
2) "It's" is a contraction of "it is."
... or "it has".
--
EDIT: I find it much easier to remember that "its" is (only) the possesive pronoun.
Open nature of PC allowed for truly free/open source software to exist which can be functional without big corporate lockdown. I can fully assemble it with parts I can buy individually and as long as they are compatible (which is mentioned on the box, no hidden knowledge here) I can expect it to work within the mentioned warranty.
My PC based computers can be booted and fully functional with Debain, Fedora and (put your favorite Linux, BSD distro here mine is openSUSE Tumbleweed). There is no parallel ecosystem which yet, which rivals PC in terms of open specs and fully tinkerable hardware and software.
Macbooks are locked down with Apple and forget about your own hardware.
Android seemed like a competitor, but closed nature of its development and lack commodity hardware around ARM based phones means that FOSS layer exists only in user bases apps. We have custom ROMs which require bootable blobs from vendors and its non-reliable and breaks often.
Put something with the power of an M series or a Graviton on these and you have the start of a great ARM PC market.
There's nothing inherently not-open about ARM, or at least it's no less open by nature than x86. The fact that most ARM devices are locked down is a secondary effect from most of them being phones.
RISC-V would be more open than either of these but it still lags on performance. I have a RISC-V board but it's kind of slow. Not terrible but wouldn't make a good PC for anything but basic uses.
I'd argue lack of something like ACPI to discover the device tree and memory map is why this impression exists. Besides the ARM CPUs not being socketed.
UEFI ?
https://en.wikipedia.org/wiki/Secure_boot#Secure_Boot_critic...
"x86-based systems certified for Windows 8 must allow Secure Boot to enter custom mode or be disabled, but not on systems using the ARM architecture"
I'm waiting. A PC (ATX) with ARM or RISC-V or Mx or Power would be very nice.
Haven't seen any though. Raspberry is a joke from a PC extendability point of view.
Not completely. Asahi linux boots on bare metal and runs great on Apple silicon machines prior to the M3.
Interestingly enough, Apple helped to develop a version of Linux running on the Mach microkernel, and handed out thousands of MkLinux CDs at WWDC and MacWorld Boston in 1996. Macs have been running Linux in various ways ever since.
http://www.mklinux.org
https://en.wikipedia.org/wiki/MkLinux
http://gate.crashing.org/doc/ppc/doc003.htm
Windows has also been running on intel Macs since Boot Camp in 2007. It remains to be seen whether ARM Windows will ever run natively on Apple Silicon however.
We still had plenty of issue with Intel and Microsoft being able to play out their monopoly.
So I think it could have been a lot worse, but it could also have been a lot better.
They did! I kind of want one of these PDP-11 based PCs:
1977: https://en.wikipedia.org/wiki/Heathkit_H11
1982: https://en.wikipedia.org/wiki/DEC_Professional
Heathkit H11 was an opportunity but of course never followed up with.
The Apple II was an open system and IBM clearly took a lot of inspiration from the Apple II line. Look at the 5150 motherboard in the picture in the article and compare it to the motherboard from an Apple II+
And if an updated system were to break any published app, Apple would be blamed. There were apps, albeit only a few, that would not run on an Apple IIe, and I think, a few more that wouldn't run on a IIc.
There were some notable violations of published entry points in MS-DOS software, most notably the page locations of display memory, leading to the famous "640k barrier." But they weren't enough to dissuade developers from treating the PC as an "open enough" platform.
I doubt that developers felt a particular sense of morality about the DOS interface, that they didn't feel about Apple II, but only that the interface was good enough to use as-is.
The real important thing here, was the openly published interface, and mutual agreement among devs to respect that interface. I mean "open enough" and "mostly respect" of course.
All computers which could be bought by individuals at that time were 'open systems', they usually came with a full set of hardware schematics and programming documentation, and sometimes even ROM listings. The Apple II was nothing special in that regard.
[1] https://www.allaboutcircuits.com/news/how-compaqs-clone-comp...
But this view neglects the fact that an organic ecosystem of interoperable open hardware converging to de facto standards and running a common OS already existed prior to IBM designing their PC. By 1980, there were already many independent vendors implementing their own variation on the 8080/S-100 design pioneered by MITS, all running CP/M from Digital Research.
When IBM released the PC, the CP/M world was still going strong. The fact that it was an easily cloneable architecture based on the 16-bit 8086 caused a lot of disruption, and led to the market dynamics that were already present in the 8080-S100-CP/M world pivoting over to x86-ISA-DOS.
If IBM had kept their PC proprietary, it might have led to a bit more fragmentation in the short-term market for business microcomputing, but at the same time, the CP/M world would have continued on without that disruption, and something else would have ultimately catalyzed the move to a common 16-bit architecture. DR was already working on CP/M-86 at the time IBM was developing the PC, after all.
Eventually, the same forces that led to the collapse of vertically integrated, proprietary platforms and the dominance of open-standards system builders would have asserted themselves, and IBM itself would still have been subdued by them. Modern computing would likely be in a similar position with or without IBM. The PC was a major ripple, but didn't really change the current.
That was true everywhere. I worked at a mini company at the time when the PC came out. People in that company looked at the PC as a cool thing, but not a real computer.
In 10 or so years, the PC killed of almost all mini computer companies. Some even speculated that was the main reason for IBM to create the PC :)
Nowadays not only they own one of the few UNIX proper left standing, they also own everything Red-Hat contributes for.
Everyone else, including other IBM offerings, were all about vertical integration.
It is no coincidence that nowadays with PC desktops being largely left to enthusiastics and gamers, OEMs are all doubling down on vertical integration across laptops and mobile devices, as means to recoup the thin margins that have come to be.
And also cloud applications, which are useless without the harder-to-clone data center part.
But Microsoft and the companies that made PC clones did everything to keep this "mistake" alive.
In fact, the openness of the PC platform is a historical accident. Other proprietary personal computer manufacturers (like Apple, Commodore and Atari) also never planned to create an open platform either. The closest thing was the 8-bit MSX platform, which was a Microsoft thing for the Japanese market, and it was very soon outdated.
Companies like Compaq, and later Phoenix and AMI, were able to get around the proprietary nature of the BIOS by building clean-room BIOS clones that withstood IBM's legal challenges.
However, given the willingness of Microsoft (apparently with little IBM could do about it) to sell MS-DOS variants to others like Compaq, and later the emergence of MS-DOS clones like DR-DOS, it's not obvious that clones might not still have taken off without the unintentional assist of the standardized BIOS interface.
Exactly the point of having a BIOS. CP/M had one as well.
However, if there were no BIOS, the thin hardware abstraction layer that the BIOS provided would be part of MS-DOS. I see only two historical alternatives from that:
1. Microsoft would have had an even greater upper hand in controlling the PC market.
2. IBM could have kept the BIOS proprietary (even though as a part of MS DOS), which prevented Microsoft from selling MS-DOS independently with an IBM PC abstraction layer.
However, even if option 2 prevailed, Microsoft could have created its own BIOS to ensure that software written for MS-DOS would be compatible across the PC clone market.
Had IBM made clones impossible they could likely have captured far more of the market.
It certainly wasn't IBM ability to produce PC that prevented them selling more.
Likely eventually they would have licensed the architecture to AT&T and the like.
Vertical integration could have worked, we are just lucky it didn't.
In 1990, everyone on my higschool that had access to computers was distributed between Spectrum, Atari ST, Amiga, and PCs, with PCs being the minority.
Personally I only moved into PCs in 1991, with a 386SX.
Until then, I only used the PC1512 ones on the school lab.
I became part of the PC minority.
Because in Iberian Penisula, it was full of green phosphor terminals into timesharing systems, and random 16 bit computers from all brands on the more creative side.
In 1990 it wasn't certain that PC would really take over, everyone was mostly on MS-DOS, and not everyone was still buying into Windows, which only got 3.0 released in the middle of the year and demanded too expensive hardware for most business.
Doesn't change the fact that on a worldwide basis IBM PC and clones-of sales were through the roof for business customers from about 84 or so on. Not a bubble at all.
There was a period where offices and schools etc would have more heterogenous systems -- especially schools where Apple offered incredibly aggressive educational discounts, and so you'd find a lot of Apple IIs and even the odd Macintosh. But this ended here in Canada at least by about 88, 89. From then on you'd find offices with PCs running DOS -- sometimes Windows -- often running Novel, and using boring business applications like Lotus, WordPerfect or Word, etc.
By the time I was in high school in 90, 91, everything was PC. So much so that I ended up ditching my well-loved Atari ST and getting a 486 myself, because the writing was on the wall and the party was over for 68k machines.
Macs in Spain were for people from the Humanities branch. Journalists, editors, writers, media creators, graphic designers, audio producers...
Even if you take all DEC Terminals sold in the 80s, you only get a fraction of the early sales of the PC by the 1990s. Of course timeshare systems had a huge history and install base, and the same for terminals, but by the late 80s PC sales dwarfed it by ridiculous numbers.
Europe generally is more delayed and more fragmented, but the economics of the situation was totally clear and are driven by US home and business demand.
https://cdn.arstechnica.net/wp-content/uploads/archive/artic...
By 1990 everything other then PC is a rounding error. Before that some 8-bit systems were relevant.
You mean like publishing the system board schematics and a full source listing for the BIOS?
That seems to have been surprisingly normal for PCs in the late 1970s.
Apple also published schematics and listings, and had to deal with clones, but Apple 2 clones weren't particularly useful without a copy of (or compatible replacement for) Apple's ROM, which Apple did not license.
I am 70's child, kind of aware how common it used to be during those days.
But it was too late, and they didn't have the power they thought they had.
TL;DR: big incumbents (e.g. IBM) get out-innovated and replaced by scrappy startups even when the incumbent sees it coming and tries to react. The incumbent's business processes, sales metrics (NPII in this story), internal culture and established customer base make it impossible for an innovative product to succeed within the company.
The incumbent produces an innovative gadget. It may even be good, but its Sales Dept earn their quarterly bonus from the existing product line sold to the existing customers. They haven't got time to go chasing small orders of the new gadget from new customers who they don't have a relationship with, and the existing customers don't see the point of the new gadget. So orders for the gadget stagnate.
Across town is the small scrappy start-up making a similar gadget. It lives on those small orders and has a highly motivated sales person who chases those orders full time. So their orders grow, their product improves from the market feedback, and one day the new gadget is actually better than the incumbent's main product. At that point the incumbent goes out of business.
IBM created a rather generic machine using off the shelf components, and someone else's operating system.
Innovation factor was almost zero.
The only advantage it had was it had IBM's name on it, and IBM was still a Really Big Deal then. It brought "respectability" to a thing that before was still a weird subculture.
In a rare feat, Apple managed to do just that with the iPhone, which ate the iPod’s lunch. This at a time when the iPod was a core product, directly responsible for their revival and success, that could have been milked for years to come.
The most impressive thing about the iphone I didn't think has anything to do with the technology, and everything to do with timing the release of a mobile device to hit the sweet spot between the cost of the hardware and capability of the hardware.
“One of Job's business rules was to never be afraid of cannibalizing yourself. " If you don't cannibalize yourself, someone else will," he said. So even though an Iphone might cannibalize sales of an IPod, or an IPad might cannibalize sales of a laptop, that did not deter him.” — Walter Isaacson
However, Jobs also believed in product differentiation and thought that having too many products in the same space was confusing. Arguably by making iPadOS more macOS-like Apple is reducing that differentiation and increasing confusion.
They tried, in the form of the previously mentioned PS/2. They just squeezed a little too hard. There was also the PCjr, which was riddled with enough technical flaws at a blistering price point for it to also end up a flop (Charlie Chaplin was also not exactly a great choice to sell to a market already trending younger). IBM might have eventually gotten it right, they just lost the will to keep trying. Their business model depended on landing corporate whales buying high-margin products and services; mere commodities were a plebeian concern beneath them.
https://www.youtube.com/watch?v=1LR1Xvvch18
I would argue things like the 1984 ad from Apple were bizarre, and while it makes the mark, it wasn't pivotable in terms of actually being effective. It appealed to Apples core, but wasn't effective in terms of ad dollars.
What was mind blowing is when Jobs came back to Apple, and Chiat/Day launched "Think Different." This was not grind it out. It was not "weird Apple" stuff.
It was awe-inspiring branding that changed the nature of technology marketing. It was beautify and emotive. I think it holds up well today, and may well hold together for many generations to come.
The subsequent "Get a Mac / PC vs Mac" ads were beyond brilliant in being able to pivot away from just emotion to an informed sense of humor.
I like the iPod ads, but we started to lose the edge.
I see none of the raw brilliance today that was a part of the previous years. However, I think they still do a great job of grind it out marketing, and they have continued to understand their brand. Maybe this is okay for where they are at.
Yes the iMac had big success, but mainly because it was an attractive piece of furniture.
iPad and then especially iPhone is what changed the formula completely. And with it they switched from focusing on a kind of "Think Different" model where they emphasized their oddity and uniqueness to being a luxury lifestyle brand instead.
They even dropped their historical "user friendly" mottos ("It Just Works" and "Computer for the rest of us") because those definitely didn't sound like slogans that high end luxury (or luxury wannabe) consumers would be attracted to.
Apple was dying. Somehow Steve got his foot in the door at the beginning of '97, and kicked out Gil in around 7 months. It was insane. Steve then immediately launched the Think Different, and personally narrated the ads. He then kills around 70% of the SKUs, killed the Newton, and got out the iMac in '98.
The reason why it was brilliant is because Steve said that we "lost our core." If there was a criticism about the think different campaign is that Steve was retreating to "the core groupees" therefore indicating that Apple was about the niche and not the mainstream. To Steve this was idiotic observation. You build from the devoted first. The campaign was brilliant as it wasn't about market growth, it was about stemming the blood loss and reclaiming the core. Then the MSFT deal was booed when announced. But it was classic Steve at his best.
To call the iMac "a piece of furniture" and the reason for its success is not to understanding how to deliver products and how to drive sales. Steve had an incredible aesthetic sense, and understood that he needs to segment his market an move away from the beige box. If you were in the market, you admired and were shocked at what he did.
But do you understand why suddenly there was an "i" in front? This wasn't about a piece of furniture, it was about portraying a device as your access to the outside world.
Then somehow you believe that the iPad and iPhone changed the formula, but I don't think you understand the iMac was the original example of how to apply the formula of being a product company that delivers a brand promise.
While Steve started as a PC person with Woz, he came back as a product person that had lived the glorious life of NeXT and Pixar. I think his banishment was helpful not hurtful. He understood that iMac was a product and not a PC.
He started asking "what products and what markets will Apple play in." The change was not iPhone or iPad. If you want to ask "when did the formula get copied outside of iMac?"
It was the iPod. It was iTunes. It was owning your own music. However, it was seen as a natural product extension from the iMac. Local storage of music was key.
Once they had a culture of products, then were able to build toward iPhone (iPad is a simple extension and has a much smaller impact on revenue.) However, it wasn't a pivot. It was a build.
I was born in '74, lived through the whole period, and my wife worked in marketing at Apple 2002-2010
I have a weird career where I was inside the door of Apple supplying tech in the early 90s, then worked in the PC industry, then returned to spend time inside of Apple as a supplier. I never had personal interactions with Steve, but the people I would interact with would say things like "Steve made the decision."
In this time, I had responsibility for both marketing communications and engineering. (I know weird.) In my Marcom role, we looked at market data on advertising, and worked with figures like Larry Light--who is well known specifically for branding, and we asked him to help us in branding for the PC industry.
I am not going to say that all my observations are right, but I will tell you that this is a field that I specifically worked in, and I have time inside the walls of One Infinite Loop.
This makes perfect sense. In the early 2010s I worked with what remained of IBM development and was surprised at the dysfunction, complete lack of manufacturing culture and engineering approaches. I couldn't believe that this culture could produce a successful product. Guess what, it actually didn't.
I'm surprised Db2 still exists as a product they actively develop, to be honest. Maybe because their services branch and mainframe customers still use it heavily.
Ambra?
They had very unusual mice but I never saw one in the wild.
The sale to Lenovo went very well, when compared to how most mergers, acquisitions and consolidations went in the period. I can't remember Lenovo from before the acquisition and, again, I can't remember seeing any pre-Thinkpad Lenovo machines.
1. Get out from the blue tax 2. Have and alternative procurement path 3. Set up a channel where we might not cannibalize ourselves. 4. Free outselves from some of our rigorous engineering processes
It was basically a fail fast experiment, which is popular today. It was set up with the thought process that we wanted it more virtual and not to disrupt the core business. It became obvious pretty quick that it brought its own set of risks, and so we moved into Aptiva. It is good to try and fail and get out.
Actually, the Lenovo acquisition was a bit of a war. There was some visionary leadership from the most senior level of Lenovo that saw the core USA as extremely valuable, and allowed them to win arguments. While their long term goal was to move the core to China, they were careful to make sure they kept a lot of the USA team engaged, and many key USA individuals did move or travel constantly to China.
However, I wasn't a part of the company when it was sold, so most of this is top level feedback from my friends that did go.
Where were the Ambra machines sourced? Were they special clones like Compaq (where the BIOS was different), decent commodity clones like Dell or were they generic clones like everything off-brand?
I never understood what the value proposition was. Was it a bit like a supermarket own brand where the customer kind-of guesses that the brand leader makes them, much like how Americans know CostCo Kirkland diapers are made by Huggies?
Being this is 30 years ago, I can't remember an exact matrix on some that wasn't my core product. But they had a strategy.
This was early in my career, but I happened to be in a pivotable position that got me access above my pay grade. (I have this weird background in both marketing and engineering, and as somebody that can speak both, I turned into basically a language translator in many meetings, then I was sent out as PR person to the magazines.) I did not work for the Ambra team, but they had an impact on my work, so I got to be involved a enough to see the edges.
I'm not going to have the exact numbers, but I remember that we stated that we were going to have no more than 8 IBMers involved with the thing. The Taiwanese clone market was just starting to take off, and we were starting to outsource to Taiwan. If I remember correctly, it was the Phoenix BIOS, who we had already done a deal with for our Consumer PC line. (Actually, it was co-development.)
As I already wrote, the final bit is that the guys had done some anonymous bid work, and had gotten some very aggressive bids--better than what we were getting. So, they had the impression that they could take a lot of cost out of the system. Also, we wanted to take out Dell and Gateway, but not impact the core IBM brand. Compaq was considered the real comp. HP second. Dell was this annoying "can't stop them because they always win the bids" company. Gateway was on the fringe, and more of a threat to our consumer brand, which was small at the time. But it was free TAM.
So, there was an impression if we followed the Dell/Gateway model, leading tech, very competitive pricing, and full page ads, with some systems that lived in the space, we could start to cannibalize the their TAM.
Now, you don't want to read back into history. The Dell then is not the Dell of now. But, buying behavior was stronger with Dell than the group anticipated. It just was tough to get the velocity growth they wanted. I think we launched in EMEA first, maybe because that is where the VP that ran the thing was from, and then it was rolled out in the USA. However, it just did not see the growth, and I remember there were some quality issues that the small group couldn't handle, but this is pretty foggy.
I will also state that the Round Rock team (and even Gateway), was incredibly tough competition in this arena. I would say that the team did not appreciate this.
However, it was never a massive corporate push for RTP--the home of the PC and PC Server. It was a "let's try this and see if we can learn something." I do remember most of us in the core PC team to NOT get involved as it wasn't pitched as being our core business. If I remember correctly, we did help share some information on parts that we procured to help them.
Its not Costco as the models are so different. As I have run a distribution business before, and Costco is really a marvel to me. My disti business was to the VAR channel, but my sister group used Costco and Walmart. Costco is absolute maniac about delivering value to their customers with quality. Really, it blows me away. They had a bunch of brands, and Sinegal said they could combine them all under Kirkland, turn it into a quality brand to drag up the entire Costco brand. He is so freaking brilliant, and I would argue unique to a company that had a distribution business that wanted to position themselves in the consumer's mind. I would argue that Costco never used their branding to indicate a "secret way" to get a better brand. I think that Costco is keen on making sure that Kirkland IS the brand, which is different than Ambra.
Thanks for asking. I don't think we ever did this type of post mortem at the time, and thinking through events always seems to generate learning for myself as I type it down.
https://en.wikipedia.org/wiki/Double_Irish_arrangement
At the time, customers had no idea that this was why these tech companies were operating out of Ireland. All they knew was that you got a lot for your money, and, more importantly, the latest tech. However, you might have to wait 28 days for delivery. It was important to the business model that Dell had not one single employee in the UK or other EU countries apart from Ireland, so it was call centre for everything before the web came along.
There may have been more to the Dell business model of customisation. In the UK, if you order a bespoke product, then you don't have the normal rights to just return it if you don't like it, you are stuck with it, and at the mercy of customer service.
I don't know if Dell played this card because they always had that refurbished gig going, where you could get good kit with a few dents and scratches. Nonetheless, there was very little manufacturing going on, it was just a screwdriver operation, final assembly of what amounted to knock-down kits. You did get the latest and greatest though,
Regardless, compare with the original IBM model where they made PCs in Greenock, Scotland. Undoubtedly you know more about this than I do. However, as I understand it, the original Scottish factory made typewriters before the PC came along, and IBM were incentivised to choose Scotland in the post war years, when the British government were quite serious about bringing industry to Scotland. Shipbuilding had gone on the Clyde, and with it steel and the outfitting and everything else that goes with shipbuilding. There was also an emotional reason for Greenock, Watson had Scottish ancestry.
The product that came out of Greenock was really good. To this day people want those keyboards that came out of there. The Trinitron monitors came from Sony's plant in Bridgend Wales, which they originally opened in the 1970s to make TVs, again with the usual government incentives. Sony also supplied Dell and Gateway 2000. Now all that has gone, RIP Trinitron, we loved you...
I am sure there was more to the supply chain, since, in period, semiconductors were made in Silicon Glen. However, these tended to be things like DRAM chips, where vast fortunes would be spent building a fab for it to be pretty much stillborn.
I am curious as to how 'vertically integrated' the IBM operation was, since hard disks were also made by IBM in Scotland. The IBM PC story is told as 'using commodity off the shelf parts', but IBM PCs were not a product of a screwdriver operation.
It is shocking that the UK have done so badly at tech. However, how was IBM supposed to compete against those tax fiddlers operating out of Dublin? Why did the EU allow Ireland in the UK when they were not taxing the big corporations? The Irish shot themselves in the foot with this as they ended up with house price inflation and very high personal taxation.
The UK also had a lot of tech in the M4 corridor, this being the motorway out of London that goes all the way to (drumroll...) Bridgend. Reading was the prime spot with Compaq, SGI, Microsoft, Oracle, Sybase and plenty of others setting up shop there.
In Reading there was an industrial estate that housed most of them, with their own private motorway. If you were on the train going past you imagined this as being a full on mini-Silicon-Valley, however, not a lot was going on in those impressive headquarter buildings. I am sure Sybase was just a couple of guys hassling the few customers they had for whatever license fees they could get from them, as for Microsoft, there was nobody writing code there, you just had product managers for things such as Microsoft Golf. Same with SGI, just a very big building with nobody in there. It all looked impressive from the train, however, it was just a Potemkin Village.
I am in genuine disbelief regarding how the UK messed up with tech given the advantages of a reasonably educated population, a reasonably high standard of English, access to the EU market, access to the former colonies, an army of 8-bit coders (from the BBC Micro project) and access to capital (London).
I've done both marketing and engineering. I managed a group at RTP responsible for a part of the engineering for PC, Thinkpads, and the Server group. While I had people go to Greenock, I never went personally. So, in some sense, I don't feel like I can give an adequate impression and background.
I will tell you that our team out of Greenock had a big impact on manufacturing, and some real live wires. I remember being at home on the weekend, and have a VP of manufacturing hunt me down to say words that normally IBMer wouldn't say in that I was bringing down his line. It really was the stereotypical Scottish Soccer fan type interaction. In this case, it turned out that it wasn't my group. After I left IBM, I ended up supplying tech to all my IBM competitors. I observed what I thought was basically all of IBM processes in their processes. Similar terminology that I know started at IBM. While we did see some people move, a big part of this was the supplier base.
We did do a lot of work with outside resources. For example, we "qualified" power supplies, but we basically did co-engineering. Part of our problem in that we would significantly change everybody's product, then they would sell it to Dell or Gateway. This was a very quick path to get your processes into the comp as the suppliers would "suggest" things that they had learned at IBM.
The hard drive issue is a bit more subtle. IBM was not the original supplier to our own PCs--and they had a melt down with a massive recall. I don't remember us making hard drives in Scotland. I remember Mainz, Rochester MN, San Jose, and Fujisawa. The PC client group took almost exclusively Fujisawa products. The server group worked to take San Jose/Rochester server products. Rochester was the 5.25 and then 3.5" lines for AS/400, so they were at first leading with the PC server group. I will relate that I think HDDs were always the #1 reason for Quality No Ship (QNS), mainly due to shock specs, which got better over time.
As to the delivery of tech in the UK. I would state that it is similar inside of the USA. Try as many want, tech seems to be focused in the New York, Bay Area and in Puget Sound. While I was in RPT, and this state killed themselves to get tech (including 3 incredible Universities right next door), for some reason it never turned into a tech hub. Without research, I would state that there is also an element of chance or luck that gets a region going, and can't be planned for. However, maybe your just pointing out that UK never even gave themselves any chance, which I am just not close enough to understand.
No it is a "personal", but not "computer".
As Patrick Lencioni has often said, we have things reported as strategy, when it turns out to be people issues. A lot of what happened at IBM only makes sense if you were there.
I'll list some things here, though since I'm late to the conversation, I’m not sure how much it will be observed. However, perhaps an IBMer who was with the PC Company will come across this and add a few more alternatives or supporting facts.
1. IBM was a wildly diverse place culturally. We had almost half a million employees worldwide. As with any large corporation, you could find divergent views—anyone could find a person or two to support anything they wanted to claim about the company. However, the PC division was generally well regarded. Sure, you can find somebody who said something about "antibodies", but you can find a lot more who would say that’s ridiculous. I tend more toward the latter than the former.
2. Don Estridge was a bit of a cowboy. He did love being down in Florida, which gave him the ability to move quickly. Still, I would say IBM allowed for its “wild ducks”, and while the PC group was one of the more obvious successes, it was not IBM’s only success. Estridge died in a well-publicized airplane accident at Dallas Fort Worth. I don't think most people understand how much cultural impact this had on the group. Although it could be debated, I do believe we could say it was as if Bill Gates or Steve Jobs had been taken out of their company. The amazing thing about the PC group is that it didn’t collapse after his death.
3. The single most destructive thing IBM did was thinking they could take the PC group out of Boca and transport it to Research Triangle Park. After it moved to RTP, I got to work there with many of the group’s core members. They consistently described how the move was traumatic to virtually every aspect of a team that was truly world-class. (Another issue: There was also a development decision in Boca that some decried—some forward-thinking was shelved—but I wasn’t heavily involved in that, and it wasn’t so universal.)
4. I was in the midst of the turmoil as IBM reached the midlife of the PC in RTP. By that stage, we had given up on the idea of clear, proprietary closed systems. Yet at the same time, we were doing some excellent engineering and marketing—we were finally winning awards from PC magazines for the desktops, and people already loved our laptops. But we were clearly hamstrung. Without going deep into details, it’s clear in today’s economy that certain business units serve different purposes. The PC Division was expected to make a lot of money while paying what was internally called “the blue tax.” In other words, corporate hit us with effective tax rates and metrics that basically made it impossible to compete with Compaq or Dell. What most people don’t realize is that one of the biggest impacts of selling the group to Lenovo was the removal of the blue tax. Many key U.S. development team members stayed with the company, and though Lenovo was committed to eventually moving true development headquarters to China, it would have collapsed without an incredibly dedicated group of IBMers who were unfailingly unselfish. There was something about the culture—dedication to the team was one of the most important things you could do, even after the group had been sold to Lenovo.
On reflection, if Estridge never died, and if the division had never moved from Boca, the computing industry would be very different as being apart of what happened.
(-:
I was moved every two years, and it was a full M&L.
I will treasure my time there as remarkable rich.
Thank you for confirming the precise point that I made in the post!
This is pretty well understood about defining your business model, and is not an antibody issue.
The real threat was Microsoft, which we understood at the time. Our biggest problem is not the OS, it was the apps.
I have made a part of fable on this here: https://theologic.substack.com/p/the-fable-continues-microso...
Someone might also have read the BIOS source code listing that IBM provided:
https://archive.org/details/IBMPCIBM5150TechnicalReference63...
> Typically, a clean-room design is done by having someone examine the system to be reimplemented and having this person write a specification. This specification is then reviewed by a lawyer to ensure that no copyrighted material is included. The specification is then implemented by a team with no connection to the original examiners.
https://en.wikipedia.org/wiki/Clean-room_design
The article above makes some interesting points about whether clean-room design is actually necessary.
A quick search showed it isn't easy to find online version of these patents (because IBM has so many that even knowing these are from 1984 didn't help), but I remember that one was related to being able to split a screen into a graphics and a text part in their EGA board (though the Apple II previously did this too, but with a fixed split), one was about detecting 360KB vs 1.2MB floppy drives by seeking to track 60 and then stepping back 59 tracks and checking if we were now at track 0 (not unlike how the Apple II handled the lack of a track 0 signal, but for a different purpose), one was for the "bus master" signal in the PC AT (later ISA) bus and I can't seem to remember the other four but they were all similar in style.
So in the late 1990s you had to pay IBM if you wanted to make a PC clone (AT and up, but 8088 clones had died out in the early 1990s).
The blog on Don Estridge covers IBM's place in PC history in fascinating and extensive detail.
Mr. Edwards also reminded me what a debt Linux users owe to Rod Canion for making the gang of 9 and open hardware a reality.