Born in the wake of the Apple ]['s success, the IBM Personal Computer (dubbed the "5150" in IBM's internal numbering system) was the International Business Machines Corporation's official entry into the desktop computer system market, and by far their most successful. Earlier attempts, like the 5100 desktop APL machine and the DisplayWriter word-processing machine, hadn't taken off, and IBM needed something fast to compete with Apple. Bypassing the usual IBM bureaucracy, in 1980 they tasked a team of engineers in an IBM office in Boca Raton, Florida with developing the new machine and gave them an unusual amount of freedom in developing the system.
What appeared in August 1981 was nothing like any IBM machine built before. Like the Apple II, the IBM PC was built almost completely out of off-the-shelf parts and had a generous amount of expansion capability. As for the system design, the Boca Raton team considered several processors (including IBM's own ROMP CPU note an early RISC chip whose design was ancestral to the POWER architecture and the Motorola 68000) before settling on Intel's 16-bit 8088. The 8088 was chosen mainly for cost and time-to-market reasons; the ROMP was still experimental and IBM was concerned that the 68000 wouldn't be available in quantity. Also, the 8088 could re-use many of the support chips Intel had designed for the 8085, making the motherboard design simpler. To ensure a steady supply of 8088s, IBM and Intel recruited Advanced Micro Devices (AMD) to act as a second source, a decision that would have some importance later.
The other big influence on the IBM PC's design was the world of S-100 machines, which were based around the Intel 8080 (or, later the Zilog Z80) and the "S-100" bus that had been introduced in the pioneering Altair 8800. These machines ran an OS called CP/M, which had been invented by a programmer named Gary Kildall in 1974 and was based indirectly on Digital Equipment Corp.'s various operating systems for their PDP series of minicomputers. While they weren't nearly as slick as the Apple ][, S-100 machines were popular with hobbyists and businesses alike, and several CP/M applications for businesses, like WordStar and dBASE, were making inroads.
S-100 machines were large, server-style boxes with a large number of slots inside, plugged into a central backplane with power and data signals on it. The cards themselves were large and nearly square. To save space, IBM decided against using the S-100 backplane system, and instead went with Apple II-style cards that were long and rectangular, with a 62-pin edge connector near the back end of the card. IBM also added a sheet-metal bracket to the back of the card to add some structural stability. Since the PC used a regulated, switching power supply, the hot-running secondary regulators that S-100 cards used were also no longer necessary.note The secondary regulators showed back up in the PCI era, as more and more chips required 3.3 volts or less, but by then the regulators were much smaller and ran a lot cooler than the huge metal-can regulators of the 1970s.
In a burst of brilliance, the engineers of the PC decided to make it possible to install both MDA and CGA cards in the same machine, creating the earliest instance ever of a multi-monitor PC setup. While the setup was pricey, note as it needed two monitors as well as both the MDA and CGA cards tricky to configure, note the ANSI.SYS driver must be loaded into memory, then the MODE command be invoked to select the desired display and required more desk space than a typical setup of the time, many professional-grade software packages (mostly desktop publishing, engineering and development software) could take advantage of the setup. Unfortunately, many consumer-grade software programs, especially games, were designed to presume only the CGA card was installed and did arcane things in the memory space normally used by the MDA card. This caused issues that ranged from glitching to outright system crashes when said software was run on a PC with both cards installed, relegating such configurations to the offices of professionals, and then typically only on workstation-class machines, until Windows 98 revived the idea by offering official support for multiple GPUs and spreading the desktop across multiple monitors on PCs.
Also, those living in PAL regions never got to experience composite CGA the way it's meant to be experienced. PAL's superior color separation, which is often touted as its strongest selling point, is also its downfall in this case, as it was highly advertised that PAL is immune to artifacting and thus composite output shows the same unnatural color palates as with a RGB monitor in PAL territories.note In truth, it simply needed a different trick to expoit. The Sinclair Research group managed to derive a method that was used for the ZX Spectrum which are PAL machines. IBM however chose not to implement the method on the CGA cards in PAL regions. Many PC clones sold in these regions, and even some clones in the US, lacked composite video output. This is compounded with the issue that earlier batches of the PC sold in PAL countries still had CGA cards that output in 60Hz NTSC, at a time where NTSC playback or even Multi-System TVs were almost non-existent, meant that composite was also barely if ever used in the region.note You can get a multi-system monitor, but they were expensive pieces of kit meant for professional use and usually not sold to consumers.
IBM followed up the PC with the XT in 1983, which removed the original PC's cassette interface, added more expansion slots (along with an optional expansion chassis), and made a hard drive option available. 1983 also saw the introduction of the PCjr, a severely crippled version of the XT intended for home use; its main claims to fame were the addition of a 16-color, 320200 graphics mode and an internal 4-voice PSG (the same Texas Instruments model used in their own TI-99 series and, more famously, in the ColecoVision), both of which inspired one of the most famous clone families, the Tandy 1000. Next was the PC/AT in 1984, which introduced the 80286 processor and a fully 16-bit architecture, along with the Enhanced Graphics Adapter (EGA), which finally made 16-color graphics (in resolutions all the way up to 640350, although 320x200 and 320x240 were the most popular resolutions) possible on a regular PC. And with that, with the capacity for attractive applications and especially entertainment software, the march of history began...
At first, the IBM PC didn't have much to offer home users and gamers. It was new, expensive, not as good with graphics as the Apple ][ or the Atari 800, and was directed squarely at business users. However, IBM's name on the machine made it a safe buy for businesses that already used IBM hardware, and they ended up buying the machines in droves. The machine's open design sparked a huge third-party expansion market, with dozens of vendors selling memory expansion boards, hard drive upgrades and more. It wasn't long until other computer makers started examining the PC's design and figuring out how to make clones of the machine that could run PC software without issues. The one thing stopping them, however, was the ROM. IBM had a copyright on what they called the "ROM BIOS", and while cloning the hardware was easy, cloning the ROM would be much harder, with few vendors able to get it completely right (and the few that tried too hard, such as Eagle, getting sued into oblivion). It wasn't until Compaq introduced the Portable in 1983 that a truly 100% IBM compatible PC was available, and after that, software houses such as Phoenix,note who provided the BIOS and much of the DOS for the Olivetti M series and their American AT&T counterparts, as well as providing BIOS code for Samsung, Packard Bell and others Awardnote who provided the BIOS code for a few now-obscure American brands such as AST, as well as the BIOS for a huge number of clone boards from the late 1990s to the introduction of UEFI in the late 2000s, but eventually merged with Phoenix and American Megatrendsnote also used in a lot of clone boards, as well as high-end boards they built themselves; most Intel-made motherboards also used customized AMI BIOS. Not as common as Award on clones during the pre-UEFI era, but their Aptio UEFI BIOS is now almost ubiquitous followed suit, opening the floodgates to an entire industry of low-priced PC compatibles.
IBM also had another problem to deal with: Microsoft. When the PC was first being developed, IBM decided they wanted to license an outside OS rather than attempt to write their own, and their first choice would have been CP/M. However, when they tried to meet with Gary Kildall to license it, he wasn't around to sign the papers; the full details are unclear and have become something of a legend, but in the end, IBM didn't get CP/M. What they did get was the product of another little-known Seattle software developer's own frustration with CP/M: MS-DOS. MS-DOS began life as an admittedly "quick and dirty" clone of CP/M, written by a developer named Tim Patterson at Seattle Computer Products.
While Microsoft intended MS-DOS to be a universal operating system where applications could be written once and run anywhere (similar to UNIX), its programming interface was so poor that many software developers bypassed it and directly accessed the hardware. Even Microsoft itself was guilty of this, with early versions of Microsoft Flight Simulator often used as a compatibility benchmark. A number of manufacturers did introduce MS-DOS-based computers, but they all failed in the marketplace because they weren't fully IBM compatible.
Lotus 1-2-3, a spreadsheet that was the IBM PC's Killer App, was praised for its speed because it had been tightly coded in assembler and directly accessed the hardware of the PC. This meant that any clone would need to be as compatible with the hardware as possible in order to run Lotus 1-2-3, which became a litmus test of PC compatibility. This would have consequences for the evolution of the platform's hardware, particularly the "640k barrier."
795a8134c1