Born in the wake of the Apple II's success, the IBM Personal Computer (dubbed the "5150" in IBM's internal numbering system) was IBM's official entry into the desktop computer system market, and by far their most successful. Earlier attempts, like the 5100 desktop APL machine and the DisplayWriter word-processing machine, hadn't taken off, and IBM needed something fast to compete with Apple. Bypassing the usual IBM bureaucracy, in 1980 they tasked a team of engineers in an IBM office in Boca Raton, Florida. with developing the new machine, and gave them an unusual amount of freedom in developing the new system.
What appeared in August 1981 was nothing like any IBM machine built before. Like the Apple II, the IBM PC was built almost completely out of off-the-shelf parts and had a generous amount of expansion capability. As for the the system design, the Boca Raton team considered several processors (including IBM's own ROMP CPU [1] and the Motorola 68000) before settling on Intel's 16-bit 8088. The 8088 was chosen mainly for cost and time-to-market reasons; the ROMP was still experimental and IBM was concerned that the 68000 wouldn't be available in quantity. Also, the 8088 could re-use many of the support chips Intel had designed for the 8085, making the motherboard design simpler. To ensure a steady supply of 8088s, IBM and Intel recruited Advanced Micro Devices (AMD) to act as a second source, a decision that would have some importance later.
The other big influence on the IBM PC's design was the world of S-100 machines, which were based around the Intel 8080 (or, later the Zilog Z80) and the "S-100" bus that had been introduced in the pioneering Altair 8800. These machines ran an OS called CP/M, which had been invented by a programmer named Gary Kildall in 1974 and was based indirectly on Digital Equipment Corp.'s RSX-11 Operating System for the PDP-11. While they weren't nearly as slick as the Apple II, S-100 machines were popular with hobbyists and businesses alike, and several CP/M applications for businesses, like WordStar and dBASE were making inroads.
S-100 machines were large, server-style boxes with a large amount of slots inside, plugged into a central backplane with power and data signals on it. The cards themselves were large and nearly square. To save space, IBM decided against using the S-100 backplane system, and instead went with Apple II-style cards that were long and rectangular, with a 62-pin edge connector near the back end of the card. IBM also added a sheet-metal bracket to the back of the card to add some structural stability. Since the PC used a regulated, switching power supply, the hot-running secondary regulators that S-100 cards used were also no longer necessary.
In one break with the Apple II's precedent, and as an improvement on the serial consoles S-100 machines used, IBM decided to leave the graphics system off the motherboard and provide two add-on cards — a text-only Monochrome Display Adapter (MDA)[2], intended for business users, and a Color/Graphics Adapter (CGA), for games, education and emulating other color-capable IBM hardware. This was done to allow buyers a choice in the video hardware, as well as to save space on the motherboard. While MDA was widely praised for its outstanding clarity and readability, especially when combined with IBM's original 5151 monitor, which showed off MDA's effective 720×350 resolution, CGA had a barely adequate 320×200 [3] or seriously weird 640×200 and was nearly universally panned. Still, it wasn't until the advent of EGA in 1984 that anything more adequate appeared, so everybody still used it — or a third-party Hercules monochrome card, which could address individual pixels, unlike the MDA, but was much more expensive.
The base system came with just 64K, like the Apple II, but could be expanded to a then-breathtaking 640K thanks to the Intel 8088 processor inside, which had a 1 MB address space (huge for a desktop machine in 1981). It even had BASIC in ROM, just like the Apple II.
IBM followed up the PC with the XT in 1983, which removed the original PC's cassette interface and made a hard drive option available. 1983 also saw the introduction of the PCjr, a severely crippled version of the XT intended for home use; its main claims to fame were the addition of a 16-color, 320×200 graphics mode and an internal 4-voice PSG (the same TI model used in the Colecovision). Next was the PC/AT in 1984, which introduced the 80286 processor and a fully 16-bit architecture, along with the Enhanced Graphics Adapter (EGA), which finally made 16-color graphics (in resolutions all the way up to 640×350) possible on a regular PC.
The Rise Of The Clones[]
At first, the IBM PC didn't have much to offer home users and gamers. It was new, expensive, not as good with graphics as the Apple II or the Atari 800, and was directed squarely at business users. However, IBM's name on the machine made it a safe buy for businesses that already used IBM hardware, and they ended up buying the machines in droves. The machine's open design sparked a huge third-party expansion market, with dozens of vendors selling memory expansion boards, hard drive upgrades and more. It wasn't long until other computer makers started examining the PC's design and figuring out how to make clones of the machine that could run PC software without issues. The one thing stopping them, however, was the ROM. IBM had a copyright on what they called the "ROM BIOS", and while cloning the hardware was easy, cloning the ROM would be much harder, with few vendors able to get it completely right. It wasn't until Compaq introduced the Portable in 1983 that a truly 100% IBM compatible PC was available, and after that, software houses such as Phoenix and AMI followed suit, opening the floodgates to an entire industry of low-priced PC compatibles.
IBM also had another problem to deal with: Microsoft. When the PC was first being developed, IBM decided they wanted to license an outside OS rather than attempt to write their own, and their first choice would have been CP/M. However, when they tried to meet with Gary Kildall to license it, he wasn't around to sign the papers; the full details are unclear and have become something of a legend, but in the end, IBM didn't get CP/M. What they did get was the product of another little-known Seattle software developer's own frustration with CP/M: MS-DOS. MS-DOS began life as an admittedly "quick and dirty" clone of CP/M, written by a developer named Tim Patterson at Seattle Computer Products.
SCP was mostly a hardware outfit, whose business was in memory upgrades and other add-ons to the aforementioned S-100 machines. When the 8086 appeared on the market, they wanted to use it and quickly threw together a prototype machine. Digital Research had promised an 8086/8088 port of CP/M for a long time and never delivered until it was too late, leading Patterson to write his own and name it "86-DOS" or "Q-DOS" (depending on who you ask). Microsoft, who had already basically lied to IBM and said they had something ready (they did; it was called Xenix--but it was a UNIX OS, and IBM wouldn't accept that. Xenix was later sold to the SCO Group), paid SCP and Patterson $10,000 for the rights to 86-DOS, did some quick editing and released that as MS-DOS/PC-DOS 1.0. Microsoft also put language in their contract with IBM, stating that they had the right to license MS-DOS to whoever they wanted without first seeking IBM's approval. This had serious implications for the PC clone business, as it meant that once the clone makers, AMI and Phoenix had opened the floodgates on the hardware side, Microsoft could sell the hardware makers MS-DOS, thus creating a complete package — and a huge pain for IBM.
IBM Tries To Win Back The Crowd[]
With all of the pieces in place, the clone market took off like a shot after 1984. New companies building PCs based on cheap, mass-market "motherboards" made in factories in East Asia were popping up everywhere, and Compaq became a Fortune 500 company on the success of its Portable and Deskpro ranges. In 1986 Compaq beat IBM to the punch with the first PC to use the new, 32-bit 80386 processor. Between the clone armies and Compaq's meteoric rise, IBM decided that if it couldn't compete on price, it would compete on features, and (it was hoped) introduce a new standard that they alone would control.
The result was the Personal System/2 (or PS/2 for short), a line of new PC-based machines that were deliberately much different from the prevailing PC standards. The new machines used a new, IBM-proprietary expansion bus called "Micro Channel", which was faster than the AT's bus (by now referred to as "ISA" for "Industry Standard Architecture") but completely incompatible and protected by IBM patents, requiring anyone who wanted to use it to go through a lengthy licensing process and pay royalties. The other major feature the PS/2 line introduced was a new video subsystem called Video Graphics Array (VGA), a substantial upgrade to EGA that added a new 640×480 high-resolution mode (familiar now as the mode Windows 2000 and XP use for their splash screens), analog RGB video with an 18-bit palette (over 262,000 colors), and up to 256 colors on-screen at once. VGA was accepted by the rest of the industry enthusiastically, with 100% VGA compatibility becoming a must for video-card makers.
The VGA proved very popular with game developers. What it lacked in tricks like sprites, blitting and scanline DMA, it compensated for by being tweakable (hacked 256-color modes were very popular, providing resolutions up to 360×480) and having high-speed, easy-to-use video memory. The base 320×200x256 mode, however, was the easiest and the fastest[4], and many groundbreaking games of the late 1980s and early 1990s were written with this mode in mind, including the Sierra and Lucasfilm point-and-click adventures, Wolfenstein 3D and Doom. 640x480x16 mode, on the other hand, was extremely popular with the early graphical OSes and GUI-based DOS software, and it remains a barebone compatibility video mode in many modern OSes as well. Later IBM innovations, like 1024x768x16 XGA, were a few more-or-less standard modes in the swirling chaos generally known as "Super VGA".
Local Bus Wars[]
Micro Channel didn't fare so well, though. Only a handful of other PC makers (notably Tandy) adopted the bus, and while a few outside peripheral makers made MCA-compatible devices, the vast majority were designed and built by IBM as build-to-order add-ons. Miffed at IBM's rather shameless attempt to corner the high-end PC market, Compaq and several other PC makers introduced a competing standard called "extended ISA" or just EISA for short. EISA expanded the bus to 32 bits and added "bus mastering" support (which let the CPU do other things while a data transfer from, say, a disk was happening) and MCA-style, semi-automatic configuration, while maintaining compatibility with regular ISA cards. EISA was popular mainly on servers and high-end PCs; desktops didn't need that kind of bandwidth yet. Its auto-configuration system was later backported to ISA as "Plug and Play", as part of the development effort leading up to Windows 95.
In the mid-1990s there was another competitor bus, VESA Local Bus (or VLB for short), which added a 32-bit wide, full-speed side channel to the ISA bus. VLB was originally designed to give bandwidth-hungry GPUs a faster connection to the system bus, but it was also something of a stopgap measure and didn't last long. Its biggest problem was that it was too deeply connected to the internals of the 486 processor, for which it was developed; the Pentium used a completely different memory bus setup, and converting between the two was notoriously difficult. Also, VLB's specification was not very rigid and almost all manufacturers tweaked it a bit. This lack of precision also made running anything other than video on VLB a potentially dangerous proposition, especially if mass storage was involved; most IDE controllers of the day generated their timing signals directly from the VLB, and if it was running faster than what the controller expected, bad things would happen.
VLB also had severe mechanical problems. Physically, it was an inverted PCI connector positioned next to an ISA-16 slot, so it could be used with full-length cards only (giving it a Fan Nickname of Very Long Bus) and, due to the large number of pins and friction in them, it required enormous force to physically install or remove a card, dangerously flexing the motherboard. This was exacerbated by the fact that due to the cards' extreme length, many cases (in which such cards barely fit) made it impossible to angle the card into the slot, making it necessary to force them straight. In the end, Intel's new "Peripheral Component Interconnect" (PCI) spec won the "local-bus wars" due to its platform-agnostic nature and cleaner architecture. PCI was first announced in 1992, and became popular with the first PCI 2.0 machines in 1995, making VLB obsolete almost immediately.
Wintel Comes And Wins[]
After years of being confined to what were basically fleet sales, IBM discontinued the PS/2 line and MCA in the mid-1990s, preferring instead to concentrate on the revived "IBM PC" brand (new, ISA/PCI-based machines sold as business desktops) and the highly successful ThinkPad line of notebooks, which was introduced in 1992. This marked the end of IBM's dominance of the PC clone market, with the balance of power now shifted to Microsoft, Intel and the clonemakers
The introduction of Windows 3.0 in 1990 also finally made Windows a legitimate platform after several years of false starts; it placed higher demands on both graphics hardware and Mass Storage, and it was this need for better hardware that drove PC development. With the introduction of PCI between 1993 and 1995, along with improved video, sound and storage hardware, the PC started to look less like a classic 8-bit computer with bolted-on upgrades and more like a high-end RISC workstation. The introduction of the second-wave Pentium in 1995 and the Pentium II and AMD K6 in 1997, along with the ACPI API in 1998, blurred the distinctions even further, and convinced people that a cheap desktop could perform as well as an expensive UNIX workstation. AMD sweetened the deal further in 1999 with the announcement of the "x86-64" instruction set, which added 64-bit capability to the PC as well as fixing some of the 80x86's long-standing quirks; the first CPU to feature it was the Opteron, released in 2003.
This caught Intel in a bad spot. Intel had been working with Hewlett-Packard on a new, 64-bit processor called "Itanium", and had pretty much bet the company on it, with grandiose plans to move the entire PC platform over by the mid-2000s. The Itanium was not a regular CPU. It used a new architecture based on "very long instruction words", small programs in themselves that ran the CPU's insides more-or-less directly instead of relying on a decoder system to do it. Between the chip itself being late to market and the issues developers had with writing the development tools for it, most people simply ignored it; it was often referred to as "Itanic" since it seemed to be sinking fast. Despite all this, Intel stood by Itanium right up until the Opteron came out — and then changed their minds quickly upon seeing how popular it had become, working on plans to add 64-bit support to the Pentium 4 line as well as the then upcoming Core 2 processors. Since then, the Itanium has found a niche in replacing older RISC systems like DEC's Alpha, and HP's PA-RISC, and the Itanium 2 has found use in high-performance computing, but it never had the mass-market success Intel was hoping for.
Today, in early 2011, the PC's various implementations are collectively the most popular desktop computer in the world, and have even made inroads into scientific and high-performance computing due to huge leaps in processing capability as well as an emphasis on power savings. Several attempts to update the PC using newer parts have come and gone; most of them failed after 1995 as the PC's hardware ended up absorbing most of the features that a switch to MIPS or PowerPC would have brought, including (eventually) the RISC philosophy itself. The Pentium Pro and its descendants are actually RISC processors internally, and use emulation to handle the rather haphazard x86 ISA instead of trying to execute it directly, as all processors up to the Pentium had.
On top of all this, the rise of the clones meant that pretty much everyone was selling nearly identical systems with little to differentiate them; this made margins even tighter and made PC makers more reliant on advertising, system aesthetics (the aforementioned ThinkPad was one of the first PCs to buck the trend of the generic beige box), gimmicks such as "100x CD-ROM" software and other pack-ins, and, if all else failed, price. This environment made it much more difficult for a new player to enter, as any new system would almost certainly be more expensive than a regular PC would and would have the additional hassles of porting all the software. Apple was able to stick it out due to both clever advertising and innovative system design, but they, too, eventually switched to x86 in 2006.
So far, x86 has seen off all attempts to replace it, with most of its challengers either long obsolete (6502, 68000) or relegated to embedded systems (MIPS, PowerPC). However, ARM CPUs have seen massive increases in computing power since the mid-2000s, and may end up being the first real challenge to x86's dominance on the desktop in decades. Chips based on the ARM architecture have dominated consumer electronics for years because of their clean system design and low power usage (which means better battery life), whereas Intel has had to cut features and performance from the Atom just to get it cool enough for netbooks. Chips from NVIDIA, Texas Instruments, Samsung and Apple have shown what ARM is capable of, and with many people switching from desktops and laptops to tablets, even Microsoft has gotten concerned — Windows 8 will be the first version of Windows to run natively on ARM. There's also rumors that Apple may be considering switching their desktop machines to ARM.
As a footnote, IBM themselves left the personal computer business for good in 2005, selling their PC division to a Chinese company named Lenovo (hence Lenovo now sells the Thinkpad).
Specifications:[]
IBM PC:
- Intel 8088 processor running at 4.77 MHz
- 64K RAM, expandable to 640K; some expansion systems could backfill up to 736K if you were using an MDA
- Eight 8-bit expansion slots
- Keyboard and cassette ports
- Optional single or dual 5¼" floppy drives
- Could be ordered with either an MDA (80×25 text only with blink, bold, underline and reverse-video effects) or a CGA (80×25 or 40×25 text in 16 colors, 320×200 in four colors, 640×200 monochrome; a hacked 160×100 16-color mode was also available)
- CGA's 80×25 text mode was a joke. Unlike the similar MDA's mode it used 8×8 matrix for a one symbol, which, accounting for letter and row separations, left only 7×7 for symbol itself, at best. MDA's character tile resolution was 9×14, a feat unmatched at least until VGA, and allowing for a then unprecedented text clarity and readability — and not a small selling booster for the early models, which were used mostly as office machines. CGA characters, in contrast, looked ugly, grainy and were nigh unreadable. Several clone vendors (most notably Compaq and AT&T/Olivetti) remedied this by providing a double-scan text mode, which ran at 640×400 and used a much more legible 8×16 character matrix.
IBM PC/XT
- Same as the PC, except with more memory, standard floppy drives and a hard disk option
- Cassette port removed
IBM PC/AT
- Intel 80286 processor running at 6 or 8 MHz, depending on age
- Up to 16 MB "extended" memory using add-on cards
- Eight expansion slots — six 16-bit, two 8-bit
- 20 or 40 MB high-performance hard drive
- Single or dual "high density" floppy drives
- Optional MDA, CGA, EGA or the rare and expensive Professional Graphics Controller, an early GPU meant for CAD use
- When it became obvious that MCA bus isn't taking off, IBM released the ISA version of VGA, which became the video card of choice for years to come.
Games:[]
Exclusive titles and Multi Platform games that started here:[]
- Bandits Phoenix Rising
- Battlefield
- Blaz Blue Calamity Trigger
- Codename Iceman
- Commander Keen
- Command and Conquer
- Dark Seed
- Darwinia
- DEFCON
- Descent
- Deus Ex
- Doom
- Duke Nukem
- Dune II
- EcoQuest
- The Elder Scrolls
- Elvira: Mistress of the Dark
- Elvira II: The Jaws of Cerberus
- Excelsior Phase One Lysandia
- Formula One Grand Prix aka World Circuit
- Gabriel Knight
- Gubble
- Half-Life series
- Havoc
- Homeworld
- Indianapolis 500: The Simulation
- Iron Seed
- King's Quest
- Laura Bow
- Leisure Suit Larry
- Monkey Island
- OpenArena
- Police Quest
- Populous
- Quake
- Quest for Glory
- Recettear an Item Shops Tale
- Rise of the Robots
- Sam and Max Freelance Police
- The Seventh Guest
- Shadow President
- Simon the Sorcerer
- Space Quest
- Star Control
- Starcraft
- Stunts aka 4D Sports Driving
- Super Tux Kart
- Syndicate
- Test Drive
- Unreal
- Uplink
- Warcraft
- Wing Commander
- Wolfenstein 3D
- You Don't Know Jack
Ports:[]
- Burger Time
- Cobra Mission
- Elite
- Lemmings
- Maniac Mansion
- Mega Man X
- Revolution X
- Sim City
- Sonic the Hedgehog CD
- Space Harrier
- Star Trek Text Game (as Video Trek 88, later EGATrek)
- Street Fighter IV
- Street Fighter X Tekken
- Thexder
- Fire Hawk: Thexder the Second Contact
- Wax Works
- X Men Children of the Atom
- Ys
- Zak McKracken and the Alien Mindbenders
- Zaxxon
- Zeliard
- ↑ an early RISC chip whose design was ancestral to the POWER architecture
- ↑ Also often called MDPA, because it also carried a parallel port for connecting a printer
- ↑ with just four colors and two hideously ugly palettes — red-green-yellow one didn't even have a true white, which is why cyan-purple-white was much more widely used, even if it was even more shitty-looking
- ↑ Mainly because, at 64 000 bytes, it fit neatly into the one 64K segment, by which x86 CPUs addressed their memory in "real mode" (the main one back in the DOS era), thus freeing the programmer from fussing around with the segment registers.