One of the first PC
s from IBM
- the IBM PC
IBM PC compatible refers to a class of computers which make up the vast majority of smaller computers (microcomputers) on the market today. They are based (without IBM's participation) on the original IBM PC. They use the Intel x86 architecture (or an architecture made to emulate it) and are capable of using interchangeable commodity hardware. These computers also used to be refered to as PC clones, and nowadays, just PCs.
The origins of this platform came with the decision by IBM in 1981 to market a personal computer as quickly as possible in response to Apple Computer's rapid success in the burgeoning PC market. In August 1981, the first IBM-PC went on sale. In licensing an operating system from Microsoft, IBM's agreements allowed Microsoft to sell MS-DOS for non-IBM platforms (the IBM version was called PC-DOS). Also, in creating the platform, IBM used only one proprietary component: the BIOS.
Columbia produced the first IBM PC compatible in 1982. Compaq Computer Corp. produced an early IBM PC compatible (which was also the first sewing machine-sized portable PC) a few months later in 1982 — the Compaq Portable. Compaq could not directly copy the BIOS as a result of the court decision in Apple v. Franklin, but it could reverse-engineer the IBM BIOS and then write its own BIOS using clean room design. Compaq became a very successful PC manufacturer, and was bought out by Hewlett-Packard in 2002.
Simultaneously, many manufacturers such as Xerox, Digital, and Sanyo introduced PCs that were, although x86- and MS-DOS-based, not completely hardware-compatible with the IBM PC. While such decisions seem foolish in retrospect, it is not always appreciated just how fast the rise of the IBM clone market was, and the degree to which it took the industry by surprise. Later, in 1987, IBM itself would launch the PS/2 line of personal computers which was only software compatible with the PC architecture; this was also hugely unsuccessful.
Microsoft's intention, and the mindset of the industry from 1981 to as late as the mid-1980s, was that application writers would write to the API's in MS-DOS, and in some cases to the firmware BIOS, and that these components would form what would now be called a hardware abstraction layer. Each computer would have its own OEM version of MS-DOS, customized to its hardware. Any piece of software written for MS-DOS would run on any MS-DOS computer, regardless of variations in hardware design.
During this time MS-DOS was sold only as an OEM product. There was no Microsoft-branded MS-DOS, MS-DOS could not be purchased directly from Microsoft, and the manual's cover had the corporate color and logo of the PC vendor. Bugs were to be reported to the OEM, not to Microsoft. However, in the case of the clones, it soon became clear that the OEM versions of MS-DOS were virtually identical, except perhaps for the provision of a few utility programs.
MS-DOS provided adequate support for character-oriented applications, such as those that could have been implemented on a minicomputer and a Digital VT100 terminal. Had the bulk of commercially important software fallen within these bounds, hardware compatibility might not have mattered. However, from the very beginning, many significant pieces of popular commercial software wrote directly to the hardware, for a variety of reasons:
- Communications software directly accessed the UART chip, because the MS-DOS API and the BIOS did not provide full support for the chip's capabilities.
- Graphics capability was not taken seriously. It was considered to be an exotic or novelty function. MS-DOS didn't have an API for graphics, and the BIOS only included the most rudimentary of graphics functions (such as changing screen modes and plotting single points); having to make a BIOS call for every point drawn or modified also increased overhead considerably, making the BIOS interface notoriously slow. Because of this, line-drawing, arc-drawing, and blitting had to be performed by the application, and this was usually done by bypassing the BIOS and accessing video memory directly. Games, of course, used graphics. They also performed any machine-dependent trick the programmers could think of in order to gain speed. Thus, games were machine-dependent—and games turned out to be important in driving PC purchases.
- Even for staid business applications, speed of execution was a significant competitive advantage. This was shown dramatically by Lotus 1-2-3's competitive knockout of rival spreadsheet Context MBA. The latter, now almost forgotten, preceded Lotus to market, included more functions, was written in Pascal, and was highly portable. It was also too slow to be really usable on a PC. Lotus was written in pure assembly language and performed some machine-dependent tricks. It was so much faster that Context MBA was dead as soon as Lotus arrived.
- Disk copy-protection schemes, popular at the time, made direct access to the disk drive hardware precisely in order to write nonstandard data patterns, patterns that were illegal from the point of view of the OS and therefore could not be produced by standard OS calls.
- The microcomputer programming culture at the time was hacker-like, and enjoyed discovering and exploiting undocumented properties of the system.
At first, other than Compaq's models, few "compatibles" really lived up to their claim. "95% compatibility" was seen as excellent. Gradually vendors discovered, not only how to emulate the IBM BIOS, but the places where they needed to use identical hardware chips to perform key functions within the system. Reviewers and users developed suites of programs to test compatibility, generally including Lotus 1-2-3 and Microsoft Flight Simulator, the two most popular "stress tests." Meanwhile, IBM damaged its own franchise by failing to appreciate the important of "IBM compatibility," when they introduced products such as the IBM Portable (essentially a Compaq Portable knockoff), and later the PCjr, which had significant incompatibilities with the mainline PCs. Eventually, the Phoenix BIOS and similar commercially-available products permitted computer makers to build essentially 100%-compatible clones without having to reverse-engineer the IBM PC BIOS themselves.
By the mid-to-late 1980s buyers began to regard PCs as commodity items, and became skeptical as to whether the security blanket of the IBM name warranted the price differential. Meanwhile the incompatible Xeroxes and Digitals and Wangs were left in the dust. Nobody cared that they ran MS-DOS; the issue was that they did not run off-the-shelf software written for IBM compatibles.
The declining influence of IBM
Since 1982, IBM PC compatibles have conquered both the home and business markets of commodity computers so that the only notable remaining competition comes from Apple Macintosh computers with a market share of only a few per cent. Meanwhile, IBM has long since lost its leadership role in the market for IBM PC compatibles (this may have had to do with the failure of other manufacturers to adopt the new features of the IBM PS/2); currently the leading players include Dell and Hewlett-Packard. Despite advances in computer technology, all current IBM PC compatibles remain very much compatible with the original IBM PC computers, although most of the components implement the compatibility in special backward compatibility modes used only during a system boot.
One of the strengths of the PC compatible platform is its modular design. This meant that if a component became obsolete, only an individual component had to be upgraded and not the whole computer as was the case with many of the microcomputers of the time. As long as applications used operating system calls and did not write to the hardware directly, the existing applications would work. However, MS-DOS (the dominant operating system of the time) did not have support for many calls for multimedia-hardware, and the BIOS was also inadequate. Varous attempts to standardise the interfaces were made, but in practice, many of these attempts were either flawed or ignored. Even so, there were many expansion options, and the PC compatible platform advanced much faster than other competing platforms of the time.
"IBM PC Compatible" becomes "Wintel"
In the 1990s, IBM's influence on PC architecture became increasingly irrelevant. Instead of focusing on staying compatible with the IBM-PC, vendors began to focus on compatibility with the evolution of Microsoft Windows. No vendor dares to be incompatible with the latest version of Windows, and Microsoft's annual WinHEC conferences provide a setting in which Microsoft can lobby for and in some cases dictate the pace and direction of the hardware side of the PC industry. The term "IBM PC Compatible" is on the wane. Ordinary consumers simply refer to the machines as "PCs," while programmers and industry writers are increasingly using the term "Wintel architecture" ("Wintel" being a contraction of "Windows" and "Intel") to refer to the combined hardware-software platform.
The breakthrough in entertainment software
The original IBM PC was not designed with games in mind. The monochrome graphics and very simple sound made it unsuitable for multimedia applications. That, and the fact that it was priced out of the entertainment market, made it seem unlikely that the PC platform would be used for games.
As the technology of the PC advanced, games started to appear for the PC. At first, these were inferior to the games for other platforms. Thanks to the modular design, the technology behind the PC advanced rapidly. What PC games lacked in multimedia capabilities, they made up for in raw speed. A few years later, VGA cards started to appear. These offered 256-colour graphics out of a palette of 262144. At around this time, sound-cards started to appear. They improved the beeping sounds of the PC speaker to give a more rich sound.
By the time the PC had superior hardware to the competing platforms of the time, it still was not taken seriously as a games machine. This could have been caused by the higher price, or the fact that video game consoles rather than personal computers were now starting to attract gamers, or it could have been that the hardware was very awkward to program for, and required the development of different drivers for all the multimedia hardware.
The PC platform did not manage to create a cult-following as the other platforms had done. At the time, there was a demo scene on the PC but it was small, did not appear until many years after the original IBM PC and demos were few and far between. The lack of a demo scene meant that there were few programmers who knew how to get the most out of the machine, so there were few PC programmers out there with the knowledge required to squeeze the full performance out of the machine.
One thing that PCs did have in their favour were raw processing power. This made them suitable for 3D games. The PC made a breakthrough as a games machine when Doom was released in 1993 thanks to its outstanding graphics and gameplay. Because networking hardware was widespread on PCs, Doom also offered multiplayer support accross a network. Few games offered that at the time. Doom finally established the PC as a games-machine.
Design-flaws and more compatability issues
When the PC was originally designed, even though it was designed for expandability, even the designers of the original IBM PC could not take into account the hardware-developments of the '80s. By the late '80s, IBM the creators of the IBM PC hardly had much say, and a lot of other companies were trying to push their standards.
To make things worse, IBM, Intel and Microsoft introduced several design flaws which created hurdles for developing the PC compatible platform. One example of such a design flaw was the DOS 640k barrier (memory below 640k is known as conventional memory). This was partly to do with the way IBM mapped the memory of the PC, and the memory-managment of DOS (which was the most widely used operating system) had a way of dealing with it that made things worse. In order to expand PCs beyond one megabyte, EMS was devised to allow access to the memory above 1 megabyte. However, once Intel released the 80286 processor, an alternative memory managment scheme was introduced — XMS. EMS and XMS were originally incompatible, so anyone writing software that used more than one megabyte had to support both systems.
Graphics cards suffered from their own incompatibilities. Once graphics cards advanced to SVGA level, the standard for accessing them was no longer clear. At the time, PC programming involved using a memory model that had 64KB memory segments. The standard VGA graphics modes used screen memory that fitted into a single memory segmet. SVGA modes required more memory, so accessing the full screen memory was tricky. Each manufacturer developed their own ways of accessing the screen-memory and even numbering the new graphics modes. This meant that the manufacturers needed to develop device drivers in software that allowed the SVGA modes to be used by a program that accesses the graphics-card at the driver level. Unfortunately, there was no standard for device-drivers that all manufacturwers followed. An attempt at creating a standard called VESA was made, but not all manufacturers adhered to it. To make things worse, the manufacturers' drivers often had bugs. To work around them, the application developers had to write their own drivers for the cards with buggy drivers.
Programming the PC was a nightmare. It put many hobbyists off, and may have been responsible for the slow take-off of the PC as a multimedia platform. When developing for the PC, a large test-suite of various hardware combinations was needed to make sure the software was compatible with as many PC configurations as possible. Eventually, a new memory-model was devised — DPMI. It offered a flat memory model and made life for programmers easier.
Meanwhile, consumers were overwhelmed by the many different combinations of hardware on offer. To give the consumer some idea of what sort of PC would be needed to run a given piece of software, the Multimedia PC standard (or MPC) was set in 1990. It meant that a PC that met the minimum MPC standard could be considered an MPC. Software that could run on a minimalistic MPC-compliant PC would be guaranteed to run on any MPC. The MPC level 2 and MPC level 3 standards were later set, but the term "MPC compliant" never caught on. After MPC level 3 in 1996, no further MPC standards were set.
The rise of Windows
The first version of the commercially available Microsoft product Windows, Windows 3.0, was seen as a massive change in the way users interacted with the IBM PC computer that was open to most users. IBM had already released OS/2 in 1987; which seemed to be a superior software product to Windows. However, like Betamax video tapes, because Microsoft had the market share of MS DOS and everyone had it preinstalled on their systems, Windows 3.0 was the GUI of choice for most people and OS/2 failed to catch on. Initially sitting on top of the MS DOS environment, and being slow and difficult to use, Windows 3.0 was hailed as the way forward for the majority of home users who owned a PC and Compatible.
Although Windows 3.0 was based heavily on Apple Computer's System 7 and IBM's OS/2, Windows 3.0 revolutionised the way the PC was operated by the user. In the past, users had typed in commands into the MS DOS interface whereas now they could intuitivly perform operations via a GUI and by also using icons. Windows 3.0 was followed by Windows 3.1 and eventually Microsoft, realising that users wanted to network their PCs, included standard network protocols into a newer 3.11 version.
Adding more and more features and standardised protocols and building on hardware support, Windows 95 was born. Before Windows 95, games and gaming were a totally MS DOS experience. Users had to tolerate rebooting into DOS, fiddling with memory (see the 640k barrier) and reconfiguring their PC every time they wanted to load a game. Windows 95 provided a system called DirectX which allowed programmers access to a standard API to perform video and sound card calls from Windows, revolutionising the games arena. For the first time, a PC programmer could benefit from Windows 95s memory management capabilities and extended functionality, and have API access to the graphics and sound cards - of which there were many versions and drivers. 3D graphics were possible from within Windows, (for those with 3Dfx cards) and now Network Multiplayer 3D graphics games were in the realms of possibility to almost every programmer.
Windows 95 was soon replaced with Windows 98 then with Windows Me (often thought of as a poor version of Windows for its memory problems and instability) and finally with Windows XP. A branch of Windows meant for servers and workstations, Windows NT and its successors Windows 2000 (Workstation and Server versions) and XP have also proved popular. Windows has dominated the desktop PC market, and almost every PC to be distributed by major manufacturers comes with Windows.
Challenges to Wintel domination
The success of Windows had driven nearly all other rival commercial operating systems into near-extinction, and had ensured that the PC was the dominant computing platform. This meant that if a manufacturer only made their software for the Wintel platform, they would be able to reach out to the vast majority of computer users. By the mid to late 1990s, introducing a rival operating system had become too risky a commercial venture. Experience had shown that even if an operating system was superior to Windows, it would be a failure.
However, a free operating system was being developed by enthusiasts - Linux. Because they were doing it for fun, they were not concerned with taking risks. Despite the fact that Microsoft programmers were programming for a living and the programmers working on Linux were programming in their spare time, Linux became a first class product. The sheer number of contributors to the Linux project allowed development effort comparable to that of the Microsoft programmers. After a couple of years, Linux had become a very powerful operating system and, because it was free, it spread widely.
By the late 1990s, Linux was being taken seriously. It was seen as an example of what could be achieved with the open source movement. While initially lacking in software and being incompatible with Windows, Linux did solve one of the main problems with Windows — stability issues. Despite this, Windows still remains the dominant operating system.
On the hardware front, Intel decided to licence their technology so that other manufacturers could make x86 compatible CPUs. In other cases, companies such as AMD and Cyrix produced alternative CPUs compatible with Intel's. Towards the end of the 1990s, AMD was taking a huge chunk of the CPU market for PCs and even ended up playing a significant role in directing the evolution of the 'x86 platform when its Athlon processors were released in 1999, two years before the comparable Intel Pentium 4 architecture was released.
DirectX, while solving many of the problems in programming the PCs, was only compatible with Windows. OpenGL, which was available for several platforms, was ported to Windows, and offered a means of rapidly developing cross-platform 3D applications.
The PC today
- main article is at personal computer
A modern PC'
. This is more fancy than the traditional beige box
cases used throughout the '90s and late '80s.
The original IBM PC is long forgotten and the term PC compatible is not used. The processor speed and memory are many orders of magnitude greater than they were on the original IBM PC, yet any well-behaved program for the original IBM PC that does not call the hardware directly can still run on a modern PC. Some say that the desire for backward compatibility might have hindered the development of the PC, but many believe the ability to run legacy software is what helped keep the PC alive.
The modular design makes it possible to choose every component of a PC from a variety of different manufacturers and to buy only what is needed for the tasks the computer is intended to carry out. Upgrades are easy. It is also possible to choose the operating system to run on the PC, and what software to run.
Software and compatability amongst different PCs and hardware compatibility is no longer a major issue. There are other platforms in existance today (mostly the Apple Macintosh), but they are a minority).
Thanks to intuitive user-interfaces and the information-gathering and communications capabilities of the Internet, the computer has finally escaped from the domain of computer-professionals and computer-hobyists, and has become mainstream.
The design of computer cases has become more elaborate and users can modify the cases themselves (this is known as case modding), but even so, the plain beige box case design that has been around since the 80's are still common.
There is a thriving demo scene, and a huge community of people willing to write free software.
A PC can come in one of the following configurations:
A computer that sits on the top of a desk. Portability is not part of the design, so the desktop computers tend to be too heavy and too large to carry. This has the advantage that the components do not need to be mimiaturised, and are therefore cheaper.
Not long after the first IBM-PC came out, Compaq produced the Compaq Portable — one of the first portable PC compatible computers. Weighing in at 28 pounds, it was more of a "luggable" than a "portable".
The portable computer evolved into the laptop. Unlike laptops, portable computers usually do not run on batteries.
A Laptop (also known as a Notebook) is a PC that has been miniaturised so that it is easy to carry and can fit into a small space. It uses a flat-screen LCD display which is folded onto the keyboard to create a slab-shaped object. Carrying a laptop around is easy, but this increased portability comes at a cost. To reduce size and mass, a special design is used with smaller components. These components are more expensive than regular components. The design is more integrated meaning that it is less expandable, although the RAM and the hard drive can be upgraded. Laptops are also battery powered, so as well as being smaller, the components need to have a low power-usage.
The Libretto 1100 one of many available "sub-notebooks"
Palmtops and Sub-notebooks:
In 1996, Toshiba produced the Libretto range of sub-notebooks (mini-notebooks). The first model (the Libretto 20) had a volume of 821.1 cm3 and weighed just 840 g! They were fully PC compatible (unlike PDAs). There were several models produced in the Libretto range. Librettos are no longer produced.
Over the years, there have been several operating systems for the PC: DOS, Windows, Linux, OS/2, BSD, BeOS, and others.
Buying a PC
Building a PC
- Build your own PC (http://www.pcmech.com/byopc/)
- http://www.buildyourowncomputer.net/ – Learn to build your own computer.
- My Super PC (http://mysuperpc.com/) – How To Build A PC - A Computer Building Guide.