[Next] [Up] [Previous]

The Later 16-Bit Era: The GUI Becomes the Standard

1984 is memorable for two advances. In that year, IBM brought out a larger model of their personal computer, the IBM Personal Computer AT, which used the more powerful 80286 processor.

The introduction by Apple of the Macintosh computer was even more significant, as it brought about a major fundamental change in the computer industry, by making the graphical user interface (GUI) a serious option for business and home computer users, rather than a novel laboratory curiosity.

The Xerox 8010 Star was announced as a commercial product in April, 1981. Since it predated even the Apple Lisa, and it was implemented with a custom bit-slice processor, despite the potenial attractiveness of a GUI - which, of course, was still a very novel concept at that time, the value of which was yet to be generally appreciated - it was too expensive to be successful.

In the image at right, the Star is shown at the bottom right along with other components of the Xerox 8010 Information System.

The Xerox Star was preceded by the Xerox Alto. This computer was built from TTL chips, and was an internal Xerox project, not sold as a commercial product. But it was not kept secret; for example, the September, 1977 issue of Scientific American had an article, titled Microelectronics and the Personal Computer which illustrated the features of the GUI used in that computer.

On the left is a later image, from a 1986 advertisement, of a display terminal connected to the Xerox Document System, in which the GUI is more plainly visible.


Another influential computer that provided a graphical computing environment was the LISP Machine, from Symbolics. Their first commercial product was the LM-2; this, though, had been intended to be superseded by the 3600, which did take place, although later than hoped due to delays in bringing the newer model to production. That goal, however, was reached by 1983. Pictured at right is an image of the terminal from a 3600 system from a 1983 advertisement, showing the face the system presents to the user.

From a 1986 advertisement, here are three models of the 3600 then available:

This image, from the cover of a 1982 manual for the Symbolics 3600, shows the original "Space Cadet" keyboard, as was used on the LM-2, but which apparently was not used, except possibly on some early deliveries, with the 3600:

Although from a technical standpoint, not having the extra symbols on the keys reduces the power and versatility of the system, from a marketing standpoint, it is clear that to a large proportion of potential customers, such a keyboard would be distinctly off-putting and intimidating.


The Apple Lisa computer had been introduced in January, 1983. Like the Macintosh, it had a graphical user interface, so when the Macintosh came out, it did not come as a total shock. And, as noted above, not only was the Lisa itself preceded by the Xerox 8010 Star, but as far back as 1977, information about the Alto project within Xerox had been made public.

The Lisa is pictured below, first in its original form, and then its appearance after it was changed to use standard 3 1/2" disks is shown.


Originally, it used modified 5 1/4" disks, which had two openings through which the magnetic surface could be accessed at the sides, instead of one at the end inserted into the computer.

The Macintosh, pictured at left, however, was far less expensive, and was thus something home users could consider. Both of these computers were based on the Motorola 68000 processor, also used for early Apollo workstations, the Fortune Systems 32:16 computer, and the computers in IBM's System 9000 series, discussed above, a laboratory data collection computer from IBM, all of which were quite expensive systems.

While the original Macintosh was quite limited by the fact that it only had 128 kilobytes of memory, this was soon remedied by the Macintosh Plus, which had 512 k of RAM instead.


Motorola eventually made the 68008 chip, a version of the 68000 that had an external 8-bit bus available, but too late for it to have been used in the IBM PC.

However, that was used in the Sinclair QL computer, announced on January 12, 1984.

The Sinclair QL, shown above, unlike the Macintosh, did not have a graphical user interface, although it did interact with the user by means of menus. As an inexpensive computer providing access to the power of the 68000 architecture, it was a very exciting product at the time, but its success was fatally hampered by two instances of one basic flaw. Instead of coming with floppy disk drives, it used a non-standard form of tape cartridge, and instead of offering a hard disk as an option, the option for larger storage was a solid state disk based on wafer-scale technology... which never materialized. Prospective purchasers, especially those outside of the United Kingdom, were naturally skeptical about the prospects of there being the needed support infrastructure to make the computer genuinely useful.

The Macintosh was famously announced with a television commercial that aired during the Super Bowl, which is the culminating game of American football, on January 22, 1984.


Another very important event happened in 1984: the first Phoenix BIOS became available. This enabled many companies without the resources to develop their own legal and compatible BIOS, to start making computers fully compatible with the IBM PC. Unlike the Compaq, which was sold at a premium price, these were often sold at lower prices, reflecting their new status as generic commodity products.

And soon after, the Award BIOS was available as a competitor to the Phoenix BIOS, and then came the American Megatrends BIOS, later versions of which are found on today's motherboards.

The government of the Republic of China (located on the island of Taiwan, by the name of which it is better known) even arranged to have a BIOS written for the benefit of that country's PC manufacturers, the ERSO bios, which is sometimes referred to as the DTK/ERSO bios since manufacturers of computers using it customized it with their own names, and DTK was one of the largest such manufacturers.

Thus, the floodgates were opened at this point, and many more people had the opportunity to own a computer that could run the same software as the IBM PC. The cover of the October 14, 1986 issue of PC Magazine, shown at right, illustrates the situation that eventually arose, if with some exaggeration for effect.


Another notable computer introduction from 1984 was that of the IBM PCjr. This was an attempt by IBM to make a lower-priced computer that would address the home market. It came with PC-DOS 2.1 which included some modifications to support this computer.

It had two cartridge ports below the standard 5 1/4" floppy disk drive in the case.

Naturally, it had some limitations of expandability compared to the original IBM PC. This would soon become a very good reason not to buy one, when the inexpensive imitations of the full IBM PC started to come out. But one reason advanced for not considering this computer was specious.

The photo shows the original keyboard of the IBM PCjr, with push-button keys. This was condemned as not being a "real" keyboard, and the example of the original Commodore PET was often cited.

The outcry was so loud that IBM eventually provided all the original purchasers of the computer with a revised keyboard with full size keys.

Computer columnists praised IBM for this remedial action, and praised the new keyboard for its improved feel.

I remember comparing the two keyboards in an IBM storefront at the time, and in my opinion, the tactile characteristics of the two keyboards were identical. Unlike the keyboard of the original Commodore PET, the original IBM PCjr had keys which had the same spacing and position as those of any other normal typewriter-style keyboard.

So, yes, you could touch-type on the original IBM PCjr keyboard.

Even so, why did IBM risk what ended up happening by not making the keyboard of the IBM PCjr conventional in every way? Was it just to keep it out of offices, so that it wouldn't compete with the more expensive real thing?

No. There was a reason behind the keyboard of the IBM PCjr, a reason which drew inspiration from a much more expensive IBM product, the IBM 2250 graphics terminal. The photograph at left of one example of that item came originally from an advertisement for Sikorsky Aviation, a maker of transport helicopters.

This is a type of graphics terminal that allowed moving graphic displays to be provided with relatively modest amounts of computer power. It had a cathode ray tube as its display element, but that tube worked like the CRT in an oscilliscope rather than the CRT in a television set. That is, instead of the electron beam in the tube tracing out a raster, a pattern of lines that covered the whole screen, with images being placed on the screen by varying the intensity of the electron beam, and hence the brightness of the current point on the screen, over time, instead the beam was directed to move over the screen along the lines of the image to be drawn, just as a pen would move on a sheet of paper when drawing an outline picture.

Terminals of this general type were usually provided with light pens. This input device served to indicate a point on the screen; they contained a lens and a photodiode or other device sensitive to light, and so a point on the screen was indicated by an electrical pulse coming from the light pen when the electron beam was at that point on the screen.

Light pens could also be used with raster displays; in the case of a vector display, to use a light pen to indicate a point in the blank part of the screen required the terminal to draw a cursor on the screen that followed the light pen from some starting point on which there was always something drawn.

Note, in the lower left corner of the image, to the left of a conventional terminal keyboard in the typewriter style, a square keyboard of round buttons, with 32 buttons in a 6 by 6 array with the corners omitted.

There was plenty of space between the buttons for a reason: so that a sheet of cardboard or plastic with holes cut out for the keys could have legends written or printed on it, so that the keyboard overlay could be used with programs that made use of the display. The top of such a sheet could have notches to indicate which sheet was placed on the keyboard, thus allowing it to be used to choose between many more than 32 possibilities.

While the PCjr didn't do the trick with the notches, overlays for its keyboard were made, so that PCjr software could make use of customized keyboard arrangements easily.



The IBM 2250 came in several forms.

The Model 1 was intended to be used with a dedicated IBM System/360 computer; it had the basic operator control panel, from which that System/360 computer could be started up or shut down built right in to it.

The Model 2 and Model 3 were controlled by an IBM 2840 Display Control - Model 1 of that display control for the Model 2 of the 2250, and Model 2 of that display control for the Model 3 of the 2250.

The Model 4 of the 2250 display was attached to an IBM 1130 computer, and the IBM 1130 could be used by itself for computing tasks involving the display, or it could be used to give the terminal more local processing power while it was connected to an IBM System/360 computer.

Control Data, the Digital Equipment Corporation, and Xerox also made terminals of the same general type as the IBM 2250. Much later, the Vectrex home video game, pictured at right, made use of the same principle.


Pictured below is Control Data's 240 series graphics terminal subsystem, also known as GRID, for Graphic Remote Integrated Display.

This image is from an advertising brochure by Control Data for the terminal. There is a high-quality color image on the Internet of what appears to be the same model, working with a light pen on an image of a globe on the screen. That picture even appears on the front cover of the book The Power of Go Tools for some reason; presumably, the language Go is well suited to working with graphics, even if this isn't the hardware it would run on.

The internal processor in this terminal, in the cabinet on the woman's right, is a processor with the same instruction set as the Control Data 160 computer.

The GT40 interactive display from the Digital Equipment Corporation used a smaller model of the PDP-11 to provide it with processing power; an image is shown below.

The interactive graphics terminals, and one video game, shown above benefit from the versatility of the CRT; it need not be used only as a raster display, but instead the electron beam can be deflected in the directions required to trace out outlines directly. There's no reason this can't be done inexpensively by a conventional magnetic deflection yoke, as found in a TV set, instead of by electrostatic deflection plates, as typically used for higher precision in oscilloscope displays.

But that is only one aspect of the versatility that the CRT offered. Another thing one can do with a CRT is choose different phosphors to use.

Thus, pictured above is a Tektronix 4006 graphics terminal, between a tape drive and a hardcopy unit.

This was a less expensive successor to the very popular Tektronix 4010 graphics terminal, and worked on the same principle.

Here, not only did the electron beam directly trace out the outlines of what was to be drawn, but it only needed to do so once. That's because the CRT was a storage tube, somewhat similar to the kind once used as a computer memory (the Williams Tube). In fact, it's because it is possible to read out what areas of a storage tube that are lit, so that it's usable as a computer memory, that the printer unit on the right of the image functions.

A modern design, of course, would simply use memory in the terminal to replay the characters instructing the terminal to draw an image, and the printer would then draw the image again; and, indeed, many raster graphics terminals emulated the Tektronix 4010 terminal, accepting the same commands to draw an image, but presenting the image at a somewhat lower resolution.

Other choices are possible. A high-resolution color display, for use in such applications as air traffic control, might use a beam penetration CRT; this would provide a limited gamut of colors by using two phosphors, applied in layers without a shadow mask or other similar provision within the CRT. Instead, the voltage difference between the cathode and the screen would be varied to determine if the green phosphor, or the red phosphor, or both for the display of yellow lines, would be excited by the electron beam.


Although the introduction of the Macintosh did bring about a more fundamental change to the computing scene than that of the IBM Personal Computer AT, since computer chips were getting more powerful from the start of the microcomputer era onwards, this computer, pictured at right, was still a very important milestone in microcomputer history.

The original IBM PC was based on the Intel 8088 chip, which, although it was a 16-bit chip internally, had an 8-bit external data bus. The 80286, however, like the 8086, had a full 16-bit data bus. Therefore, the slots in the IBM PC AT were designed for a new bus, but this bus was a compatible superset of the bus in the original IBM PC, so that older peripheral cards could still work in the AT.

Also, the keyboard of the AT was revised. As there were complaints about the placement of the | and \ key between the Z key and the left-hand shift key, this was changed on the AT; the arrangement was still not what I would consider ideal, as now there was a key between the + and = key and the back space key.

Another innovation of the IBM Personal Computer AT was the introduction of new, higher-density 5¼-inch floppy disks that could store 1.2 megabytes of data on them rather than only 360 kilobytes of data.

Later computers, based on the Intel 80386 chip, were able to use the same bus as the IBM Personal Computer AT, so its basic design remained an industry standard for some time.


The May and June 1984 issues of BYTE Magazine contained a series of articles by Steve Ciarcia on the construction of an add-in card for the IBM PC which was itself a microcomputer based on the Zilog Z8000 chip. This is another of the very limited possible ways one could use a Zilog Z8000 processor for computing.

The parts for assembling the board, or assembled Trump Cards, were available from Sweet Micro Systems, most famous for another of their products, the Mockingboard, a sound card for the Apple II computer with speech synthesis capabilities.

I haven't shown too many images of computers belonging to another very important category, the graphics workstation. On the right is an image of an Apollo DN300 computer; it used the DOMAIN operating system, and was based on a RISC chip; earlier Apollo workstations used the 68000 processor.


Also, 1984 saw one of the last exciting developments in the field of 8-bit computing.

Spectravideo came out with a pair of computers with both upper and lower case on their keyboards, still something of a rarity; below is pictured the more expensive model with a keyboard made of regular keys instead of buttons, the SV-328:

The keyboard has a very nice arrangement, and this computer inspired a Microsoft standard for 8-bit computers called MSX. (The Spectravideo, however, wasn't fully conformant with the standard as it was established.)

The MSX standard was most significant in the Japanese market, but some MSX systems were sold worldwide. One MSX computer that was memorable was the Yamaha CX5M, since it included a sound chip with capabilities similar to those of its popular FM-based synthesizers; it is pictured below:

Actually, though, January 1985 also was the month of another exciting development in the world of 8-bit computing. That was when Commodore announced the Commodore 128, shown below. This was a compatible successor to the Commodore 64 which also included a Z80 processor in addition to 128k instead of 64k of memory. A version of CP/M for this computer could be purchased separately. And, if one used a monitor, the computer did have an 80-column text output, so there wasn't a compatibility issue similar to that of the 56-column Osborne I. However, the use of a disk drive derived from the 1541 disk drive design would have impaired performance compared with more typical CP/M systems.

The fall of 1984 saw the introduction of the Mindset computer. This computer was an IBM PC-compatible computer which used the 80186 chip from Intel. In addition, it used custom graphic circuitry similar to that which would later make its appearance in the Amiga.

This computer was not a success, and quickly disappeared from the marketplace. The reason for that, of course, is not hard to guess. Although it was innovative, prospective customers would have seen little reason to pay extra for graphics capabilities that would not be likely to be used by any software they would find available, since there was no reason to expect that programs written for the standard set by the IBM PC would not vastly outnumber those written specifically to take advantage of the additional features of the Mindset.

At least the later Amiga, based on a 68000 processor, but completely incompatible with the Macintosh as well as the IBM PC, was assured of having some software written specifically for it, partly because, while it cost more than an Atari ST, it still was less expensive than the Macintosh, and so software writers had a reason to expect that enough people would own this computer to create a market for software for it.

Thanks to Microsoft providing the DirectX specification for video drivers to video card manufacturers (and we can also give some credit to SGI for OpenGL), graphics cards for IBM PC compatible computers also have a market; those people who which to play games on their IBM PC compatible computers. The Mindset didn't have the benefit of a situation like that either, where a software layer allowed people extending the graphics capabilities of their IBM PC compatible computers in other ways to use software that also would work on the Mindset, using its graphics capabilities.


The Final Stand Against the Clones

The Atari 520ST was introduced in June, 1985, although there was a delay of a month before it became widely available; it was based on the 68000 microprocessor, and offered a graphical user environment by licensing GEM Desktop from Digital Research. It was considered to be an inexpensive alternative to the Macintosh, and it was also significantly less expensive than the Amiga, although that was partly because it lacked the special graphics chips that distinguished that computer.

However, the Atari ST finally, unlike the Sinclair QL, put the 68000 within the reach of the ordinary computer enthusiast in an unproblematic manner, thus putting an end to the 8-bit era, although, of course, its success was still relatively modest, given the explosion of PC clones that would come in the next year.

The Atari 520ST had 512 kilobytes of RAM.

Later models in the series included:

The Atari 1040ST, introduced in 1986, with one megabyte of RAM.

The Atari Falcon030, introduced in 1992, which used a 68030 processor.

The Commodore Amiga was introduced on July 23, 1985: it had a 68000 as its processor, but as that processor, powerful as it was by the standards of the time, was augmented by special graphics and sound chips, the Amiga had multimedia capabilities which were not available on the x86 and Windows platform until years later. Although less expensive than a Macintosh, its success in the market was limited, but it lasted until Motorola stopped making chips with the 680x0 architecture. In fact, after its apparent demise, a German company acquired the rights to the system, and successors to the Amiga are still made by that company and others to this very day, but these are mainly of interest to enthusiasts, however much the machine might deserve to be a mainstream computing alternative on its intrinsic merit.

Of course, "amigo" is the Spanish word for "friend", and amiga is the feminine of that. While the intent behind the name was to present the computer as a friendly computer in as positive a way as possible, one could easily imagine a wag noting that its name designated what a typical computer nerd really wanted and didn't have.

Although the IBM PC included a socket for an 8087 floating-point coprocessor, nether the original Atari ST nor the original Amiga included a socket for the 68881, Motorola's floating-point coprocessor for the 68000.

Subsequent models in the series included:

The Amiga 2000, announced on January 1987, a version of the Amiga 1000 with internal expansion slots, for which the Video Toaster was made, famously employed for special effects in the Babylon 5 television series.

The Amiga 500, also announced on January 1987, a less expensive version of the Amiga in a compact case.

The Amiga 3000, introduced on June 1990, which used a 68030 processor.

The Amiga 4000/040, introduced on October 1992, which used a 68040 processor. The Amiga 4000, in both of its versions, included an improved graphics chipset. Also, it used ATA hard disk drives instead of SCSI hard disk drives.

The Amiga 4000/030, introduced on April 1993, which used a 68EC030 processor.


The image at left is a publicity shot of the Commodore 800, from an item in a magazine published by Commodore itself noting its announcement. This announcement took place after the Amiga was available, although work had begun on this computer before the Amiga was concieved.

It included a "blitter" chip of its own, which perhaps served as the prototype that inspired the design of the later one which was included with the Amiga.

Earlier rumors about this computer, designed in Germany, had claimed that it would be available at first only in Europe, although this was not mentioned in the official announcement from which this image is taken.

This computer would have used the Coherent operating system. Coherent was a single-user operating system which was designed to resemble UNIX. However, there was to be a multi-user server version of the Commodore 800 available, in addition to the single-user workstation pictured here. Whether Coherent also developed a multi-user version of their operating system by the time the Commodore 800 was introduced, or some other consideration was involved, I do not know.

Had the Commodore 800 reached the market, it would have been another of the very few computer systems made available which used the Zilog Z8000 16-bit microprocessor. As it is, only 50 prototypes were made; of whatever few of them may survive, a few of them are in the hands of lucky YouTube creators, who have made videos about this mysterious computer system.


Since we're discussing the Amiga and the Atari ST, an important historical note is in order:

Commodore Business Machines was founded in 1953 by Jack Tramiel (a Holocaust survivor from Poland) as a typewriter repair shop; it then began manufacturing typewriters from parts made in Czechoslovakia, and then gradually moved to calculators and eventually to computers.

It competed intensely on price, but it still made conventional products. This was unlike what Clive Sinclair was doing over at his company: making radically low-priced items, either computers or calculators or portable televisions through paring away frills - sometimes to the point of making products that failed because they were regarded as unusable. (Although Commodore strayed into that territory at least once, since the VIC-20 displayed lines of text which had only 22 characters.)

But in January 1986, Jack Tramiel resigned from Commodore, and formed a new company, with the aim of making a new computer system; soon after, that company bought the assets of Atari from Warner Communications.

That is why the Atari ST was less expensive and less ambitious than the Commodore Amiga, instead of the other way around.

Incidentally, work on the Commodore 800 was begun when Jack Tramiel was still at Commodore.



In September 1985, Compaq brought out the Compaq Deskpro 386, which used Intel's new 386 microprocessor. An image of the original 16 MHz Compaq Despro 386 is shown at right; it is from an advertisement, and two other earlier Compaq computers are shown in the background to illustrate that company's record of advancing the state of PC compatible technology. This computer was considerably faster and more powerful than an IBM Personal Computer AT, and, with appropriate software, it could make use of more memory, which, of course, was a natural consequence of using Intel's new 80386 chip. This was the first system which was commercially available that made use of that chip. Of course, it was expensive, but as the years went by, prices of systems based on the 80386 chip came down; as well, a compatible chip with a 16-bit external bus, the 80386SX, was offered by Intel starting in 1985, which allowed even more affordable systems to use the capabilities that the 80386 offered over the 80286.

The 80286 had offered a protected mode which allowed the use of 24-bit addresses instead of 20-bit addresses, but this was at the cost of giving up compatibility with older programs. Thus, there was an operating system resembling UNIX for the 80286 processor, but it did not become popular.

The 80386 offered 32-bit addressing, and a Virtual 8086 mode which enabled computers to offer advanced multi-user operating systems without giving up the ability to run older programs for the IBM PC. This led to the versions of Microsoft Windows that required a 386 becoming the standard, and it made Linux for the IBM PC a possibility as well.


One memorable clone of the IBM PC, from 1985, was the MacCharlie from Dayna Communications. This device consisted of a PC clone computer that included a base extending to the left on which a Macintosh computer would sit, and a keyboard extension that fitted over the keyboard of the original Macintosh. The two computers were connected, so that they could exchange data, and so that PC programs running on the MacCharlie could display on the screen of the Macintosh computer to which it was attached.

I must apologize for the poor quality of my efforts to make an image of the computer out of a two-page ad, with a slice missing between the two pages. There was a single-page ad with a smaller image I had also found, but the image in that one seemed to me to be too small to use.


Another group of computers from 1985 that I would like to mention is the Tektronix 6000 family of intelligent graphic workstations. One of its members (I am not sure which) is pictured below:

The Tektronix 6120 Intelligent Graphics Workstation is the one I have seen most often referenced, but the series also included the 6130 and other models. These computers are notable because they were among the few computers to use the 32016 and 32032 processors from National Semiconductor.

The 32016 was originally called the 16032; initial production of that chip was beset by bugs which were eventually corrected; that chip, and its successor, the 32032, were not successful, as they were outshone by contemporary chips such as the 68000 and 68020 respectively. A later chip, the 32532, was much more competitive, but by that time, the market had turned to RISC chips, and as well, interest in the architecture had been lost. However, the handful of systems which used the 16032 and its successors was larger than that which used the Zilog Z8000.

A Sun workstation from 1985, the Sun 2/50, is shown at right; this workstation was powered by a Motorola 68010 microprocessor, thus making it more powerful than the home computers using the 68000, formerly used in workstations, of that time. This therefore predates the introduction of the SPARC series of RISC microprocessors designed by Sun and used in their later workstations, starting from 1987.


1985 was also the year in which DELL, founded by one Michael Dell, produced and sold its first PC compatible computer. That company was highly successful, and produced and sold servers as well as personal computers.

The IBM PC Convertible was announced on April 2, 1986.

Because it was from IBM, it was possible to fit it with 3 1/2-inch disk drives, despite the fact that the standard for the IBM PC at the time was the 5 1/4-inch disk drive. This style of 3 1/2-inch disk drive, with the sloping segment of the front, would appear again a year later, with the introduction of the IBM Personal System/2, and at that point, the 3 1/2-inch drive would become the new standard for computers derived from the IBM Personal Computer.

Its name came from the fact that the liquid crystal display could be removed, so as to allow desk space to be better organized, when the computer was being used with an external monitor.


On September 15, 1986, Apple announced the Apple IIgs; this computer used the WDC 65C816 chip, which had been introduced in 1983. It was a chip that was compatible with the very popular 6502 processor, but unlike that 8-bit chip, it was a 16-bit processor that could switch from operating as a 6502 to operating in its own native 16-bit mode.

The "gs" in Apple IIgs was presumed to stand for the Granny Smith variety of apples, although it was not identified with that officially by Apple; instead, it was said to stand for Graphics and Sound, as the Apple IIgs also included custom chips on its motherboard to give it extra graphics and sound capabilities.

The Apple IIgs included software in ROM that applications could call to handle graphics functions. Applications specifically for the Apple IIgs could and did include features such as drop-down menus, normally associated with computers having graphical user interfaces. The computer came with a mouse. And it came with the program MouseDesk which gave it a graphical user interface not unlike that of the Macintosh.

Yet, for some reason, what I remember from the time of its introduction was that either Apple specifically avoided allowing the Apple IIgs to compete with the Macintosh, or at least this was widely claimed, which would seem to be disproven by the inclusion of MouseDesk with the IIgs.

Further searching allowed me to learn that MouseDesk was originally developed by a French company, and it worked on the Apple IIc and Apple IIe computers. Apple purchased the software, and came out with new versions of it. It was renamed the Apple II DeskTop when used as the initial graphical interface for the Apple IIgs, even though I saw an early Australian brochure for the IIgs that still called it MouseDesk.

This explains why other sources refer to the GUI for the Apple IIgs as Finder; this apparently replaced Apple II DeskTop. However, it was the Toolbox, included with the machine in ROM, that was the Application Programmer's Interface (API) used for writing programs for the Apple IIgs that fit in with the GUI, and so replacing Apple II DeskTop with Finder didn't mean that the authors of software for that computer had to start over with new versions of their programs. Had that been the case, the IIgs would have failed resoundingly, and the cause would not have been a mystery.

Regardless of what Apple may have intended, the Apple IIgs did not remain a long-lasting presence on the computer scene, with the market segmented so that home and educational users would buy the less expensive Apple IIgs, while businesses used the Macintosh. No successors to the IIgs were made by Apple, as it focused instead on the Macintosh.

Did the Apple IIgs fail in the marketplace? Windows 3.1 wasn't around yet (see below), but PC clones, with all their software, even without a GUI, and the Atari ST and the Amiga, which had both a GUI and a 68000 processor, competed with it. Also, while an Apple IIgs could run Apple II software much faster than the Apple II, apparently it was a common complaint that software written specifically for the Apple IIgs, which tried to offer nearly the same features as equivalent Macintosh software, was annoyingly slow.

Focusing on the Macintosh was really the only response to this situation open to Apple; the only way to remedy the issue would be to prevail on the Western Design Center (which company was connected to MOS Technology, the makers of the original 6502 which Commodore owned) to design a faster compatible successor to the 65C816, and it's not as if, for example, Commodore ever tried to make a successor to the Commodore 64 and Commodore 128 that used this chip, so the mass market required to make that possible didn't exist.


1986 was the year that the first Packard-Bell personal computers reached the market. The company that made the Packard-Bell 250 and Packard-Bell 440 computers discussed earlier in this history was acquired by Teledyne back in 1968, and only the name was purchased for the new company that made those computers.


The image above is from Wikimedia Commons, licensed under the Creative Commons Attribution 3.0 Unported License, and is thus available for your use under the same terms.

Its author is Bilby.

In January 1987, Apple made a revised version of the Apple IIe available for those who were still faithful to that computer, no doubt because of the large amount of software available for it. This revised Apple IIe had a numeric keypad, with the keyboard arrangement matching that of the detached keyboard of the Apple IIgs. It is pictured at left.

It is usually known as the Platinum Apple IIe, because its color scheme was also changed, but this name was applied informally by users, rather than having been given to it by Apple.


On March 2, 1987, Apple announced the Macintosh II. This computer used the 68020 processor, and came in a conventional desktop computer form factor, as can be seen from the image at right, with the CPU and floppy drives in a box, with a separate monitor. The keyboard was separate also, as in the original Macintosh, but added a numeric keypad; the standard design for its keyboard is shown below:

The image at right shows a unit with an optional version of the keyboard which closely resembled that of the IBM PC in layout.

The Macintosh SE, in the form factor of the original Macintosh, but with somewhat different styling, announced on the same date, came with a 68000 microprocessor, but it could be ordered with an optional card installed in it which upgraded it to a 68020 microprocessor like that used in the Macintosh II.



March 1987 was also the month that the Amiga 2000, pictured at left, was introduced. This computer still used the Motorola 68000 processor, but it did have improved expansion capabilities compared to those of the original Amiga 1000.

The most notable contribution of the Amiga 2000 to the reputation of the Amiga was due to the ability to use it with a third-party product, the famous Video Toaster, from NewTek.

I was able to find the image above of the Video Toaster itself in an advertisement; I have retouched this image, and the shadow of the card as depicted in the bottom right of the image may not be accurate.

Although it was announced at the 1987 World of Commodore exposition, it was not until December 1990 that it was released as a commercial product. It gave the Amiga the ability to do video editing, including superimposing graphics generated on the computer on a video stream.

Lightwave 3D software for generating three-dimensional graphics and animation was included in the package with the Video Toaster.

The science-fiction series Babylon 5 made use of the Video Toaster for its special effects; however, even initially, they used several Amigas networked together to generate their digital animations, and later they upgraded to a more sizable render farm built around Pentium PCs and DEC Alpha computers.

It was in 1987 that Gateway 2000 entered the field of PC compatible manufacturing; for a time, the firm was quite successful, but eventually, after a period of decline, it was acquired by Acer in 2007.

On September 19, 1988, Apple announced the Macintosh IIx, similar to the Macintosh II in appearance, which used the 68030 processor.

Note how this compares with the Atari TT, introduced in 1990, the Amiga 3000, introduced in June 1990, and the Sharp X68030, introduced in March, 1993.


The IBM PC was a spectacular success for IBM, even if, ultimately, the popularity of less expensive compatible computers led to Microsoft and Intel ending up making more money from its success than IBM. Before it left the consumer PC business on May 1st, 2005, selling it off to Lenovo, it had attempted to deal with the issue through the introduction of Personal System/2, announced on April 2, 1987, which in addition to moving to the 3 1/2" floppy disk (previously introduced for the IBM PC Convertible) from the 5 1/4" floppy disk, and introducing a smaller keyboard connector, as well as a similar mouse connector (previously, mice had to be connected via a serial port), introduced the Micro Channel Adapter bus for which a higher rate of royalties would be charged as it embodied a considerable amount of new technology.

A Personal System/2 Model 50 is pictured at right, and a Personal System/2 Model 80 is pictured at left. The Model 80 was included in the initial line-up of PS/2 systems, and was the first 386-based computer in IBM's personal computer line. Note that, as befits an expensive and powerful machine, it is not only in a tower case, but it has extensions on its sides so as to keep it from tipping over.

The initial PS/2 line-up consisted of the following systems:

The market, however, saw little reason to subscribe to a more proprietary standard which did not offer obvious advantages. OS/2 was announced shortly before the PS/2, but Windows was already in existence, and IBM had allowed Microsoft to be involved with OS/2 which it abandoned in favor of Windows NT. Had IBM, instead of Microsoft, been the company that developed the standard GUI for the IBM PC, of course, it would have succeeded in diverting the gravy train its way. However, to develop Windows, Microsoft had to license key interface elements from Apple, and either Apple or IBM might have balked at a similar relationship between those two companies instead.

Thus, for a time, some competing PC-compatible computers included slots for the EISA bus. This stood for Extended Industry Standard Architecture, the ISA bus being the 16-bit bus of the IBM Personal Computer AT. The EISA bus offered similar functionality to IBM's MCA bus. Eventually, Intel devised the PCI bus, which became the standard still in use to the present day.


The extent to which IBM regarded the success of the Personal System/2 as vital might be considered to be evidenced by the fact that it spared no expense on the advertising campaign for it. Thus, an early advertisement for the PS/2 series showed this photo of excited staff members at some office opening up their new PS/2 computers... but rather than being portrayed by anonymous models, they were portrayed by well-known actors who had appeared in a very popular television series.

From left to right, in this image we see William Christopher, Harry Morgan, Jamie Farr, Gary Burghoff, Wayne Rogers, Loretta Swit, and Larry Linville. These were, of course, all actors who starred in the very popular television series M. A. S. H., which ran for ten consecutive seasons on CBS, from the fall of 1972 to just before the spring of 1983. It is considered to have been, in many respects, the most popular and successful television series of all time.

Incidentally, Alan Alda did also appear in photographs in IBM advertisements of the time; he appeared by himself in an advertisement for the AS/400 where he is shown watering a plant - the plant grows larger as a company grows larger, and changes from a small AS/400 computer to a larger one.

In case you're wondering, there was also a three-page version of this advertisement, in addition to the two-page version shown above. In the two versions of the ad, a different portion of the larger potted plant after watering is lost in the gutter between pages. So, no, I did not have a mole inside IBM's advertising agency.

Incidentally, from the accounts I've read of the production of the television show M. A. S. H., it has been claimed that there was some friction between Gary Burghoff and several other cast members, but not anything similar concerning Alan Alda. Also, Jamie Farr appeared on his own as well, in an advertisement for ASCII terminals made by IBM. So I would presume Alan Alda's absence from the large group advertisement for the PS/2 is due to decisions made by IBM or its advertising agency.



June, 1987 was a historic month for computing. This was the month in which the Acorn Archimedes computer made its debut. Shown below is an Acorn Archimedes A410/1 computer.

This image from the Wikipedia Commons was graciously put into the public domain by Paul Vernon.

After their success as the supplier of the BBC Micro, Acorn Computers ran into problems due to the lack of a 16-bit computer in their lineup. They decided on a radical strategy to leapfrog the competition, designing a RISC processor with an architecture of their own. In an early demonstration, it outperformed an 80386-based computer.

Of course, this architecture became very successful later on, as the one used almost universally in smartphones and tablets, even if, at the time, bringing out a computer based on a unique processor architecture was a very bold step.

In the fall of 1987, the SHARP Corporation of Japan brought out their X68000 computer, which was based on a Motorola 68000 CPU (actually, they used chips second-sourced by Hitachi) and which had additional graphics hardware, pictured at right. This computer could be thought of as their (belated) answer to the Atari ST and the Commodore Amiga.

A number of models of this computer were available over the next few years with improvements and advancements, including the X68030 which, as its name indicates, used a 68030 CPU.

These computers were not available outside Japan.

One important reason why this computer is of special significance is that on April 1, 2000, the Sharp Products Users' Forum was successful in making arrangements for Sharp, Hudson (the makers of Human68k, the operating system for the X68000), and several other contributing companies to release both the BIOS ROMs of the SHARP X68000, its operating system and windowing environments, and its SX-Window C compiler suite into the public domain. Nothing similar has happened in the case of the Amiga or the Atari ST, which are seen to still be of commercial value despite being obsolete.

Another computer introduced in 1987, which recieved a lot of favorable publicity at the time of its introduction, was the Canon Cat. It used a Motorola 68000 as its processor, but its speed was at least somewhat reduced by using the FORTH interpretive language for its operating system.

Its designer was Jef Raskin, who had worked on the original Macintosh at Apple. His ambition with the Cat was to produce a computer that was very easy to use, but which achieved this goal without going to the elaborate length of providing a graphical user interface. It had the unusual feature of treating all the user's documents as parts of one single large text file, on the theory that it would be easier for the user to understand searching for text than a directory structure.

It was not successful, and to me, the reason is obvious. It may have been much simpler than the Macintosh, but that simplicity was not reflected in a price so low that prospective customers did not demand more in terms of power and versatility from a computer in that price range than it offered. However, those who admire the Canon Cat have other possible reasons for its failure to point to: possible pressure from Apple, which was a major Canon customer at the time, and infighting between divisions of Canon over which one should have reponsibility for this system.



In the year 1988, Ardent brought their Titan supercomputer to market. This system is pictured at right. It was built from MIPS 2000 microprocessor chips, which gave it a conventional instruction set, plus custom circuitry that performed all the floating-point arithmetic, and also allowed it to function as a vector supercomputer, resembling the Cray I and its successors, at a much lower price.

On the left, from a 1988 advertisement, is an IBM RT workstation. This appears to have used the multi-chip processor which preceded the PowerPC line of single-chip microprocessors; there was a choice of a 170 nanosecond or 100 nanosecond cycle time (5.88 or 10 MHz) and floating-point chores were handled by a Motorola 68881 math coprocessor.


In 1989, Apple responded to popular demand by bringing out the Mac Portable computer. This computer was somewhat large for a laptop computer. It used sealed lead-acid batteries from Gates to operate when not plugged in, rather than the more common nickel-cadmium batteries. Presumably, this was because it had heavier current demands from being largely based on the design of the desktop Macintosh.

As it was expensive, it had limited success in the market.


Below is illustrated a workstation from DEC, the DECstation 2100, from 1989. This RISC workstation was based on the MIPS chip, as DEC's own RISC chip, the Alpha, would only be introduced later, in 1991, and DEC would introduce a line of Alpha-based workstations in 1993.

In 1990, IBM came out with the Personal System/1 line of computers. One member of that line was an all-in-one computer which had a configuration reminiscent of the original Macintosh, even if it looked quite different with IBM styling. Pictured at left is another member of that lineup, with a conventional desktop form factor, along with the software that came with it. Note that the keyboard is not a Model M, but is designed to save space around the edges.

As the PS/1 was aimed at the home market, it is sometimes referred to as the successor to the PCjr. The PCjr, however, while essentially compatible with the IBM PC, was still limited in a number of ways compared to the IBM PC, being considerably stripped down to permit it to be made at a lower cost, while the systems in the PS/1 lineup were genuine PC-compatible computers.


In April, 1992, Microsoft offered version 3.1 of their Microsoft Windows software to the world. This allowed people to use their existing 80386-based computers compatible with the standard set by the IBM Personal Computer to enjoy a graphical user interface similar to that of the Macintosh, if not quite as elegant, at a far lower price. One major advance over version 3.0 was the addition of support for the TrueType font standard, licensed from Apple.

There was a Microsoft Windows 1.0, and there were a Microsoft Windows 2.0, and a 3.0 as well, of course. The first version of Microsoft Windows required all the windows that were open to be tiled on the screen, rather than allowing overlapping windows as on the Macintosh and the early Xerox machines that pioneered the GUI, and this was generally seen as a serious limitation by reviewers at the time. Windows 3.0 was promoted by an arrangement that allowed Logitech to include a free copy with every mouse that they sold. (When Windows 3.1 came out, Logitech actually sued Microsoft, because it had decided that Windows 3.1 was good enough that it no longer needed to be promoted in that way. Needless to say, their suit was unsuccessful in the courts.)

It was Windows 3.1, however, that enjoyed the major success that led to Windows continuing the dominance previously enjoyed by MS-DOS. The major factor usually credited for this is that Windows 3.1 was the first version to include TrueType, a technology licensed from Apple, thus allowing it to be used for preparing attractive documents on laser printers in a convenient fashion, with the ability to see the fonts being used on the computer's screen, just as had been possible on the Macintosh.

Except for TrueType, Windows 3.0 (May 22, 1990) already offered most of the features of Windows 3.1 that made it reasonably useful. And it included a Reversi game, which was dropped from Windows 3.1. It was with Windows 2.1 (May 27, 1988), which was distributed both as Windows/286 and Windows/386, that, in its Windows/386 form, that some of the important features of Microsoft Windows that the Intel 80386 architecture first made possible were first introduced.


A brief note on digital vector fonts might be in order here.

Apple developed the TrueType format, which allowed the curved portions of character outlines to be represented by quadratic spline curves, as an alternative to licensing a digital font format that was already in existence and wide use at the time, Adobe's Type 1 fonts, which used Bezier curves.

The Adobe Type 1 font format, however, was not the first digital font format in existence. One which preceded it was the Ikarus font format, developed by Peter Karow. This format is still supported by the program IkarusMaster within the FontMaster font utilities package from Dutch Type Library (DTL).

And Donald Knuth devised METAFONT, which instead of describing characters in terms of outlines, described a center line to be drawn with an imaginary pen nib which was also described. This accompanied this TeX typesetting program project.

But the granddaddy of all the electronic outline font formats was devised by Peter Purdy and Ronald McIntosh back in the 1960s for the Linofilm electronic CRT typesetter. This format used Archimedian spirals instead of Bézier curves, and thus was less sophisticated than what would come later. This is because the Archimedian spiral was an obvious and mathematically simple line of varying curvature that could substitute for the draftsman's French curve.


On this page, I've focused on the dramatic early days of the microcomputer, when many different companies made their own incompatible computers. Particularly after Windows 3.1 made an adequate GUI available for users of PC-compatible computers (with a video card, and a processor, more advanced, of course, than what the original PC offered), so that the Macintosh wasn't the only alternative if you wanted a GUI, the market largely settled down to multiple makers of similar "clone" PCs, with the only real competition between Intel and the few other chipmakers that were licensed to make x86 processors, such as Cyrix at one time, and AMD today.

But the Macintosh still retained a presence. For a time, Steve Jobs parted ways with Apple, and for a time he was offering his own new computer, the NeXT, which was built around BSD Unix. That computer had a monochrome high-resolution display, on which four gray-scale levels were available, and is pictured below.

Initially, the NeXTcube shipped with a copy of Mathematica, which definitely tempted me to run out and buy one if I could have afforded it!

It might be noted that the NeXT was introduced in 1988. It was not until 1992 that Windows 3.1 was introduced, and not even until 1990 that Windows 3.0 was introduced; at the time, the version of Windows available was Windows/386, which had not yet achieved massive popularity. The Amiga was still a popular and viable platform. So, while the computers descended from the IBM PC were definitely dominant, this dominance had not yet reached the absolute nature that it has today.

Below is an image of two powerful computers offered in 1992 by Silicon Graphics: the Indigo 2 workstation, and the Challenge server.

The Indigo 2 was available with a range of MIPS processors, three speeds of the R4000, or an R4400. Later related models used the R8000 or the R10000. These processors were used in the Challenge servers (or supercomputers) as well.

These computers ran IRIX, Silicon Graphics' licensed version of UNIX.

It wasn't just specialized companies like Sun and Apollo that made powerful workstation computers. Major computer manufacturers also entered this market; for example, IBM made workstations based on PowerPC chips.

The Model 735 Apollo workstation pictured at left, also from 1992, used the PA-RISC microprocessor, Apollo having been acquired by Hewlett-Packard.


In September, 1992, IBM announced the ValuePoint series of computers. A model from that series is pictured at right.

In 1993 (at least in the United States domestic market; it began a year earlier in Europe) a line of computers was introduced by a company named Ambra, billed as An IBM Corporation. This was IBM's way of making a line of computers that would attempt to compete with the low prices of clone systems while distancing that from their brand.

These computers encompassed a range of performance; at left is pictured the top-of-the-line model, which could optionally come with two Pentium processors.



On March 14, 1994, Apple announced the first Macintosh computers which used microprocessors based on the IBM PowerPC architecture instead of 680x0 microprocessors.

Another important event in Macintosh history was the short-lived Macintosh OS 7 licensing program. From early 1995 to mid-1997, other companies were permitted to license the ROMs of the PowerMAC and Macintosh OS 7, to ship those with compatible systems they manufactured.

Two of the companies which took advantage of this offer were PowerComputing and UMAX. An early system by PowerComputing, possibly the Power 100, is shown at left, and the Supermac S900 (the brand Supermac was obtained from Radius, a maker of Macintosh-compatible periperals, for the United States; in other countries, UMAX used the brand name Pulsar for this series of computer) is shown at right.



On September, 1994, IBM announced the Aptiva; this line of computers replaced the PS/1, but included models with a wide range of performance levels despite being aimed at the home market. One model is pictured at left.


In 1988, 2.88 megabyte floppy disks became available. Not many computers used them, but the magnetic media used had the capacity to store much more than 2.88 megabytes of data, with the accuracy with which it was possible to position the read and write heads of a floppy drive being the limiting factor.

As a result, a class of disk drive known as the "floptical" drive was developed, where one surface of a disk was coated in the same way as the disk in a 2.88 megabyte disk, and the other surface had a printed pattern on it for an optical sensor which would guide the magnetic read and write heads for the other side of the disk.

The most well-known product in this category was the Zip drive by Iomega, pictured at left, which was introduced in late 1994.

A year later, Iomega brought out a removable hard drive, called the Jaz, pictured at right.



The last version of the Commodore Amiga to be released was the Amiga 4000. Initially, it was a desktop machine with a 68040 CPU; later, getting an Amiga 4000 with a 68030 CPU became an option, and, as well, a tower version, pictured at right in an image graciously released into the public domain by Kaiiv through Wikipedia (which I have slightly retouched), with the 68040 CPU was available. This version had extra ports as well as one extra expansion slot.

As the A 4000 T was the best possible Amiga, despite continuing to be manufactured by Escom after the departure of Commodore, it is now a rare collector item sold at premium prices.


June 1998 was the month in which the storied Digital Equipment Corporation was acquired by Compaq. Compaq itself would later be acquired by Hewlett-Packard in May, 2002.

At the time Steve Jobs returned to Apple, its market share had sunk to a perilously low level. While he brought about many substantive new features to the Macintosh over time, since something was needed quickly to revive interest, he began with something that many would regard as trivial: a new Macintosh with different visual styling.

Of course, it is the original iMac to which I refer. The initial model, pictured at left, was in a color called "Bondi Blue", after the waters off of Bondi Beach in Australia. It was released on August 15, 1998.

Shortly after, the iMac was available in five colors, Orange, Lime, Strawberry, Blueberry, and Grape, shown in the image at right.

This bought Apple time, and saved it from the brink, but it invited a degree of derision from the PC camp.

The very first iMac used a 233 MHz PowerPC 750 processor, manufactured by both Motorola and IBM, and was thus designated a G3 system by Apple; later versions of the CRT-based iMac used faster versions of the processor. Apple continued to use the iMac brand name for LCD all-in-one computers using Intel x86 processors, and now at the present day uses it for computers using their own ARM chips. It had been necessary for Apple to switch from the Motorola 68000 architecture to the Power PC when Motorola discontinued development of that architecture in 1994.

The last version of the Atari ST was released in 1992, and discontinued very shortly after its release; and Commodore went bankrupt as the result of the failure of their CD32 computer, a more compact version of the Amiga, in 1994. Thus, the demise of the 68000, although it complicated (but did not completely prevent) attempts by enthusiasts to revive the Amiga architecture, was not to blame for the demise of these alternatives in the market.

Of course, as far back as the days of the original Fat Mac, when you couldn't just easily and cheaply buy a 128K Mac, and then later add memory to bring it up to 512K, and as recently as 2019, where the only Macintosh, the Macintosh Pro, that you could open up and put peripheral boards in was the top-of-the-line one at a price of $5,999, the fact that the Macintosh wasn't a computer you could upgrade yourself outside of very narrow limits, and sold at a significant price premium, left typical PC users scratching their heads.

The closed and restrictive nature of the App Store for the iPhone and iPad, and the appearance that Apple was emphasizing those products, and moving away from the Macintosh, also did not help matters.

There were and are people who are devoted to Apple products, and find PC-derived computers and Android smartphones to be far inferior. But Apple's products seem to be niche products, rather than being for everyone; budget-conscious consumers on the one hand, and technically-oriented enthusiasts who want to have control and freedom on the other both have reason to be less than enthused over Apple's products.

And, yet, how can one offer a premium-quality computing experience without doing much of what Apple is doing?

The fundamental problem, that the availability of third-party software is critical to the value of a computer, which leads to nearly everyone jumping on the bandwagon of the most popular machine, hence eliminating alternatives, doesn't seem to be solvable.

Of course, one other alternative survives in addition to the Macintosh. Linux.

When the Yggdrasil Linux CD became available at the end of 1992, the same year that Windows 3.1 came out, it became possible for ordinary people to actually try Linux for themselves.

Nowadays, of course, there are distributions in CDs on the covers of magazines, and it can be easily downloaded from the Internet. But back then, downloading a large operating system like Linux over a dial-up modem would not bear consideration.

Even more so than the Macintosh, however, Linux is too large a topic for me to adequately discuss here.



At left is pictured an Internet appliance, the Netpliance I-Opener, which was offered for sale at $99 in late 1999. This was less than its cost of manufacture, but this was expected to be recouped as the device could only be used with an Internet service provided by its makers for $21 a month.

This did not work out as planned; hackers found a way to transform the device into a full-fledged computer, and the company mismanaged how it responded to this, leading to problems with the Federal Trade Commission. Of course, though, it is not a unique practice; video game consoles that can only use their manufacturer's game cartridges, and even inkjet printers that can only use ink supplied by their manufacturer are also often sold at a price involving an initial loss.

Be that as it may, this device is noteworthy for having on its keyboard a key with the ultimate special function, long sought after by programmers burning the midnight oil in intense coding sessions.

This is not an April Fool's joke. It really did bear, on its keyboard, a "Pizza" key!!

Of course, the key did not cause a pizza simply to materialize, nor did it directly initiate the preparation or cooking of one from stocks in one's refrigerator. Instead, it did something that was entirely technically feasible: it caused the web page for Papa John's pizza to come up, that company having paid a sponsorship fee that made an additional contribution to defraying the low cost of the computer.

None the less, I am inclined to view this as a legendary moment (at least, of a sort) in the history of computing.


It was at the 2005 WWDC, which began on June 6, 2005, that Apple announced that it would switch from using the PowerPC chip in Macintosh computers to Intel chips with the x86 architecture. As both the 68000 and the PowerPC were big-endian, while the x86 is little-endian, this led to some concern about a need for changes to the formats of files containing data.

The eventful year of 2005 was also the one in which IBM withdrew from the personal computer business, selling their business in that area to Lenovo.

It was on November 10, 2020 that Apple announced the M1 processor, first in a line of chips based on the ARM architecture and designed by Apple itself, and the transition of the Macintosh from Intel processors to these new chips.


The history of the microcomputer up to this point has been mostly about individual microcomputers. The microchips within them, of course, played an important role as well, such as the Intel 8086, the Motorola 68000, and so on, but it was natural to discuss those chips in the context of the computer systems that used them.

While it would still be possible to continue a discussion of individual models of computer systems from Apple beyond this point, and some individual models from other makers are also of note, such as the Sony VAIO UX to be discussed on a later page, in general, a new chip would not be associated with any one particular computer system the way the 80286 was associated with the IBM Personal Computer AT in a world dominated by generic computers, built with motherboards designed to specifications supplied by Intel and AMD.

Thus, the history of the microcomputer continues on from this point as primarily a history of chips instead of a history of systems. Of course, particularly when one thinks of the experience faced by the user, it is also a history of successive releases of Microsoft Windows and other operating systems.

Another Exciting Development

The IBM PC, from 1981, was mentioned above. It had a socket on the motherboard for the Intel 8087 floating-point coprocessor. The floating-point format that was ultimately embodied in the IEEE 754-1985 standard was, of course, originally devised by Intel for its 8087 co-processor for their 8086 microprocessor; the 8087 itself was announced in 1980.

Since it looked like that format was going to become the standard, however, other manufacturers implemented it in their computers even before the 8087 came out, let alone before the IEEE-754 standard was finalized and officially adopted. One example of this was the IBM System/38 computer, announced on October 24, 1978, and pictured below.

The IBM System/38 was the predecessor of the IBM AS/400 line of computers, and IBM's current IBM i operating system for its POWER servers (formerly, it sold POWER servers designed for direct use of the POWER PC instruction set separately from those intended as "System i" machines).

It was based on the technology developed for IBM's Future Systems (FS) project. This project became known to the public in May 1973, as a consequence of disclosures in the Telex vs. IBM antitrust trial. Originally, it was expected that this project would replace the IBM System/370 and change the direction of the industry; notably, industry pundit James Martin gave seminars on how to plan for this large change.

IBM abandoned the plan to switch to FS in 1975, but continued developing one of the machines in that series, and this was what became the System/38.

Conclusions

The story of aids to computation, from its earliest beginnings, isn't the kind of story that would lead to conclusions; rather, it is a saga of human achievement.

But the story of the microcomputer, being a time of intense competition, with the rise and fall of many computer products and companies, would seem to be a testbed for what works and what doesn't. Are there conclusions to be drawn from that period?

In some ways, to me, the microcomputer era illustrated principles that were well-known from the mainframe era that preceded it. Home users, buying a computer for personal use, strange to relate, just like the managers who oversaw contracts for mainframe systems costing hundreds of thousands of dollars, sought to avoid vendor lock-in, to have an upgrade path, to have software availability. Joe Consumer was no fool, and where the same basic principles applied to small computers as to large ones, they were recognized.

Of course, though, while there were some basic commonalities, there were also huge differences between the two markets.

And there were major transitions during the microcomputer era.

There was the technical shift from 8-bit computers to computers that were 16 bits and larger. There were two instances where an upgrade path of a sort was offered across this huge technical gulf: the IBM PC, where the 16-bit 8086 architecture had so much in common with the previous popular 8080 of the 8-bit era that many applications from CP/M were ported to the IBM PC, and the Apple IIgs, which used the WDC 65C816 chip, allowing a new 16-bit mode to be added to the 8-bit 6502-based Apple computer.

There was a gradual shift from computers that came with a BASIC interpreter, for which users wrote a lot of their own programs, to computers that mostly relied on the use of purchased software; this shift was then made complete by the transition to computers with a GUI, like the Macintosh, or to using Microsoft Windows on the PC platform.


From my point of view, one of the saddest things that happened was when the 68000 architecture was abandoned. At the time, it was thought that it would be easier to keep up with the technical development of the successors of the 8086 if one could design around a simple RISC design, as the PowerPC was, instead of the 68000, which, like the 8086, was CISC. But this broke software compatibility, and so while the Macintosh survived the transition, the Atari ST and the Amiga did not.

And the PowerPC ended up as only one RISC architecture among many; unlike the 68000, which for a time was the one obvious alternative to the 8086.

So there is no longer a real battle for the desktop; only AMD can compete against Intel - at least, as far as most of us are concerned. Now that Apple has abandoned the Intel Mac for Apple Silicon, based on the ARM architecture, indeed, there is another ISA in use on the desktop, but the Macintosh is a product with a premium price. ARM chips are available from other suppliers... for use in smartphones. While Apple has proven that the ARM architecture is also suitable for more heavyweight CPUs, there is no third-party off-the-shelf supplier of that kind of ARM chip because there aren't really any prospective purchasers of such a chip out there.

The SONY PlayStation 3 video game used the CELL processor, which included a main CPU with the PowerPC architecture, and so video games are an obvious potential market for a powerful processor with an alternative ISA.


Lessons can be learned from the individual stories of computers that succeeded for a time, compared to those that failed quickly. Several computers that were designed to give the computer maker the additional profits from controlling the market for software for those computers failed, because that reduced the value of those computers to their purchasers.

The IBM PS/2, which offered no real benefit for transitioning away from an existing standard to one that was more proprietary, did not work out well for IBM, even if it did lead to the transition of the PC in general to the 3.5" floppy and to a smaller keyboard connector and the use of a similar connector for the mouse.

The Macintosh and the HP 9845, each in their own way, showed that a product certainly could be a success on the basis of sheer innovation. And the Commodore 64 demonstrated that offering a reasonably good product at an excellent price was still a formula for success as well. But the biggest success, that of what is often termed the "Wintel platform", taught the most dramatic lesson: the importance of having the value of a computer multiplied by a large pool of available software.

Since tablets and smartphones, unlike desktop personal computers, were a new market that initially wasn't saturated, that was where the excitement was, with many pundits viewing it as foolish to expect any more real change or progress in the desktop PC except for the gradual improvement coming from technical progress. There's too much software for Windows for any new computer to come along and generate interest, so the war for the desktop is over.

While that may not necessarily be certain, I definitely have to admit that I know of no obvious way for anyone to start a business making a new, incompatible, desktop personal computer that can somehow overcome that obstacle. The one thing that might have some hope of success would be to address a niche market with something well suited to it.


When I recently added a mention of the HP 9845 computer to this site, since some referred to it as the first workstation computer, I asked myself if I had overlooked any other computers that were very influential.

But which computers were the most influential?

During the microcomputer era, the most obvious computers in that category would be the IBM PC, for setting the standards still followed today, and the Macintosh, for establishing the importance of the GUI.

Prior to the microcomputer era, the most influential computers would appear to have been the IBM 704, the IBM System/360, and the DEC PDP-11.

Some other computers can be identified as runners-up as well.

The IBM 305 RAMAC introduced the hard disk drive.

The PDP-8 established that there were a great many individuals who would want to purchase a computer if it were at all possible, which meant that later, when it became possible to make a computer like the Altair 8800, it was realized there would be a market for it; and, with DECtape, showed that even a lesser substitute for a hard disk drive as a random-access storage device would be useful and desirable.

The floppy disk drive was invented by IBM to load the System/370 Model 135 and Model 145 with microcode whenever they were turned on, prior to booting up (Initial Program Load, or IPL). System/370 was a revision of the System/360 that was announced on June 30, 1970; Model 155 and 165 were included in that initial announcement. Model 145 was announced on September 23, 1970, and model 135 was announced on March 8, 1971.

The Altair 8800 launched the microcomputer revolution. Microsoft BASIC was developed for it, and when a floppy disk was offered for the Altair 8800 as a peripheral, CP/M was developed for it by Digital Research as well. As Microsoft MS-DOS was based closely on CP/M, this meant that the Altair 8800 had a very big influence on the later IBM PC.

And CP/M, in turn, was very similar to OS/8, an operating system for larger PDP-8 systems that included a hard disk drive.

So the most influential computer systems were well-known, as one might expect, rather than obscure or forgotten.


[Next] [Up] [Previous]