Printer Friendly

External economics and economic progress: the case of the microcomputer industry.

One candidate is Apple. With its effective compromise between technology and marketing--a combination typical of prime movers--Apple helped define the personal computer and staked an early dominant position. That position was soon eroded, of course, by IBM and the standard it created--also a situation not atypical for prime movers. IBM might be another candidate. It also combined adequate technology and marketing, aided by its enormous reputation, which was probably the only internal economy of scope IBM brought to bear in the making of the original PC. But IBM's position in the microcomputer industry, though arguably dominant, is scarcely oligopolistic, and it is certainly not dominant in the way most of Chandler's prime In a passage that mocked the neoclassical theory of competition as much as the anti-business sentiments of non-economists, Joseph Schumpeter singled out the large business enterprise as the dominant source of economic progress in modern times.

As soon as we go into the details and inquire into the individual items in which progress was most conspicuous, the trail leads not to the doors of those firms that work under conditions of comparatively free competition but precisely to the doors of the large concerns--which, as in the case of agricultural machinery, also account for much of the progress in the competitive sector--and a shocking suspicion dawns upon us that big business may have had more to do with creating that standard of life than with keeping it down.(1)

The reason for the dominance of the large-scale enterprise lies, in Schumpeter's view, in the superior ability of large firms to generate technological and organizational innovation.

It is not surprising that business historians should be sympathetic to Schumpeter's argument. Indeed, Alfred D. Chandler, Jr., the dean of present-day business historians, paints a similar picture of the large firm as an engine of progress. In Chandler's story, however, the large enterprise comes across less as a generator of innovation than as an "institutional response" to innovation and growth whose superiority lies in its ability to create massive internal economies of high-volume production. "The visible hand of management," Chandler writes, "replaced the invisible hand of market forces where and when new technology and expanded markets permitted a historically unprecedented high volume and speed of materials through the process of production and distribution." More recently, Chandler has extended his analysis to British and German industry, concluding that it was the "large enterprises that were most responsible for the economic growth of the world's three largest industrial nations |and that~ have provided a fundamental dynamic or force for change in capitalist economies since the 1880s." And William Lazonick has proposed a theory of the entrepreneurial firm that connects Chandler and Schumpeter. In this view, the large collective enterprise supplants a decentralized market system in an act of innovation; but the innovation necessarily consists in the creation of capabilities within the organization that yield the potential for massive economies of large-scale production and distribution.(2)

Schumpeter, Chandler, and Lazonick are all arguably working within an approach that is coming to be called the dynamic capabilities theory of business organization.(3) Unlike the neoclassical theory of industrial organization, this approach is concerned not with the efficient allocation of known resources but with the ways in which social institutions and organizational forms generate (and sometimes fail to generate) economic growth. The centerpiece of the analysis is the concept of economic capabilities--embodied in human and organizational knowledge that enables business institutions to produce goods and services. The issues are comparative ones. Which forms of business organization are most effective in creating economic capabilities and, withal, economic growth? Chandler's historical cases focus on the creation of internal capabilities.(4) Building such organizational capabilities required an investment in the capital equipment necessary for high-volume production. It meant investing in regional, national, or international networks of marketing and distribution. And it also meant turning over the reins of management to a hierarchy of salaried professionals. This, essentially, is the model that Lazonick projects into the future. He believes that to prosper nations must take advantage of substantial economies of scale in major industries, and that this requires a high degree of centralized coordination to overcome market deficiencies.(5)

Yet there is another important tradition in economics that sees the sources of economic growth in a slightly different light. Though never denying the importance to economic progress of internal economies, Alfred Marshall and his followers also highlighted the systemic interactions among a large number of competing and cooperating firms. For Marshall, such interaction could yield "external economies" that play an important role in economic progress quite in addition to that played by the economies internal to particular business organizations.(6) As Lazonick suggests, economic progress requires the development of economic capabilities. But all such capabilities need not reside within the boundaries of the organization, however generously defined. Some important economic capabilities--including perhaps the capability of generating certain kinds of technological and organizational innovation--can reside within a network of interacting firms whose primary--if by no means exclusive--nexus of coordination is the price system.

I do not wish to argue Marshall against Chandler. Indeed, to see any one set of business institutions as universally superior under all circumstances is a peculiarly ahistorical--not to say historically false--view.(7) It is clear, as Marshall certainly understood, that some types of innovation take place more readily within the organizational structure of a firm. I myself have argued elsewhere for "dynamic" transaction-cost explanations of vertical integration, in which the difficulties of coordinating some types of innovative activity across market boundaries can make internal organization a cheaper alternative. And Paul Robertson and I have explored one important case in which innovation--and rapid declines in product price--took place within the framework of internal economies and large-scale production: the moving assembly line and the Ford Model T.(8) But we also found episodes in the history of the automobile industry in which the existence of a variety of competing firms spurred innovation and even forced some vertical disintegration on the large firms. Moreover, a number of other cases come to mind in which rapid progress--rapid declines in product price and improvements in product quality--took place within a highly disintegrated structure.

One of the most striking examples of this phenomenon is the microcomputer industry.(9) By looking in some detail at the history of this industry, I hope to be able to shed some light on the circumstances under which economic growth proceeds through the generation of external rather than internal capabilities.(10) For it is certainly clear that this industry did not--and does not--fit the Chandlerian model.

The Microcomputer: A History

Antecedents and Sources * The microcomputer is a product that came out of nowhere, at least in the sense that established firms initially misunderstood its uses and underappreciated its importance. But no product springs Athena-like, full-blown from the head of Zeus. The microcomputer, in hindsight, is the child of two technological traditions: the mainframe and minicomputer industries and the integrated circuit industry.

Although the history of computers dates back at least to the mechanical tinkerings of Charles Babbage in the nineteenth century, the electronic digital computer was the product of the Second World War.(11) In November 1945, J. Presper Eckert and John W. Mauchly of the Moore School at the University of Pennsylvania produced the ENIAC (Electronic Numerical Integrator and Computer), the first all-electronic digital computer, under contract with the U.S. Army. The machine took up 1,800 square feet, boasted 18,000 tubes, and consumed 174 kilowatts. Collaboration with the mathematician John von Neumann led a few years later to the idea of a stored-program--that is, a programmable rather than a special-purpose--computer, an approach called the von Neumann architecture and used almost universally today in computers of all sizes. By 1951, Eckert and Mauchly had joined Remington Rand, where they produced the UNIVAC (Universal Automatic Computer), the first commercial computer using von Neumann architecture. By 1956, the lead in computer sales had passed from Remington Rand to the International Business Machines Corporation (IBM). Unlike its erstwhile competitors, including electronics plants like General Electric (GE) and Radio Corporation of America (RCA), IBM's strengths lay in the production of mechanical office equipment. These capabilities proved useful in the manufacture of all-important computer peripherals like printers, tape drives, and magnetic drums.(12) IBM cemented its dominance with a bold move in the 1960s. Betting on a high non-military demand for computers, and pushing its advantage in production costs, IBM introduced the 360 system. This was a family of computers and peripherals from which buyers could tailor a configuration suited to their needs. All pieces of the system, including software, were internally compatible but were proprietary with respect to the systems of other manufacturers. The idea of a proprietary system became a hallmark of the industry, and IBM's attempts to prevent third parties from selling so-called plug-compatible peripherals led to a famous antitrust case.(13)

By the 1960s, computers had become fully solid state--that is, they used transistors rather than tubes. Nonetheless, a computer with significant power remained physically imposing. More important, computers were imposing in their ways. They required large trained staffs of operators and programmers, and access to the machines was typically guarded closely. As late as the early 1970s, most users communicated with their mainframe via punch cards laboriously typed out and fed in. But the technology for smaller, easier-to-use computers was at hand, and a new industry seized on it. In December 1959, a two-year-old company called Digital Equipment Corporation (DEC) unveiled the prototype of the PDP (Programmed Data Processor)-1.(14) A commercial extension of the early interactive solid-state computers on which DEC's founders, Ken Olsen and Harlan Anderson, had worked at MIT, the machine sold for $120,000, contained 4K bytes of memory, was the size of a refrigerator, and included a cathode ray tube (that is, a television-like video display) built into the console. This was the first commercial minicomputer. In 1964, DEC introduced the PDP-6, the first commercial product designed to support a network of interactive users on time-sharing terminals. Like IBM, DEC built its strategy around a proprietary family of machines--the PDP and later the VAX (Virtual Address Extension) lines--with allied peripherals and software. Also like IBM, DEC became highly integrated vertically and not only assembled equipment, but also manufactured many of its own inputs, from semiconductors to equipment cases, and handled its own sales. Among other firms that entered the minicomputer market were Scientific Data Systems, Data General (founded in 1968 by defectors from DEC), Prime Computer, Hewlett-Packard (HP), Wang, and Tandem.(15)

For reasons explored later in this article, however, the minicomputer did not lead directly to the microcomputer. Nevertheless, technological advance did of course pave the way for that development. In particular, the invention of the integrated circuit created a trajectory of miniaturization, culminating in high-density memory chips and microprocessors--the heart of the microcomputers.

In 1958, a decade after three scientists at Bell Labs developed the transistor, Fairchild Semiconductor developed the planar process, a way of making transistors cheaply.(16) Initially, this process was used to make single, or discrete, semiconductors. But within a few months, Jack Kilby at Texas Instruments (TI) and Robert Noyce at Fairchild had taken the next logical step, etching several transistors--an entire circuit--into silicon.(17) The early integrated circuits set the paradigm for the development of the semiconductor industry, and technological change has taken the form primarily of process improvements leading to increased miniaturization and lower production costs.

There are two types of integrated circuit crucial for the microcomputer. One is the random-access memory (RAM) chip, which allows the computer temporarily to remember programs and other information. With each new stage in miniaturization, the price of memory (in dollars per K) has declined steadily, a result not only of the miniaturization itself but also of learning-by-doing economies in production.(18) The microcomputer industry has thus certainly benefited from one important external economy that is internal to a related industry.

The other type of integrated circuit is, of course, the microprocessor. In 1969, a Japanese calculator manufacturer asked Intel, a Fairchild spin-off firm, to design the chips for a new electronic calculator.(19) Marcian E. Hoff, Jr., the engineer in charge of the project, thought that the Japanese design was too complicated to produce. Influenced by the von Neumann architecture of minicomputers, he reasoned that he could simplify the design enormously by creating a programmable chip rather than the single-purpose device that the Japanese had sought. The result was the Intel 4004, the first microprocessor. One-sixth of an inch long and one-eighth of an inch wide, the 4004 was roughly equivalent in computational power to the ENIAC. It also matched the power of a 1960s IBM computer whose central processing unit (CPU) was about the size of a desk.(20) The 4004 processed information in 4-bit words, that is, four bits at a time. In 1972, Intel introduced the 8008, the first 8-bit microprocessor. This design was later improved and simplified to create the Intel 8080 in 1974. Capable of addressing 64K bytes of memory, the 8080 became the standard 8-bit microprocessor and was widely produced by second sources. It was also the device at the center of the earliest commercial microcomputers. Other important early microprocessors included the Zilog Z80, an improved but compatible version of the 8080 built by an Intel spin-off, and the Motorola 6800, an 8-bit device of a different--and many argue superior--design.

The Hobbyists, 1975-1976 * It is conventional to date the beginning of the microcomputer at January 1975, when that month's issue of Popular Electronics carried a cover story on the MITS/Altair computer.(21) MITS (Micro Instrumentation Telemetry Systems), run out of an Albuquerque, New Mexico, storefront by one Ed Roberts, began life making remote-control devices for model airplanes and then entered the electronic calculator business in time to be engulfed by the price wars of the early 1970s. An electronics tinkerer familiar with the hobbyist market, Roberts decided to build a kit computer as a way to save his beleaguered enterprise. He persuaded his bank to loan him an additional $65,000 on the strength of the promised Popular Electronics cover, and he negotiated a volume deal for Intel 8080 microprocessors--$75 apiece instead of the usual $360 each. The machine he and his coworkers put together was little more than a box with a microprocessor in it. Its only input/output devices were lights and toggle switches on the front panel, and its memory was a minuscule 256 bytes (not kilobytes). But the Altair was, at least potentially, a fully capable computer. Like a microcomputer, it possessed a number of "slots" that allowed for expansion--for additional memory, various kinds of input/output devices, and so forth. These slots hooked into the microprocessor by a system of wires called a "bus"; the Altair bus, which came to be known as the S-100 bus because of its one hundred-line structure, was the early industry standard of compatibility.(22)

Roberts sold the Altair for $379 stripped down and in kit form. Even though it could do almost nothing without add-ons--none of which was yet available--the machine sold beyond all expectation. MITS was besieged with orders after the Popular Electronics article appeared and found itself unable to ship in any volume until the summer of 1975. The company concentrated on getting the base model out the door, postponing development of add-ons.(23) MITS managed to ship about two thousand machines that year.(24) Although the Altair's impoverished capabilities did not deter buyers, it did give rise to two phenomena: third-party suppliers of add-ons and "user groups," organizations of hobbyists who shared information and software. The most famous group, the Homebrew Computer Club in northern California, actually began meeting before the Altair appeared. The third-party suppliers were also typically enterprising hobbyists: Processor Technology set up shop in a garage in Oakland, and Cromemco took its name from its founders' tenure in Crothers Memorial Hall, the graduate engineering dormitory at Stanford. The products that these firms supplied--such as memory boards--filled the gap left by MITS's tardy and low-quality add-ons.

In a sense, then, the Altair was quickly captured by the hobbyist community, and it became a modular technological system rather than a self-contained product. To accomplish anything, one needed not just the box itself, but also the know-how, add-on boards, and software provided by a large network of external sources. The network character of the microcomputer was fostered by Roberts's design decisions, themselves a reflection of hobbyist attitudes toward information-sharing. More important, however, the capabilities of MITS were tiny compared to those of the larger community; those larger capabilities were necessary to take full advantage of a product with such high demand and so many diverse and unforeseen uses.

The inability of MITS to meet demand led to the emergence not only of complementary activities but also of competitive ones. Within a few months of the introduction of the Altair, the microcomputer industry had its first clone, the IMSAI 8080.(25) An automobile dealer named Phillip Reed approached entrepreneur Bill Millard with the idea of computerizing automobile dealerships. Millard looked into using a microcomputer, but that proved too expensive. Alerted to the MITS/Altair, he approached Ed Roberts, who was unable to fill existing orders, let alone envisage volume sales (and a volume discount) to Millard. So, with the help of engineers Joseph Killian and Bruce Van Natta, Millard set about building an 8080 machine of his own.(26) Soon the computer eclipsed the automobile project that inspired it, and Reed was enlisted to help fund IMSAI Manufacturing. The company sold kits for what was essentially an improved version of the Altair and by 1976 began shipping assembled machines. IMSAI quickly outpaced MITS and became for a time the world's leading microcomputer manufacturer, selling 13,000 machines between 1975 and 1978.(27)

Unlike Roberts and most other figures in the early industry, Millard was primarily interested in business, not technology. He went where the winds of profit took him. To help market IMSAI computers, he and others got involved in the franchising of computer stores. This business--ComputerLand--soon came to dwarf the manufacturing operation and made Millard a millionaire many times over. IMSAI fared less well. Because of Millard's early unwillingness to take on financial partners, IMSAI collapsed in a cash-flow crisis, was bought out of bankruptcy by a couple of employees, and faded into oblivion. MITS did no better. Acquired by Pertec, a maker of peripherals for larger computers, the company withered and disappeared.(28)

Nonetheless, the early success of MITS and IMSAI cemented the popularity of the S-100 standard, especially among hobbyists, who were still the primary buying group. Indeed, proponents of the S-100 and the 8080--including Lee Felsenstein, ad hoc leader of the Homebrew Computer Club--felt that their standard had reached "critical mass" and that competing chips and buses were doomed.(29) Part of the reason was the availability of software. Before the Altair even appeared, Gary Kildall, founder of Digital Research, had written CP/M (Control Program for Microcomputers), an operating system for 8080/Z80 microcomputers.(30) An operating system is a master-of-ceremonies program that is especially important for controlling the computer's disk drives. The earliest machines typically used paper-tape readers and ordinary cassette recorders to retrieve and store programs. In 1972, IBM invented the floppy disk drive, and by 1973 Shugart was offering a relatively inexpensive model for 5.25-inch disks. Using a larger computer to simulate the 8080, Kildall wrote CP/M as a way to control such drives. He began selling it through advertisements in the Homebrew Club's newsletter and eventually licensed it to IMSAI for inclusion with all their machines.(31)

In short order, CP/M became the dominant operating system for microcomputers. Also using a simulation rather than the real thing, William Gates and Paul Allen, founders of Microsoft, wrote a version of the programming language BASIC (Beginner's All-Purpose Symbolic Instruction Code) for the Altair.(32) When Roberts tried to tie the sale of MBASIC--as the Gates and Allen version came to be called--to the purchase of inferior MITS memory boards, software pirates raised the Jolly Roger for the first time by copying one another's paper tapes.(33) This prompted a now-famous angry letter from Gates in the Homebrew newsletter. But Gates was bucking the tide. Free software, like other kinds of information-sharing, was part of the hobbyist ethic.(34) And software for CP/M machines proliferated.

The Industry Begins, 1977 * The predicted dominance of CP/M and the S-100 never materialized, however. In 1977, a little more than two years after the Altair's debut, three important new machines entered the market, each with its own incompatible operating system, and two based around a different microprocessor. The almost simultaneous introduction of the Apple II, the Commodore PET, and the Tandy TRS-80 Model I began a new regime of technological competition, moving the industry away from the hobbyists into an enormously larger and more diverse market.

In early 1976, Stephen Wozniak worked as an engineer for Hewlett-Packard. Steven Jobs did work on contract for Atari.(35) The two were college dropouts and electronics tinkerers whose previous major collaboration had been the fabrication and sale of "blue boxes" for making long-distance calls without charge (and illegally). Of the two, Wozniak was the gifted engineer. Like most members of the Homebrew Computer Club, he wanted a computer of his own, so he set about designing what became the Apple I. Because the Intel 8080 and its variants were too expensive, Wozniak turned to the 6502, a clone of the Motorola 6800, which he could get for $25 rather than about $175 for a 6800 or an 8080. (The 6502 was designed by Chuck Peddle, who had helped design the 6800 at Motorola, and was produced by Peddle's company, MOS Technology.) Wozniak wrote a version of BASIC for the 6502, then designed a computer. Instead of lights and toggles on the front panel, the machine had a keyboard and loaded from information stored on chips. It had 4K bytes of memory and could drive a black-and-white television. None of these capabilities was significant enough to draw much interest from fellow Homebrew members.(36) But friends asked for schematics, and Jobs became convinced that he and Wozniak could make money selling the device. They scrounged together $1,300 and set about assembling circuit boards in--yes--the garage at Jobs's parents' house.

Seeing a commercial future for the microcomputer, the pair went to their employers--Atari and HP--with the idea. Both were rebuffed. "HP doesn't want to be in that kind of a market," Wozniak was told.(37) So Apple Computer formed as a partnership on 1 April 1976. As Wozniak worked to refine the design, Jobs looked beyond sales to hobbyist friends. He persuaded Paul Terrell, owner of the Byte Shop--perhaps the first computer store and the progenitor of a chain--to order fifty Apples. Soon the pair acquired funding and a new partner in Mike Markkula, a former Intel executive. Apple Computer Corporation supplanted the partnership in early 1977. Meanwhile, Jobs enlisted the Regis McKenna advertising agency to represent Apple for a share of the sales revenue.

The Apple II made its debut at the First West Coast Computer Faire in spring 1977. The machine came in a plastic case with a built-in keyboard, could be expanded from 4K to 48K of memory, drove a color monitor, connected to a cassette recorder, and had a version of BASIC stored in a chip. The Apple also had eight expansion slots, the result of the hobbyist Wozniak's winning an argument with Jobs, who tended to see the computer as a narrowly focused product rather than as an open-ended system.(38) Although the Apple II was not necessarily the hit of the Faire, Apple kept a high profile and a professional appearance quite distinct from the hobbyist firms displaying their wares. Almost immediately, sales began to take off. The company took in $750,000 in revenues by the end of fiscal 1977; almost $8 million in 1978; $48 million in 1979; $117 million in 1980 (when the firm went public); $335 million in 1981; $583 million in 1982; and $983 million in 1983.(39) The lion's share of these revenues, especially in the early years, reflects sales of the Apple II.

What accounts for the Apple II's phenomenal success? Industry guru Adam Osborne believes that the machine was in fact "technologically inferior." People bought it, he wrote, "because they were not inconvenienced by its limitations. Technology had nothing to do with Apple Computer Corporation's success; nor was the company an aggressive price leader. Rather, this company was the first to offer real customer support and to behave like a genuine business back in 1976 when other manufacturers were amateur shoe-string operations."(40) Certainly Jobs's drive to create a successful company--not a technologically successful computer--had much to do with the firm's success. Moreover, Apple received good business advice from the venture capitalists who also helped bankroll the company.

In one area, however, technological superiority did help Apple out-distance its competitors. In 1977, tape cassette decks were still the standard for data storage. Floppy drives were available, but they required expensive controller circuits. In what all regard as his most brilliant piece of engineering, Wozniak designed a wholly novel approach to encoding data on a disk and a vastly simplified controller circuit. The design not only won him belated kudos from the Homebrew Club but, more important, it helped Apple beat Commodore and Tandy to market. "It absolutely changed the market," said Chuck Peddle, designer of the rival Commodore PET.(41) Another event that changed the industry--and helped Apple--was software. Daniel Bricklin, a Harvard MBA and former DEC employee, wanted to buy a microcomputer-like DEC intelligent terminal in order to develop a programming idea: the spreadsheet. Rebuffed by salespeople interested only in volume sales to businesses, he acquired an Apple II instead and created VisiCalc.(42) For a full year, the program was available only in an Apple version, allowing the company to make early inroads into the business market.

In the end, what made the Apple II so successful was its compromise between technology and marketing. Under Jobs's influence, the machine was compact, attractive, and professional in appearance. Under Wozniak's influence, it was elegantly designed, easy both to use and to manufacture. Compared with earlier hobbyist machines like the Altair or the IMSAI, the Apple II was an integrated and understandable product. Yet, thanks to Wozniak's slots, it was also still a system, able to draw on the large crop of external suppliers of software and add-ons that quickly sprang up. Indeed, Apple relied heavily on external suppliers for almost everything. Like the IBM PC a few years later, the Apple II was almost completely "outsourced." Apple president Mike Scott, who was in charge of production, did not believe in automated manufacturing and expensive test equipment. "Our business was designing, educating, and marketing. I thought that Apple should do the least amount of work that it could and that it should let everyone else grow faster. Let the subcontractors have the problems."(43) The company handled board-stuffing (attaching the chips to the circuit boards) on a putting-out system before turning to a contract board-stuffing firm in San Jose; Scott even used a contractor for the firm's payroll.(44)

The Commodore PET, also introduced at the 1977 West Coast Computer Faire, shared with Apple the 6502 microprocessor. This is not entirely surprising, as the designer of the PET was Chuck Peddle, who was also the designer of the 6502. Commodore, a maker of calculators, had acquired Peddle's company, MOS Technology. Jack Tramiel, Commodore's aggressive founder, believed strongly in internal sourcing, and he wanted his own chip-making capability.(45) In a corridor one day in 1976, Peddle suggested to Tramiel that Commodore get into the computer business. Tramiel agreed, and Peddle readied a design based on the KIM-1, a 6502 machine he had designed to train microprocessor engineers. The result was the Commodore PET, which had much the same capabilities as the Apple, including a keyboard built into the case. Commodore never rivaled Apple, however, in part because the company was slow in shipping reliable product. Tramiel's insistence that Commodore use only internal MOS Technology chips caused delays and put him at odds with Peddle, who finally prevailed. More important, Tramiel chose to aim at the low or "home computer" end of the market. Commodore was, however, the most successful protagonist in that market, besting the likes of Texas Instruments, Atari, and Timex.(46)

The third important entrant in 1977 was Tandy Corporation. In 1962 Charles Tandy took the retail chain started by his father in a new direction: he purchased a small Boston-based chain of electronics stores called Radio Shack. By the early 1970s, Radio Shack had come to dominate the market for retail electronics. Although the company had begun some production of its own in 1966, Tandy was initially wary of making microcomputers.(47) A Radio Shack buyer named Don French became enamored of the Altair and began developing the idea of a computer for Radio Shack. In December 1976, he received a tentative go-ahead, with the injunction to "do it as cheaply as possible." With the help of Steve Leininger, an engineer who had worked for National Semiconductor, French designed the TRS-80 Model I, which appeared in August 1977. The machine used a Z80 microprocessor, but it ran a proprietary operating system rather than CP/M. It was also slow, lacked lower-case letters, and was genuinely cut-rate in other ways. Nonetheless, Radio Shacks sold 10,000 of the $399 machines in little more than a month.(48) In 1979, Tandy introduced the TRS-80 Model II, which overcame many of the limitations of the original; the TRS-80 Model III followed in 1980.

Growth and Technological Competition, 1978-1981 * For the next few years, Apple, Radio Shack, and Commodore became the top three makers of microcomputers. CP/M was not dead, however; smaller companies like Processor Technology and North Star continued to sell S-100 computers, and add-on boards soon became available to allow users of Apples, PETs, and TRS-80s to run CP/M software on their machines. This period was thus one of strong technological competition, with four major incompatible operating systems vying for position in the market.

The major arena of competition was software. VisiCalc, the first spreadsheet program, gave Apple a boost when it appeared in 1979 in an Apple version. The word processor was another important innovation, giving the microcomputer powers nearly equal to those of expensive stand-alone word processors. By the end of 1976, Michael Shrayer had created Electric Pencil, one of the very earliest word processors. It ran first on S-100 machines, but versions for the other major machines appeared quickly. In 1979, a group of former IMSAI employees at a company called MicroPro introduced WordStar. Unlike Electric Pencil, WordStar was a WYSIWYG (what you see is what you get) word processor. It quickly became a bestseller and the industry standard.(49) In January 1981, Ashton-Tate introduced dBase II, a database management program that National Aeronautics and Space Administration engineer Wayne Ratliff had written in his spare time. It also soon became the industry leader.(50)

By mid-1981, then, the uses of the microcomputer were becoming clearer than they had been only a few years earlier, even if the full extent of the product space lay largely unmapped. A microcomputer was a system comprising a number of more-or-less standard elements: a microprocessor unit with 64K bytes of RAM memory; a keyboard, usually built into the system unit; one or two disk drives; a monitor; and a printer. The machine ran operating-system software and applications programs like word processors, spreadsheets, and database managers. The market was no longer primarily hobbyists but increasingly comprised businesses and professionals. Total sales were growing rapidly. CP/M, once the presumptive standard, was embattled, but no one operating system reigned supreme.

The Paradigm Emerges, 1981 * The emerging outline of a paradigmatic microcomputer gave Adam Osborne an idea: no frills computing.(51) Rather than pushing the technological frontier, he would create a package that was technologically adequate but also inexpensive. In this way, he could seek out, as it were, a local maximum in the product space. "The philosophy was that if 90 percent of users' needs were adequately covered, the remaining ten percent of the market could be sacrificed." Osborne wanted a machine integrated into one package that users could simply plug into the wall, "as they might a toaster."(52) Moreover, he wanted the machine to be portable and small enough to fit under an airline seat. And he wanted to ship a package of basic software "bundled" with the computer so that novice users would have most of what they needed immediately available.

Osborne set about attracting venture capital, and he engaged Lee Felsenstein, the organizer of the Homebrew Club and a consultant for recently defunct Processor Technology, to design the hardware. He acquired an unlimited license for CP/M from Digital Research for $50,000, swapped stock with Microsoft for MBASIC, negotiated a deal with MicroPro for WordStar, and arranged for the development of a VisiCalc clone called SuperCalc. When the Osborne I appeared at the West Coast Computer Faire in March 1981, it had a five-inch screen, two low-density disk drives, a Z80 processor with 64K of memory, and bundled software. It sold for $1,795.

The Osborne I was the hit of the show. The company went from $6 million in sales in its first year to $70 million in its second and to $93 million in 1983. Osborne sold 10,000 units a month at the peak, and some 100,000 people owned one of its machines by 1984. This fantastic growth was matched, however, by an equally dramatic collapse and bankruptcy in 1984, the result of poor management decisions. A contributing factor was the quick rise of competitors. Morrow and Cromemco introduced Z80 machines for about the same price. And Non-Linear Systems, a southern California maker of sensing devices, introduced the Kaycomp II (later the Kaypro II) at the 1982 West Coast Faire. Conceived independently by Andrew Kay, Non-Linear's founder, the Kaypro, like the Osborne, was a Z80 portable with bundled software. Kay also shared Osborne's philosophy. "We don't sell half a computer and call it a computer and then ask a person to come back and buy the rest of it later," he is quoted as saying. "It's like selling an automobile without wheels or seats and saying, 'Those are options.' IBM, Apple, and Tandy play that kind of game. But we don't." Unlike the Osborne, however, the Kaypro had a sturdy metal case and a nine-inch screen with a full eighty-column display. Kaypro Corporation--as the company was rechristened--stepped in when Osborne stumbled and became the fourth largest seller of intermediate-price computers in 1984, reaching sales of $150-175 million that year.(53)

But the signal event of 1981 was not the advent of the cheap bundled portable. On 12 August 1981, IBM introduced the computer that would become the paradigm for most of the 1980s. Like the Osborne and the Kaypro, it was not technologically sophisticated, and it incorporated most of the basic features users expected. But, unlike the bundled portables, the IBM PC was a system, not an appliance: it was an incomplete package, an open box ready for expansion, reconfiguration, and continual upgrading.

In July 1980, William Lowe met with IBM's Corporate Management Committee (CMC). John Opel, soon to become IBM's president, had charged Lowe with getting IBM into the market for desktop computers. Lowe's conclusion was a challenge to IBM's top management. "The only way we can get into the personal computer business," he told the CMC, "is to go out and buy part of a computer company, or buy both the CPU and software from people like Apple or Atari--because we can't do this within the culture of IBM."(54) The CMC knew that Lowe was right, but they were unwilling to put the IBM name on someone else's computer. So they gave Lowe an unprecedented mandate: go out and build an IBM personal computer with complete autonomy and no interference from the IBM bureaucracy. Lowe hand-picked a dozen engineers, and within a month they had produced a prototype. The committee approved and gave Lowe a deadline of one year to market.

The timing was critical. IBM sensed that Apple and its competitors were vulnerable: they were failing to capitalize on the developing business market for personal computers. But to get a machine to market quickly meant bypassing IBM's cumbersome system of bureaucratic checks and its heavy dependence on internal sourcing. Philip Donald Estridge, who succeeded Lowe as director of the project, put it this way: "We were allowed to develop like a startup company. IBM acted as a venture capitalist. It gave us management guidance, money, and allowed us to operate on our own."(55)

Estridge knew that, to meet the deadline, he would have to design a machine that was not at the cutting edge of technology. Moreover, IBM would have to make heavy use of outside vendors for parts and software. The owner of an Apple II, Estridge was also impressed by the importance of expandability and an open architecture.(56) He insisted that his designers use a modular bus system that would allow expandability, and he resisted all suggestions that the IBM team design any of its own add-ons.

The only decision implying anything like a technical advance in the IBM PC (personal computer) was the choice of the Intel 8088 microprocessor. Although touted as a 16-bit chip--and thus an advance over the 8-bit 8080, Z80, and 6502--the 8088 processed data internally in 16-bit words but used 8-bit external buses.(57) The IBM team decided against the 8086, a full 16-bit chip, because they feared its power would raise the hackles of turf-protectors elsewhere in the company.(58) Moreover, the 8088 was perhaps the only 16-bit microprocessor for which there already existed a full complement of support chips.(59)

Choosing the 8088 microprocessor meant that the IBM PC could not use existing operating systems designed for 8-bit chips. Here again, IBM chose not to write its own proprietary system but to go on the market for a system. Estridge's group approached Digital Research, where Gary Kildall was working on a 16-bit version of CP/M. But IBM and Digital were unable to come to terms, probably because of Digital's initial unwillingness to sign the nondisclosure agreements on which IBM insisted.(60) So IBM turned instead to Microsoft, which they were already soliciting to supply a new version of MBASIC. Gates jumped at the chance. He bought an operating system for the 8088 created by a local Seattle software house, put the finishing touches on it, and sold it to IBM as MS-DOS (disk operating system).(61) IBM called its version PC-DOS, but it allowed Microsoft to license MS-DOS to other computer makers. Although this allowed competitors to enter, it also helped IBM to force its operating system as a standard on the industry.

Another radical departure from IBM tradition was the marketing of the PC. Shunning IBM's staff of commission sales agents, the PC group turned to retail outlets to handle the new machine: Sears Business Centers and ComputerLand. Here again, the project philosophy was to do things in keeping with the way they were done in the microcomputer industry--not with the way they were done at IBM. The PC group even solicited input from ComputerLand dealers, flying a few to group headquarters in Boca Raton, Florida, for top-secret consultations.(62)

Perhaps the most striking way in which IBM relied on external capabilities, however, was in the actual fabrication of the PC. All parts were put up for competitive bids from outside suppliers. When internal IBM divisions complained, Estridge told them to their astonishment that they could submit bids like anyone else. With a little prodding, some IBM divisions did win contracts. The Charlotte, North Carolina, plant won a contract for board assembly, and the Lexington, Kentucky, plant made the keyboard. But an IBM plant in Colorado could not make quality disk drives, so Estridge turned to Tandon as the principal supplier. Zenith made the PC's power supply, SCI Systems stuffed the circuit boards, and Epson made the printer.(63) The machine was assembled from these components on an automated line at Boca Raton that by 1983 could churn out a PC every 45 seconds.(64)

The IBM PC was an instant success, exceeding sales forecasts by some 500 percent. The company shipped a mere 13,533 machines in the last four months of 1981, an amount far behind demand. Order backlogs became intolerable.(65) By 1983, the PC had captured 26 percent of the market, and an estimated 750,000 machines were installed by the end of that year.

The First Era of the Clones, 1982-1987 * The IBM PC called forth a legion of software developers and producers of add-on peripherals. Its early phenomenal success also generated competitors producing compatible machines. The first era of the clones falls into two distinct periods. The early manufacturers of clones fed on the excess demand for PCs, and, with one brilliant exception, these manufacturers disappeared when IBM began catching up with demand and lowered prices in 1983 and 1984. A second wave of clones began a few years later, when IBM abandoned 8088 technology in favor of the PC AT, which was built around the faster Intel 80286 chip.

IBM did have one trick up its sleeve to try to ward off cloners, but it turned out not to be a very powerful trick. The PC-DOS that Microsoft designed for the IBM PC differed slightly in its memory architecture from the generic MS-DOS that IBM allowed Microsoft to license to others.(66) IBM chose to write some of the BIOS (basic input/output system, a part of DOS) into a chip and to leave some of it in software. They then published the design of the chip in a technical report, which, under copyright laws, copyrighted part of the PC-DOS BIOS. To make matters more difficult for cloners, many software developers, especially those using graphics, chose to bypass DOS completely and to access the PC's hardware directly. IBM sued Corona, Eagle, and a Taiwanese firm for infringing the BIOS copyright in their earliest models. These companies and all later cloners responded, however, with an end run. They contracted with software houses like Phoenix to create a software emulation (or sometimes a combination of hardware and software emulation) that does what the IBM BIOS does, but in a different way. The emulation is also able to intercept the hardware calls and to process them through the BIOS. This removed the last proprietary hurdle to copying the original PC.

Among the early cloners were Compaq, Corona, and Eagle. Compaq, which by the early 1990s had grown to become the third largest American seller of microcomputers, started out by combining the idea of Adam Osborne and Andrew Kay--transportability--with strict IBM compatibility and good quality control.(67) Started by two former Texas Instruments engineers, the company achieved $100 million sales in its first year, a plateau that Apple had taken four years to reach. Corona and Eagle fared less well, and they failed to survive the relative downturn in the computer market after 1984.(68)

Although IBM had become the dominant force in the personal computer industry, its dominance was by no means complete. In fact, by 1986 more than half of the IBM-compatible computers sold did not have IBM logos on them.(69) By 1988, IBM's worldwide market share of IBM-compatible computers was only 24.5 percent.(70) The clones had returned. Part of the reason was price. IBM never practiced learning-curve pricing, preferring instead to take some of the premium that its name could command. It thus sacrificed potential market share for revenues. Some have also argued that IBM abandoned the 8088 technology prematurely.(71) But others contend that even IBM's moves upscale from the original PC--the PC XT in 1983 and the 80286 PC AT in 1984--represented no technological breakthroughs and left IBM a "sitting duck."(72) It is surely the case that IBM's choice of an open modular system was a two-edged sword that gave the company a majority stake in a standard that had grown well beyond its control.

What is especially interesting is the diversity of sources of these compatible machines. Many come from American manufacturers who sold under their own brand names. These included Compaq, Zenith, Tandy, and Kaypro, the latter two having dumped their incompatible lines in favor of complete IBM compatibility.(73) Another group consists of foreign manufacturers selling under their own brand names; the largest sellers among these are Epson and NEC of Japan and Hyundai of Korea.(74) But there is also a large OEM (original-equipment manufacturer) market, in which firms--typically Taiwanese or Korean, but sometimes American or European--manufacture PCs for resale under another brand name. The popular Leading Edge computer, for example, is made by Daewoo of Korea, which bought its American distributor out of bankruptcy.(75) Until recently, the AT&T PC was manufactured by Olivetti in Italy; the contract has now gone to Intel's Systems Division, which maintains a lively OEM business.(76)

But perhaps the most interesting phenomenon is the no-name clone--the PC assembled from an international cornucopia of standard parts and resold, typically through mail orders or storefronts. Because of the openness and modularity of the IBM PC and the dominance of its bus and software standards, a huge industry has emerged to manufacture compatible parts. The resulting competition has driven down prices in almost all areas. Most manufacturers, even the large branded ones, are really assemblers, and they draw heavily on the wealth of available vendors. But the parts are also available directly, and it is in fact quite easy to put together one's own PC from parts ordered from the back of a computer magazine. By one 1986 estimate, the stage of final assembly added only $10 to the cost of the finished machine--two hours' work for one person earning about $5 per hour.(77) Because the final product could be assembled this way for far less than the going price of name brands--especially IBM--a wealth of backroom operations sprang up. One such operation, begun from Michael Dell's dormitory room at the University of Texas, has grown into a business with revenues in the tens of millions of dollars.(78)

The parts list is truly international. Most boards come from Taiwan, stuffed with chips made in the United States (especially microprocessors and ROM BIOS), Japan, or Korea (especially memory chips). Hard disk drives come from the United States, but floppy drives come increasingly from Japan. A power supply might come from Taiwan or Hong Kong. The monitor might be Japanese, Taiwanese, or Korean. Keyboards might come from the United States, Taiwan, Japan, or even Thailand. Although Japan has been an ever-present fear in the American microcomputer industry, that country's success in PCs has not paralleled its well-known success in some other areas of electronics. Apart from laptop computers, the biggest Japanese sellers are NEC at the high end, with a 5.1 percent market share in 1988, and low-end Epson, with a 4.1 percent share.(79) The biggest foreign players are in fact Taiwan and Korea, countries with dramatically different industry structures.

Taiwan entered the computer business much earlier than Korea. By the late 1970s, the island had become the manufacturing epicenter of a network of low-cost--and illegal--clones.(80) The principal target was the Apple II; but, fed with Taiwanese and other parts, the computer bazaars of Hong Kong could also supply S-100 machines and, in the early 1980s, IBM PC clones. Although there were attempts to export these clones outside the Far East--attempts that met with swift legal action from American firms--most of the clone buyers were local amateurs and business people who could not have afforded the real thing. A side-effect of this industry, then, was to open up the Far East, and especially Taiwan, to the microcomputer and to microcomputer technology. In effect, the years of piracy built up capabilities in the Taiwanese economy that soon could be--and were--directed toward making legal IBM clones and related equipment for export to the Far East, Europe, and the United States. As is apparently traditional in Taiwan, firms like Acer (formerly Multitech) and Mitac are closely held companies that draw for engineering talent on American-trained Taiwanese. Although frequently heavy-handed, the Taiwanese government in this case restricted itself to offering a few generic research and development and coordination services.(81) The leading Taiwanese firm is Acer, which uses the same automated assembly techniques as American firms like Compaq to produce OEM clones for the likes of Unisys, Canon, Siemens, and Philips. Acer also sells under its own brand name; in fact, Texas Instruments makes Acer machines under license in the United States. The company pours about 5 percent of revenues back into R&D, is making a move into high-end work stations, and has begun investing in American firms.(82)

In sharp contrast to the decentralized and entrepreneurial Taiwanese industry, Korea has entered the microcomputer business--as so many other businesses--within the context of a handful of large, vertically integrated firms. Impressed by the model of Japan's industrialization, Korea has attempted to build capabilities in a conscious, directed way within large organizations. The large conglomerates--Hyundai, Daewoo, Samsung, and Gold Star--have all invested heavily in technology to fabricate their own DRAMs and some other chips, with considerable help from the Korean government.(83) The firms have waited until the technology matured and have gone after the low end in microcomputers. The most successful has probably been Daewoo, which provided the OEM machines for Michael Shane's Leading Edge Hardware Products. The combination of Shane's marketing with the Koreans' low-cost manufacture and competent design made the Leading Edge the single most popular clone of the original IBM PC for most of the 1980s. Recently, however, Hyundai, which is the only Korean firm to sell under its own name rather than on an OEM basis, has edged Daewoo in the American market. Korean success has not been universal, however. In some areas Korean firms have been hampered by the lack of (external) technical capabilities. In printers, for example, the inability to manufacture critical parts (notably the printhead) and certain key chips has forced the firms to import most of the printer's components and robbed them of any cost advantage. Another area is disk drives. Because of its strategy of developing internal capabilities, and "unlike Taiwan, which has a large number of small specialized firms, Korea lacks an industrial base necessary to make electromechanical parts, such as disk drive mechanisms."(84)

The Victory of the Clones * The era of the clones is by no means over. Clone makers quickly followed IBM upscale to copy machines using Intel's 80286 and 80386 chips. Indeed, manufacturers of compatibles have begun beating IBM to the punch in introducing new technology. It was Compaq, for example, that introduced the first machine based around the Intel 80386 chip; they also produced a machine using the 80486 chip before IBM did.(85) IBM's response, after a record with more failures than successes, was to begin making the PC proprietary again.(86)

In April 1987, IBM announced its PS/2 line of computers. These machines offered a streamlined design, smaller high-density disk drives, and integrated functions, like high-resolution graphics, that had previously required add-on boards. The higher-end machines, based around the 80286 and the 32-bit 80386 chips, used a new proprietary bus called the Micro Channel Architecture (MCA). The original IBM PC had established an 8-bit industry bus standard, and the PC AT had established a new standard 16-bit bus called the Industry Standard Architecture (ISA). Though still serviceable for most uses, the AT standard was no longer optimal for high-speed 386 and 486 machines. In announcing the Micro Channel, IBM was attempting not only to set the standard, but also to prevent others from taking advantage of it. Nine of the major clone makers, with some nudges from Intel and Microsoft, quickly banded together to announce development of a competing 32-bit bus to be called the Extended Industry Standard Architecture (EISA) bus.(87)

This development led many to expect a protracted, and perhaps fierce, battle of the standards. Essentially, however, no such battle has emerged. The ISA (and to a lesser extent the EISA) standard is perhaps even stronger today than in its early years. IBM continues to sell MCA machines almost exclusively, but has lost market share in doing so. Very few MCA clones appeared, although most manufacturers had the capabilities to make them. IBM's attempt to take the PC proprietary was at best a mixed success. The company was able to trade on its name and reputation to take rents for awhile, selling high-priced machines to corporate customers and others seeking reliability over price and performance. But the passage of time, along with an increasing price-performance gap between IBM and its competitors, soon persuaded more and more corporate customers to buy clones. Newcomers like Dell, AST, Packard Bell, CompuAdd, Gateway 2000, Northgate, and Zeos have gained sales through aggressive pricing--increasingly including sales to corporate buyers.(88) The thickness and competitiveness of the market for components not only drove down PC prices, but also increased reliability. Now a mail-order clone house could credibly offer better service than a big-name manufacturer simply by shipping overnight replacement parts to be swapped by the PC owner. This reduced the rents to a name brand. Moreover, experience with clones diffused through the user population, further reducing the advantages of a brand.(89)

IBM's response has been to cut prices. IBM is undergoing a massive restructuring in response to problems that go well beyond its PC business. The company is selling off divisions and turning others into autonomous units. In effect, IBM is attempting to respond to a changing market with large-scale vertical disintegration.(90)

Ironically, one of the major victims in the victory of the low-cost clones has been Compaq, one of the first clones. Compaq had developed a strategy of competing against IBM on price and performance. Like IBM, Compaq targeted the corporate customer who was willing to pay a premium for reputation, the latest technology, and a hand-holding sales force of exclusive dealers. By 1991, Compaq had been badly wounded by its lower-priced competitors. What had been a small start-up firm found itself weighed down by high-cost internal production and bureaucratic purchasing.(91) Compaq's response has been to reorient itself toward the external economies of the market--in a way, indeed, that bears a remarkable similarity to IBM's strategy for the original PC. Compaq's chairman, Benjamin Rosen, sent executives incognito to Comdex, the computer trade fair, and discovered that the company could buy parts in the market and assemble a PC for far less than Compaq's own internal production costs. Taiwanese circuit boards, for example, cost some 30 percent less than Compaq's production cost. The company set up an independent business unit--called the Ruby project--to make a low-cost machine and gave the unit autonomy and the right to bypass internal purchasing procedures. Compaq's internal divisions were not guaranteed contracts and won the rights to produce some components only after lowering their costs substantially--drawing in part on lessons learned by Compaq executives while touring the facilities of outside suppliers.(92)

Apple, with its incompatible line of computers, has also felt price pressure recently.(93) In order to appreciate Apple's position today, we need to back up in time.

Thriving on continued strong sales of the Apple II, Apple survived some disastrous product development efforts in the late 1970s and early 1980s. I will treat these, especially the Apple III, in more detail later. In the late 1970s, Steven Jobs, who had left the Apple III project and was spearheading development of another new machine, paid a reluctant visit to the Xerox Palo Alto Research Center (PARC), where researchers had been developing advanced new ideas for microcomputers.(94) There Jobs saw bit-mapped graphics, overlapping windows, and a pointing control device called a mouse. He went back to Apple and incorporated much of what he saw into the Lisa, which appeared in January 1983. The new machine was expensive ($10,000), slow, and lacked software; it was not in fact much more of a success commercially than the disastrous Apple III. But it set Apple on a new strategic course: the technological high road. In January 1984, the Macintosh appeared, containing most of the Lisa's features in a less expensive package. The Mac still was not much of a business machine, lacking software and the large memory that IBM users expected. But its novelty won it a following, and John Sculley, who became president of Apple in 1983 and eventually ousted Jobs in 1985, worked to upgrade the Mac into a capable business machine.(95) The company introduced the Macintosh Plus in January 1986 and the Macintosh SE and a high-end machine called the Macintosh II in 1987. By pushing the Mac's desktop publishing capabilities, Apple insinuated itself into the large and medium-sized corporations it had previously neglected. Employees impressed with the Mac's ease of use soon began using the machine for tasks beyond desktop publishing in competition with IBM-compatible machines.

Sculley's reorientation of the Mac line toward business had the effect of spurring a partial convergence toward the IBM standard. The Mac can now read and write to IBM-formatted diskettes, and the Mac II is a modular machine that can use some of the same kinds of parts--such as high-resolution color monitors--as the IBM-compatibles. From their side, IBM and its cohorts have been moving in Macintosh directions as well. Newer IBM-compatibles increasingly sport a mouse, and software writers routinely use overlapping windows and icons. In the summer of 1990, Microsoft introduced version 3.0 of its Windows software, a "graphical user interface" that gives IBM-compatibles some of the "look and feel" of a Mac. And dual-version software, like the word-processing program Microsoft Word, make for "portability" between, for example, an IBM at work and a Mac at home.

One effect of this partial convergence of the Macintosh and IBM standards is that the decreasing cost and increasing power and reliability of ISA clones have drawn Apple away from its niche strategy and into mainstream price competition. In late 1990, Apple introduced a low-priced version of the Mac called the Classic, priced at under $1,000. On the whole, Apple has fared well, holding market share over the years and occasionally eclipsing IBM's share.(96)

Table 1 presents a portrait of the microcomputer industry from 1980 through 1991. Market shares are based on units sold rather than on dollar value, and the table includes makers of low-end "home" computers as well as business-oriented PCs. Nonetheless, the table suggests both the rapid change in the industry and the declining market share of the leading producers. Table 2 presents the top PC manufacturers for various years, with their (declining) share of the market.

Toward a Standard, Modular Future? * Underlying the partial convergence of the Macintosh and the IBM standard is the trend toward greater power--faster execution and more RAM and disk memory. A typical high-end configuration today might be an IBM-compatible machine using an Intel 80486 microprocessor running at 33 or 50 megahertz.(97) The machine might have eight megabytes of internal memory and a 150-megabyte hard drive. Such computers are beginning to challenge the power of scientific work stations made by the likes of Sun Microsystems and Hewlett-Packard, who have themselves greatly eroded the business of traditional minicomputer makers like DEC and Data General.(98) It seems clear that the future of the personal computer lies in higher-power chips, probably 64-bit microprocessors using the reduced instruction set (RISC) architecture, which is incompatible with the Intel and Motorola chips now in use. The standard for the future is thus up for grabs, and coalitions of players are already positioning themselves for battle.

Perhaps the most significant development is the alliance between IBM and Apple signed on 3 October 1991.(99) In the short run, these two former adversaries will develop both a common hardware platform and an operating system. The hardware will revolve around an inexpensive version of IBM's RS/6000 chip, to be produced by Motorola in a joint venture. These chips will power the next generation of Macintosh computers. The operating TABULAR DATA OMITTED TABULAR DATA OMITTED system will be a version of IBM's AIX operating system (a version of the UNIX system developed by AT&T for minicomputers and work stations), adapted to the Macintosh user interface. Thus both the Mac and IBM's RS/6000 work stations could run the same software. For the longer run, IBM and Apple have formed a joint venture called Taligent to develop an object-oriented operating system based on an Apple project called Pink. The goal is to write an operating system to be called Power Open that will allow the RS/6000 PC--as it is now being touted---to run software written for any operating system, including the Mac, Windows, or IBM's OS/2.(100)

The IBM-Apple alliance is in many ways a response to the dominance of Intel chips and the Microsoft standard in the current PC market. In most minds, it is Microsoft, not IBM, that is the dominant force in microcomputers today.(101) Microsoft banded together with DEC, Compaq, and a large number of smaller PC makers in the Advanced Computing Environment (ACE) consortium to develop next-generation hardware and software standards. The hardware was to be built around RISC chips from Mips Computer Systems, and the operating system was to allow "interoperability" between Intel-based PCs and MIPS RISC work stations. The consortium has felt the difficulties of collective action, however, and its ability to forge a standard is not clear. DEC has developed its own RISC chip, the Alpha, and Microsoft has endorsed it. Other consortium members are attracted to Intel's P5 chip.(102)

Whatever happens, however, it is clear that all players have abandoned a proprietary strategy. All recognize that an open standard, for both hardware and software, is inevitable, and the strategic issues are ones of placement within a mostly nonproprietary world.

External Economies and the Microcomputer Industry

Although dependent on, and in many ways driven by, economies internal to the semiconductor industry, the rapid growth and development of the microcomputer industry is largely a story of external economies. It is a story of the development of capabilities within the context of a decentralized market rather than within large vertically integrated firms. Indeed, the microcomputer industry represents, in many ways, a case quite exactly opposite to the picture of economic progress that Chandler paints in Scale and Scope.

In Chandler's story, a small number of prime movers--who are not necessarily the industry pioneers--seize a dominant, oligopolistic position by making large investments in high-volume production and marketing and in a managerial hierarchy. But it is not clear that there were any such Chandlerian prime movers in the microcomputer industry.(103)
COPYRIGHT 1992 Business History Review
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1992 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Langlois, Richard N.
Publication:Business History Review
Date:Mar 22, 1992
Words:10539
Previous Article:Japan's Software Factories: A Challenge to U.S. Management.
Next Article:Strategic maneuvering and mass-market dynamics: the triumph of VHS over Beta. (includes appendices on VCR industry chronology, Sony product schedule...
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters