From the outside, the world headquarters of Microsoft Corp. is deceptively modest. Located in Redmond, Wash., 20 km northeast of Seattle, the headquarters’ 22 grey low-rise buildings nestle in immaculately landscaped grounds. Inside, however, offices are cluttered with everything from empty beer bottles to bicycles, as the complex’s 4,000 employees strive to transform the latest microchip developments into salable products. Some of Microsoft’s programmers have as many as seven computers working at once in their semi-detached cubicles. Others routinely work past midnight in the drive to perfect new technologies. Said John Lazarus, director of systems marketing: “My job as a manager is to throw them out of the offices to sleep.”
The feverish pace is partly driven by a host of competitors eager to emulate Microsoft’s $2.1 billion in annual sales. But underlying the frantic activity is a shared vision of a future in which computers will be as ubiquitous as electricity. The industry is in an all-out campaign to
make the machines ever faster, cheaper and easier to use. Indeed, the ultimate goal is to make computers seem not like computers at all. To that end, users may eventually be able to speak directly to their personal computers, while other machines invite them to enter imaginary, computer-generated parallel realities. A glimpse of the future as the wizards of the microchip envision it:
Multimedia: The goal of multimedia technology is to enable personal-computer users to achieve special effects on their own small screens. One product already on the market permits a user to hear Beethoven’s Ninth Symphony—while following the score on the screen. However, such products are available only in so-called CD-ROM (compact disc readonly memory). Users cannot edit or add to the audio and video components.
There are other limitations. For one thing, multimedia requires massive amounts of computer memory: a single color photograph may occupy 150,000 bytes of memory—as much as 100 pages of text. Still, some of the limitations
on multimedia are disappearing. In December, Apple Computer Inc. introduced a software program that compresses a chunk of data when it is stored and decompresses it when the user calls it up again—vastly increasing the effective storage capacity of computers that use the program.
Voice-activated computers: A handful of computers can already recognize spoken words—in limited applications. Some programs enable the PC user to perform simple functions by uttering such one-word commands as “delete” or “save.” And some experts predict that before the end of the decade, people will be able to call home and tell their computer to turn on the VCR to record a television show. “You won’t even have to say what time or what channel the show is on,” said Andrew Toller, an industry analyst for Montreal-based DMR Group Inc., a computer consulting company. 2 “It will get to know your habits.” Apple has taken an important first step in that direction. It has developed a prototype continuous speech recognition system, which has a vocabulary of about 200 English-language words and can understand some short sentences. “You are not carrying on a dialogue,” acknowledged Rick Parfitt, a senior research scientist with Apple’s advanced technology group in Cupertino, Calif. “But it’s still a whole new way of interacting with a computer.”
Virtual reality: Participants wear helmets containing screens that show full-color, threedimensional computer-generated images. The user also wears a glove that is electronically linked to the computer, enabling him to manipulate objects in the make-believe world. One company has started to make the technology an actual reality: W Industries of Leicester, England, manufactures a system for use in video arcades. In one game, Dactyl Nightmare, four players try to find and shoot one another while also trying to avoid being seized by a pterodactyl.
Still, most analysts say that virtual reality will prove most valuable in the professional world. For their part, doctors will be able to practise rare surgical procedures on a virtualreality patient before attempting the operation on a live one. The main drawback so far is that the images are too grainy and cartoon-like. But making the images more realistic requires so much computing power that it slows down the reaction time between what the user tells the computer to do with the glove and the action appearing on the screen.
Still, even if the timetable for the introduction of the most ambitious of the new technologies slips, what appears certain is that the computer revolution continues—and with it, the changes that the microchip is producing at so many levels of everyday life.
The story you want is part of the Maclean’s Archives. To access it, log in here or sign up for your free 30-day trial.
Experience anything and everything Maclean's has ever published — over 3,500 issues and 150,000 articles, images and advertisements — since 1905. Browse on your own, or explore our curated collections and timely recommendations.WATCH THIS VIDEO for highlights of everything the Maclean's Archives has to offer.