The Cuckoos Egg

cuckoo2Cliff Stoll’s “The Cuckoo’s Egg” was on the best seller list in 1989 for more than four months. It chronicles the story of his pursuit of a German computer spy.  A systems manager at Lawrence Berkeley Labs, he’d detected anomalies in his network, discovering an apparently authorized user was using a stolen account and granting himself the powers of a systems manager.

For nearly a year Stoll tracked the hacker’s movements into his system, and what began as a curiosity turned into a law enforcement investigation as the culprit accessed sensitive military systems. The hacker turned out to be a German computer wiz selling information to Soviet intelligence. In 1989 Stoll published the account, in which he made public several major security weaknesses in widely used systems.

Neuromancer

neuromancer2When Neuromancer by William Gibson was first published it created a sensation. Or perhaps it would be more precise to say that it was used to create a sensation, for Bruce Sterling and other Gibson associates declared that a new kind of science fiction had appeared which rendered merely ordinary SF obsolete. Informed by the amoral urban rage of the punk subculture and depicting the developing human-machine interface created by the widespread use of computers and computer networks, set in the near future in decayed city landscapes like those portrayed in the film Blade Runner it claimed to be the voice of a new generation. (Interestingly, Gibson himself has said he had finished much of what was to be his body of early cyberpunk fiction before ever seeing Blade Runner.)

Eventually it was seized on by hip “postmodern” academics looking to ride the wave of the latest trend. Dubbed “cyberpunk,” the stuff was being talked about everywhere in SF. Of course by the time symposia were being held on the subject, writers declared cyberpunk dead, yet the stuff kept being published and it continues to be published today by writers like K. W. Jeter and Rudy Rucker. Perhaps the best and most representative anthology of cyberpunk writers is Mirrorshades., edited by Sterling, the genre’s most outspoken advocate.

Time Magazine’s Machine Of The Year

machineFrom Time Magazine, 1982:
By The Millions, It Is Beeping Its Way Into Offices, Schools And Homes

By Otto Friedrich. Reported by Michael Mortiz,San Francisco, J. Madeleine Nash,Chicago and Peter Stoler,New York

WILL SOMEONE PLEASE TELL ME, the bright red advertisement asks in mock irritation, WHAT A PERSONAL COMPUTER CAN DO? The ad provides not merely an answer, but 100 of them. A personal computer, it says, can send letters at the speed of light, diagnose a sick poodle, custom- tailor an insurance program in minutes, test recipes for beer. Testimonials abound. Michael Lamb of Tucson figured out how a personal computer could monitor anesthesia during surgery; the rock group Earth, Wind and Fire uses one to explode smoke bombs onstage during concerts; the Rev. Ron Jaenisch of Sunnyvale, Calif., programmed his machine so it can recite an entire wedding ceremony.

In the cavernous Las Vegas Convention Center a month ago, more than 1,000 computer companies large and small were showing off their wares, their floppy discs and disc drives, joy sticks and modems, to a mob of some 50,000 buyers, middlemen and assorted technology buffs. Look! Here is Hewlett Packard’s HP9000, on which you can sketch a new airplane, say, and immediately see the results in 3D through holograph imaging; here is how the Votan can answer and act on a telephone call in the middle of the night from a salesman on the other side of the country, here is the Olivetti M20 that entertains bystanders by drawing garishly colored pictures of Marilyn Monroe, here is a program designed by The Alien Group that enables an Atari computer to say aloud anything typed on its keyboard in any language. It also sings, in a buzzing humanoid voice, Amazing Grace and When I’m 64 or anything else that anyone wants to teach it.

As both the Apple Computer advertisement and the Las Vegas circus indicate, the enduring American love affairs with the automobile and the television set are now being transformed into a giddy passion for the personal computer. This passion is partly fad, partly a sense of how life could be made better, partly a gigantic sales campaign. Above all, it is the end result of a technological revolution that has been in the making for four decades and is now, quite literally, hitting home.

Americans are receptive to the revolution and optimistic about its impact. A new poll for TIME by Yankelovich, Skelly and White indicates that nearly 80 percent of Americans expect that in the fairly near future, home computers will be a commonplace as television sets or dishwashers. Although they see dangers of unemployment and dehumanization, solid majorities feel that the computer revolution will ultimately raise production and therefore living standards (67 percent), and that it will improve the quality of their children’s education (68 percent).

The sales figures are awesome and will become more so. In 1980 some two dozen firms sold 724,000 personal computers for 1.8 billion. The following year 20 more companies joined the stampede, including giant IBM, and sales doubled to 1.4 million units at just under 3 billion. When the final figures are in for 1982, according to Dataquest, a California research firm, more than 100 companies will probably have sold 2.8 million units for 4.9 billion.

To be sure, the big, complex, costly “mainframe” computer has been playing an increasingly important role in practically everyone’s life for the past quarter century. It predicts the weather, processes checks, scrutinizes tax returns, guides intercontinental missiles and performs innumerable other operations for governments and corporations. The computer has made possible the exploration of space. It has changed the way wars are fought, as the Exocet missile proved in the South Atlantic and Israel’s electronically sophisticated forces did in Lebanon.

Despite its size, however, the mainframe does its work all but invisibly, behind the closed doors of a special, climate controlled room. Now, thanks to the transistor and the silicon chip, the computer has been reduced so dramatically in both bulk and price that it is accessible to millions. In 1982 a cascade of computers beeped and blipped their way into the American office, the American school, the American home. The “information revolution” that futurists have long predicted has arrived, bringing with it the promise of dramatic changes in the way people live and work, perhaps even in the way they think. America will never be the same.

In a larger perspective, the entire world will never be the same. The industrialized nations of the West are already scrambling to computerize (1982 sales 435,000 in Japan, 392,000 in Western Europe). The effect of the machines on the Third World is more uncertain. Some experts argue that computers will, if anything, widen gap between haves and have nots. But the prophets of high technology believe the computer is so cheap and so powerful that it could enable under developed nations to bypass the whole industrial revolution. While robot factories could fill the need for manufactured goods, the microprocessor would create myriad new industries, and an international computer network could bring important agricultural and medical information to even the most remote villages. “What networks of railroads, highways and canals were in another age, networks of telecommunications, information and computerization…are today,” says Austrian Chancellor Bruno Kreisky. Says French Editor Jean Jacques Servan Schreiber, who believes that the computer’s teaching capability can conquer the Third World’s illiteracy and even its tradition of high birth rates “It is the source of new life that has been delivered to us.”

The year 1982 was filled with notable events around the globe. It was a year in which death finally pried loose Leonid Brezhnev’s frozen grip on the Soviet Union, and Yuri Andropov, the cold eyed ex chief of the KGB, took command. It was a year in which Israel’s truculent Prime Minister Menachem Begin completely redrew the power map of the Middle East by invading neighboring Lebanon and smashing the Palestinian guerrilla forces there. The military campaign was a success, but all the world looked with dismay at the thunder of Israeli bombs on Beirut’s civilians and at the massacres in the Palestinian refugee camps. It was a year in which Argentina tested the decline of European power by seizing the Falkland Islands, only to see Britain, led by doughty Margaret Thatcher, meet the test by taking them back again.

Nor did all of the year’s major news derive from wars or the threat of international violence. Even as Ronald Reagan cheered the sharpest decline in the U.S. inflation rate in ten years, 1982 brought the worse unemployment since the Great Depression (12 million jobless) as well as budget deficits that may reach an unprecedented 180 billion in fiscal 1982. High unemployment plagued Western Europe as well, and the multibillion dollar debts of more than two dozen nations gave international financiers a severe fright. It was also a year in which the first artificial heart began pumping life inside a dying man’s chest, a year in which millions cheered the birth of cherubic Prince William Arthur Philip Louis of Britain, and millions more rooted for a wrinkled, turtle like figure struggling to find its way home to outer space.

There are some occasions, though, when the most significant force in a year’s news is not a single individual but a process, and a widespread recognition by a whole society that this process is changing the course of all other processes. That is why, after weighing the ebb and flow of events around the world, TIME has decided that 1982 is the year of the computer. It would have been possible to single out as Man of the Year one of the engineers or entrepreneurs who masterminded this technological revolution, but no one person has clearly dominated those turbulent events. More important, such a selection would obscure the main point. TIME’s Man of the Year for 1982, the greatest influence for good or evil, is not a man at all. It is a machine, the computer.

Byte Magazine

byte-num-1Byte was a hugely influential computer magazine that begain in 1975 and was published throughout the 1980s. It covered development in the entire field of software and computers, was published monthly, and sold for a yearly subscription of $10. The first issue was printed in September 1975, and featured ads from many companies that would become corporate giants in the near future. Early articles included do it yourself projects for your computer, and even software development for computer improvement. Significant articles were printed in the publication, including source code for Tiny C and BASIC, a first look at CP/M, and it also ran one of Microsoft’s first ads.

In 1979 Byte was sold to McGraw-Hill and moved away from the do it yourself theme of the early issues after the emergence of the IBM-PC to become one of the first computer magazines to do product reviews. By 1990 it had become an inch thick and boasted a yearly subscription price of $56, making it the must have computer periodical. The magazine launched a website in 1993, but then was bought by CMP Media and publication was ceased in July 1998 to the shock of readers and subscribers. The following year it was revived as a subscription web publication and has been thriving ever since.

When Byte was originally published by Wayne and Virginia Green, there was a fight over the magazine following a court case against their previous publishing company. Virginia Green retained control of the company however, and later Wayne paid damages after a few lawsuits for saying negative things about Byte.

On Distributed Communications Networks

Computer engineer Paul Baran of BBN (Bolt,Beranek,Newman) writes a paper, “On Distributed Communication Networks”, describing what later becomes known as packet switching, in which digital data are sent over a distributed network in small units and reassembled into a whole message at the receiving end. Packet switching will be an integral part of the ARPANET a few years later.

Baran’s paper introduced two radical ideas at that time. The first proposed changing the structure of the existing communication networks. Instead of a common decentralized network that used several interconnected main centers linked to nearby locations, he proposed a distributed network, where each point was only connected to it’s nearest neighbors. This way, messages had multiple pathways to their destinations.

The second idea introduced a theory that messages could be chopped up into smaller chunks of data, and then reassembled at the receiving end. This allowed each one to travel by a different route, because the traditional method of streaming data was inefficient. These ideas prove to be so unconventional none of the telco giants were interested, and even Baran moved on to other projects. It wasn’t until later that Larry Roberts, then head of ARPA, adopted it for their new computer networking program.

On Computable Numbers

Written by the mathematician Alan Turing and published in 1936, this paper demonstrates that there are problems to which no mechanically computable solution exists by detailing the design of a theoretical digital computer. German mathematician David Hilbert theorized in 1928 that all math problems could be solved and that a machine could do it. Turing set out to prove Hilbert wrong, describing what is now known as the Turing machine. It manually scanned a tape that was punched with 1s and 0s (which was later used in the first calculating computers), and used instructions programmed by a person to solve the problems. The values were recorded on tape, delivering the outcomes in binary and handling the process without human intervention. In this theoretical scenario, the machine can only calculate problems if it is capable of it. This demonstrates that there are some math and logic problems that cannot be solved with algorithms. This paper immortalizes Turing in the annals of computing history, introducing the basic concepts of digital computing upon which modern computer science is based.