Linus Torvalds was born in 1969 in Helsinki, Finland. He attended the University of Helsinki and graduated in 1996 with a degree in computer science. In the early 90s he discovered the Minix demo, a Unix based operating system, and was inspired to create his own. While attending the university, he wrote it on his own time using his own resources and later released it to a newsgroup in September of 1991 after a friend made it available on an FTP server. The result was an open source operating system project that has spawned several different flavors of the system and code. Today, Linux is gaining ground in the desktop market, and is also a front runner in high end enterprise systems. Torvalds’ contribution of the kernel is credited with single handedly completing the GNU project’s goal of a free, Unix based operating system.
A graduate of Oxford University, England, Tim now holds the 3Com Founders chair at the Laboratory for Computer Science ( LCS) at the Massachusetts Institute of Technology. He directs the World Wide Web Consortium, an open forum of companies and organizations with the mission to lead the Web to its full potential.
With a background of system design in real-time communications and text processing software development, in 1989 he invented the World Wide Web, an internet-based hypermedia initiative for global information sharing. while working at CERN, the European Particle Physics Laboratory. He wrote the first web client (browser-editor) and server in 1990.
Before coming to CERN, Tim worked with Image Computer Systems, of Ferndown, Dorset, England and before that a principal engineer with Plessey Telecommunications, in Poole, England.
Kevin Mitnick was convicted of destroying data over a computer network and with stealing operator’s manuals from the telephone company in 1981. As his escapades continued, he became the first high profile hacker.
Born in 1964 in Los Angeles, Mitnick first began hacking by changing friends’ grades on the high school’s computer system in the 1970s. He was again convicted in 1983 for breaking into a Pentagon computer over the ArpaNet. Later he was accused of tampering with a TRW credit computer, and went into hiding. A warrant was issued for his arrest which promptly disappeared from police records. He was then later convicted for stealing software in 1987. In 1988 he pled guilty to one count of computer fraud and one count of possessing illegal long-distance access codes for attempting to break into Easynet, Digital’s computer network. As part of a plea bargain he was sentenced to one year in prison and six months of counseling.
After serving his sentence he went to work for a detective agency, where soon it was discovered someone was illegaly using a commercial database at the company. When a warrant was issued and authorities went to arrest him Mitnick promptly went underground, hiding from the police and FBI for two years. In 1994 he broke into scientist/computer security expert Tsutomu Shimomura’s computers and was tracked down within a few months and arrested yet again in 1995. Rumors swirled that Mitnick was able to launch a nuclear strike by whistling into a phone and punching some numbers. As a result, jail officials placed him in solitary confinement for nearly 8 months.
After his release from prison in January 2002, he was banned from using the internet until January 21st, 2003. He is now CEO of a computer security consulting company and has published two books.
Lee Felsenstein is an electronic design engineer who was a participant in the early development of personal computers. Two of his designs (the Sol-20 and the Osborne-1) are on display in the Smithsonian, as is the story of the Homebrew Computer Club, which he chaired and where open architecture was developed. Most recently, Lee was a senior researcher at Interval Research Corporation in Palo Alto, participating in long-range projects to re-invent the information infrastructure. Mr. Felsenstein lives in Palo Alto, CA. He holds several patents and in 1994 received a Pioneer of the Electronic Frontier Award from the Electronic Frontier Foundation.
Robert (Bob) Metcalfe was a pioneer in computing, creating the first theory about ethernet networking, which has become the industry standard. Born 1946 in Brooklyn, NY, he was sure he wanted to be an electrical engineer and go to MIT by age 10. He enrolled in 1964, leaving five years later with his dream. Subsequently he earned a Ph.D. in computer science from Harvard.
While working at Xerox in 1973, he invented ethernet. In 1979 he formed 3Com, which everyone is familiar with today, and thanks to Metcalfe’s management and marketing skills propelled ethernet to the defacto standard and cemented 3Com as a Fortune 500 company. After leaving in 1990 he began to voice his concerns about problems in computing and networking, and in 1993 he became vice president of International Data Group. In 1995 he caused a debate by declaring that the internet would collapse by 1996. Although a few ISPs suffered setbacks during that period, they were minor and did not bring the web to it’s knees. Metcalfe responded by eating his words-literally, blending his own magazine column and consuming the beverage at the 1997 Sixth International World Wide Web Conference.
Larry Roberts was an integral part of the formation of ARPANet as one of it’s chief designers, and was also the first to connect two computers to one another via dedicated phone lines, proving that such connections were possible. After receiving a PhD from MIT , he heard the inspiring words of J.C.R. Licklider at a conference and was enamored with the idea of computer networks.
At the time he was working at MIT’s Lincoln Laboratory, and in 1965 a connection was proposed from Lincoln’s TX-2 computer to System Development’s Q-32 in Santa Monica. ARPA agrees to this and Lincoln appoints Roberts supervisor. The link is established over a four wire Western Union telephone line, allowing the machines to send messages to one another. Later ARPA would recruit Roberts to head it’s computer networking program, and he becomes one of Arpanet’s greatest advocates and primary architects, leading to the first multiple computer network.
Born in 1915 in St.Louis, Missouri, J.C.R. Licklider (Lick) studied physics, chemistry, fine arts, and psychology, eventually earning undergraduate degrees and a Ph.D. He was a professor at Harvard University in the 40s, before moving on to MIT. There he was in charge of a human engineering group at Lincoln Lab, MIT’s air defense laboratory, where he worked extensively with computers. He believed that technology had the power to save humanity, and in 1960 published “Man-Computer Symbiosis”, which put forth the idea that computers would eventually help humans make decisions. It was an unorthodox view to say that computers would become more than just calculating tools, but he believed they would augment the human intellect.
He eventually accepted a position as head of ARPA’s Information Processing Techniques Office (IPTO). Licklider formed alliances with the most advanced academic computer centers across the country, which he calls the Intergalactic Computer Network. During his two year tenure, he made important contributions to computer science. He wrote a memo, “To Members and Affiliates of the Intergalactic Computer Network”, in which he put forth the idea of an interactive network linking people via computer. He was responsible for planting the seeds of the World Wide Web and saw it’s birth before his death in 1990.
As the eldest son of the president of IBM, Thomas Watson Jr. grew up tortured by self-doubt. He suffered bouts of depression and once burst into tears over the thought that his formidable father wanted him to join IBM and eventually run what was already a significant company. “I can’t do it,” he wailed to his mother. “I can’t go to work for IBM.”
Yet 26 years later, Watson not only succeeded his father but also would eventually surpass him. IBM is now synonymous with computers, even though the company did not invent the device that would change our life, nor had it shipped a single computer before Tom Jr. took over.
But he boldly took IBM–and the world–into the computer age, and in the process developed a company whose awesome sales and service savvy and dark-suited culture stood for everything good and bad about corporate America. No wonder the Justice Department sought (unsuccessfully) to break it up.
A bureaucracy and a factory are automated machines in Wiener’s view. The whole world — even the universe — could be seen as one big feedback system subject to the relentless advance of entropy, which subverts the exchange of messages that is essential to continued existence (Wiener, 1954). This concept of interdependent communications systems, coupled with Wiener’s assertion that a machine that changes its responses based on feedback is a machine that learns, indicates the distinction between media and cybermedia.
Since Wiener’s time, cybernetics as a discipline experienced a rapid rise (in the 1960s) and a swift decline, but it appears to be on the upswing again because of a broadened perspective. The original foundation of cybernetics was limited to the observation of the states of a system, with the drawback being that the states observed — and defined — were wholly dependent on an observer who was construed as impartial and having no effect on the observed system.
Shannon was a graduate of the University of Michigan, being awarded a degree in mathematics and electrical engineering in 1936. He then went to the Massachusetts Institute of Technology where he obtained a Master’s Degree in electrical engineering and his Ph.D. in mathematics in 1940. Shannon wrote a Master’s thesis A Symbolic Analysis of Relay and Switching Circuits on the use of Boole’s algebra to analyze and optimize relay switching circuits.
At the Massachusetts Institute of Technology he also worked on the differential analyzer, an early type of mechanical computer developed by Vannevar Bush for obtaining numerical solutions to ordinary differential equations. Shannon published “Mathematical Theory of the Differential Analyzer” in 1941. In 1948 he published “A Mathematical Theory of Communication” , formulating the link between computers and communication. He coined the term bit as a fundamental unit of information, and the analysis creates several theorems on ways to encode messages so that information could be exchanged reliably. These ideas would influence electronic, computer, and communications design for decades to come.