The Selective Sequence Electronic Calculator was first introduced to the public on January 27th, 1948. It was a hybrid, made up of a mix of electromechanical relays and vacuum tubes. It was put on display behind a large window in the ground floor of IBM’s New York headquarters, and was nicknamed “Poppa” by the public for it’s noisy operation.
Although not completely electronic, it was the first to treat programmed instructions as data and had many features of stored program machines that would come about later. Like most computers of the day it was a large machine that took up a lot of space, occupying a 60 by 30 foot room.
The all electronic ENIAC had already been introduced prior to completion of the SSEC, and it was quickly outdated. It was used mainly as a public relations prop and served that role well until it ceased operation in 1952 and the new IBM 701 was placed on display instead.
A bureaucracy and a factory are automated machines in Wiener’s view. The whole world — even the universe — could be seen as one big feedback system subject to the relentless advance of entropy, which subverts the exchange of messages that is essential to continued existence (Wiener, 1954). This concept of interdependent communications systems, coupled with Wiener’s assertion that a machine that changes its responses based on feedback is a machine that learns, indicates the distinction between media and cybermedia.
Since Wiener’s time, cybernetics as a discipline experienced a rapid rise (in the 1960s) and a swift decline, but it appears to be on the upswing again because of a broadened perspective. The original foundation of cybernetics was limited to the observation of the states of a system, with the drawback being that the states observed — and defined — were wholly dependent on an observer who was construed as impartial and having no effect on the observed system.
A bit (abbreviated b) is the most basic information unit used in computing and information theory. A single bit (short for binary digit) is a zero or a one, or a true or a false, or for that matter any two mutually exclusive states. Claude E. Shannon first used the word bit in a 1948 paper.
Shannon was a graduate of the University of Michigan, being awarded a degree in mathematics and electrical engineering in 1936. He then went to the Massachusetts Institute of Technology where he obtained a Master’s Degree in electrical engineering and his Ph.D. in mathematics in 1940. Shannon wrote a Master’s thesis A Symbolic Analysis of Relay and Switching Circuits on the use of Boole’s algebra to analyze and optimize relay switching circuits.
At the Massachusetts Institute of Technology he also worked on the differential analyzer, an early type of mechanical computer developed by Vannevar Bush for obtaining numerical solutions to ordinary differential equations. Shannon published “Mathematical Theory of the Differential Analyzer” in 1941. In 1948 he published “A Mathematical Theory of Communication” , formulating the link between computers and communication. He coined the term bit as a fundamental unit of information, and the analysis creates several theorems on ways to encode messages so that information could be exchanged reliably. These ideas would influence electronic, computer, and communications design for decades to come.