Printed circuit boards were originally invented in 1936 by Austrian engineer Paul Eisler while working on a radio set. The PCB as it later became known, would revolutionize electronic circuit design and assembly a few decades later. By World War II they were secret technology being used in proximity fuses on bombs, mines, and other munitions by the U.S. military, resulting in a patent that remained classified for nearly forty years.
Providing connections to electrical components via paths etched in copper sheets and then laminated with non-conductive materials, this simple formula was arrived at through a combination of different applications by different scientists in the early 20th century.
The proximity fuse technology was released to the public after the war but it wasn’t until the 1950s when the U.S. Army devised the Auto-Sembly system, eliminating the need for multiple wire leads and reducing the connections to the board. Later advances in lamination and etching pioneered the electronics explosion that began in the 1980s. This was a crucial advancement in electronics, putting this invention in the pantheon of tech that changed the world.
Written by the mathematician Alan Turing and published in 1936, this paper demonstrates that there are problems to which no mechanically computable solution exists by detailing the design of a theoretical digital computer. German mathematician David Hilbert theorized in 1928 that all math problems could be solved and that a machine could do it. Turing set out to prove Hilbert wrong, describing what is now known as the Turing machine. It manually scanned a tape that was punched with 1s and 0s (which was later used in the first calculating computers), and used instructions programmed by a person to solve the problems. The values were recorded on tape, delivering the outcomes in binary and handling the process without human intervention. In this theoretical scenario, the machine can only calculate problems if it is capable of it. This demonstrates that there are some math and logic problems that cannot be solved with algorithms. This paper immortalizes Turing in the annals of computing history, introducing the basic concepts of digital computing upon which modern computer science is based.
Born June 23rd, 1912 in London, England, Turing displayed a penchant for math at an early age. He attended Kings College, Cambridge, and Princeton, where he earned a Ph.D. in mathematics. He published “On Computable Numbers” in 1936, which established the theoretical foundations of digital, stored-program computing. During WWII he worked for the British government cracking the German Enigma code and assisting in the construction of the Colossus, the first operational electronic computer.
Eccentric and colorful, he rode a bicycle everywhere (even when it rained) and often wore a gas mask to control hay fever. He was also a long distance runner and would’ve been a contender in the 1948 Olympics if it hadn’t been for a serious injury. After the war he turned his attention to artificial intelligence, inventing what is now called the Turing test, which was used to judge whether a machine was considered intelligent. He was also openly gay, which was looked upon with disdain at the time and earned him some serious prosecution. Turing went on to study the chemical and mathematical basis for the formation of asymmetrical patterns in biology. He was found dead on June 7th, 1954 at his home, victim of an apparent suicide.