Monday, June 20, 2022

The first computers and the birth of the digital age. From Z1 to PC

 In the twenty-first century, we cannot imagine our life without computers. From daily communication to spaceflight, these universal machines fuel our world. Thanks to them, we can move more efficiently in reality or completely escape from it in VR games. How did great counting machines become the universal tool of civilization? Here is the story of this groundbreaking invention.

Any machine that can read and execute coded instructions can be called a computer. The main difference that separates the computer from the calculator is the possibility of programming. Even the oldest analytical machine from the 19th century (which I will write about later) was theoretically able to execute the program. The architecture of computers was described by two eminent mathematicians in the 1930s and 1940s - Alan Turing (Ritchie D., 1986: p. 77) and John von Neumann. In order for our computer to function as described above, it must meet certain conditions:

  • I / O devices take instructions and return results
  • memory stores variables and instructions
  • the arithmetic logic unit processes information
  • the control unit has access to memory and enables the execution of accepted instructions.

The computer needs its own energy source. Unlike slide rules, mechanical calculators, and arithmometers, it is fully automatic. The latter condition meant that the construction of a computer only began to be feasible in the 19th century. For centuries, people have used numerical codes, and machines of all kinds, including calculating machines. However, everything happened in the head, on paper, or, at best, as a result of the use of levers, springs, gears, or weights. The situation changed when an improved steam engine became widely used.

The first programmable computer in history was to be powered by a steam engine. In 1822, English inventor Charles Babbage built the first differential machine. It wasn't a computer yet, but rather a huge mechanical, automated calculator. In the next decade, he began working on an analytical machine that had all the basic features of a modern computer. It was programmable (using punch cards - an idea borrowed from Jacquard), and had a separate memory unit and an analytical unit. Powered by a steam engine allowed for automated work. (Petzold Ch., 2002: pp. 267-268) The project was never finished. In Babbage's time, it was very expensive. The cost and complexity of the device were disproportionate to the demand for this type of equipment. After all, complex calculations could be performed with the help of abacuses, manually controlled small arithmometers, and sliders.

Babbage's associate and author of the first programs were Ada Byron-King, Duchess of Lovelace. Daughter of the well-known romantic poet George Byron and well-educated (including mathematically) Anna Izabela (Anabelli) Byron.

The German engineer Konrad Zuse is considered to be the first computer constructor in the modern sense. In 1936 he patented a memory that could be read by a computer control unit. The memory was based on automatically displaced plaques. In 1938, a prototype of the V1 computer was created, later renamed the Z1. Zuse's most innovative idea was to use binary numbers, that is, represented by ones and zeros. The ability to represent even the largest numbers with ones and zeros made it possible to represent them by appropriately long sequences of simple electrical pulses. This has another advantage. It is easy to move from working with numbers to logical operations where instead of 0s and 1s there are "true" and "false".

The Z1 was a community-funded project, more specifically by the family and friends of the young engineer. The prototype was never fully functional due to the unreliability of the plates. It was finally destroyed during the bombing of Berlin in 1943.

The first fully programmable version of the computer was the Z3 model, unfortunately, was used by German aviation during World War II. (Lee J.A.N .: pp. 765-766) Zuse was not ideologically associated with the Third Reich, but he was also not in opposition. He was one of the scientists and engineers who simply did his job. Regardless of everything, his invention was groundbreaking. Other Zuse machines were used for computation for scientific and commercial purposes. After the war, he ran the company Zuse KG, which was later absorbed by the Siemens corporation.

World War II was a turning point in the history of computers. The ability to communicate remotely was great for war purposes. It had one drawback - the enemy could intercept orders by connecting to the cable network or catching the appropriate radio frequency. Therefore, encryption techniques began to develop quickly. At the end of World War I, Artur Schreibus constructed an encryption machine that he patented in 1918 and sold under the commercial name of Enigma from 1923. At the end of the 1930s, it was started to be used by the army of the Third Reich. Another device used by the German command was the Lorenz machine.

Deciphering the instructions encoded with Enigma and Lorenz's machine was crucial for predicting the opponent's movements. Speed ​​and faultlessness mattered. Therefore, it was decided to use analytical machines. The Colossus, constructed in 1943, is considered the first digital, binary, and (manually) programmable computer. His task was to decode the information transmitted by teletype by Lorenz's machine. A mechanical-electrical device called Bomba was used to decode the Enigma. They were constructed by Polish cryptologists before the war. The British Bombe was adapted to the new encryption methods that the Germans introduced after the outbreak of the war. In the construction of Colossus and Bombe, he helped, among others Alan Turing, author of a hypothetical model computer called the Turing machine.

The most dynamic development of digitization took place between 1945 and 1989, i.e. during the Cold War. The conditions conducive to the development of computers had multiple grounds, including an economic boom in the US, and the availability of raw materials and technologies. The world war forced both sides to accelerate the development of previously known technologies. The demand for further technology refinement was driven by the arms race and competition in space, but also by growing commercial and social needs. Generations born after World War II were more and more interested in the development of entertainment, communication, and facilitating everyday life with the help of computers. Dreams from before the war came back. Even then, people were to be replaced by machines. The entry of computers into use made these dreams closer and more real.

In 1946, the first computers already had all the basic components I mentioned earlier. Their architecture did not differ significantly from the modern one. Nevertheless, they took up a lot of space and required qualified service.

The transistor is one of the most important inventions that made the computer work better and made it possible to reduce and speed it up.

A transistor is a component of an electronic circuit capable of amplifying an electrical signal. Before its creation, vacuum tubes were used for this purpose. Their size limited the miniaturization of the first computers.

The main component that enables a transistor to function is the semiconductor material. The most commonly used is silicon, hence the name of the electronic el dorado "Silicon Valley". The first working transistor was made in Bell's laboratories in 1947. (cf. Ritchie D .: p. 120) During the 1950s, the design was improved. In 1958, Jack Kilby and Robert Noyce independently built the first integrated circuits from transistors.

Immediately after the appearance of a working transistor on the market, experiments with its use in computers began. However, it was only the creation of an integrated circuit that became the basis for the real development of computers (and other electronic devices). Development was so rapid that around 1965, George Moore, the future co-founder of Intel, found that the number of transistors in an integrated circuit would double each year. This is called Moore's law was modified in the following years and began to be applied to other parameters of the computer, i.e. memory capacity or processor clock speed.

Today, we might be surprised to see what was called a minicomputer in the late 1960s. One of the first models - the PDP-8 - looked like a medium-sized cabinet, not counting any peripherals, of course. Huge, multi-piece mainframes have been used by large organizations and were priced beyond the capabilities of most small businesses, not to mention home use. Minicomputers were adapted to more common use. They gave rise to personal computers. Not only did they take up less space, but it was easier to enter data using switches.

Among the precursors of the PC, historians of digitization unanimously distinguish the Altair 8800 model, using the Intel 8080 processor. Introduced by MITS, it was the first computer that a private user could afford. It was the most minimalist version of the computer, with no external memory, keyboard, or other input devices. It was also impressively small for those times - the size of a large suitcase.

The availability of the Altair 8800 made it possible to create computers that really could be used by everyone. Its usefulness to non-professionals increased when, in 1975, Bill Gates and Paul Allen wrote the Altair Basic programming language (later named Microsoft Basic), based on the previously developed educational Basic language. (Lee J.A.N., 1995: p. 413) With further refinements, the fairly crude machine that was the "bare" Altair 8080 turned into a real personal computer, with programs, games, and learning. Connecting the monitor and keyboard allowed it to be used normally.

The second half of the 70s and the beginning of the 80s are the next improvements related to the development of personal computers. The miniaturization of devices was combined with more and more efficient work.

It is arguable what event started the digital age, but it was certainly sealed by the spread of the internet. Personal computers were not only widely available and easy to use but also provided access to a global database of information.

Bibliography:

  • Charles Petzold, Code. The hidden language of computer hardware and software, Warsaw 2002
  • David Ritchie, The computer pioneers: the making of the modern computer, New York 1986
  • John A.N. Lee, International biographical dictionary of computer pioneers, Chicago 1995

No comments:

Post a Comment

Does Fallout predict our post-war reality? Consequences of a nuclear disaster and a chance for survival

 When we think of nuclear war, images of destroyed cities, radioactive contamination and survivors struggling to survive immediately come to...