History of computer science


The history of computer science began long before the modern discipline of computer science that emerged in the 20th century, and was hinted at in the centuries prior. The progression, from mechanical inventions and mathematical theories towards modern computer concepts and machines, led to a major academic field and the basis of a massive worldwide industry.[1]

The earliest known tool for use in computation was the abacus, developed in the period between 2700–2300 BCE in Sumer. The Sumerians' abacus consisted of a table of successive columns which delimited the successive orders of magnitude of their sexagesimal number system.[2]:11 Its original style of usage was by lines drawn in sand with pebbles . Abaci of a more modern design are still used as calculation tools today.[3]

The Antikythera mechanism is believed to be the earliest known mechanical analog computer.[4] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to c. 100 BCE. Technological artifacts of similar complexity did not reappear until the 14th century, when mechanical astronomical clocks appeared in Europe.[5]

When John Napier discovered logarithms for computational purposes in the early 17th century, there followed a period of considerable progress by inventors and scientists in making calculating tools. In 1623 Wilhelm Schickard designed a calculating machine, but abandoned the project, when the prototype he had started building was destroyed by a fire in 1624 . Around 1640, Blaise Pascal, a leading French mathematician, constructed a mechanical adding device based on a design described by Greek mathematician Hero of Alexandria.[6] Then in 1672 Gottfried Wilhelm Leibniz invented the Stepped Reckoner which he completed in 1694.[7]

In 1837 Charles Babbage first described his Analytical Engine which is accepted as the first design for a modern computer. The analytical engine had expandable memory, an arithmetic unit, and logic processing capabilities able to interpret a programming language with loops and conditional branching. Although never built, the design has been studied extensively and is understood to be Turing equivalent. The analytical engine would have had a memory capacity of less than 1 kilobyte of memory and a clock speed of less than 10 Hertz .

Considerable advancement in mathematics and electronics theory was required before the first modern computers could be designed.

Binary logic

In 1702, Gottfried Wilhelm Leibniz developed logic in a formal, mathematical sense with his writings on the binary numeral system. In his system, the ones and zeros also represent true and false values or on and off states. But it took more than a century before George Boole published his Boolean algebra in 1854 with a complete system that allowed computational processes to be mathematically modeled .[8]

By this time, the first mechanical devices driven by a binary pattern had been invented. The industrial revolution had driven forward the mechanization of many tasks, and this included weaving. Punched cards controlled Joseph Marie Jacquard's loom in 1801, where a hole punched in the card indicated a binary one and an unpunched spot indicated a binary zero. Jacquard's loom was far from being a computer, but it did illustrate that machines could be driven by binary systems .[8]

Creation of the computer

Before the 1920s, computers (sometimes computors) were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Most of these computers were women. Some performed astronomical calculations for calendars, others ballistic tables for the military.

After the 1920s, the expression computing machine referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the Church-Turing thesis. The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight.

Machines that computed with continuous values became known as the analog kind. They used machinery that represented continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential.

Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital machinery used difference engines or relays before the invention of faster memory devices.

The phrase computing machine gradually gave way, after the late 1940s, to just computer as the onset of electronic digital machinery became common. These computers were able to perform the calculations that were performed by the previous human clerks.

Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described "purely mechanical." The theoretical Turing Machine, created by Alan Turing, is a hypothetical device theorized in order to study the properties of such hardware.

Emergence of a discipline

Charles Babbage and Ada Lovelace

Main articles: Charles Babbage and Ada Lovelace

Charles Babbage is often regarded as one of the first pioneers of computing. Beginning in the 1810s, Babbage had a vision of mechanically computing numbers and tables. Putting this into reality, Babbage designed a calculator to compute numbers up to 8 decimal points long. Continuing with the success of this idea, Babbage worked to develop a machine that could compute numbers with up to 20 decimal places. By the 1830s, Babbage had devised a plan to develop a machine that could use punched cards to perform arithmetical operations. The machine would store numbers in memory units, and there would be a form of sequential control. This means that one operation would be carried out before another in such a way that the machine would produce an answer and not fail. This machine was to be known as the “Analytical Engine”, which was the first true representation of what is the modern computer.[9]

Ada Lovelace (Augusta Ada Byron) is credited as the pioneer of computer programming and is regarded as a mathematical genius, a result of the mathematically heavy tutoring regimen her mother assigned to her as a young girl. Lovelace began working with Charles Babbage as an assistant while Babbage was working on his “Analytical Engine”, the first mechanical computer. During her work with Babbage, Ada Lovelace became the designer of the first computer algorithm, which had the ability to compute Bernoulli numbers. Moreover, Lovelace’s work with Babbage resulted in her prediction of future computers to not only perform mathematical calculations, but also manipulate symbols, mathematical or not. While she was never able to see the results of her work, as the “Analytical Engine” was not created in her lifetime, her efforts in later years, beginning in the 1840s, did not go unnoticed.[10]

Alan Turing and the Turing machine

Main articles: Alan Turing and Turing machine

The mathematical foundations of modern computer science began to be laid by Kurt Gödel with his incompleteness theorem (1931). In this theorem, he showed that there were limits to what could be proved and disproved within a formal system. This led to work by Gödel and others to define and describe these formal systems, including concepts such as mu-recursive functions and lambda-definable functions.

In 1936 Alan Turing and Alonzo Church independently, and also together, introduced the formalization of an algorithm, with limits on what can be computed, and a "purely mechanical" model for computing. This became the Church–Turing thesis, a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis claims that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available.

In 1936, Alan Turing also published his seminal work on the Turing machines, an abstract digital computing machine which is now simply referred to as the Universal Turing machine. This machine invented the principle of the modern computer and was the birthplace of the stored program concept that almost all modern day computers use.[11] These hypothetical machines were designed to formally determine, mathematically, what can be computed, taking into account limitations on computing ability. If a Turing machine can complete the task, it is considered Turing computable or more commonly, Turing complete.[12]

The Los Alamos physicist Stanley Frankel, has described John von Neumann's view of the fundamental importance of Turing's 1936 paper, in a letter:[11]

I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936… Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing...

Early computer hardware

In 1941, Konrad Zuse developed the world's first functional program-controlled computer, the Z3. In 1998, it was shown to be Turing-complete in principle.[13][14] Zuse also developed the S2 computing machine, considered the first process control computer. He founded one of the earliest computer businesses in 1941, producing the Z4, which became the world's first commercial computer. In 1946, he designed the first high-level programming language, Plankalkül.[15]

In 1948, the Manchester Baby was completed, it was the world's first general purpose electronic digital computer that also ran stored programs like almost all modern computers.[11] The influence on Max Newman of Turing's seminal 1936 paper on the Turing Machines and of his logico-mathematical contributions to the project, were both crucial to the successful development of the Manchester SSEM.[11]

In 1950, Britain's National Physical Laboratory completed Pilot ACE, a small scale programmable computer, based on Turing's philosophy. With an operating speed of 1 MHz, the Pilot Model ACE was for some time the fastest computer in the world.[11][16] Turing's design for ACE had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer, which was enormous by the standards of his day.[11] Had Turing's ACE been built as planned and in full, it would have been in a different league from the other early computers.[11]

Shannon and information theory

Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an ad hoc manner, lacking any theoretical rigor. This changed with Claude Elwood Shannon's publication of his 1937 master's thesis, A Symbolic Analysis of Relay and Switching Circuits. While taking an undergraduate philosophy class, Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. This concept, of utilizing the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers, and his thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II .

Shannon went on to found the field of information theory with his 1948 paper titled A Mathematical Theory of Communication, which applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, including data compression and cryptography .

Wiener and cybernetics

From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes, Norbert Wiener coined the term cybernetics from the Greek word for "steersman." He published "Cybernetics" in 1948, which influenced artificial intelligence. Wiener also compared computation, computing machinery, memory devices, and other cognitive similarities with his analysis of brain waves.

The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II.[17] While the invention of the term 'bug' is often but erroneously attributed to Grace Hopper, a future rear admiral in the U.S. Navy, who supposedly logged the "bug" on September 9, 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was September 9, 1947 when operators filed this 'incident' — along with the insect and the notation "First actual case of bug being found" (see software bug for details).[17]

John von Neumann and the von Neumann architecture

In 1946, a model for computer architecture was introduced and became known as Von Neumann architecture. Since 1950, the von Neumann model provided uniformity in subsequent computer designs. The von Neumann architecture was considered innovative as it introduced an idea of allowing machine instructions and data to share memory space. The von Neumann model is composed of three major parts, the arithmetic logic unit (ALU), the memory, and the instruction processing unit (IPU). In von Neumann machine design, the IPU passes addresses to memory, and memory, in turn, is routed either back to the IPU if an instruction is being fetched or to the ALU if data is being fetched.[18]

Von Neumann’s machine design uses a RISC (Reduced instruction set computing) architecture, which means the instruction set uses a total of 21 instructions to perform all tasks. (This is in contrast to CISC, complex instruction set computing, instruction sets which have more instructions from which to choose.) With von Neumann architecture, main memory along with the accumulator (the register that holds the result of logical operations)[19] are the two memories that are addressed. Operations can be carried out as simple arithmetic (these are performed by the ALU and include addition, subtraction, multiplication and division), conditional branches (these are more commonly seen now as if statements or while loops. The branches serve as go to statements), and logical moves between the different components of the machine, i.e., a move from the accumulator to memory or vice versa. Von Neumann architecture accepts fractions and instructions as data types. Finally, as the von Neumann architecture is a simple one, its register management is also simple. The architecture uses a set of seven registers to manipulate and interpret fetched data and instructions. These registers include the "IR" (instruction register), "IBR" (instruction buffer register), "MQ" (multiplier quotient register), "MAR" (memory address register), and "MDR" (memory data register)."[18] The architecture also uses a program counter ("PC") to keep track of where in the program the machine is.[18]

See also

References

  1. "History of Computer Science". uwaterloo.ca.
  2. Ifrah, Georges (2001). The Universal History of Computing: From the Abacus to the Quantum Computer. John Wiley & Sons. ISBN 0-471-39671-0.
  3. Bellos, Alex (2012-10-25). "Abacus adds up to number joy in Japan". The Guardian. London. Retrieved 2013-06-25.
  4. The Antikythera Mechanism Research Project, The Antikythera Mechanism Research Project. Retrieved 2007-07-01
  5. In search of lost time, Jo Marchant, Nature 444, #7119 (November 30, 2006), pp. 534–538, doi:10.1038/444534a PMID 17136067.
  6. "History of Computing Science: The First Mechanical Calculator". eingang.org.
  7. Kidwell, Peggy Aldritch; Williams, Michael R. (1992). The Calculating Machines: Their history and development (PDF). Massachusetts Institute of Technology and Tomash Publishers., p.38-42, translated and edited from Martin, Ernst (1925). Die Rechenmaschinen und ihre Entwicklungsgeschichte. Germany: Pappenheim.
  8. 1 2 Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. CRC Press.
  9. "Charles Babbage". Encyclopedia Britannica Online Academic Edition. Encyclopedia Britannica In. Retrieved 2013-02-20.
  10. Isaacson, Betsy (2012-12-10). "Ada Lovelace, World's First Computer Programmer, Celebrated With Google Doodle". The Huffington Post. http://www.huffingtonpost.com/2012/12/10/google-doodle-ada-lovelace_n_2270668.html. Retrieved 2013-02-20. External link in |publisher= (help)
  11. 1 2 3 4 5 6 7 "The Modern History of Computing". stanford.edu.
  12. Barker-Plummer, David. [<http://plato.stanford.edu/archives/win2012/entries/turing-machine/>. "Turing Machines"] Check |url= value (help). The Stanford Encyclopedia of Philosophy. Retrieved 2013-02-20.
  13. Rojas, R. (1998). "How to make Zuse's Z3 a universal computer". IEEE Annals of the History of Computing. 20 (3): 51–54. doi:10.1109/85.707574.
  14. Rojas, Raúl. "How to Make Zuse's Z3 a Universal Computer".
  15. Talk given by Horst Zuse to the Computer Conservation Society at the Science Museum (London) on 18 November 2010
  16. "BBC News – How Alan Turing's Pilot ACE changed computing". BBC News. May 15, 2010.
  17. 1 2 "The First "Computer Bug"" (PDF). CHIPS. United States Navy. 30 (1): 18. January–March 2012.
  18. 1 2 3 Cragon, Harvey G. (2000). Computer Architecture and Implementation. Cambridge: Cambridge University Press. pp. 1–13. ISBN 0-521-65168-9.
  19. "Accumlator" Def. 3. Oxford Dictionaries.

Further reading

Oral history links
This article is issued from Wikipedia - version of the 12/1/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.