Timeline of computer science before 1950

From Wikipedia, the free encyclopedia

This article presents a detailed timeline of concepts in the history of computer science: from prehistory until 1949. For narratives explaining the overall developments, see History of computer science.

Date Event
c. 1770 BC First known use of zero by ancient Egyptians in accounting texts.
c. 500 BC Indian grammarian Pāṇini formulated the grammar of Sanskrit (in 3959 rules) known as the Ashtadhyayi which was highly systematised and technical. Pāṇini used metarules, transformations, and recursions with such sophistication that his grammar had the computing power equivalent to a Turing machine.[citation needed] Pāṇini's work was the forerunner to modern formal language theory, and a precursor to its use in modern computing. The Panini–Backus form used to describe most modern programming languages is also significantly similar to Pāṇini's grammar rules.[citation needed]
c. 200 BC Indian mathematician Pingala first described the binary number system which is now used in the design of essentially all modern computing equipment. He also conceived the notion of a binary code similar to the Morse code.[1][2]
c. AD 9 Chinese mathematicians first used negative numbers.
c. AD 60 Hero of Alexandria made numerous inventions, including "sequence control" in which the operator of a machine set a machine running, which then follows a series of instructions in a deterministic fashion. This was, essentially, the first program. He also made numerous innovations in the field of automata, which are important steps in the development of robotics.
c. 639 Indian mathematician Brahmagupta was the first to describe the modern place-value numeral system (Hindu numeral system).
c. 820 Persian mathematician, Muḥammad ibn Mūsā al-Khwārizmī, described the rudiments of modern algebra whose name is derived from his book Al-Kitāb al-muḫtaṣar fī ḥisāb al-ğabr wa-l-muqābala. The word algorithm is derived from al-Khwarizmi's Latinized name Algoritmi.
c. 850 Arab mathematician, Al-Kindi (Alkindus), was a pioneer of cryptography. He gave the first known recorded explanation of cryptanalysis in A Manuscript on Deciphering Cryptographic Messages. In particular, he is credited with developing the frequency analysis method whereby variations in the frequency of the occurrence of letters could be analyzed and exploited to break encryption ciphers (i.e. cryptanalysis by frequency analysis).[3] The text also covers methods of cryptanalysis, encipherments, cryptanalysis of certain encipherments, and statistical analysis of letters and letter combinations in Arabic.[citation needed]
1412 Ahmad al-Qalqashandi gives a list of ciphers in his Subh al-a'sha which include both substitution and transposition, and for the first time, a cipher with multiple substitutions for each plaintext letter. He also gives an exposition on and worked example of cryptanalysis, including the use of tables of letter frequencies and sets of letters which can not occur together in one word.
c. 1450 Kerala school of astronomy and mathematics in South India invented the floating-point number system.[4]

Notes[edit]

  1. ^ The History of the Binomial Coefficients in India, California State University, East Bay. Archived March 16, 2008, at the Wayback Machine
  2. ^ Morse code. ActewAGL.
  3. ^ Simon Singh. The Code Book. pp. 14–20
  4. ^ Sriraman, Bharath; Ernest, Paul; Greer, Brian (2009-06-01). Critical Issues in Mathematics Education. IAP. pp. 175, 200. ISBN 9781607522188.