Timeline of computer science before 1950
It is proposed that this article be deleted because of the following concern:
If you can address this concern by improving, copyediting, sourcing, renaming, or merging the page, please edit this page and do so. You may remove this message if you improve the article or otherwise object to deletion for any reason. Although not required, you are encouraged to explain why you object to the deletion, either in your edit summary or on the talk page. If this template is removed, do not replace it. The article may be deleted if this message remains in place for seven days, i.e., after 17:53, 10 May 2024 (UTC). Find sources: "Timeline of computer science before 1950" – news · newspapers · books · scholar · JSTOR Nominator: Please consider notifying the author/project: {{subst:proposed deletion notify|Timeline of computer science before 1950|concern=The article is a random collection of facts, mostly not related to computer science in common understanding, sources for most facts are not provided. I have checked the first two facts:
# Zero was invented in the Ancient Egypt and used in accounting. There is scant evidence that the Egyptians used a symbol [[nefer]] ("good", "beautiful") for "base" and zero-balanced accounts, it was not used in calculations. Anyhow, the concept of zero is arithmetic, not computer science. Sources provided in the article: none
# Indian grammarian's formal grammars had the power of a [[Turing machine]] and were [[Backus–Naur form]]-like. The claim makes little sense: TM has memory, BNF does not. Sources: none
This page appears to be [[WP:SYNTH]], I failed to locate any source that groups these disparate facts together.}} ~~~~ |
This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these template messages)
|
This article presents a detailed timeline of concepts in the history of computer science: from prehistory until 1949. For narratives explaining the overall developments, see History of computer science.
History of computing |
---|
Hardware |
Software |
Computer science |
Modern concepts |
By country |
Timeline of computing |
Glossary of computer science |
Date | Event |
---|---|
c. 1770 BC | First known use of zero by ancient Egyptians in accounting texts. |
c. 500 BC | Indian grammarian Pāṇini formulated the grammar of Sanskrit (in 3959 rules) known as the Ashtadhyayi which was highly systematised and technical. Pāṇini used metarules, transformations, and recursions with such sophistication that his grammar had the computing power equivalent to a Turing machine.[citation needed] Pāṇini's work was the forerunner to modern formal language theory, and a precursor to its use in modern computing. The Panini–Backus form used to describe most modern programming languages is also significantly similar to Pāṇini's grammar rules.[citation needed] |
c. 200 BC | Indian mathematician Pingala first described the binary number system which is now used in the design of essentially all modern computing equipment. He also conceived the notion of a binary code similar to the Morse code.[1][2] |
c. AD 9 | Chinese mathematicians first used negative numbers. |
c. AD 60 | Hero of Alexandria made numerous inventions, including "sequence control" in which the operator of a machine set a machine running, which then follows a series of instructions in a deterministic fashion. This was, essentially, the first program. He also made numerous innovations in the field of automata, which are important steps in the development of robotics. |
c. 639 | Indian mathematician Brahmagupta was the first to describe the modern place-value numeral system (Hindu numeral system). |
c. 820 | Persian mathematician, Muḥammad ibn Mūsā al-Khwārizmī, described the rudiments of modern algebra whose name is derived from his book Al-Kitāb al-muḫtaṣar fī ḥisāb al-ğabr wa-l-muqābala. The word algorithm is derived from al-Khwarizmi's Latinized name Algoritmi. |
c. 850 | Arab mathematician, Al-Kindi (Alkindus), was a pioneer of cryptography. He gave the first known recorded explanation of cryptanalysis in A Manuscript on Deciphering Cryptographic Messages. In particular, he is credited with developing the frequency analysis method whereby variations in the frequency of the occurrence of letters could be analyzed and exploited to break encryption ciphers (i.e. cryptanalysis by frequency analysis).[3] The text also covers methods of cryptanalysis, encipherments, cryptanalysis of certain encipherments, and statistical analysis of letters and letter combinations in Arabic.[citation needed] |
1412 | Ahmad al-Qalqashandi gives a list of ciphers in his Subh al-a'sha which include both substitution and transposition, and for the first time, a cipher with multiple substitutions for each plaintext letter. He also gives an exposition on and worked example of cryptanalysis, including the use of tables of letter frequencies and sets of letters which can not occur together in one word. |
c. 1450 | Kerala school of astronomy and mathematics in South India invented the floating-point number system.[4] |
Notes[edit]
- ^ The History of the Binomial Coefficients in India, California State University, East Bay. Archived March 16, 2008, at the Wayback Machine
- ^ Morse code. ActewAGL.
- ^ Simon Singh. The Code Book. pp. 14–20
- ^ Sriraman, Bharath; Ernest, Paul; Greer, Brian (2009-06-01). Critical Issues in Mathematics Education. IAP. pp. 175, 200. ISBN 9781607522188.