Everipedia Logo
Everipedia is now IQ.wiki - Join the IQ Brainlist and our Discord for early access to editing on the new platform and to participate in the beta testing.
Computer Science

Computer Science

Computer science (sometimes called computation science or computing science, but not to be confused with computational science or software engineering) is the study of processes that interact with data and that can be represented as data in the form of programs. It enables the use of algorithms to manipulate, store, and communicate digital information. A computer scientist studies the theory of computation and the practice of designing software systems.[5]

Its fields can be divided into theoretical and practical disciplines. Computational complexity theory is highly abstract, while computer graphics emphasizes real-world applications. Programming language theory considers approaches to the description of computational processes, while computer programming itself involves the use of programming languages and complex systems. Human–computer interaction considers the challenges in making computers useful, usable, and accessible.

History

Charles Babbage, sometimes referred to as the "father of computing".

Charles Babbage, sometimes referred to as the "father of computing".[6]

Ada Lovelace is often credited with publishing the first algorithm intended for processing on a computer.

Ada Lovelace is often credited with publishing the first algorithm intended for processing on a computer.[7]

The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division. Algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment.

Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623.[8] In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner.[9] He may be considered the first computer scientist and information theorist, for, among other reasons, documenting the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry[1] when he released his simplified arithmometer, which was the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which eventually gave him the idea of the first programmable mechanical calculator, his Analytical Engine.[10] He started developing this machine in 1834, and "in less than two years, he had sketched out many of the salient features of the modern computer".[11]Charles%20Babbage%2C%20pioneer%20of]] [2]Ada Lovelace Bernoulli numbers ublished algorithm ever specifically tailored for implementation on a computer.[12] Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information; eventually his company became part of IBM. Following Babbage, although unaware of his earlier work, Percy Ludgate in 1909 published [13] the 2nd of the only two designs for mechanical analytical engines in history. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, which was making all kinds of punched card equipment and was also in the calculator business[14] to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit. When the machine was finished, some hailed it as "Babbage's dream come true".[15]

During the 1940s, as new and more powerful computing machines such as the Atanasoff–Berry computer and ENIAC were developed, the term computer came to refer to the machines rather than their human predecessors.[16] As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City. The renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world.[17] Ultimately, the close relationship between IBM and the university was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946.[18] Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s.[19][20] The world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science department in the United States was formed at Purdue University in 1962.[21] Since practical computers became available, many applications of computing have become distinct areas of study in their own rights.

Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.[22]Hackers%3A%20Heroes%20of%20the%20Compu]] IBM International Business Machines) released the IBM 704[24] and later the IBM 709[25] computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating […] if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again".[22]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*Hackers%3A%20Heroes%20of%20the%20Compu]]*uring the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.

Time has seen significant improvements in the usability and effectiveness of computing technology.[26] Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base. Initially, computers were quite costly, and some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage.

Contributions

The German military used the Enigma machine (shown here) during World War II for communications they wanted kept secret. The large-scale decryption of Enigma traffic at Bletchley Park was an important factor that contributed to Allied victory in WWII.

The German military used the Enigma machine (shown here) during World War II for communications they wanted kept secret. The large-scale decryption of Enigma traffic at Bletchley Park was an important factor that contributed to Allied victory in WWII.[27]

Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is a founding science of the current epoch of human history called the Information Age and a driver of the information revolution, seen as the third major leap in human technological progress after the Industrial Revolution (1750–1850 CE) and the Agricultural Revolution (8000–5000 BCE).

These contributions include:

  • The start of the "Digital Revolution", which includes the current Information Age and the Internet.[28]

  • A formal definition of computation and computability, and proof that there are computationally unsolvable and intractable problems.[29]

  • The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction.[30]

  • In cryptography, breaking the Enigma code was an important factor contributing to the Allied victory in World War II.[27]

  • Scientific computing enabled practical evaluation of processes and situations of great complexity, as well as experimentation entirely by software. It also enabled advanced study of the mind, and mapping of the human genome became possible with the Human Genome Project.[28] Distributed computing projects such as Folding@home explore protein folding.

  • Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligence, machine learning, and other statistical and numerical techniques on a large scale.[31] High frequency algorithmic trading can also exacerbate volatility.[32]

  • Computer graphics and computer-generated imagery have become ubiquitous in modern entertainment, particularly in television, cinema, advertising, animation and video games. Even films that feature no explicit CGI are usually "filmed" now on digital cameras, or edited or post-processed using a digital video editor.[33][34]

  • Simulation of various processes, including computational fluid dynamics, physical, electrical, and electronic systems and circuits, as well as societies and social situations (notably war games) along with their habitats, among many others. Modern computers enable optimization of such designs as complete aircraft. Notable in electrical and electronic circuit design are SPICE,[35] as well as software for physical realization of new (or modified) designs. The latter includes essential design software for integrated circuits.

  • Artificial intelligence is becoming increasingly important as it gets more efficient and complex. There are many applications of AI, some of which can be seen at home, such as robotic vacuum cleaners. It is also present in video games and on the modern battlefield in drones, anti-missile systems, and squad support robots.[36]

  • Human–computer interaction combines novel algorithms with design strategies that enable rapid human performance, low error rates, ease in learning, and high satisfaction. Researchers use ethnographic observation and automated data collection to understand user needs, then conduct usability tests to refine designs. Key innovations include the direct manipulation, selectable web links, touchscreen designs, mobile applications, and virtual reality.

Etymology

Although first proposed in 1956,[23]*The%20Science%20of%20Computing%3A%20Sh]]he term "computer science" appears in a 1959 article in Graduate School in Computer Sciences analogous to the creation of Harvard Business School in 1921,[38] justifying the name by arguing that, like management science, the subject is applied and interdisciplinary in nature, while having the characteristics typical of an academic discipline.[37] His efforts, and those of others such as numerical analyst George Forsythe, were rewarded: universities went on to create such departments, starting with Purdue in 1962.[39] Despite its name, a significant amount of computer science does not involve the study of computers themselves. Because of this, several alternative names have been proposed.[40] Certain departments of major universities prefer the term computing science, to emphasize precisely that difference. Danish scientist Peter Naur suggested the term datalogy,[41] to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution to use the term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. An alternative term, also proposed by Naur, is data science; this is now used for a multi-disciplinary field of data analysis, including statistics and databases.

Also, in the early days of computing, a number of terms for the practitioners of the field of computing were suggested in the Communications of the ACMturingineer,* turologist*,* flow-charts-man*,* applied meta-mathematician*, and* applied epistemologist*.[42] Three months later in the same journal, comptologist was suggested, followed next year by hypologist.[43] The term computics has also been suggested.[44] In Europe, terms derived from contracted translations of the expression "automatic information" (e.g. "informazione automatica" in Italian) or "information and mathematics" are often used, e.g. informatique (French), Informatik (German), informatica (Italian, Dutch), informática (Spanish, Portuguese), informatika (Slavic languages and Hungarian) or pliroforiki (πληροφορική, which means informatics) in Greek. Similar words have also been adopted in the UK (as in the School of Informatics of the University of Edinburgh).[45] "In the U.S., however, informatics is linked with applied computing, or computing in the context of another domain."[46]

A folkloric quotation, often attributed to—but almost certainly not first formulated by—Edsger Dijkstra, states that "computer science is no more about computers than astronomy is about telescopes."[3] The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research also often intersects other disciplines, such as philosophy, cognitive science, linguistics, mathematics, physics, biology, statistics, and logic.

Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science.[19] Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel, Alan Turing, Rózsa Péter and Alonzo Church and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.[23]

The relationship between Computer Science and Software Engineering is a contentious issue, which is further muddied by disputes over what the term "Software Engineering" means, and how computer science is defined.[47] David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.[48]

The academic, political, and funding aspects of computer science tend to depend on whether a department formed with a mathematical emphasis or with an engineering emphasis.

Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment with computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.

Philosophy

A number of computer scientists have argued for the distinction of three separate paradigms in computer science.

Peter Wegner argued that those paradigms are science, technology, and mathematics.[49] Peter Denning's working group argued that they are theory, abstraction (modeling), and design.[50] Amnon H. Eden described them as the "rationalist paradigm" (which treats computer science as a branch of mathematics, which is prevalent in theoretical computer science, and mainly employs deductive reasoning), the "technocratic paradigm" (which might be found in engineering approaches, most prominently in software engineering), and the "scientific paradigm" (which approaches computer-related artifacts from the empirical perspective of natural sciences, identifiable in some branches of artificial intelligence).[51]

Fields

Computer science is no more about computers than astronomy is about telescopes.— Michael Fellows

As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of computation to the practical issues of implementing computing systems in hardware and software.[52][53] CSAB, formerly called Computing Sciences Accreditation Board—which is made up of representatives of the Association for Computing Machinery (ACM), and the IEEE Computer Society (IEEE CS)[54]—identifies four areas that it considers crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also identifies fields such as software engineering, artificial intelligence, computer networking and communication, database systems, parallel computation, distributed computation, human–computer interaction, computer graphics, operating systems, and numerical and symbolic computation as being important areas of computer science.[52]

Theoretical computer science

Theoretical Computer Science is mathematical and abstract in spirit, but it derives its motivation from practical and everyday computation. Its aim is to understand the nature of computation and, as a consequence of this understanding, provide more efficient methodologies. All studies related to mathematical, logic and formal concepts and methods could be considered as theoretical computer science, provided that the motivation is clearly drawn from the field of computing.

Data structures and algorithms

Data structures and algorithms are the study of commonly used computational methods and their computational efficiency.

O(n2)
Analysis of algorithmsAlgorithmsData structuresCombinatorial optimizationComputational geometry

Theory of computation

According to Peter Denning, the fundamental question underlying computer science is, "What can be (efficiently) automated?"[19] Theory of computation is focused on answering fundamental questions about what can be computed and what amount of resources are required to perform those computations. In an effort to answer the first question, computability theory examines which computational problems are solvable on various theoretical models of computation. The second question is addressed by computational complexity theory, which studies the time and space costs associated with different approaches to solving a multitude of computational problems.

The famous P = NP? problem, one of the Millennium Prize Problems,[55] is an open problem in the theory of computation.

P = NP?GNITIRW-TERCES
Automata theoryComputability theoryComputational complexity theoryCryptographyQuantum computing theory

Information and coding theory

Information theory is related to the quantification of information.

This was developed by Claude Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.[56] Coding theory is the study of the properties of codes (systems for converting information from one form to another) and their fitness for a specific application. Codes are used for data compression, cryptography, error detection and correction, and more recently also for network coding. Codes are studied for the purpose of designing efficient and reliable data transmission methods. [57]

Programming language theory

Programming language theory is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of programming languages and their individual features. It falls within the discipline of computer science, both depending on and affecting mathematics, software engineering, and linguistics. It is an active research area, with numerous dedicated academic journals.

Type theoryCompiler designProgramming languages

Formal methods

Formal methods are a particular kind of mathematically based technique for the specification, development and verification of software and hardware systems.[58] The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design. They form an important theoretical underpinning for software engineering, especially where safety or security is involved. Formal methods are a useful adjunct to software testing since they help avoid errors and can also give a framework for testing. For industrial use, tool support is required. However, the high cost of using formal methods means that they are usually only used in the development of high-integrity and life-critical systems, where safety or security is of utmost importance. Formal methods are best described as the application of a fairly broad variety of theoretical computer science fundamentals, in particular logic calculi, formal languages, automata theory, and program semantics, but also type systems and algebraic data types to problems in software and hardware specification and verification.

Computer systems

Computer architecture and computer engineering

Computer architecture, or digital computer organization, is the conceptual design and fundamental operational structure of a computer system.

It focuses largely on the way by which the central processing unit performs internally and accesses addresses in memory.[59] The field often involves disciplines of computer engineering and electrical engineering, selecting and interconnecting hardware components to create computers that meet functional, performance, and cost goals.

Digital logicMicroarchitectureMultiprocessing
Ubiquitous computingSystems architectureOperating systems

Computer performance analysis

Computer performance analysis is the study of work flowing through computers with the general goals of improving throughput, controlling response time, using resources efficiently, eliminating bottlenecks, and predicting performance under anticipated peak loads.[60] Benchmarks are used to compare the performance of systems carrying different chips and/or system architectures.[61]

Concurrent, parallel and distributed systems

Concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other.[62] A number of mathematical models have been developed for general concurrent computation including Petri nets, process calculi and the Parallel Random Access Machine model.[63] When multiple computers are connected in a network while using concurrency, this is known as a distributed system. Computers within that distributed system have their own private memory, and information can be exchanged to achieve common goals.[64]

Computer networks

This branch of computer science aims to manage networks between computers worldwide.

Computer security and cryptography

Computer security is a branch of computer technology with an objective of protecting information from unauthorized access, disruption, or modification while maintaining the accessibility and usability of the system for its intended users.

Cryptography is the practice and study of hiding (encryption) and therefore deciphering (decryption) information. Modern cryptography is largely related to computer science, for many encryption and decryption algorithms are based on their computational complexity.

Databases

A database is intended to organize, store, and retrieve large amounts of data easily.

Digital databases are managed using database management systems to store, create, maintain, and search data, through database models and query languages.

Computer applications

Computer graphics and visualization

Computer graphics is the study of digital visual contents and involves the synthesis and manipulation of image data.

The study is connected to many other fields in computer science, including computer vision, image processing, and computational geometry, and is heavily applied in the fields of special effects and video games.

Human–computer interaction

Research that develops theories, principles, and guidelines for user interface designers, so they can create satisfactory user experiences with desktop, laptop, and mobile devices.

Scientific computing

Scientific computing (or computational science) is the field of study concerned with constructing mathematical models and quantitative analysis techniques and using computers to analyze and solve scientific problems. In practical use, it is typically the application of computer simulation and other forms of computation to problems in various scientific disciplines.

Numerical analysisComputational physicsComputational chemistryBioinformatics

Artificial intelligence

Artificial intelligence (AI) aims to or is required to synthesize goal-orientated processes such as problem-solving, decision-making, environmental adaptation, learning, and communication found in humans and animals.

From its origins in cybernetics and in the Dartmouth Conference (1956), artificial intelligence research has been necessarily cross-disciplinary, drawing on areas of expertise such as applied mathematics, symbolic logic, semiotics, electrical engineering, philosophy of mind, neurophysiology, and social intelligence. AI is associated in the popular mind with robotic development, but the main field of practical application has been as an embedded component in areas of software development, which require computational understanding. The starting point in the late 1940s was Alan Turing's question "Can computers think?", and the question remains effectively unanswered, although the Turing test is still used to assess computer output on the scale of human intelligence. But the automation of evaluative and predictive tasks has been increasingly successful as a substitute for human monitoring and intervention in domains of computer application involving complex real-world data.

Machine learningComputer visionImage processing
Pattern recognitionData miningEvolutionary computation
Knowledge representation and reasoningNatural language processingRobotics

Software engineering

Software engineering is the study of designing, implementing, and modifying software in order to ensure it is of high quality, affordable, maintainable, and fast to build.

It is a systematic approach to software design, involving the application of engineering practices to software.

Software engineering deals with the organizing and analyzing of software—it doesn't just deal with the creation or manufacture of new software, but its internal maintenance and arrangement.

Discoveries

The philosopher of computing Bill Rapaport noted three Great Insights of Computer Science:[65]

  • Alan Turing's insight: there are only five actions that a computer has to perform in order to do "anything".

  • Corrado Böhm and Giuseppe Jacopini's insight: there are only three ways of combining these actions (into more complex ones) that are needed in order for a computer to do "anything".[67]

Programming paradigms

Programming languages can be used to accomplish different tasks in different ways.

Common programming paradigms include:

  • Functional programming, a style of building the structure and elements of computer programs that treats computation as the evaluation of mathematical functions and avoids state and mutable data. It is a declarative programming paradigm, which means programming is done with expressions or declarations instead of statements.[68]

  • Imperative programming, a programming paradigm that uses statements that change a program's state.[69] In much the same way that the imperative mood in natural languages expresses commands, an imperative program consists of commands for the computer to perform. Imperative programming focuses on describing how a program operates.

  • Object-oriented programming, a programming paradigm based on the concept of "objects", which may contain data, in the form of fields, often known as attributes; and code, in the form of procedures, often known as methods. A feature of objects is that an object's procedures can access and often modify the data fields of the object with which they are associated. Thus Object-oriented computer programs are made out of objects that interact with one another.[70]

Many languages offer support for multiple paradigms, making the distinction more a matter of style than of technical capabilities.[71]

Academia

Conferences are important events for computer science research.

During these conferences, researchers from the public and private sectors present their recent work and meet.

Unlike in most other academic fields, in computer science, the prestige of conference papers is greater than that of journal publications.[72][73] One proposed explanation for this is the quick development of this relatively new field requires rapid review and distribution of results, a task better handled by conferences than by journals.[74]

Education

Computer Science, known by its near synonyms, Computing, Computer Studies, Information Technology (IT) and Information and Computing Technology (ICT), has been taught in UK schools since the days of batch processing, mark sensitive cards and paper tape but usually to a select few students.[75] In 1981, the BBC produced a micro-computer and classroom network and Computer Studies became common for GCE O level students (11–16-year-old), and Computer Science to A level students. Its importance was recognised, and it became a compulsory part of the National Curriculum, for Key Stage 3 & 4. In September 2014 it became an entitlement for all 7,000,000 pupils over the age of 4.[76]

In the US, with 14,000 school districts deciding the curriculum, provision was fractured.[77] According to a 2010 report by the Association for Computing Machinery (ACM) and Computer Science Teachers Association (CSTA), only 14 out of 50 states have adopted significant education standards for high school computer science.[78]

Israel, New Zealand, and South Korea have included computer science in their national secondary education curricula,[79][80] and several others are following.[81]

Challenges

In many countries, there is a significant gender gap in computer science education.

In 2012, only 20 percent of computer science degrees in the United States were awarded to women.[82] The gender gap is also a problem in other western countries.[83] The gap is smaller, or nonexistent, in some parts of the world. In 2011, women earned half of the computer science degrees in Malaysia.[84] In 2001, 55 percent of computer science graduates in Guyana were women.[83]

See also

  • Information technology

  • List of computer scientists

  • List of important publications in computer science

  • List of pioneers in computer science

  • List of unsolved problems in computer science

  • List of terms relating to algorithms and data structures

  • Software engineering

Computer science – Wikipedia book

References

[1]
Citation Linkopenlibrary.orgIn 1851
Sep 18, 2019, 7:56 PM
[2]
Citation Linkopenlibrary.org"The introduction of punched cards into the new engine was important not only as a more convenient form of control than the drums, or because programs could now be of unlimited extent, and could be stored and repeated without the danger of introducing errors in setting the machine by hand; it was important also because it served to crystallize Babbage's feeling that he had invented something really new, something much more than a sophisticated calculating machine." Collier, Bruce (1990). The little engine that could've: The calculating machines of Charles Babbage. Garland Publishing Inc. ISBN 978-0-8240-0043-1., 1970
Sep 18, 2019, 7:56 PM
[3]
Citation Linken.wikiquote.orgSee the entry "Computer science" on Wikiquote for the history of this quotation.
Sep 18, 2019, 7:56 PM
[4]
Citation Linkopenlibrary.orgThe word "anything" is written in quotation marks because there are things that computers cannot do. One example is: to answer the question if an arbitrary given computer program will eventually finish or run forever (the Halting problem).
Sep 18, 2019, 7:56 PM
[5]
Citation Linkwordnetweb.princeton.edu"WordNet Search—3.1". Wordnetweb.princeton.edu. Retrieved May 14, 2012.
Sep 18, 2019, 7:56 PM
[6]
Citation Linkwww.cbi.umn.edu"Charles Babbage Institute: Who Was Charles Babbage?". cbi.umn.edu. Retrieved December 28, 2016.
Sep 18, 2019, 7:56 PM
[7]
Citation Linkwww.computerhistory.org"Ada Lovelace | Babbage Engine | Computer History Museum". www.computerhistory.org. Retrieved December 28, 2016.
Sep 18, 2019, 7:56 PM
[8]
Citation Linkwww.fmi.uni-jena.de"Wilhelm Schickard – Ein Computerpionier" (PDF) (in German).
Sep 18, 2019, 7:56 PM
[9]
Citation Linkblogs.royalsociety.orgKeates, Fiona (June 25, 2012). "A Brief History of Computing". The Repository. The Royal Society.
Sep 18, 2019, 7:56 PM
[10]
Citation Linkwww.sciencemuseum.org.uk"Science Museum—Introduction to Babbage". Archived from the original on September 8, 2006. Retrieved September 24, 2006.
Sep 18, 2019, 7:56 PM
[11]
Citation Linkopenlibrary.orgAnthony Hyman (1982). Charles Babbage, pioneer of the computer.
Sep 18, 2019, 7:56 PM
[12]
Citation Linkwww.scottlan.edu"A Selection and Adaptation From Ada's Notes found in Ada, The Enchantress of Numbers," by Betty Alexandra Toole Ed.D. Strawberry Press, Mill Valley, CA". Archived from the original on February 10, 2006. Retrieved May 4, 2006.
Sep 18, 2019, 7:56 PM
[13]
Citation Linkscss.tcd.ieThe John Gabriel Byrne Computer Science Collection
Sep 18, 2019, 7:56 PM
[14]
Citation Linkopenlibrary.org"In this sense Aiken needed IBM, whose technology included the use of punched cards, the accumulation of numerical data, and the transfer of numerical data from one register to another", , p.44 (2000)
Sep 18, 2019, 7:56 PM
[15]
Citation Linkopenlibrary.org, p. 187, 1975
Sep 18, 2019, 7:56 PM
[16]
Citation Linkopenlibrary.orgThe Association for Computing Machinery (ACM) was founded in 1947.
Sep 18, 2019, 7:56 PM
[17]
Citation Linkwww.ibm.com"IBM Archives: 1945". Ibm.com. Retrieved March 19, 2019.
Sep 18, 2019, 7:56 PM
[18]
Citation Linkwww.ibm.com"IBM100 – The Origins of Computer Science". Ibm.com. September 15, 1995. Retrieved March 19, 2019.
Sep 18, 2019, 7:56 PM
[19]
Citation Linkwww.idi.ntnu.noDenning, Peter J. (2000). "Computer Science: The Discipline" (PDF). Encyclopedia of Computer Science. Archived from the original (PDF) on May 25, 2006.
Sep 18, 2019, 7:56 PM
[20]
Citation Linkwww.cl.cam.ac.uk"Some EDSAC statistics". University of Cambridge. Retrieved November 19, 2011.
Sep 18, 2019, 7:56 PM