History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

Prototype quantum processor boasts record 99.9% qubit fidelity

China's upgraded light-powered 'AGI chip' is now a million times more efficient than before, researchers say

Radical quantum computing theory could lead to more powerful machines than previously imagined

Most Popular

  • 2 Vikings in Norway were much more likely to die violent deaths than those in Denmark
  • 3 Why are some people's mosquito bites itchier than others'? New study hints at answer
  • 4 NASA's newly unfurled solar sail has started 'tumbling' end-over-end in orbit, surprising observations show
  • 5 Over 40% of pet cats play fetch — but scientists aren't quite sure why

assignment on computer history

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center
  • Introduction & Top Questions

Analog computers

Mainframe computer.

  • Supercomputer
  • Minicomputer
  • Microcomputer
  • Laptop computer
  • Embedded processors
  • Central processing unit
  • Main memory
  • Secondary memory
  • Input devices
  • Output devices
  • Communication devices
  • Peripheral interfaces
  • Fabrication
  • Transistor size
  • Power consumption
  • Quantum computing
  • Molecular computing
  • Role of operating systems
  • Multiuser systems
  • Thin systems
  • Reactive systems
  • Operating system design approaches
  • Local area networks
  • Wide area networks
  • Business and personal software
  • Scientific and engineering software
  • Internet and collaborative software
  • Games and entertainment
  • Analog calculators: from Napier’s logarithms to the slide rule
  • Digital calculators: from the Calculating Clock to the Arithmometer
  • The Jacquard loom
  • The Difference Engine
  • The Analytical Engine
  • Ada Lovelace, the first programmer
  • Herman Hollerith’s census tabulator
  • Other early business machine companies
  • Vannevar Bush’s Differential Analyzer
  • Howard Aiken’s digital calculators
  • The Turing machine
  • The Atanasoff-Berry Computer
  • The first computer network
  • Konrad Zuse
  • Bigger brains
  • Von Neumann’s “Preliminary Discussion”
  • The first stored-program machines
  • Machine language
  • Zuse’s Plankalkül
  • Interpreters
  • Grace Murray Hopper
  • IBM develops FORTRAN
  • Control programs
  • The IBM 360
  • Time-sharing from Project MAC to UNIX
  • Minicomputers
  • Integrated circuits
  • The Intel 4004
  • Early computer enthusiasts
  • The hobby market expands
  • From Star Trek to Microsoft
  • Application software
  • Commodore and Tandy enter the field
  • The graphical user interface
  • The IBM Personal Computer
  • Microsoft’s Windows operating system
  • Workstation computers
  • Embedded systems
  • Handheld digital devices
  • The Internet
  • Social networking
  • Ubiquitous computing

A laptop computer

What is a computer?

Who invented the computer, what can computers do, are computers conscious, what is the impact of computer artificial intelligence (ai) on society.

Technical insides of a desktop computer

Our editors will review what you’ve submitted and determine whether to revise the article.

  • University of Rhode Island - College of Arts and Sciences - Department of Computer Science and Statistics - History of Computers
  • LiveScience - History of Computers: A Brief Timeline
  • Computer History Museum - Timeline of Computer history
  • Engineering LibreTexts - What is a computer?
  • Computer Hope - What is a Computer?
  • computer - Children's Encyclopedia (Ages 8-11)
  • computer - Student Encyclopedia (Ages 11 and up)
  • Table Of Contents

A laptop computer

A computer is a machine that can store and process information . Most computers rely on a binary system , which uses two variables, 0 and 1, to complete tasks such as storing data, calculating algorithms, and displaying information. Computers come in many different shapes and sizes, from handheld smartphones to supercomputers weighing more than 300 tons.

Many people throughout history are credited with developing early prototypes that led to the modern computer. During World War II, physicist John Mauchly , engineer J. Presper Eckert, Jr. , and their colleagues at the University of Pennsylvania designed the first programmable general-purpose electronic digital computer, the Electronic Numerical Integrator and Computer (ENIAC).

What is the most powerful computer in the world?

As of November 2021 the most powerful computer in the world is the Japanese supercomputer Fugaku, developed by RIKEN and Fujitsu . It has been used to model COVID-19 simulations.

How do programming languages work?

Popular modern programming languages , such as JavaScript and Python, work through multiple forms of programming paradigms. Functional programming, which uses mathematical functions to give outputs based on data input, is one of the more common ways code is used to provide instructions for a computer.

The most powerful computers can perform extremely complex tasks, such as simulating nuclear weapon experiments and predicting the development of climate change . The development of quantum computers , machines that can handle a large number of calculations through quantum parallelism (derived from superposition ), would be able to do even more-complex tasks.

A computer’s ability to gain consciousness is a widely debated topic. Some argue that consciousness depends on self-awareness and the ability to think , which means that computers are conscious because they recognize their environment and can process data. Others believe that human consciousness can never be replicated by physical processes. Read one researcher’s perspective.

Computer artificial intelligence's impact on society is widely debated. Many argue that AI improves the quality of everyday life by doing routine and even complicated tasks better than humans can, making life simpler, safer, and more efficient. Others argue AI poses dangerous privacy risks, exacerbates racism by standardizing people, and costs workers their jobs leading to greater unemployment. For more on the debate over artificial intelligence, visit ProCon.org .

Recent News

computer , device for processing, storing, and displaying information.

Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery . The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications. The second section covers the history of computing. For details on computer architecture , software , and theory, see computer science .

Computing basics

The first computers were used primarily for numerical calculations. However, as any information can be numerically encoded, people soon realized that computers are capable of general-purpose information processing . Their capacity to handle large amounts of data has extended the range and accuracy of weather forecasting . Their speed has allowed them to make decisions about routing telephone connections through a network and to control mechanical systems such as automobiles, nuclear reactors, and robotic surgical tools. They are also cheap enough to be embedded in everyday appliances and to make clothes dryers and rice cookers “smart.” Computers have allowed us to pose and answer questions that were difficult to pursue in the past. These questions might be about DNA sequences in genes, patterns of activity in a consumer market, or all the uses of a word in texts that have been stored in a database . Increasingly, computers can also learn and adapt as they operate by using processes such as machine learning .

computer chip. computer. Hand holding computer chip. Central processing unit (CPU). history and society, science and technology, microchip, microprocessor motherboard computer Circuit Board

Computers also have limitations, some of which are theoretical. For example, there are undecidable propositions whose truth cannot be determined within a given set of rules, such as the logical structure of a computer. Because no universal algorithmic method can exist to identify such propositions, a computer asked to obtain the truth of such a proposition will (unless forcibly interrupted) continue indefinitely—a condition known as the “ halting problem .” ( See Turing machine .) Other limitations reflect current technology . For example, although computers have progressed greatly in terms of processing data and using artificial intelligence algorithms , they are limited by their incapacity to think in a more holistic fashion. Computers may imitate humans—quite effectively, even—but imitation may not replace the human element in social interaction. Ethical concerns also limit computers, because computers rely on data, rather than a moral compass or human conscience , to make decisions.

Analog computers use continuous physical magnitudes to represent quantitative information. At first they represented quantities with mechanical components ( see differential analyzer and integrator ), but after World War II voltages were used; by the 1960s digital computers had largely replaced them. Nonetheless, analog computers, and some hybrid digital-analog systems, continued in use through the 1960s in tasks such as aircraft and spaceflight simulation.

assignment on computer history

One advantage of analog computation is that it may be relatively simple to design and build an analog computer to solve a single problem. Another advantage is that analog computers can frequently represent and solve a problem in “real time”; that is, the computation proceeds at the same rate as the system being modeled by it. Their main disadvantages are that analog representations are limited in precision—typically a few decimal places but fewer in complex mechanisms—and general-purpose devices are expensive and not easily programmed.

Digital computers

In contrast to analog computers, digital computers represent information in discrete form, generally as sequences of 0s and 1s ( binary digits, or bits). The modern era of digital computers began in the late 1930s and early 1940s in the United States , Britain, and Germany . The first devices used switches operated by electromagnets (relays). Their programs were stored on punched paper tape or cards, and they had limited internal data storage. For historical developments, see the section Invention of the modern computer .

During the 1950s and ’60s, Unisys (maker of the UNIVAC computer), International Business Machines Corporation (IBM), and other companies made large, expensive computers of increasing power . They were used by major corporations and government research laboratories, typically as the sole computer in the organization. In 1959 the IBM 1401 computer rented for $8,000 per month (early IBM machines were almost always leased rather than sold), and in 1964 the largest IBM S/360 computer cost several million dollars.

These computers came to be called mainframes, though the term did not become common until smaller computers were built. Mainframe computers were characterized by having (for their time) large storage capabilities, fast components, and powerful computational abilities. They were highly reliable, and, because they frequently served vital needs in an organization, they were sometimes designed with redundant components that let them survive partial failures. Because they were complex systems, they were operated by a staff of systems programmers, who alone had access to the computer. Other users submitted “batch jobs” to be run one at a time on the mainframe.

Such systems remain important today, though they are no longer the sole, or even primary, central computing resource of an organization, which will typically have hundreds or thousands of personal computers (PCs). Mainframes now provide high-capacity data storage for Internet servers, or, through time-sharing techniques, they allow hundreds or thousands of users to run programs simultaneously. Because of their current roles, these computers are now called servers rather than mainframes.

  • Trending Now
  • Foundational Courses
  • Data Science
  • Practice Problem
  • Machine Learning
  • System Design
  • DevOps Tutorial

History of Computers

Before computers were developed people used sticks, stones, and bones as counting tools. As technology advanced and the human mind improved with time more computing devices were developed like Abacus, Napier’s Bones, etc. These devices were used as computers for performing mathematical computations but not very complex ones. 

Some of the popular computing devices are described below, starting from the oldest to the latest or most advanced technology developed:

Around 4000 years ago, the Chinese invented the Abacus, and it is believed to be the first computer. The history of computers begins with the birth of the abacus.

Structure: Abacus is basically a wooden rack that has metal rods with beads mounted on them.

Working of abacus: In the abacus, the beads were moved by the abacus operator according to some rules to perform arithmetic calculations. In some countries like China, Russia, and Japan, the abacus is still used by their people.

Napier’s Bones

Napier’s Bones was a manually operated calculating device and as the name indicates, it was invented by John Napier. In this device, he used 9 different ivory strips (bones) marked with numbers to multiply and divide for calculation. It was also the first machine to use the decimal point system for calculation.

It is also called an Arithmetic Machine or Adding Machine. A French mathematician-philosopher Blaise Pascal invented this between 1642 and 1644. It was the first mechanical and automatic calculator. It is invented by Pascal to help his father, a tax accountant in his work or calculation. It could perform addition and subtraction in quick time. It was basically a wooden box with a series of gears and wheels. It is worked by rotating wheel like when a wheel is rotated one revolution, it rotates the neighbouring wheel and a series of windows is given on the top of the wheels to read the totals.

Stepped Reckoner or Leibniz wheel

A German mathematician-philosopher Gottfried Wilhelm Leibniz in 1673 developed this device by improving Pascal’s invention to develop this machine. It was basically a digital mechanical calculator, and it was called the stepped reckoner as it was made of fluted drums instead of gears (used in the previous model of Pascaline).

Difference Engine

Charles Babbage who is also known as the “Father of Modern Computer” designed the Difference Engine in the early 1820s. Difference Engine was a mechanical computer which is capable of performing simple calculations. It works with help of steam as it was a steam-driven calculating machine, and it was designed to solve tables of numbers like logarithm tables.

Analytical Engine

Again in 1830 Charles Babbage developed another calculating machine which was Analytical Engine. Analytical Engine was a mechanical computer that used punch cards as input. It was capable of performing or solving any mathematical problem and storing information as a permanent memory (storage).

Tabulating Machine

Herman Hollerith, an American statistician invented this machine in the year 1890. Tabulating Machine was a mechanical tabulator that was based on punch cards. It was capable of tabulating statistics and record or sort data or information. This machine was used by U.S. Census in the year 1890. Hollerith’s Tabulating Machine Company was started by Hollerith and this company later became International Business Machine (IBM) in the year 1924.

Differential Analyzer

Differential Analyzer was the first electronic computer introduced in the year 1930 in the United States. It was basically an analog device that was invented by Vannevar Bush. This machine consists of vacuum tubes to switch electrical signals to perform calculations. It was capable of doing 25 calculations in a few minutes.

In the year 1937, major changes began in the history of computers when Howard Aiken planned to develop a machine that could perform large calculations or calculations involving large numbers. In the year 1944, Mark I computer was built as a partnership between IBM and Harvard. It was also the first programmable digital computer marking a new era in the computer world.

Generations of Computers

First Generation Computers

In the period of the year 1940-1956, it was referred to as the period of the first generation of computers. These machines are slow, huge, and expensive. In this generation of computers, vacuum tubes were used as the basic components of CPU and memory. Also, they were mainly dependent on the batch operating systems and punch cards. Magnetic tape and paper tape were used as output and input devices. For example ENIAC, UNIVAC-1, EDVAC, etc.

Second Generation Computers

In the period of the year, 1957-1963 was referred to as the period of the second generation of computers. It was the time of the transistor computers. In the second generation of computers, transistors (which were cheap in cost) are used. Transistors are also compact and consume less power. Transistor computers are faster than first-generation computers. For primary memory, magnetic cores were used, and for secondary memory magnetic disc and tapes for storage purposes. In second-generation computers, COBOL and FORTRAN are used as Assembly language and programming languages, and Batch processing and multiprogramming operating systems were used in these computers.

For example IBM 1620, IBM 7094, CDC 1604, CDC 3600, etc.

Third Generation Computers

In the third generation of computers, integrated circuits (ICs) were used instead of transistors(in the second generation). A single IC consists of many transistors which increased the power of a computer and also reduced the cost. The third generation computers are more reliable, efficient, and smaller in size. It used remote processing, time-sharing, and multiprogramming as operating systems. FORTRON-II TO IV, COBOL, and PASCAL PL/1 were used which are high-level programming languages.

For example IBM-360 series, Honeywell-6000 series, IBM-370/168, etc.

Fourth Generation Computers

The period of 1971-1980 was mainly the time of fourth generation computers. It used VLSI(Very Large Scale Integrated) circuits. VLSI is a chip containing millions of transistors and other circuit elements and because of these chips, the computers of this generation are more compact, powerful, fast, and affordable(low in cost). Real-time, time-sharing and distributed operating system are used by these computers. C and C++ are used as the programming languages in this generation of computers.

For example STAR 1000, PDP 11, CRAY-1, CRAY-X-MP, etc.

Fifth Generation Computers

From 1980 – to till date these computers are used. The ULSI (Ultra Large Scale Integration) technology is used in fifth-generation computers instead of the VLSI technology of fourth-generation computers. Microprocessor chips with ten million electronic components are used in these computers. Parallel processing hardware and AI (Artificial Intelligence) software are also used in fifth-generation computers. The programming languages like C, C++, Java, .Net, etc. are used.

For example Desktop, Laptop, NoteBook, UltraBook, etc.

Sample Questions

Let us now see some sample questions on the History of computers:

Question 1: Arithmetic Machine or Adding Machine is used between ___________ years.

a. 1642 and 1644

b. Around 4000 years ago

c. 1946 – 1956

d. None of the above

Solution:  

a. 1642 and 1644 Explanation: Pascaline is also called as Arithmetic Machine or Adding Machine. A French mathematician-philosopher Blaise Pascal invented this between 1642 and 1644. 

Question 2: Who designed the Difference Engine?

a. Blaise Pascal

b. Gottfried Wilhelm Leibniz 

c. Vannevar Bush

d. Charles Babbage 

Solution: 

d. Charles Babbage  Explanation: Charles Babbage who is also known as “Father of Modern Computer” designed the Difference Engine in the early 1820s.

Question 3: In second generation computers _______________ are used as Assembly language and programming languages.

a. C and C++.

b. COBOL and FORTRAN 

c. C and .NET

d. None of the above.

b. COBOL and FORTRAN  Explanation: In second generation computers COBOL and FORTRAN are used as Assembly language and programming languages, and Batch processing and multiprogramming operating systems were used in these computers.

Question 4: ENIAC and UNIVAC-1 are examples of which generation of computers?

a. First generation of computers.

b. Second generation of computers. 

c. Third generation of computers. 

d. Fourth generation of computers.  

a. First-generation of computers. Explanation: ENIAC, UNIVAC-1, EDVAC, etc. are examples of the first generation of computers.

Question 5: The ______________ technology is used in fifth generation computers .

a. ULSI (Ultra Large Scale Integration)

b. VLSI( very large scale integrated)

c. vacuum tubes

d. All of the above

a. ULSI (Ultra Large Scale Integration) Explanation: From 1980 -to till date these computers are used. The ULSI (Ultra Large Scale Integration) technology is used in fifth generation computers. 

Please Login to comment...

Similar reads.

  • School Learning
  • School Programming
  • Top 10 Fun ESL Games and Activities for Teaching Kids English Abroad in 2024
  • Top Free Voice Changers for Multiplayer Games and Chat in 2024
  • Best Monitors for MacBook Pro and MacBook Air in 2024
  • 10 Best Laptop Brands in 2024
  • System Design Netflix | A Complete Architecture

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

History Of Computers With Timeline [2023 Update]

History Of Computers And Computer Science

It’s important to know the history of computers in order to have a good understanding of the field. Computers are one of the most important inventions in human history. Given how fast technology is evolving, you might not expect the history of computers to go back thousands of years. However, that’s exactly the case. But before we go back that far, let’s first understand what a computer actually is.

The First Computers In History

What is a computer.

A computer is simply a machine that follows a set of instructions in order to execute sequences of logical or arithmetic functions. However, when we think of modern computers , we don’t see them as just calculators performing functions. Yet, that’s exactly what they are at their core.

Every time you make a purchase on Amazon or post a picture on Instagram, your computer is executing instructions and processes a massive amount of binary. However, when we consider the definition of a computer, we realize that the history of computers goes far back.

When Was The First Computer Invented?

The history of computers goes back thousands of years with the first one being the abacus . In fact, the earliest abacus, referred to as the Sumerian abacus, dates back to roughly 2700 B.C. from the Mesopotamia region. However, Charles Babbage, the English mathematician and inventor is known as the “Father of Computers.” He created a steam-powered computer known as the Analytical Engine in 1837 which kickstarted computer history.

Digital Vs. Analog Computers

The very first computer, the abacus, is a digital computer because it deals in digits. Today’s computers are also digital because they compute everything using binary: 0’s and 1’s. However, most of the computers between the time of the abacus and modern transistor-based computers were in fact analog computers.

Analog computers, rather than calculating single digits, deal with more complex mathematics and functions. Rather than 1’s and 0’s, analog computers are more often represented by continuously varying quantities. The earliest analog computer, the Antikythera mechanism , is over 2000 years old. These ancient computers paved the way for modern transistor-based computers.

Brief History of Computers

The history of computers goes back as far as 2500 B.C. with the abacus. However, the modern history of computers begins with the Analytical Engine, a steam-powered computer designed in 1837 by English mathematician and “Father of Computers,” Charles Babbage. Yet, the invention of the transistor in 1947, the integrated circuit in 1958, and the microprocessor in 1971 are what made computers much smaller and faster.

In fact, the first personal computer was invented in 1971, the same year as the microprocessor. Then, the first laptop, the Osborne-1 was created a decade later in 1981. Apple and IBM joined the personal computer industry shortly thereafter, popularizing the home PC. Then, when the world wide web came online in 1989, which would eventually serve to connect nearly the whole world.

The 1990s was a booming decade for computer history. IBM produced the first smartphone in 1992 and the first smartwatch was released in 1998. Also, the first-ever quantum computer in history was up and functioning in 1998, if only for a few nanoseconds.

Turn of the Century Computers

The 2000s are the years of social media: the rise and fall of MySpace at the forefront. Facebook took off shortly after and would become one of the popular apps on the iPhone, which was first presented by the legend, Steve Jobs in 2007. It was a pocket-sized computer that was capable of greater computation than the computer which brought mankind to the Moon. The iPad would be released three years later in 2010.

The 2010s seem to have been the decade of Artificial Intelligence and Quantum Computing. Tesla AI-powered self-driving vehicles have made incredible progress toward full autonomy. An AI robot named Sophia was created in 2016 and even gained citizenship in Saudi Arabia in 2017. The world’s first reprogrammable quantum computer was created in 2016, bringing us closer to quantum supremacy.

Timeline Of Computer History

The first digital computer.

2700 B.C: The first digital computer, the Abacus is invented and used around the area of Mesopotamia. Yet, later iterations of the abacus appear in Egypt, Greece, and China, where they’re continually used for hundreds of years. The first abaci were likely used for addition and subtraction which must have been revolutionary for the time. However, the following iterations allowed for more complex calculations.

The First Analog Computer

200 B.C: The first analog computer, the Antikythera mechanism, is created. The Antikythera mechanism was found off the coast of the Greek island of Kythira from which the computer received its name. This find actually baffled most scientists because a computer this advanced wasn’t supposed to exist this long ago. This mechanical analog computer was used by ancient sailors to determine their position in the sea, based on their astrological position.

Binary Number System

1703: Gottfried Wilhelm Leibniz developed the binary number system which is at the heart of modern computing. The binary number system is a way to convert a series of 0’s and 1’s into other numbers, letters, and characters. Everything we see on screen and interact with on our computers is converted into binary before the computer can process it. The magic of present-day computers is that they process binary extremely quickly.

First Programmable Loom

1801: Joseph Jacquard creates a punch-card programmable loom which greatly simplified the weaving process. This allowed those with fewer skills to weave more complicated patterns. However, many didn’t like the idea of simplifying and automating the process as it would displace weaving jobs at the time. Yet, technology persisted and the textile industry would eventually change for the better because of it.

First Steam-Driven Computer

1837: Charles Babbage designed the groundbreaking Analytical Engine . The analytical engine was the first major step toward modern computers. Although it was never actually built, its design embodied the major characteristics of modern computers. This included memory, a central processing unit, and the ability for input and output. Charles Babbage is commonly referred to as the “Father of Computers” for his work.

First Computer Algorithm

1843: Ada Lovelace , the daughter of Lord Byron, worked alongside Charles Babbage to design the analytical engine. However, shortly afterward, she developed the first-ever computer algorithm. She carefully considered what computers were capable of when developing her algorithm. The result was a solution to Bernoulli numbers , a significant mathematical advancement.

First U.S. Census Calculator

1890: Herman Hollerith created a tabulating machine to help calculate the U.S. census. The previous decade’s census took eight years to calculate but with the help of Hollerith’s tabulating machine, it took only six years. With the success of his tabulator, Hollerith then began his own company, the Hollerith Electrical Tabulating System. He applied this same technology to the areas of accounting and inventory.

The Turing Machine

1936: Alan Turing invented the Turing Machine and pushed the limits of what a computer could do at the time. A Turing Machine consists of a tape divided by squares that can contain a single digit, often binary digits, or nothing at all. It also consisted of a machine that could read each digit on the tape and change it. This might not sound like much, but computers to this day emulate this functionality of reading simple binary input and computing a logical output. This relatively simple machine enables the computation of any algorithm.

Turing set the standard for computers regarding them as “ Turing complete ” if they met the standards for simulating a Turing machine. Today’s computers are Turing complete because they simulate the same functionality of Turing machines, however with a much greater processing ability.

The Complex Number Calculator

1940: George Stibitz created the Complex Number Calculator for Bell Labs. It consisted of relays that could recognize the difference between ‘0’ and ‘1’ and therefore, could use binary as the base number system. The final version of the Complex Number Calculator used more than 400 relays and took about two years to create.

First Automatic Computer

1941: Konrad Zuse, a German Computer Scientist, invented the Z3 computer . Zuse’s Z3 was the first programmable fully automatic computer in history. It was much larger than the Complex number calculator and contained more than 2,500 relays. Since the Z3 computer didn’t demonstrate any advantage to the Germans during world war II, the government didn’t provide any funding for it and it was eventually destroyed in the war.

First Electric Digital Computer

1942: Professor John Vincent Atanasoff invented the Atanasoff-Berry Computer (ABC). The ABC was the first automatic electric digital computer in history. It contained over 300 vacuum tubes and solved linear equations but it was not programmable or Turing complete. However, the Atanasoff-Berry Computer will forever hold a place in Computer history.

First Programmable Electronic Digital Computer

1944: British engineer Tommy Flowers and assistants completed the code-breaking Colossus which assisted in decrypting German messages during world war II. It’s held as the first programmable electronic digital computer in history. The Colossus contained more than 1,600 vacuum tubes and thermionic valves in the prototype and 2,400 in the second version, the Mark 2 Colossus.

First General-Purpose Digital Computer

1945: ENIAC (Electronic Numerical Integrator and Computer) is completed by professors John Mauchly and J. Presper Eckert. ENIAC was absolutely massive, consisting of more than 17,000 vacuum tubes, 70,000 resistors, and 10,000 capacitors, filling a 30′ x 50′ room and weighing around 60,000 pounds. It was the first general-purpose digital computer in history and was extremely capable of a computer at the time. It’s said that for the first decade that ENIAC was in operation, it completed more calculations than in all of history previously.

First Computer Transistor

1947: William Shockley of Bell Labs invented the first transistor and drastically changed the course of computing history. The transistor replaced the common vacuum tube which allowed computers to be much more efficient while still greatly reducing their size and energy requirements.

First General-Purpose Commercial Computer

1951: Professors John Mauchly and J. Presper Eckert built UNIVAC (Universal Automatic Computer), the first general-purpose commercial computer in history. The early UNIVAC models utilized 5,000 vacuum tubes but later models in the series adopted transistors. It was a massive computer weighing around 16,000 pounds. However, the massive size allowed for more than 1,000 computations per second.

First Computer Programming Language

1954: A team at IBM led by John Backus created the first commercially available general-purpose computer programming language, FORTRAN. FORTRAN stands for Formula Translation and is still used today. When the language first appeared, however, there were bugs and inefficiencies which led people to speculate on the commercial usability of FORTRAN. Yet, the bugs were worked out many of the programming languages that came after were inspired by FORTRAN.

First Computer Operating System

1956: The first computer operating system in history was released in 1956 and produced by General Motors, called the GM-NAA I/O . It was created by Robert L. Patrick and allowed for direct input and output, hence the name. It also allowed for batch processing: the ability to execute a new program automatically after the current one finishes.

First Integrated Circuit

1958: Jack Kilby and Robert Noyce create the first integrated circuit , commonly known as a microchip. An integrated circuit consists of electronic circuits mounted onto a semiconductor. The most common semiconductor medium is silicon, which is where the name ‘ Silicon Valley ‘ comes from. If not for the integrated circuit, computers would still be the size of a refrigerator, rather than the size of a credit card.

First Supercomputer

1964: History’s first supercomputer, known as the CDC 6600 , was developed by Control Data Corp. It consisted of 400,000 transistors, 100 miles of wiring, and used Freon for internal cooling. Thus, the CDC 6600 was able to reach a processing speed of up to 3 million floating-point operations per second (3 megaFLOPS). Amazingly, this supercomputer was ten times faster than the fastest computer at the time and cost a whopping $8 million.

First Computer Mouse

1964: Douglas Engelbart invented the first computer mouse in history but it wouldn’t accompany the first Apple Macintosh until 1984. The computer mouse allowed for additional control of the computer in conjunction with the keyboard. These two input devices have been the primary source of user input ever since. However, voice commands from present-day smart devices are increasingly becoming the norm.

First Wide Area Computer Network

1969: DARPA created the first Wide Area Network in the history of computers called ARPAnet which was a precursor to the internet . It allowed computers to connect to a central hub and interact in nearly real time. The term “internet” wouldn’t come around until 1973 when computers in Norway and England connect to ARPAnet. Although the internet has continued to advance through the decades, many of the same protocols from ARPAnet are still standards today.

First Personal Computer

1971: The first personal computer in history, the Kenbak-1 , is created by John Blankenbaker, and sold for only $750. However, only around 40 of these computers were ever sold. As small as it was, it was able to execute hundreds of calculations in a single second. Blankenbaker had the idea for the personal computer for more than two decades before completing his first one.

First Computer Microprocessor

1971: Intel releases the first microprocessor in the history of computers, the Intel 4004 . This tiny microprocessor had the same computing power as the ENIAC computer and was the size of an entire room. Even by today’s standards, the Intel 4004 is a small microprocessor, housed on a 2-inch wafer as opposed to today’s 12-inch wafers. That said, the initial model had only 2,300 transistors while it’s not uncommon for today’s microprocessors to have several hundred million transistors.

First Apple Computer

1976: Apple takes the stage and releases its first computer: the Apple-1 . The Apple-1 was different from other computers at the time. It came fully assembled and on a single motherboard. It sold for nearly $700 and had only 4 KB of memory, which is almost laughable compared to today’s standards. However, that was plenty of memory for the applications at the time.

First IBM Personal Computer

1981: IBM launches its first personal computer, the IBM Model-5150 . It only took a year to develop and cost $1,600. However, that was a steep drop from other IBM computers before this that sold for several million dollars. The IBM Model-5150 had only 16 KB of RAM when it was first released, but eventually increased to up to 640 KB maximum RAM.

First Laptop Computer

1981: The first laptop in the history of computers, the Osborne 1 , was released by the Osborne Computer Corporation. It had an incredibly small 5-inch display screen, a bulky fold-out keyboard, 64 KB of main memory, and weighed 24 pounds. Not surprisingly, the Osborne 1 was actually very popular, selling more than 125,000 units in 1982 alone. The going rate for an Osborne 1 was $1,795.

First Windows Operating System

1985: Microsoft released its first version of the Windows operating system, Windows 1.0 . What made Windows 1.0 remarkable was its reliance on the computer mouse which wasn’t standard yet. It even included a game, Reversi, to help users become accustomed to the new input device. Love it or hate it, the Windows 1.0 operating system and its subsequent versions have become commonplace among computers ever since its creation. The development of the original Windows OS was led by none other than Bill Gates himself.

World Wide Web Is Created

1989: The World Wide Web is created by Sir Tim Berners-Lee of CERN . When it was first created, it wasn’t intended to grow into a massive platform that would connect the average person. Rather, it was originally just intended to easily share information between scientists and universities. The first website in the history of computers was actually just a guide to using the world wide web.

First Flash-Based Solid State Drive

1991: The first flash-based solid-state drive was created by SanDisk (at the time it was called SunDisk). These drives presented an alternative option to hard drives and would prove to be very useful in computers, cell phones, and similar devices. This first flash-based SSD had 20 MB of memory and sold for approximately $1,000.

First Smartphone Is Created

1992: IBM created the first-ever smartphone in history, the IBM Simon , which was released two years later in 1994. It was a far cry from the smartphones we’re used to today. However, at the time, IBM Simon was a game-changer. It sold for $1,100 when it was first released and even had a touchscreen and several applications including mail, a calendar, a to-do list, and a few more.

First Platform Independent Language

1995: Sun Microsystems releases the first iteration of the Java programming language . Java was the first computer programming language in history to be platform-independent, popularizing the phrase: “Write once, run anywhere.” Unlike other computer programming languages at the time, a program written with Java could run on any device with the Java Development Kit (JDK).

First Smartwatch Is Released

1998: The first-ever smartwatch , the Ruputer, was released by the watch company Seiko. If you look at the original Ruputer, you’ll see that it really doesn’t look much different than present-day smartwatches with the exception of a better display and minor styling changes. As it wasn’t a touchscreen, a small joystick assisted with navigating the various feature of the watch.

First Quantum Computer

1998: After decades of theory, the first quantum computer is created by three computer scientists. It was only 2 qubits, as opposed to the 16 qubit reprogrammable quantum computers of recent. This first quantum computer didn’t solve any significant problem, as it wasn’t incredibly efficient. In fact, it ran for only a few nanoseconds. However, it was a proof of concept that paved the way for today’s quantum computers.

First USB Flash Drive

2000: The first USB Flash drive in computer history, the ThumbDrive , is released by Trek, a company out of Singapore. However, there were other flash drives that hit the market almost immediately after, such as I.B.M.’s DiskOnKey, a 1.44 MB flash drive. This led to some speculation as to who was actually first. However, as evidenced by the patent application back in 1999, and the fact that Trek’s ThumbDrive made it to market first, the debate was shortly settled.

DARPA Centibots Project

2002: DARPA launched the Centibots project in which they developed 100 identical robots that could work together and communicate with each other. The Centibots could survey an area and build a map of it in real time. Additionally, these robots could identify objects including their companion robots and people, and distinguish between the two. The maps that they make are incredibly accurate. In total, the Centibots project cost around $2.2 million to complete.

MySpace Comes And Goes

2004: MySpace gained over 1 million users within the first month of its official launch and 22 million users just a year later. Soon after in 2005, it was purchased by News Corp for $580 million. However, shortly after the sale of MySpace, it was fraught with scandal after scandal. MySpace helped to popularize social media, with Facebook trailing right behind it, passing MySpace in users in 2008. It eventually laid off about 50% of its workforce in 2011.

Arduino Is Released

2005: Italian designer Massimo Banzi released the Arduino , a credit card-sized development board. The Arduino was intended to help design students who didn’t have any previous exposure to programming and electronics but eventually became a beloved tool for tech hobbyists worldwide. To this day, Arduino boards are increasingly a part of electronics education including self-education.

iPhone Generation-1 Released

2007: Steve Jobs of Apple released the first-ever iPhone , revolutionizing the smartphone industry. The screen was 50% bigger than the popular smartphones of the time, such as the beloved Blackberry and Treo. It also had a much longer-lasting battery. Additionally, the iPhone normalized web browsing and video playback on phones, setting a new standard across the industry. The cost was what you could expect from an iPhone, selling at around $600, more than twice as much as its competitors.

Apple’s iPad Is Released

2010: Only three years after the iPhone is released, Steve Jobs announces the first-ever iPad , Apple’s first tablet computer. It came with a 9.7-inch touchscreen and options for either 16GB, 32GB, or 64GB. The beauty of the iPad is that it was basically a large iPhone, as it ran on the same iOS and offered the same functionality. The original iPad started at $499 with the 64GB Wi-Fi + 3G version selling for $829.

Nest Thermostat Is Released

2011: The Nest Thermostat , a smart thermostat created by Nest Labs, is released as a growing number of household devices make up the “internet of things.” When the Nest Thermostat was first released, not only did it make thermostats smart, it made them beautiful. For only $250 you could buy a thermostat that decreased your energy bill, improved the aesthetic of your home, and is controlled by your phone.

First Raspberry Pi Computer

2012: The first Raspberry Pi computer is released, opening up a world of possibilities for creative coders. These small yet capable computers cost around 25$-$35 when first released and were as small as a credit card. Raspberry Pi’s were similar to the Arduino in size but differed greatly in their capability. The Raspberry Pi is several times faster than the Arduino and has over 100,000 times more memory.

Tesla Introduces Autopilot

2014: Tesla’s Elon Musk introduces the first self-driving features in its fleet of automobiles dubbed: Autopilot. The future of automobiles chauffeuring their passengers with no input from a driver is finally within sight. The first feature of autopilot included not only camera systems but also radar and sonar in order to detect everything within the car’s surroundings. It also included a self-park feature and even a summoning feature that calls the vehicle to you. The computers and technology within Tesla vehicles have essentially turned them into the first advanced personal transportation robots in history.

Sophia The Robot Is Created

2016: Sophia, the artificially intelligent humanoid robot, was created by former Disney Imagineer David Hanson. A year after her creation, Sophia gained citizenship in Saudi Arabia, becoming the first robot in history to gain citizenship. Since she was created, Sophia was taken part in many interviews and even debates. She’s quite a wonder to watch!

First Reprogrammable Quantum Computer

2016: Quantum Computers have made considerable progress and the first reprogrammable quantum computer is finally complete. It’s made up of 5 singular atoms that act as switches. These switches are activated by a laser beam that controls the state of the qubit. This leap has brought us very close to quantum supremacy.

First Brain-Computer Interface

2019: Elon Musk announces Neuralink’s progress of their brain-machine interface that would lend humans the same information processing abilities that computers have while linking to Artificial Intelligence. In this announcement, Neuralink revealed that they had already successfully tested their technology on mice and apes.

Tesla Nears Fully Autonomous Vehicles

2020: In July, Elon Musk declared that a Tesla autopilot update is coming later this year that will bring their vehicles one step closer to complete “level-5” autonomy . Level-5 autonomy would finally allow passengers to reach their destination without any human intervention. The long-awaited software update would likely increase the company’s value massively and Musk’s net worth along with it.

Elon Musk announces Tesla Bot, becomes Time’s Person of the Year

2021: Musk continues to innovate, announcing in August that Tesla is developing a near-life-size humanoid robot. Many are skeptical of the viability of the robot while others claimed this is another of Musk’s inventions that science fiction warned against similar to the Brain-Computer Interface.

Regardless of any opinions, Musk still had a stellar year. Starship has made progress, Tesla sales are on the rise, and Musk managed to earn the title of Time’s Person of the Year. All the while becoming the most wealthy person on the planet, with a net worth exceeding $250 million.

Facebook changes name to Meta, Zuck announces Metaverse

2021: In October, Mark Zuckerberg made a bold and controversial, yet possibly visionary announcement that Facebook would change its name to Meta. Additionally, he explained the new immersive Virtual Reality world they’re creating that would be built on top of the existing social network, dubbed the Metaverse. Zuckerberg elaborated that the technology for the immersive experience he envisions is mostly here but mainstream adoption is still 5 to 10 years out . However, when that time comes, your imagination will be your only limitation within the confines of the Metaverse.

IBM’s “Eagle” Quantum Computer Chip (127 Qubits)

2021: IBM continues to lead the charge in quantum computer development and in November, they showcased their new “Eagle” chip . This is currently the most cutting-edge quantum chip in existence, packing 127 qubits, making it the first to reach over 100 qubits. IBM plans to create a new chip more than three times more powerful than the “Eagle” by next year, 2022.

OpenAI Releases DALL-E 2

2022: DALL-E, developed by OpenAI, is capable of generating high-quality images from textual descriptions. It uses a combination of deep learning techniques and a large database of images to generate new and unique images based on textual input. DALL-E 2, launched in April 2022, is the second generation of this language model that is trained on hundreds of millions of images and is almost magical in its production of high-quality images in a matter of seconds.

IBM’s “Osprey” Quantum Computer Chip (433 Qubits)

2022: In only a year, IBM has nearly tripled the quantum capacity of its previous “Eagle” chip. The new “Osprey” quantum chip, announced in November, greatly surpasses its predecessor. The IBM Osprey quantum computer chip represents a major advancement in quantum computing technology and is expected to pave the way for even more powerful quantum computers in the future.

ChatGPT Released Upon The World

2022: ChatGPT is launched on November 30 and took the world by storm, amassing over 1 million users in only 5 days! Currently, ChatGPT is powered by GPT3.x, which is the latest version of the AI software. The development of ChatGPT was a significant milestone in the field of natural language processing, as it represented a significant improvement in the ability of machines to understand and generate human language.

The Much Hyped GPT4 Is Finally Released

2023: ChatGPT and many other AI apps run on GPT. As powerful as the GPT3.x was, GPT4 is trained on a much more massive data set and is far more accurate and better at understanding the intentions of the user’s prompts. It’s a giant leap forward from the previous version, causing both excitement and concern from the general public as well as tech powerhouses such as Elon Musk who wants to slow the advancement of AI.

Tim Statler

Tim Statler is a Computer Science student at Governors State University and the creator of Comp Sci Central. He lives in Crete, IL with his wife, Stefanie, and their cats, Beyoncé and Monte. When he's not studying or writing for Comp Sci Central, he's probably just hanging out or making some delicious food.

Recent Posts

Programming Language Levels (Lowest to Highest)

When learning to code, one of the first things I was curious about was the difference in programming language levels. I recently did a deep dive into these different levels and put together this...

Is Python a High-Level Language?

Python is my favorite programming language so I wanted to know, "Is Python a High-Level Language?" I did a little bit of research to find out for myself and here is what I learned. Is Python a...

  • All Categories
  • Artificial Intelligence Software

History Of Computers: Timeline, I/O Devices and Networking

assignment on computer history

In this post

Computers in the 1600s

Computers in the 1800s, computers from the 1900-1950s, computers from the 1960-1970s, computers from the 1980-1990s, computers from 2000-2010, computers from 2011 - present day.

Can you imagine your life without a computer?

Think about all of the things you wouldn’t be able to do. Send an email or online shop, and find an answer to a question instantly. And that’s just the tip of the iceberg.

We’ve come a long way from the very first computer and even the first smartphone. But how much do you really know about their history and evolution? From floppy discs to artificial intelligence (AI) software , the Acorn to the Macintosh, let’s explore how far we’ve come.

While today we use computers for both work and play, they were actually created for an entirely different purpose.

In 1880, the population of the United States had grown so large that it took seven years to formulate the results of the U.S. Census. So, the government looked for a faster way to get the job done, which is why punch-card computers were invented, which took up an entire room. The first digital computer, known as the Pascal Machine, has its roots in the 16th century. The Pascal Machine was the first revelation in the world of computers before Charles Babbage invented the Difference engine. From assembly to binary, we have come a long way.

While that’s how the story starts, it’s certainly not where it ends. Let’s explore the history of computers.

History of Computers 18000-1970s

In the 1600s, the idea of an analog computer became a little known. A computer was referred to a device that is used for mathematical and statistical calculations. It used the techniques of decimal, statistics and logarithmic expressions to compute values. Women were tasked with such calculations and computations. 

1617:  John Napier invented Napier Bones, a manual calculating apparatus. Napier bones was a handheld device that used to support multiplication and division. It was also known as Napier Rods.

1642: Biase Pascal, a french mathematician invented Pascaline, which is also known as the world's first automated calculator. 

1673: Gottfried Willhelm Liebniz invented Stepped Reckoner or Liebniz wheel, which was an improved version of Pascaline. This mechanical calculator could do basic calculations and was made of flat drums. 

Want to learn more about Artificial Intelligence Software? Explore Artificial Intelligence products.

1801 : In France, weaver and merchant Joseph Marie Jacquard create a loom that uses wooden punch cards to automate the design of woven fabrics. Early computers would use similar punch cards.

1822 : Thanks to funding from the English government, mathematician Charles Babbage invented a steam-driven calculating machine that was able to compute tables of numbers.

1822:  Charles Babbage invented the difference engine as a machine that could perform any kind of calculation, from basic mathematical to astronomical calculations. 

1840:  Lady Ada Lovelace wrote her first computer programme which was a sequence of user-generated instructions for the computer.

1890 : Inventor Herman Hollerith designs the punch card system to calculate the 1880 U.S. census. It took him three years to create, and it saved the government $5 million. He would eventually go on to establish a company that would become IBM .

Hollerith Machine

A census clerk tabulates data using the Hollerith machine Source: Census.gov

1936 : Alan Turing developed an idea for a universal machine, which he would call the Turing machine, that would be able to compute anything that is computable. The concept of modern computers was based on his idea.

1937 : A professor of physics and mathematics at Iowa State University, J.V. Atanasoff, attempts to build the first computer without cams, belts, gears, or shafts.

1939 : Bill Hewlett and David Packard found Hewlett-Packard in a garage in Palo Alto, California. Their first project, the HP 200A Audio Oscillator, would rapidly become a popular piece of test equipment for engineers.

In fact, Walt Disney Pictures would order eight to test recording equipment and speaker systems for 12 specially equipped theaters that showed Fantasia in 1940.

David Packard and Bill Hewlett

David Packard and Bill Hewlett in 1964 Source : PA Daily Post

Also in 1939, Bell Telephone Laboratories completed The Complex Number Calculator, designed by George Stibitz.

1941 : Professor of physics and mathematics at Iowa State University J.V. Atanasoff and graduate student Clifford Berry designed a computer that can solve 29 equations simultaneously. This is the first time a computer is able to house data within its own memory.

That same year, German engineer Konrad Zuse created the Z3 computer, which used 2,300 relays, performed floating-point binary arithmetic, and had a 22-bit word length. This computer was eventually destroyed in a bombing raid in Berlin in 1943.

Additionally, in 1941, Alan Turing and Harold Keen built the British Bombe, which decrypted Nazi ENIGMA-based military communications during World War II.

1943 : John Mauchly and J. Presper Eckert, professors at the University of Pennsylvania, built an Electronic Numerical Integrator and Calculator (ENIAC) . This is considered to be the grandfather of digital computers, as it is made up of 18,000 vacuum tubes and fills up a 20-foot by 40-foot room.

1944:  J. Presper Eckert and John Mauchly built the successor of ENIAC, known as Electronic Discrete Variable Automatic Computer which was one of the earliest electronic computer that could process and store discrete data

1951:   The first commercially produced digital computer, known as UNIVAC, was launched as a general purpose commercial computer for the US business economy. 

Tip: A vacuum tube was a device that controlled electronic current flow.

ENIAC Technician

Source: Science Photo Library

Also, in 1943, the U.S. Army asked that Bell Laboratories design a machine to assist in the testing of their M-9 director, which was a type of computer that aims large guns at their targets. George Stibitz recommended a delay-based calculator for the project. This resulted in the Relay Interpolator, which was later known as the Bell Labs Model II.

1944 : British engineer Tommy Flowers designed the Colossus, which was created to break the complex code used by the Nazis in World War II. A total of ten were delivered, each using roughly 2,500 vacuum tubes. These machines would reduce the time it took to break their code from weeks to hours, leading historians to believe they greatly shortened the war by being able to understand the intentions and beliefs of the Nazis.

That same year, Harvard physics professor Howard Aiken built and designed The Harvard Mark 1, a room-sized, relay-based calculator.

1945 : Mathematician John von Neumann writes The First Draft of a Report on the EDVAC . This paper broke down the architecture of a stored-program computer.

1946 : Mauchly and Eckert left the University of Pennsylvania and obtained funding from the Census Bureau to build the Universal Automatic Computer (UNIVAC) . This would become the first commercial computer for business and government use.

That same year, Will F. Jenkins published the science fiction short story A Logic Named Joe , which detailed a world where computers, called Logics, interconnect into a worldwide network. When a Logic malfunctions, it gives out secret information about forbidden topics.

1947 : Walter Brattain, William Shockley, and John Bradeen of Bell Laboratories invented the transistor, which allowed them to discover a way to make an electric switch using solid materials, not vacuums.

1947:  Kathleen Booth invented assembly language that converted high level language (HLL) into machine language. It bridged software functioning with the hardware.  

1948 : Frederick Williams, Geoff Toothill, and Tom Kilburn, researchers at the University of Manchester, developed the Small-Scale Experimental Machine. This was built to test new memory technology, which became the first high-speed electronic random access memory for computers. The became the first program to run on a digital, electronic, stored-program computer.

1950 : Built in Washington, DC, the Standards Eastern Automatic Computer (SEAC) w as created, becoming the first stored-program computer completed in the United States. It was a test bed for evaluating components and systems, in addition to setting computer standards.

1953 : Computer scientist Grace Hopper develops the first computer language, which is eventually known a s Common Business Oriented Language (COBOL), that allowed a computer user to use English-like words instead of numbers to give the computer instructions. In 1997, a study showed that over 200 billion lines of COBOL code were still in existence.

assignment on computer history

Source: bricsys

That same year, businessman Thomas Johnson Watson Jr. created the IBM 701 EDPM, which is used to help the United Nations keep tabs on Korea during the war.

1954 : Th e Formula Translation (FORTRAN) program ming language is developed by John Backus and a team of programmers at IBM.

Additionally, IBM created the 650, which was the first mass-produced computer, selling 450 in just one year.

1958 : Jack Kirby and Robert Noyce invented the integrated circuit, which is what we now call the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.

1962 : IBM announces the 1311 Disk Storage Drive, the first disk drive made with a removable disk pack. Each pack weighed 10 pounds, held six disks, and had a capacity of 2 million characters.

Also in 1962, the Atlas computer made its debut, thanks to Manchester University, Ferranti Computers, and Plessy. At the time, it was the fastest computer in the world and introduced the idea of “virtual memory”.

1963:  The American Standard Code for Information Exchange (ASCII) system was launched as the official character encoding standard for computers. In ASCII, one letter or numeric is represented as an 8-bit code.

1964 : Dou glas Engelbart introduces a prototype for the modern computer that includes a mouse and a graphical user interface (GUI). This begins the evolution from computers being exclusively for scientists and mathematicians to b eing accessible to the general public.

Additionally, IBM introduced SABRE, their reservation system with American Airlines. It program officially launched four years later, and now the company owns Travelocity. It used telephone lines to link 2,000 terminals in 65 cities, delivering data on any flight in under three seconds.

1968 : Stanley Kubrick’s 2001: A Space Odyssey hits theaters. This cult classic tells the story of the HAL 9000 computer as it malfunctions during a spaceship’s trip to Jupiter to investigate a mysterious signal. The HAL 9000, which controlled the ship, went rogue, killed the crew, and had to be shut down by the only surviving crew member. The film depicted computer-demonstrated voice and visual recognition, human-computer interaction, speed synthesis, and other advanced technologies.

1968:  Japanese company OKI, launched dot matrix printers which supported a general character of 128 characters with a print matrix of 7x5

1969 : Developers at Bell Labs unveil UNIX, an operating system written in C programming language that addressed compatibility issues within programs.

UNIX at Bell Labs

Source: Nokia Bell Labs

1970 : Intel introduces the world to the Intel 1103, the fi rst Dynamic Access Memory (DRAM) chip.

1970:  PASCAL, a low-level procedure oriented language was invented by Niklaus Wirth of Switzerland to teach structured programming, that involves loops and conditions. 

1971 : Alan Shugart and a team of IBM engineers invented the floppy disk, allowing data to be shared among computers.

That same year, Xerox introduced the world to the first laser printer, which not only generated billions of dollars but also launched a new era in computer printing.

Also, email begins to grow in popularity as it expands to computer networks.

1973 : Robert Metcalfe, research employee at Xerox, develops Ethernet, connecting multiple computers and hardware.

1974 : Personal computers are officially on the market! The first of the bunch were Scelbi & Mark-8 Altair, IBM 5100, and Radio Shack's TRS-80.

1975 : In January,  Popular Electronics magazine featured the Altair 8800 as the world’s first minicomputer kit. Paul Allen and Bill Gates offer to write software for Altair using the BASIC language. You could say writing software was successful because in the same year, they created their own software company, Microsoft .

1976 : Steve Jobs and Steve Wozniak start Apple Computers and introduced the world to the Apple I, the first computer with a single-circuit board.

first apple computer

Source: MacRumors

Also, in 1976, Queen Elizabeth II sent out her first email from the Royal Signals and Radar Establishment to demonstrate networking technology.

Queen Elizabeth II Sends Emails

Source: Wired

1977 : Jobs and Wozniak unveiled the Apple II at the first West Coast Computer Faire. It boasts color graphics and an audio cassette drive for storage. Millions were sold between 1977 and 1993, making it one of the longest-lived lines of personal computers.

1978 : The first computers were installed in the White House during the Carter administration. The White House staff was given terminals to access the shared Hewlett-Packard HP3000.

Also, the first computerized spreadsheet program, VisiCalc, is introduced.

Additionally, LaserDisc is introduced by MCA and Phillips. The first to be sold in North America was the movie Jaws .

1979 : MicroPro International unveils WordStar, a word processing program.

WordStar certainly wasn’t the last of its kind, as there is a plethora of other document creation software on the market today. Check out some of the highest-rated and leave a review on your favorite!

1981 : Not to be outdone by Apple, IBM releases their first personal computer, the Acorn, with an Intel chip, two floppy disks, and an available color monitor.

IBM Acorn

Source: Florida History Network

1982 : Instead of going with its annual tradition of naming a “Man of the Year”, Time Magazine does something a little different and names the computer its “Machine of the Year”. A senior writer noted in the article, “Computers were once regarded as distant, ominous abstractions, like Big Brother. In 1982, they truly became personalized, brought down to scale, so that people could hold, prod and play with them."

Time Magazine Machine of the Year

Source: Time

1983 : The CD-ROM hit the market, able to hold 550 megabytes of pre-recorded data. That same year, many computer companies worked to set a standard for these disks, making them able to be used freely to access a wide variety of information.

Later that year, Microsoft introduced Word, which was originally called Multi-Tool Word.

1984 : Apple launches Macintosh, which was introduced during a Super Bowl XVIII commercial. The Macintosh was the first successful mouse-driven computer with a graphical user interface. It sold for $2,500.

1985 : Microsoft announces Windows, which allowed for multi-tasking with a graphical user interface.

That same year, a small Massachusetts computer manufacturer registered the first dot com domain name, Symbolics.com.

Also, the programming language C++ is published and is said to make programming “more enjoyable” for the serious programmer.

1986 : Originally called the Special Effects Computer Group, Pixar is created at Lucasfilm. It worked to create computer-animated portions of popular films, like Star Trek II: The Wrath of Khan . Steve Jobs purchased Pixar in 1986 for $10 million, renaming it Pixar. It was bought by Disney in 2006.

1990 : English programmer and physicist Tim Berners-Lee develops HyperText Markup Language, also known as HTML. He also prototyped the term WorldWideWeb. It features a server, HTML, URLs, and the first browser.

1991 : Apple released the Powerbook series of laptops, which included a built-in trackball, internal floppy disk, and palm rests. The line was discontinued in 2006.

1993 : In an attempt to enter the handheld computer market, Apple releases Newton. Called the “Personal Data Assistant”, it never performed the way Apple President John Scully had hoped, and it was discontinued in 1998.

Also that year, Steven Spielberg’s Jurassic Park hits theaters, showcasing cutting-edge computer animation in addition to animatronics and puppetry.

History of Computers 1980 to present

1995 : IBM released the ThinkPad 701C, which was officially known as the Track Write, with an expanding full-sized keyboard that was comprised of three interlocking pieces.

Additionally, the format for a Digital Video Disc (DVD) is introduced, featuring a huge increase in storage space that the compact disc (CD).

Also that year, Microsoft’s Windows 95 operating system was launched. To spread the word, a $300 million promotional campaign was rolled out, featuring TV commercials that used “Start Me Up” by the Rolling Stones and a 30-minute video starring Matthew Perry and Jennifer Aniston. It was installed on more computers than any other operating system.

And in the world of code, Java 1.0 is introduced by Sun Microsystems, followed by JavaScript at Netscape Communications.

1996 : Sergey Brin and Larry Page develop Google at Stanford University.

Sergey Brin and Larry Page

Source: CNBC

That same year, Palm Inc., founded by Ed Colligan, Donna Dubinsky, and Jeff Hawkins, created the personal data assistance called the Palm Pilot.

Also in 1996 was the introduction of the Sony Vaio series. This desktop computer featured an additional 3D interface in addition to the Windows 95 operating system, as a way to attract new users. The line was discontinued in 2014.

1997 : Microsoft invests $150 million into Apple, which ended Apple’s court case against Microsoft, saying they copied the “look and feel” of their operating system.

1998 : Apple releases the iMac, a range of all-in-one Macintosh desktop computers. Selling for $1,300, these computers included a 4GB hard drive, 32MB Ram, a CD-ROM, and a 15-inch monitor.

Apple's iMac Computers

Source: Start Ups Venture Capital

1998:  Isaac Chuang of Los Alamos National Laboratory, Niel Gershenfeld of Massachusetts Institute of Technology (MIT) and Mark Kubinec of University of California launched the first quantum computer (2-qubit) that can be loaded with quantum energy to produce outputs.  

1999 : The term Wi-Fi becomes part of the computing language as users begin connecting without wires. Without missing a beat, Apple creates its “Airport” Wi-Fi router and builds connectivity into Macs.

2000 : In Japan, SoftBank introduced the first camera phone, the J-Phone J-SH04. The camera had a maximum resolution of 0.11 megapixels, a 256-color display, and photos could be shared wirelessly. It was such a hit that a flip-phone version was released just a month later.

Also, in 2000, the USB flash drive is introduced. Used for data storage, they were faster and had a greater amount of storage space than other storage media options. Plus, they couldn’t be scratched like CDs.

2001 : Apple introduces the Mac OS X operating system. Not to be outdone, Microsoft unveiled Windows XP soon after.

Also, the first Apple stores are opened in Tysons Corner, Virginia, and Glendale, California. Apple also released iTunes, which allowed users to record music from CDs, burn it onto the program, and then mix it with other songs to create a custom CD.

2003 : Apple releases the iTunes music store, giving users the ability to purchase songs within the program. In less than a week after its debut, over 1 million songs were downloaded.

Also, in 2003, the Blu-ray optical disc is released as the successor of the DVD.

And, who can forget the popular social networking site Myspace, which was founded in 2003. By 2005, it had more than 100 million users.

2004 : The first challenger of Microsoft’s Internet Explorer came in the form of Mozilla’s Firefox 1.0. That same year, Facebook launched as a social networking site.

Original Facebook Homepage

Source: Business Insider

2005 : YouTube, the popular video-sharing service, is founded by Jawed Karim, Steve Chen, and Chad Hurley. Later that year, Google acquired the mobile phone operating system Android.

2006 : Apple unveiled the MacBook Pro, making it their first Intel-based, dual-core mobile computer.

That same year at the World Economic Forum in Davos, Switzerland, the United Nations Development Program announced they were creating a program to deliver technology and resources to schools in under-developed countries. The project became the One Laptop per Child Consortium, which was founded by Nicholas Negroponte, the founder of MIT’s Media Lab. By 2011, over 2.4 million laptops had been shipped.

And we can’t forget to mention the launch of Amazon Web Services, including Amazon Elastic Cloud 2 (EC2) and Amazon Simple Storage Service (S3). EC2 made it possible for users to use the cloud to scale server capacity quickly and efficiently. S3 was a cloud-based file hosting service that charged users monthly for the amount of data they stored.

2007 : Apple released the first iPhone, bringing many computer functions to the palm of our hands. It featured a combination of a web browser, a music player, and a cell phone -- all in one. Users could also download additional functionality in the form of “apps”. The full-touchscreen smartphone allowed for GPS navigation, texting, a built-in calendar, a high-definition camera, and weather reports.

Steve Jobs with Original iPhone

Also, in 2007, Amazon released the Kindle, one of the first electronic reading systems to gain a large following among consumers.

And Dropbox was founded by Arash Ferdowsi and Drew Houston as a way for users to have convenient storage and access to their files on a cloud-based service.

2008 : Apple releases the MacBook Air, the first ultra notebook that was a thin and lightweight laptop with a high-capacity battery. To get it to be a smaller size, Apple replaced the traditional hard drive with a solid-state disk, making it the first mass-marketed computer to do so.

2009 : Microsoft launched Windows 7.

2010 : Apple released the iPad, officially breaking into the dormant tablet computer category. This new gadget came with many features the iPhone had, plus a 9-inch screen and minus the phone.

2011 : Google releases the Chromebook, a laptop that runs on Google Chrome OS.

Also in 2011, the Nest Learning Thermostat emerges as one of the first Internet of Things, allowing for remote access to a user’s home thermostat by use of their smartphone or tablet. It also sent monthly power consumption reports to help customers save on energy bills.

In Apple news, co-founder Steve Jobs passed away on October 11. The brand also announced that the iPhone 4S will feature Siri, a voice-activated personal assistant.

2012 : On October 4, Facebook hits 1 billion users, as well as acquires the image-sharing social networking application Instagram.

Also in 2012, the Raspberry Pi, a credit-card-sized single-board computer, is released, weighing only 45 grams.

2014 : The University of Michigan Micro Mote (M3), the smallest computer in the world, is created. Three types were made available, two of which measured either temperature or pressure and one that could take images.

Additionally, the Apple Pay mobile payment system is introduced.

2015 : Apple releases the Apple Watch, which incorporated Apple’s iOS operating system and sensors for environmental and health monitoring. Almost a million units were sold on the day of its release.

This release was followed closely by Microsoft announcing Windows 10.

2016 : The first reprogrammable quantum computer is created.

2019 : Apple announces iPadOS, the iPad's very own operating system, to support the device better as it becomes more like a computer and less like a mobile device. 

2022: Frontier became the first exascale supercomputer, surpassing one exaFLOP. Developed by HPE and using AMD EPYC CPUs and Radeon Instinct GPUs, it cost $600 million and is housed at OLCF, Tennessee, advancing scientific research.

2023:  Microsoft releases ChatGPT-powered Bing to offer a search generative experience and answer maximum search queries of the users. 

2023:  Open AI launched GPT-4 or ChatGPT Plus on March 15, 2023

2023:  The AI PC was introduced in December 2023 with the inclusion of Intel Core Ultra

So, what’s next?

I don’t have the answer to what awaits us in regard to computers. One thing is for sure -- in order to keep up with the world of tech, the growing need for cyber security and data security , and our constant need for the next big thing, computers aren’t going anywhere. If anything, they’re only going to become a bigger part of our daily lives.

Now that you've learned about the history of computers, it's time to protect your online presence and stay one step ahead in the ever-evolving digital landscape with digital security .

Mara Calvello

Mara Calvello is a Content Marketing Manager at G2. She received her Bachelor of Arts degree from Elmhurst College (now Elmhurst University). Mara works on our G2 Tea newsletter, while also writing content to support categories on artificial intelligence, natural language understanding (NLU), AI code generation, synthetic data, and more. In her spare time, she's out exploring with her rescue dog Zeke or enjoying a good book.

Explore More G2 Articles

Artificial intelligence (AI) software

Browse Course Material

Course info.

  • Dr. Slava Gerovitch

Departments

  • Science, Technology, and Society

As Taught In

  • Computer Science
  • History of Science and Technology
  • Modern History

Learning Resource Types

The history of computing, assignments.

This section includes reading response paper assignments in the  unstructured  and  structured  formats and a  final paper assignment.

Weekly Questions

Reading response paper assignment (sessions 2-6: the unstructured format).

Write a 1-2 page reading response paper addressing the issues raised in the readings. You may choose from the provided list of tentative questions, but you are encouraged to raise your own questions. Your paper must touch upon all the readings assigned for the upcoming session.

Strategies for Writing a Good Reading Response Paper

  • Define your personal stance towards the issues raised in the readings.
  • Avoid generalities, be specific.
  • Focus on the points where you disagree, or where you can push the argument further.
  • Cite examples from your personal experience or from other literature.
  • Ask provocative questions, even if you do not know the answers.

Your paper will be made accessible to other members of the class after the deadline. It will be part of discussion in class.

Papers must be submitted in the morning before each class. No late papers are accepted.

Be creative and imaginative! Good luck!

SES # TOPICS WEEKLY QUESTIONS STUDENT SAMPLES
2 Issues in the History of Computing ( ) Daniel Roy ( )
3 Computers in Nuclear Physics: ENIAC and the Hydrogen Bomb ( ) Anthony Grue ( )
Steven Stern ( ) (Courtesy of Steven Stern. Used with permission.)
4 Computers in Meteorology: Simulating the World ( ) Jason Ruchelsman ( ) (Courtesy of Jason Ruchelsman. Used with permission.)
Katherine A. Franco ( ) (Courtesy of Katherine Franco. Used with permission.)
5 Computers in Mathematics: The Logic Theorist and the Automation of Proof ( ) Joshua Tauber ( )
Patrick Griffin ( ) (Courtesy of Patrick Griffin. Used with permission.)
6 Computers in Cognitive Psychology: GPS and Psychological Theory ( ) Aaron Bell ( ) (Courtesy of Aaron Bell. Used with permission.)
Steven Stern ( ) (Courtesy of Steven Stern. Used with permission.)

Reading Response Paper Assignment (Sessions 7-13: The Structured Format)

Write a 1-2 page structured paper in response to your readings. The paper must focus on a single question; you may choose from the provided list, but you are encouraged to formulate your own question. Your paper must have the following format:

  • Introduction: State your question; explain its significance; formulate your thesis.
  • Background: Briefly give relevant historical information about the computing developments that you will analyze.
  • Survey of literature: State the existing perspectives (more than one) on the subject of your analysis; these can be gauged from your readings or simply hypothesized (one could argue that…).
  • Analysis: Give your own perspective and supporting argument.
  • Conclusion: What is the lesson here? What are further lines of inquiry, new questions to ask?
  • References: Use the format from the syllabus.

Devote no more than 1-2 paragraphs to each section. You may combine sections 2 and 3, if necessary. I realize that information in your readings may not be sufficient to fill all the sections; do the best you can. Your paper does not have to cite all the readings for the week, but you must read all of them. Spell-check and proof-read your paper before submission.

SES # TOPICS WEEKLY QUESTIONS STUDENT SAMPLES
7 Computers in Biochemistry: DENDRAL and Knowledge Engineering ( ) Aaron Bell ( ) (Courtesy of Aaron Bell. Used with permission.)
Jason Ruchelsman ( ) (Courtesy of Jason Ruchelsman. Used with permission.)
8 Computers in Aerospace: The Apollo Guidance Computer ( ) Anthony Grue ( )
Patrick Griffin ( ) (Courtesy of Patrick Griffin. Used with permission.)
10 Computers in Medicine: MYCIN and the Formalization of Expertise ( ) Aaron Bell ( ) (Courtesy of Aaron Bell. Used with permission.)
Daniel Roy ( )
11 Supercomputing at Home: A Social Experiment in Distributed Computing ( ) Antoinne Machal-Cajigas ( )
Patrick Griffin ( ) (Courtesy of Patrick Griffin. Used with permission.)
12 Computers in Linguistics: Lost in Machine Translation ( ) Aaron Bell ( ) (Courtesy of Aaron Bell. Used with permission.)
Steven Stern ( ) (Courtesy of Steven Stern. Used with permission.)
13 Computers in the Humanities: Hype, Text, and Hypertext ( ) Aaron Bell ( ) (Courtesy of Aaron Bell. Used with permission.)
Patrick Griffin ( ) (Courtesy of Patrick Griffin. Used with permission.)

Final Paper Assignment

Write a 10-15 page paper (double-spaced, 1.25" margins, 12 pt font). You may choose any topic that addresses the use of the computer as a scientific instrument. You may choose something close to your own area of expertise, or something completely different. You can focus on one specific computer system and analyze its uses from different perspectives (designers’, users’, scientists’, humanists’, etc.), or you can address a larger issue that involves a certain category of computer systems (for example, expert systems) and perhaps a range of scientific disciplines. You may choose one of the topics we discussed in class, but you must significantly broaden the range of your sources. Your final paper must analyze both primary sources (participants’ accounts) and secondary sources (works by historians, sociologists, anthropologists, or other commentators). Choose an issue over which there has been (or should have been) some debate, and take a stand on that issue. Provide ample argumentation for your position and explain your objections to the alternative position(s). The final paper should follow the same structured format that is required for the Session 7-13 reading responses.

Final Paper Guide

Proposal for a Final Paper

Write a 1-2 page proposal for your final paper. The proposal should include: (1) the central question the final paper will address; (2) the historical significance of this question and how it relates to discussions in class; (3) a brief outline; and (4) a tentative bibliography, including both primary and secondary sources. Your proposal will receive the instructor’s feedback the following week. The proposal is due in class on Session 9.

Final Paper Guidelines

Write a 10-15 page paper (double-spaced, 1.25" margins, 12 pt font). You may choose any topic that addresses the use of the computer as a scientific instrument. You may choose something close to your own area of expertise, or something completely different. You can focus on one specific computer system and analyze its uses from different perspectives (designers’, users’, scientists’, humanists’, etc.), or you can address a larger issue that involves a certain category of computer systems (for example, expert systems) and perhaps a range of scientific disciplines. You may choose one of the topics we discussed in class, but you must significantly broaden the range of your sources. Your final paper must analyze both primary sources (participants’ accounts) and secondary sources (works by historians, sociologists, anthropologists, or other commentators). Choose an issue over which there has been (or should have been) some debate, and take a stand on that issue. Provide ample argumentation for your position and explain your objections to the alternative position(s). The final paper should follow the same structured format that is required for the Session 7-13 reading responses. The final paper is due in class on Session 14.

Sample Final Paper

Anthony Ronald Grue ( PDF )

facebook

You are leaving MIT OpenCourseWare

  • History of Computers

When we study the many aspects of computing and computers, it is important to know about the history of computers. Charles Babbage designed an Analytical Engine which was a general computer   It helps us understand the growth and progress of technology through the times. It is also an important topic for competitive and banking exams.

Suggested Videos

What is a computer.

A computer is an electronic machine that collects information, stores it, processes it according to user instructions, and then returns the result.

A computer is a programmable electronic device that performs arithmetic and logical operations automatically using a set of instructions provided by the user.

Early Computing Devices

People used sticks, stones, and bones as counting tools before computers were invented. More computing devices were produced as technology advanced and the human intellect improved over time. Let us look at a few of the early-age computing devices used by mankind.

Abacus was invented by the Chinese around 4000 years ago. It’s a wooden rack with metal rods with beads attached to them. The abacus operator moves the beads according to certain guidelines to complete arithmetic computations.

  • Napier’s Bone

John Napier devised Napier’s Bones, a manually operated calculating apparatus. For calculating, this instrument used 9 separate ivory strips (bones) marked with numerals to multiply and divide. It was also the first machine to calculate using the decimal point system.

Pascaline was invented in 1642 by Biaise Pascal, a French mathematician and philosopher. It is thought to be the first mechanical and automated calculator. It was a wooden box with gears and wheels inside.

  • Stepped Reckoner or Leibniz wheel

In 1673, a German mathematician-philosopher named Gottfried Wilhelm Leibniz improved on Pascal’s invention to create this apparatus. It was a digital mechanical calculator known as the stepped reckoner because it used fluted drums instead of gears.

  • Difference Engine

In the early 1820s, Charles Babbage created the Difference Engine. It was a mechanical computer that could do basic computations. It was a steam-powered calculating machine used to solve numerical tables such as logarithmic tables.

  • Analytical Engine 

Charles Babbage created another calculating machine, the Analytical Engine, in 1830. It was a mechanical computer that took input from punch cards. It was capable of solving any mathematical problem and storing data in an indefinite memory.

  • Tabulating machine 

An American Statistician – Herman Hollerith invented this machine in the year 1890. Tabulating Machine was a punch card-based mechanical tabulator. It could compute statistics and record or sort data or information. Hollerith began manufacturing these machines in his company, which ultimately became International Business Machines (IBM) in 1924.

  • Differential Analyzer 

Vannevar Bush introduced the first electrical computer, the Differential Analyzer, in 1930. This machine is made up of vacuum tubes that switch electrical impulses in order to do calculations. It was capable of performing 25 calculations in a matter of minutes.

Howard Aiken planned to build a machine in 1937 that could conduct massive calculations or calculations using enormous numbers. The Mark I computer was constructed in 1944 as a collaboration between IBM and Harvard.

History of Computers Generation

The word ‘computer’ has a very interesting origin. It was first used in the 16th century for a person who used to compute, i.e. do calculations. The word was used in the same sense as a noun until the 20th century. Women were hired as human computers to carry out all forms of calculations and computations.

By the last part of the 19th century, the word was also used to describe machines that did calculations. The modern-day use of the word is generally to describe programmable digital devices that run on electricity.

Early History of Computer

Since the evolution of humans, devices have been used for calculations for thousands of years. One of the earliest and most well-known devices was an abacus. Then in 1822, the father of computers, Charles Babbage began developing what would be the first mechanical computer. And then in 1833 he actually designed an Analytical Engine which was a general-purpose computer. It contained an ALU, some basic flow chart principles and the concept of integrated memory.

Then more than a century later in the history of computers, we got our first electronic computer for general purpose. It was the ENIAC, which stands for Electronic Numerical Integrator and Computer. The inventors of this computer were John W. Mauchly and J.Presper Eckert.

And with times the technology developed and the computers got smaller and the processing got faster. We got our first laptop in 1981 and it was introduced by Adam Osborne and EPSON.

Browse more Topics under Basics Of Computers

  • Number Systems
  • Number System Conversions

Generations of Computers

  • Computer Organisation
  • Computer Memory
  • Computers Abbreviations
  • Basic Computer Terminology
  • Computer Languages
  • Basic Internet Knowledge and Protocols
  • Hardware and Software
  • Keyboard Shortcuts
  • I/O Devices
  • Practice Problems On Basics Of Computers

In the history of computers, we often refer to the advancements of modern computers as the generation of computers . We are currently on the fifth generation of computers. So let us look at the important features of these five generations of computers.

  • 1st Generation: This was from the period of 1940 to 1955. This was when machine language was developed for the use of computers. They used vacuum tubes for the circuitry. For the purpose of memory, they used magnetic drums. These machines were complicated, large, and expensive. They were mostly reliant on batch operating systems and punch cards. As output and input devices, magnetic tape and paper tape were implemented. For example, ENIAC, UNIVAC-1, EDVAC, and so on.
  • 2nd Generation:  The years 1957-1963 were referred to as the “second generation of computers” at the time. In second-generation computers, COBOL and FORTRAN are employed as assembly languages and programming languages. Here they advanced from vacuum tubes to transistors. This made the computers smaller, faster and more energy-efficient. And they advanced from binary to assembly languages. For instance, IBM 1620, IBM 7094, CDC 1604, CDC 3600, and so forth.
  • 3rd Generation: The hallmark of this period (1964-1971) was the development of the integrated circuit.  A single integrated circuit (IC) is made up of many transistors, which increases the power of a computer while simultaneously lowering its cost. These computers were quicker, smaller, more reliable, and less expensive than their predecessors. High-level programming languages such as FORTRON-II to IV, COBOL, and PASCAL PL/1 were utilized. For example, the IBM-360 series, the Honeywell-6000 series, and the IBM-370/168.
  • 4th Generation: The invention of the microprocessors brought along the fourth generation of computers. The years 1971-1980 were dominated by fourth generation computers. C, C++ and Java were the programming languages utilized in this generation of computers. For instance, the STAR 1000, PDP 11, CRAY-1, CRAY-X-MP, and Apple II. This was when we started producing computers for home use.
  • 5th Generation:  These computers have been utilized since 1980 and continue to be used now. This is the present and the future of the computer world. The defining aspect of this generation is artificial intelligence. The use of parallel processing and superconductors are making this a reality and provide a lot of scope for the future. Fifth-generation computers use ULSI (Ultra Large Scale Integration) technology. These are the most recent and sophisticated computers. C, C++, Java,.Net, and more programming languages are used. For instance, IBM, Pentium, Desktop, Laptop, Notebook, Ultrabook, and so on.

Brief History of Computers

The naive understanding of computation had to be overcome before the true power of computing could be realized. The inventors who worked tirelessly to bring the computer into the world had to realize that what they were creating was more than just a number cruncher or a calculator. They had to address all of the difficulties associated with inventing such a machine, implementing the design, and actually building the thing. The history of the computer is the history of these difficulties being solved.

19 th Century

1801 – Joseph Marie Jacquard, a weaver and businessman from France, devised a loom that employed punched wooden cards to automatically weave cloth designs.

1822 – Charles Babbage, a mathematician, invented the steam-powered calculating machine capable of calculating number tables. The “Difference Engine” idea failed owing to a lack of technology at the time.

1848 – The world’s first computer program was written by Ada Lovelace, an English mathematician. Lovelace also includes a step-by-step tutorial on how to compute Bernoulli numbers using Babbage’s machine.

1890 – Herman Hollerith, an inventor, creates the punch card technique used to calculate the 1880 U.S. census. He would go on to start the corporation that would become IBM.

Early 20 th Century

1930 – Differential Analyzer was the first large-scale automatic general-purpose mechanical analogue computer invented and built by Vannevar Bush.

1936 – Alan Turing had an idea for a universal machine, which he called the Turing machine, that could compute anything that could be computed.

1939 – Hewlett-Packard was discovered in a garage in Palo Alto, California by Bill Hewlett and David Packard.

1941 – Konrad Zuse, a German inventor and engineer, completed his Z3 machine, the world’s first digital computer. However, the machine was destroyed during a World War II bombing strike on Berlin.

1941 – J.V. Atanasoff and graduate student Clifford Berry devise a computer capable of solving 29 equations at the same time. The first time a computer can store data in its primary memory.

1945 – University of Pennsylvania academics John Mauchly and J. Presper Eckert create an Electronic Numerical Integrator and Calculator (ENIAC). It was Turing-complete and capable of solving “a vast class of numerical problems” by reprogramming, earning it the title of “Grandfather of computers.”

1946 – The UNIVAC I (Universal Automatic Computer) was the first general-purpose electronic digital computer designed in the United States for corporate applications.

1949 – The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team at the University of Cambridge, is the “first practical stored-program computer.”

1950 – The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC, and it was the first stored-program computer completed in the United States.

Late 20 th Century

1953 – Grace Hopper, a computer scientist, creates the first computer language, which becomes known as COBOL, which stands for CO mmon, B usiness- O riented L anguage. It allowed a computer user to offer the computer instructions in English-like words rather than numbers.

1954 – John Backus and a team of IBM programmers created the FORTRAN programming language, an acronym for FOR mula TRAN slation. In addition, IBM developed the 650.

1958 – The integrated circuit, sometimes known as the computer chip, was created by Jack Kirby and Robert Noyce.

1962 – Atlas, the computer, makes its appearance. It was the fastest computer in the world at the time, and it pioneered the concept of “virtual memory.”

1964 – Douglas Engelbart proposes a modern computer prototype that combines a mouse and a graphical user interface (GUI).

1969 – Bell Labs developers, led by Ken Thompson and Dennis Ritchie, revealed UNIX, an operating system developed in the C programming language that addressed program compatibility difficulties.

1970 – The Intel 1103, the first Dynamic Access Memory (DRAM) chip, is unveiled by Intel.

1971 – The floppy disc was invented by Alan Shugart and a team of IBM engineers. In the same year, Xerox developed the first laser printer, which not only produced billions of dollars but also heralded the beginning of a new age in computer printing.

1973 – Robert Metcalfe, a member of Xerox’s research department, created Ethernet, which is used to connect many computers and other gear.

1974 – Personal computers were introduced into the market. The first were the Altair Scelbi & Mark-8, IBM 5100, and Radio Shack’s TRS-80.

1975 – Popular Electronics magazine touted the Altair 8800 as the world’s first minicomputer kit in January. Paul Allen and Bill Gates offer to build software in the BASIC language for the Altair.

1976 – Apple Computers is founded by Steve Jobs and Steve Wozniak, who expose the world to the Apple I, the first computer with a single-circuit board.

1977 – At the first West Coast Computer Faire, Jobs and Wozniak announce the Apple II. It has colour graphics and a cassette drive for storing music.

1978 – The first computerized spreadsheet program, VisiCalc, is introduced.

1979 – WordStar, a word processing tool from MicroPro International, is released.

1981 – IBM unveils the Acorn, their first personal computer, which has an Intel CPU, two floppy drives, and a colour display. The MS-DOS operating system from Microsoft is used by Acorn.

1983 – The CD-ROM, which could carry 550 megabytes of pre-recorded data, hit the market. This year also saw the release of the Gavilan SC, the first portable computer with a flip-form design and the first to be offered as a “laptop.”

1984 – Apple launched Macintosh during the Superbowl XVIII commercial. It was priced at $2,500

1985 – Microsoft introduces Windows, which enables multitasking via a graphical user interface. In addition, the programming language C++ has been released.

1990 – Tim Berners-Lee, an English programmer and scientist, creates HyperText Markup Language, widely known as HTML. He also coined the term “WorldWideWeb.” It includes the first browser, a server, HTML, and URLs.

1993 – The Pentium CPU improves the usage of graphics and music on personal computers.

1995 – Microsoft’s Windows 95 operating system was released. A $300 million promotional campaign was launched to get the news out. Sun Microsystems introduces Java 1.0, followed by Netscape Communications’ JavaScript.

1996 – At Stanford University, Sergey Brin and Larry Page created the Google search engine.

1998 – Apple introduces the iMac, an all-in-one Macintosh desktop computer. These PCs cost $1,300 and came with a 4GB hard drive, 32MB RAM, a CD-ROM, and a 15-inch monitor.

1999 – Wi-Fi, an abbreviation for “wireless fidelity,” is created, originally covering a range of up to 300 feet.

21 st Century

2000 – The USB flash drive is first introduced in 2000. They were speedier and had more storage space than other storage media options when used for data storage.

2001 – Apple releases Mac OS X, later renamed OS X and eventually simply macOS, as the successor to its conventional Mac Operating System.

2003 – Customers could purchase AMD’s Athlon 64, the first 64-bit CPU for consumer computers.

2004 – Facebook began as a social networking website.

2005 – Google acquires Android, a mobile phone OS based on Linux.

2006 – Apple’s MacBook Pro was available. The Pro was the company’s first dual-core, Intel-based mobile computer.

Amazon Web Services, including Amazon Elastic Cloud 2 (EC2) and Amazon Simple Storage Service, were also launched (S3)

2007 – The first iPhone was produced by Apple, bringing many computer operations into the palm of our hands. Amazon also released the Kindle, one of the first electronic reading systems, in 2007.

2009 – Microsoft released Windows 7.

2011 – Google introduces the Chromebook, which runs Google Chrome OS.

2014 – The University of Michigan Micro Mote (M3), the world’s smallest computer, was constructed.

2015 – Apple introduces the Apple Watch. Windows 10 was also released by Microsoft.

2016 – The world’s first reprogrammable quantum computer is built.

Types of Computers

  • Analog Computers –  Analog computers are built with various components such as gears and levers, with no electrical components. One advantage of analogue computation is that designing and building an analogue computer to tackle a specific problem can be quite straightforward.
  • Mainframe computers –  It is a computer that is generally utilized by large enterprises for mission-critical activities such as massive data processing. Mainframe computers were distinguished by massive storage capacities, quick components, and powerful computational capabilities. Because they were complicated systems, they were managed by a team of systems programmers who had sole access to the computer. These machines are now referred to as servers rather than mainframes.
  • Supercomputers –  The most powerful computers to date are commonly referred to as supercomputers. Supercomputers are enormous systems that are purpose-built to solve complicated scientific and industrial problems. Quantum mechanics, weather forecasting, oil and gas exploration, molecular modelling, physical simulations, aerodynamics, nuclear fusion research, and cryptoanalysis are all done on supercomputers.
  • Minicomputers –  A minicomputer is a type of computer that has many of the same features and capabilities as a larger computer but is smaller in size. Minicomputers, which were relatively small and affordable, were often employed in a single department of an organization and were often dedicated to a specific task or shared by a small group.
  • Microcomputers –  A microcomputer is a small computer that is based on a microprocessor integrated circuit, often known as a chip. A microcomputer is a system that incorporates at a minimum a microprocessor, program memory, data memory, and input-output system (I/O). A microcomputer is now commonly referred to as a personal computer (PC).
  • Embedded processors –  These are miniature computers that control electrical and mechanical processes with basic microprocessors. Embedded processors are often simple in design, have limited processing capability and I/O capabilities, and need little power. Ordinary microprocessors and microcontrollers are the two primary types of embedded processors. Embedded processors are employed in systems that do not require the computing capability of traditional devices such as desktop computers, laptop computers, or workstations.

FAQs on History of Computers

Q: The principle of modern computers was proposed by ____

  • Adam Osborne
  • Alan Turing
  • Charles Babbage

Ans: The correct answer is C.

Q: Who introduced the first computer from home use in 1981?

  • Sun Technology

Ans: Answer is A. IBM made the first home-use personal computer.

Q: Third generation computers used which programming language ?

  • Machine language

Ans: The correct option is C.

Customize your course in 30 seconds

Which class are you in.

tutor

Basics of Computers

  • Computer Abbreviations
  • Basic Computer Knowledge – Practice Problems
  • Computer Organization
  • Input and Output (I/O) Devices

One response to “Hardware and Software”

THANKS ,THIS IS THE VERY USEFUL KNOWLEDGE

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Download the App

Google Play

Computer History Lesson Plan: Science, Technology, and Society

*Click to open and customize your own copy of the Computer History Lesson Plan . 

This lesson accompanies the BrainPOP topic Computer History , and supports the standard of evaluating the impact of technology on society. Students demonstrate understanding through a variety of projects.

Step 1: ACTIVATE PRIOR KNOWLEDGE

Ask students:

  • What do you use computers in your everyday life? What would our life be like without computers
  • How do you think computers have changed over time? 

Step 2: BUILD KNOWLEDGE

  • Read the description on the Computer History topic page .
  • Play the Movie , pausing to check for understanding.
  • Assign Related Reading . Have students read one of the following articles: “Quirky Stuff” or “Famous Faces.” Partner them with someone who read a different article to share what they learned with each other.

Step 3: APPLY and ASSESS 

Assign Computer History Challenge and Quiz , prompting students to apply essential literacy skills while demonstrating what they learned about this topic.

Step 4: DEEPEN and EXTEND

Students express what they learned about computer history while practicing essential literacy skills with one or more of the following activities. Differentiate by assigning ones that meet individual student needs.

  • Make-a-Movie : Create a documentary tracing the evolution of computer technology. 
  • Make-a-Map : Make a concept map identifying significant events in computer history, and their impacts. 
  • Creative Coding : Code a meme that makes a statement of how computers have changed our lives over time. 
  • Primary Source Activity : Examine the first advertisement for the Apple computer and cite evidence to answer the accompanying questions.

More to Explore

Related BrainPOP Topics : Deepen understanding of computer history with these topics Grace Hopper , Ada Lovelace , and Alan Turing .  

Time Zone X: Computer History : Place historical events in chronological order in this interactive timeline game.

Teacher Support Resources:

  • Learning Activities Modifications : Strategies to meet ELL and other instructional and student needs.
  • Learning Activities Support : Resources for best practices using BrainPOP.  

Lesson Plan Common Core State Standards Alignments

assignment on computer history

  • BrainPOP Jr. (K-3)
  • BrainPOP ELL
  • BrainPOP Science
  • BrainPOP Español
  • BrainPOP Français
  • Set Up Accounts
  • Single Sign-on
  • Manage Subscription
  • Quick Tours
  • About BrainPOP

Twitter

  • Terms of Use
  • Privacy Policy
  • Trademarks & Copyrights

home

Shortcut Keys

  • Computer Shortcut Keys
  • What is Browser
  • Google Chrome
  • Mozilla Firefox
  • Internet Explorer
  • Computer Fundamentals
  • What is Computer
  • History of Computer
  • Types of Computer
  • Computer Components
  • Input Devices
  • Output Devices
  • Central Processing Unit
  • Computer Memory
  • Register Memory
  • Cache Memory
  • Primary Memory
  • Secondary Memory
  • Memory Units
  • Computer Network
  • Computer Virus
  • Number Systems

Interview Questions

  • HR Interview

The first counting device was used by the primitive people. They used sticks, stones and bones as counting tools. As human mind and technology improved with time more computing devices were developed. Some of the popular computing devices starting with the first to recent ones are described below;

The history of computer begins with the birth of abacus which is believed to be the first computer. It is said that Chinese invented Abacus around 4,000 years ago.

It was a wooden rack which has metal rods with beads mounted on them. The beads were moved by the abacus operator according to some rules to perform arithmetic calculations. Abacus is still used in some countries like China, Russia and Japan. An image of this tool is shown below;

It was a manually-operated calculating device which was invented by John Napier (1550-1617) of Merchiston. In this calculating tool, he used 9 different ivory strips or bones marked with numbers to multiply and divide. So, the tool became known as "Napier's Bones. It was also the first machine to use the decimal point.

Pascaline is also known as Arithmetic Machine or Adding Machine. It was invented between 1642 and 1644 by a French mathematician-philosopher Biaise Pascal. It is believed that it was the first mechanical and automatic calculator.

Pascal invented this machine to help his father, a tax accountant. It could only perform addition and subtraction. It was a wooden box with a series of gears and wheels. When a wheel is rotated one revolution, it rotates the neighboring wheel. A series of windows is given on the top of the wheels to read the totals. An image of this tool is shown below;

It was developed by a German mathematician-philosopher Gottfried Wilhelm Leibnitz in 1673. He improved Pascal's invention to develop this machine. It was a digital mechanical calculator which was called the stepped reckoner as instead of gears it was made of fluted drums. See the following image;

In the early 1820s, it was designed by Charles Babbage who is known as "Father of Modern Computer". It was a mechanical computer which could perform simple calculations. It was a steam driven calculating machine designed to solve tables of numbers like logarithm tables.

This calculating machine was also developed by Charles Babbage in 1830. It was a mechanical computer that used punch-cards as input. It was capable of solving any mathematical problem and storing information as a permanent memory.

It was invented in 1890, by Herman Hollerith, an American statistician. It was a mechanical tabulator based on punch cards. It could tabulate statistics and record or sort data or information. This machine was used in the 1890 U.S. Census. Hollerith also started the Hollerith?s Tabulating Machine Company which later became International Business Machine (IBM) in 1924.

It was the first electronic computer introduced in the United States in 1930. It was an analog device invented by Vannevar Bush. This machine has vacuum tubes to switch electrical signals to perform calculations. It could do 25 calculations in few minutes.

The next major changes in the history of computer began in 1937 when Howard Aiken planned to develop a machine that could perform calculations involving large numbers. In 1944, Mark I computer was built as a partnership between IBM and Harvard. It was the first programmable digital computer.

A generation of computers refers to the specific improvements in computer technology with time. In 1946, electronic pathways called circuits were developed to perform the counting. It replaced the gears and other mechanical parts used for counting in previous computing machines.

In each new generation, the circuits became smaller and more advanced than the previous generation circuits. The miniaturization helped increase the speed, memory and power of computers. There are five generations of computers which are described below;

The first generation (1946-1959) computers were slow, huge and expensive. In these computers, vacuum tubes were used as the basic components of CPU and memory. These computers were mainly depended on batch operating system and punch cards. Magnetic tape and paper tape were used as output and input devices in this generation;

Some of the popular first generation computers are;

( Electronic Numerical Integrator and Computer) ( Electronic Discrete Variable Automatic Computer) ( Universal Automatic Computer)

The second generation (1959-1965) was the era of the transistor computers. These computers used transistors which were cheap, compact and consuming less power; it made transistor computers faster than the first generation computers.

In this generation, magnetic cores were used as the primary memory and magnetic disc and tapes were used as the secondary storage. Assembly language and programming languages like COBOL and FORTRAN, and Batch processing and multiprogramming operating systems were used in these computers.

Some of the popular second generation computers are;

The third generation computers used integrated circuits (ICs) instead of transistors. A single IC can pack huge number of transistors which increased the power of a computer and reduced the cost. The computers also became more reliable, efficient and smaller in size. These generation computers used remote processing, time-sharing, multi programming as operating system. Also, the high-level programming languages like FORTRON-II TO IV, COBOL, PASCAL PL/1, ALGOL-68 were used in this generation.

Some of the popular third generation computers are;

The fourth generation (1971-1980) computers used very large scale integrated (VLSI) circuits; a chip containing millions of transistors and other circuit elements. These chips made this generation computers more compact, powerful, fast and affordable. These generation computers used real time, time sharing and distributed operating system. The programming languages like C, C++, DBASE were also used in this generation.

Some of the popular fourth generation computers are;

In fifth generation (1980-till date) computers, the VLSI technology was replaced with ULSI (Ultra Large Scale Integration). It made possible the production of microprocessor chips with ten million electronic components. This generation computers used parallel processing hardware and AI (Artificial Intelligence) software. The programming languages used in this generation were C, C++, Java, .Net, etc.

Some of the popular fifth generation computers are;





Latest Courses

Python

Javatpoint provides tutorials and interview questions of all technology like java tutorial, android, java frameworks

Contact info

G-13, 2nd Floor, Sec-3, Noida, UP, 201301, India

[email protected] .

Facebook

Online Compiler

Banner

The Evolution of Computers: Key Resources (July 2013): General Histories and Reference Resources

  • The Year of Alan Turing

General Histories and Reference Resources

  • Human and Mechanical Computers
  • Early Electronic Computers
  • Covert Computing and Computer Security
  • ARPANET, E-mail, and the World Wide Web
  • The Personal Computing Revolution
  • The Personalized Web, Mobile, and the Cloud
  • Social Networks and Beyond

Works Cited

Numerous titles offer broad accounts of the fascinating history of computing, and more recent publications take the story up to the present.  Ian Watson’s comprehensive history published in 2012, The Universal Machine: From the Dawn of Computing to Digital Consciousness , will be particularly appealing to general readers and undergraduate students for its accessible, engaging writing style and many illustrations.  Two other notable works published in 2012 are Computing: A Concise History by Paul Ceruzzi (also author of the useful 2003 title, A History of Modern Computing ) and A Brief History of Computing by Gerard O’Regan.  Ceruzzi, curator at the National Air and Space Museum, Smithsonian Institution, provides a readable and concise 155-page overview in his book, which is part of the “MIT Press Essential Knowledge” series; this work also contains ample references to the literature in a further reading section and a bibliography.  O’Regan’s work offers an encompassing chronological survey, but also devotes chapters to the history of programming languages and software engineering.  Also published in 2012 is Peter Bentley’s Digitized: The Science of Computers and How It Shapes Our World , which provides valuable historical coverage and in later chapters reports on the revolutionary developments in artificial intelligence and their impact on society.

Other informative, accessible general histories include Computer: A History of the Information Machine by Martin Campbell-Kelly and William Aspray; Computers: The Life Story of a Technology by Eric Swedin and David Ferro; and Histories of Computing by Michael Sean Mahoney.  Mike Hally’s Electronic Brains: Stories from the Dawn of the Computer Age focuses on post-World War II developments, tracing the signal contributions of scientists from the United Kingdom, United States, Australia, and Russia.  An excellent pictorial collection of computers is John Alderman and Mark Richards’s Core Memory: A Visual Survey of Vintage Computers Featuring Machines from the Computer History Museum .

The static nature of print reference materials is not the perfect format for the topic of computer innovation; these publications may show their age not just in technical information and jargon but also in a lack of coverage of more contemporary individuals and groups.  Nevertheless, several works continue to have lasting value for their excellent and unique coverage.  The two-volume Encyclopedia of Computers and Computer History , edited by Raúl Rojas, which was published in 2001, offers comprehensive coverage of historical topics in a convenient format, enhanced with useful bibliographic aids.  More serious researchers will find Jeffrey Yost’s A Bibliographic Guide to Resources in Scientific Computing, 1945-1975 valuable for its annotations of earlier important titles and its special focus on the sciences; the volume’s four major parts cover the physical, cognitive, biological, and medical sciences.  The Second Bibliographic Guide to the History of Computing, Computers, and the Information Processing Industry , compiled by James Cortada, published in 1996, will also be of value to researchers.  For biographical coverage, Computer Pioneers by J. A. N. Lee features entries on well-known and lesser-known individuals, primarily those from the United States and the United Kingdom; however, coverage of female pioneers is limited.  Lee also edited the International Biographical Dictionary of Computer Pioneers , which provides broader geographical coverage.

Related and more recent information may be found in several online resources such as the IEEE Global History Network: Computers and Information Processing .  Sites featuring interactive time lines and interesting exhibits include the IBM Archives , and Revolution: The First 2000 Years of Computing by the Computer History Museum.

Focusing on women’s contributions to the field is “ Famous Women in Computer Science , available on the Anita Borg Institute website.  This site includes nearly eighty short biographies with links to university and other organizational and related websites.  A Pinterest board version of the awardees is also available.  “ The ADA Project , named in honor of Ada Lovelace (1815-52), who wrote what is considered to be “the first ‘computer program.’”  This site is largely based on the Famous Women in Computer Science website but also includes a time line.

In contrast to J. A. N. Lee’s International Biographical Dictionary of Computer Pioneers mentioned previously, the highly recommended Milestones in Computer Science and Information Technology by Edwin Reilly focuses more on technological aspects than individuals.  However, this author did not find a more comprehensive one-volume reference resource than Reilly’s.  Appendixes include a listing of cited references, classification of entries, “The Top Ten Consolidated Milestones,” and personal name, chronological, and general indexes.

assignment on computer history

  • Second bibliographic guide to the history of computing, computers, and the information processing industry by James W. Cortada (editor) ISBN: 9780313295423 Publication Date: 1996

assignment on computer history

  • << Previous: The Year of Alan Turing
  • Next: Human and Mechanical Computers >>
  • Last Updated: Jun 20, 2016 2:11 PM
  • URL: https://ala-choice.libguides.com/c.php?g=457199

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Assignment ON" EVOLUTION AND HISTORY OF COMPUTER "

Profile image of Abdullah Nadeem

Related Papers

Ndidi Opara

assignment on computer history

Edmund Miller

AJEET TELECOM

ELAIYA SENGUTTUVAN

Tahir Siddique

In this paper, emphasis has been given on the gradual and continuous advancement of computer from on and before 300BC to 2012 and beyond. During this very long period of time, a simple device like computer has witnessed many significant changes in its manufacturing and development. By and large, the changes are conceptual, manufacturing and in ever increasing applications. Abstract-In this paper, emphasis has been given on the gradual and continuous advancement of computer from on and before 300BC to 2012 and beyond. During this very long period of time, a simple device like computer has witnessed many significant changes in its manufacturing and development. By and large, the changes are conceptual, manufacturing and in ever increasing applications.

IEEE Potentials

Cresent Escriber

Fernando A G Alcoforado

This article aims to present how the computer, humanity's greatest invention, evolved and how its most likely future will be. The computer is humanity's greatest invention because the worldwide computer network made possible the use of the Internet as the technology that most changed the world with the advent of the information society. IBM developed the mainframe computer starting in 1952. In the 1970s, the dominance of mainframes began to be challenged by the emergence of microprocessors. The innovations greatly facilitated the task of developing and manufacturing smaller computers - then called minicomputers. In 1976, the first microcomputers appeared whose costs represented only a fraction of those practiced by manufacturers of mainframes and minicomputers. The existence of the computer provided the conditions for the advent of the Internet which is undoubtedly one of the greatest inventions of the 20th century, whose development took place in 1965. At the beginning of the 21st century, cloud computing emerged, which symbolizes the tendency to place all the infrastructure and information available digitally on the Internet. Current computers are electronic because they are made up of transistors used in electronic chips that have limitations given that there will be a time when it will no longer be possible to reduce the size of one of the components of the processors, the transistor. Quantum computers have been shown to be the newest answer in Physics and Computing to problems related to the limited capacity of electronic computers. Canadian company D-Wave claims to have produced the first commercial quantum computer. In addition to the quantum computer, Artificial Intelligence (AI) can reinvent computers.

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

David Dennis

Daniel Ofoleta

Modou Mbodji, MMb

Bekka Billionea

Bayode Oluwatomilola

Proceedings of the IEEE

stanley mazor

arXiv (Cornell University)

Alexandre Benatti

Development of Mathematical Ideas of Mykhailo Kravchuk [Krawtchouk]

Ivan Katchanovski

Selin Oktan

IFIP Advances in Information and Communication Technology

Kuanysh Ospanov

Mallareddy Pindi

Obonyo Francis Alphonse

Interdisciplinary Science Reviews

Michael Mahoney

akhmad zaimi

GCAA Research Team

Gregoire Rousseau

Md Suzanul Islam Suzan

Ionescu Andreea

Genesis Chavez

Revista Dearq , Stephen Jones

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

This week: the arXiv Accessibility Forum

Help | Advanced Search

Computer Science > Data Structures and Algorithms

Title: an effective tag assignment approach for billboard advertisement.

Abstract: Billboard Advertisement has gained popularity due to its significant outrage in return on investment. To make this advertisement approach more effective, the relevant information about the product needs to be reached to the relevant set of people. This can be achieved if the relevant set of tags can be mapped to the correct slots. Formally, we call this problem the Tag Assignment Problem in Billboard Advertisement. Given trajectory, billboard database, and a set of selected billboard slots and tags, this problem asks to output a mapping of selected tags to the selected slots so that the influence is maximized. We model this as a variant of traditional bipartite matching called One-To-Many Bipartite Matching (OMBM). Unlike traditional bipartite matching, a tag can be assigned to only one slot; in the OMBM, a tag can be assigned to multiple slots while the vice versa can not happen. We propose an iterative solution approach that incrementally allocates the tags to the slots. The proposed methodology has been explained with an illustrated example. A complexity analysis of the proposed solution approach has also been conducted. The experimental results on real-world trajectory and billboard datasets prove our claim on the effectiveness and efficiency of the proposed solution.
Comments: This Paper has been accepted at The 25th International Web Information Systems Engineering Conference (WISE-2024)
Subjects: Data Structures and Algorithms (cs.DS); Information Retrieval (cs.IR)
Cite as: [cs.DS]
  (or [cs.DS] for this version)
  Focus to learn more arXiv-issued DOI via DataCite

Submission history

Access paper:.

  • HTML (experimental)
  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

  • SI SWIMSUIT
  • SI SPORTSBOOK

Chicago White Sox Catcher Korey Lee Scratched From Lineup Due to Back Injury

Sam connon | 5 hours ago.

Jul 23, 2024; Arlington, Texas, USA; Chicago White Sox catcher Korey Lee (26) slaps hands in the dugout after scoring a run in the third inning against the Texas Rangers at Globe Life Field.

  • Chicago White Sox

Catcher Korey Lee has been scratched from the Chicago White Sox's lineup ahead of a Friday night showdown with the Boston Red Sox, the team has announced.

Lee had initially been penciled in as the White Sox's starting catcher No. 6 hitter. Chuckie Robinson is replacing Lee on defense, but he will be batting eighth in the lineup, bumping third baseman Lenyn Sosa and right fielder Dominic Fletcher up one spot each.

According to The Chicago Sun-Times' Daryl Van Schouwen , Lee is suffering from back tightness.

Lee got his start with the Houston Astros in 2022, winning a World Series ring his rookie year, but he got traded to the White Sox in exchange for veteran reliever Kendall Graveman midway through 2023. He hit .077 with one home run, three RBI, a .281 OPS and a -0.7 WAR across 24 games in Chicago last season.

In 111 games this year, Lee is batting .213 with 10 home runs, 31 RBI, a .590 OPS and a 0.1 WAR.

Lee began 2024 in a platoon with veteran Martín Maldonado, who was also on the Astros' 2022 World Series squad. Maldonado got designated for assignment in July, though, making Lee the primary starter and Robinson the new backup.

Robinson signed a minor league contract with Chicago last December. Since getting called up in July, the 29-year-old has hit .154 with a .329 OPS and a -0.3 WAR across 16 games.

Here is the updated lineup the White Sox will be trotting out in Friday's series opener:

1. Nicky Lopez, 2B 2. Luis Robert Jr., CF 3. Andrew Benintendi, LF 4. Andrew Vaughn, DH 5. Gavin Sheets, 1B 6. Lenyn Sosa, 3B 7. Dominic Fletcher, RF 8. Chuckie Robinson, C 9. Jacob Amaya, SS SP: Davis Martin, RHP

First pitch is scheduled for 7:10 p.m. ET.

Follow Fastball On SI on social media

Continue to follow our Fastball On SI coverage on social media by liking us on  Facebook  and by following us on Twitter  @FastballFN .

You can also follow Sam Connon on Twitter  @SamConnon .

Sam Connon

Sam Connon is a Staff Writer for Fastball on the Sports Illustrated/FanNation networks. He previously covered UCLA Athletics for Sports Illustrated/FanNation's All Bruins, 247Sports' Bruin Report Online, Rivals' Bruin Blitz, the Bleav Podcast Network and the Daily Bruin, with his work as a sports columnist receiving awards from the College Media Association and Society of Professional Journalists. Connon also wrote for Sports Illustrated/FanNation's New England Patriots site, Patriots Country, and he was on the Patriots and Boston Red Sox beats at Prime Time Sports Talk.

Follow SamConnon

IMAGES

  1. Understanding Computer Generations and Components: An Analysis of a

    assignment on computer history

  2. The History of Computers interactive worksheet

    assignment on computer history

  3. History and Generation of computer online worksheet for Grade 7 and 8

    assignment on computer history

  4. Assignment on history of computer in pdf in 2021

    assignment on computer history

  5. Document 4

    assignment on computer history

  6. History of Computers

    assignment on computer history

VIDEO

  1. Video Assignment

  2. Bahariawans Ms. Powerpoint 2010 creation

  3. NPTEL Computer Graphics Week 1 Assignment Solution

  4. Digital Image Processing NPTEL Assignment 4 week 4 Answers 2024

  5. NPTEL Computer Networks WEEK 4 ASSIGNMENT ANSWERS

  6. Assignment 1 Data Types [كود مصري]

COMMENTS

  1. Assignment 1

    Assignment 1 - History of Computer - Free download as Word Doc (.doc / .docx), PDF File (.pdf), Text File (.txt) or read online for free. The document traces the evolution of computer technology from ancient counting devices to modern digital computers and networking. It begins with early mechanical calculating aids like the abacus in 1200 AD and Napier's Bones in 1614.

  2. History of computers: A brief timeline

    History of computers: A brief timeline

  3. Computer

    Computer - History, Technology, Innovation: A computer might be described with deceptive simplicity as "an apparatus that performs routine calculations automatically." Such a definition would owe its deceptiveness to a naive and narrow view of calculation as a strictly mathematical process. In fact, calculation underlies many activities that are not normally thought of as mathematical.

  4. Computer

    Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery. The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications. The second section covers the history of computing.

  5. History of Computers

    Around 4000 years ago, the Chinese invented the Abacus, and it is believed to be the first computer. The history of computers begins with the birth of the abacus. Structure: Abacus is basically a wooden rack that has metal rods with beads mounted on them.

  6. History Of Computers With Timeline [2023 Update]

    The history of computers goes back thousands of years with the first one being the abacus. In fact, the earliest abacus, referred to as the Sumerian abacus, dates back to roughly 2700 B.C. from the Mesopotamia region. However, Charles Babbage, the English mathematician and inventor is known as the "Father of Computers.".

  7. History Of Computers: Timeline, I/O Devices and Networking

    1976: Steve Jobs and Steve Wozniak start Apple Computers and introduced the world to the Apple I, the first computer with a single-circuit board. Source: MacRumors. Also, in 1976, Queen Elizabeth II sent out her first email from the Royal Signals and Radar Establishment to demonstrate networking technology.

  8. PDF The History of Computing: An Introduction for the Computer Scientist

    The last of the three major institutions in the history of computing is the Computer History Museum.7 This holds an exceptional collection of rare and historical computer hardware, including pieces of the ENIAC, an Enigma machine, a SAGE console, a Cray 1 supercomputer, a Xerox Alto, an Altair, and an Apple 1.

  9. PDF History of the Computer

    of communications systems. However, the computer wasn't invented in 1990, and it has a very long history. The very name computer describes the role it originally performed—carrying out tedious arithmetic operations called computations. Indeed, the term computer was once applied not to machines but to people who carried out

  10. Computer Histories: A History of Computing in 100 People, Places and

    Computer Histories is an introductory course on the history of computing in 100 people, places, and things: 27 topics and over 1,500 slides ... Reviews | Comments | Discuss on our Facebook Page . Fall 2013 Course: Syllabus and Assignments and Essays and Simulation Instructions. Computer Histories is made available under a Creative Commons ...

  11. The History of Computing

    Course Description. This course focuses on one particular aspect of the history of computing: the use of the computer as a scientific instrument. The electronic digital computer was invented to do science, and its applications range from physics to mathematics to biology to the humanities. What has been the impact of computing on the ….

  12. History of Computers

    History of Computers. A computer is an electronic machine that accepts information, stores it, processes it according to the instructions provided by a user, and then returns the result. Today, we ...

  13. Syllabus

    This course will focus on one particular aspect of the history of computing: the use of the computer as a scientific instrument. The electronic digital computer was invented to do science, and its applications range from physics to mathematics to biology to the humanities. ... Assignments for this course also include a final paper (10-15 pages ...

  14. Assignments

    Final Paper Assignment. Write a 10-15 page paper (double-spaced, 1.25" margins, 12 pt font). You may choose any topic that addresses the use of the computer as a scientific instrument. You may choose something close to your own area of expertise, or something completely different.

  15. History of computer science

    The history of computer science began long before the modern discipline of computer science, usually appearing in forms like mathematics or physics.Developments in previous centuries alluded to the discipline that we now know as computer science. [1] This progression, from mechanical inventions and mathematical theories towards modern computer concepts and machines, led to the development of a ...

  16. History of Computers: Parts, Networking, Operating Systems, FAQs

    History of Computers: Parts, Networking, Operating Systems ...

  17. Computer History Lesson Plan: Science, Technology, and Society

    Partner them with someone who read a different article to share what they learned with each other. Step 3: APPLY and ASSESS. Assign Computer History Challenge and Quiz, prompting students to apply essential literacy skills while demonstrating what they learned about this topic. Step 4: DEEPEN and EXTEND. Students express what they learned about ...

  18. History of Computer

    The history of computer begins with the birth of abacus which is believed to be the first computer. It is said that Chinese invented Abacus around 4,000 years ago. It was a wooden rack which has metal rods with beads mounted on them. The beads were moved by the abacus operator according to some rules to perform arithmetic calculations.

  19. (PDF) History of computer and its generations.

    The history of computer dated back to the period of scientific revolution (i.e. 1543 - 1678). The calculating machine invented by Blaise Pascal in 1642 and. that of Goffried Liebnits marked the ...

  20. PDF Computers and the Internet: A Global History

    Each week's assignment includes four elements: Reading, Context, Computing, and Primary Sources. Everyone should do the Reading for every session. Also choose at least one of the ... Reading: Computer History Museum, "Breaking the Code" (video, 5 min) "History of the Enigma," "Bombe," and "Colossus," Cryptomuseum.com.

  21. The Evolution of Computers: Key Resources (July 2013): General

    Numerous titles offer broad accounts of the fascinating history of computing, and more recent publications take the story up to the present. Ian Watson's comprehensive history published in 2012, The Universal Machine: From the Dawn of Computing to Digital Consciousness, will be particularly appealing to general readers and undergraduate students for its accessible, engaging writing style and ...

  22. Assignment ON" EVOLUTION AND HISTORY OF COMPUTER

    department of management sciences, comsats computer for mangment assignment # 1 submitted to: ch. anwar shaukat submitted by: muhammad ahsan tariq (045) abdullah nadeem (006) abdullah nadeen (005) ali sabir (013) "evolution and history of computer" contents introdution:- 4 early history: (williams, 1997) 5 abacus: 5 napier rods: 5 percaline: 5 charlee's babbage's difference engine ...

  23. PDF Computer History Activity

    Computer History ActivityCo. Activity Grades: 6 8 Goal: Learn about some key events in Computer History Duration of lesson: to 1 hour. Objectives:Be curious about computer history and appreciate the compu. use everyday. Lesson Introduction Today, we can not imagine our.

  24. An Effective Tag Assignment Approach for Billboard Advertisement

    Billboard Advertisement has gained popularity due to its significant outrage in return on investment. To make this advertisement approach more effective, the relevant information about the product needs to be reached to the relevant set of people. This can be achieved if the relevant set of tags can be mapped to the correct slots. Formally, we call this problem the Tag Assignment Problem in ...

  25. Chicago White Sox Catcher Korey Lee Scratched From Lineup Due to Back

    Maldonado got designated for assignment in July, though, making Lee the primary starter and Robinson the new backup. Robinson signed a minor league contract with Chicago last December.