Transcript Document 7117173
History of Computing
The Origins of Computing Sources: Combrink, Thomas Cortina, Fergus Toolan,
What is a Computer?
one who computes a person employed to make calculations in an observatory, in surveying, etc.
“a programmable machine that can execute a list of instructions in a well-defined manner”
What is a modern computer
A machine which can execute billions of instructions per second.
Uses a “stored program” to execute instruction in a specific order to “solve a problem”
Modern Computers are assemblies of components
Six logical units of computer system Input unit Mouse, keyboard Output unit Printer, monitor, audio speakers Memory unit Retains input and processed information Arithmetic and logic unit (ALU) Performs calculations Central processing unit (CPU) Supervises operation of other devices Secondary storage unit Hard drives, floppy drives
CPU (Microprocessor Chip)
Brain of the computer Made of Integrated Circuits (ICs), which have millions of tiny transistors and other components Performs all calculations & executes all instructions Example chips for PC: Intel (Celeron, Pentium) AMD (K-6 and Athlon)
Inside the Chip
What’s a Giga Hertz (GHz) ?
A unit of measurement for CPU speed (clock speed) G (giga) means 1 billion, M (mega) would be 1 million Hz is for frequency per second GHz means 1 billion clock cycles per second CPUs may execute multiple operations each clock cycle So what does a 2.8 GHz CPU mean?
2,800,000,000 clock cycles per second Performs at least 2,800,000,000 operations per second
Main Memory (RAM)
Stores data for programs currently running Temporary empty when power is turned off Fast access to CPU
What’s a Giga Byte (GB)?
GB measures the amount of data the it can store G (giga) for 1 billion M (mega) for 1 million Data quantities are measured in bytes 1
Bit
= stores a single on/off piece of information 1
Byte
= 8 bits 1
Kilobyte
= 2 10 (~1,000 bytes) 1
Megabyte
= 2 20 (~1,000,000 bytes) 1
Gigabyte
= 2 30 (~1,000,000,000 bytes)
Hard Drive
Stores data and programs Permanent storage (theoretically) when you turn off the computer, it is not emptied
Motherboard
Connects all the components together
How did we get here?
In studying the history of computers, where do we start?
We could go back thousands of years Mathematical developments Manufacturing developments Engineering innovations The wheel?
Counting
What number system do you use?
Decimal (base-10) Has been in use for thousands of years Guesses: first China then India then Middle East then Europe (introduced as late as 1200)
Primative Calculators
The Abbacus
Early Computational Devices
(Chinese) Abacus Used for performing arithmetic operations
Al’Khowarizmi and the algorithm
12th Century Tashkent Cleric Developed the concept of a written process for doing something Published a book on the process of algorithms The basis of software
Early Computational Devices
Napier’s Bones, 1617 For performing multiplication & division John Napier 1550-1617
Philosopher Forefathers of Modern Computing 1600-1700
Blaise Pascal – developed the Pascaline.
Desk top calculator worked lik an odometer.
Von Leibniz developed binary arithmetic and a hand cranked calculator.
Calculator was able to add, subtract, multiply and divide.
Blaise Pascal
Pascal (1623-62)
was the son of a tax collector and a mathematical genius. He designed the first mechanical calculator (Pascaline) based on gears. It performed addition and subtraction.
Early Computational Devices
Pascaline mechanical calculator Blaise Pascal 1623-1662
Early Computational Devices
Slide Calculators William Oughtred 1574-1660
Gottfried Wilhelm von Leibniz
Leibnitz (1646-1716)
was a German mathematician and built the first calculator to do multiplication and division. It was not reliable due to accuracy of contemporary parts.
He also documented the binary number system which is used in all modern computers.
Count to 8 in binary
0001 0010 0011 0100 0101 0110 0111 1000
Modern Computers use Binary
Why?
Much simpler circuits needed for performing arithmetic
Early Computational Devices
Leibniz’s calculating machine, 1674 Gottfried Wilhelm von Leibniz 1646-1716
George Boole (1815-1864)
Invented Boolean Algebra System of logic using boolean values Used to establish inequalities: symbolic use of <, or >, or <> Used in computer switching Modern use in library searches
Charles Babbage
Babbage (1792-1872)
was a British inventor who designed an two important machines: Difference engine Analytical engine He saw a need to replace the human computers used to calculate numerical tables which were prone to error with a more accurate machine.
Charles Babbage
Difference engine Designed to compute values of polynomial functions automatically No multiplication was needed because he used the method of finite differences He never built one It was built from 1989 – 1991 for the London Science Museum
Charles Babbage Difference Engine
Charles Babbage The Next Leap Forward 1800’s
Charles Babbage
Analytical Engine
Could be programmed using punch cards – totally revolutionary idea Sequential control / branching / looping Turing complete
The analytical engine of Charles Babbage
Lady Ada Byron – World’s first programmer
Countess of Lovelace, daughter of Lord Byron.
One of the first women mathematicians in England Documented Babbage’s work.
Wrote an account of the difference engine.
Wrote a program for the difference engine for computing Bernoulli numbers
Herman Hollerith
Hollerith
developed an electromechanical punched card tabulator to tabulate the data for 1890 U.S. census. Data was entered on punched cards and could be sorted according to the census requirements. The machine was powered by electricity. He formed the Tabulating Machine Company which became International Business Machines (IBM). IBM is currently the largest computer manufacturer, employing in excess of 300,000 people.
Herman Hollerith punch card tabulating machine 1890 Census
Hollerith Tables and the Census
Improved the speed of the census Reduced cost by $5 million Greater accuracy of data collected Hollerith – unemployed after the census
Konrad Zuse - First Calculator 1938
The War Years 1939-1945 Two Primary Uses
Artillery Tables Hand calculation replaced by machine calculation Department of the Navy Cryptologist : Cryptography The art or process of writing in or deciphering secret writing Bletchley House The Enigma Codes – U23
The British Effort
History of Computers
Alan Turing
was a British mathematician who also made significant contributions to the early development of computing, especially to the theory of computation.
He developed an abstract theoretical model of a computer called a Turing machine which is used to capture the notion of computable i.e. what problems can and what problems cannot be computed. Not all problems can be solved on a computer. Note: A Turing machine is an abstract model and not a physical computer
Alan Turing misunderstood genius 1936
Published a paper “On Computable Numbers” Turing’s machine hypothetical computer that could perform any computation or logical operation a human could devise .
Turings Heritage
Code breaking was Touring’s strength.
Colossus a computer to break the German enigma code - 100 Billion alternatives.
Ran at rate of 25,000 characters per second
The United States Effort
The II World War Years 1939 1945
Calculate artillery tables.
Used to break codes like the Colossus.
Used to model future events - Atomic and Hydrogen Bombs .
Cmdr. Grace Hooper
Howard Aiken (1900 – 73)
Aiken,
a Harvard professor, with the backing of IBM built the Harvard Mark I computer (51ft long) in 1944. It was based on relays (operate in milliseconds) as opposed to the use of gears. It required 3 seconds for a multiplication.
Aiken’s Mark 1. (1944) based on Babbage’s original design - built at IBM labs, electro-mechanical, weighed 5 tons. Admiral Grace Hopper worked as programmer on this computer, and coined the term 'bug' for a computer fault.
HARVARD MARK - 1, 1944
The Mark I - a dinosaur
51 feet long 3,304 electro mechanical switches Add or subtract 23 digit numbers in 3/10 of a second.
Instructions (software) loaded by paper tape.
The infamous “Bug”
ENIAC - The Next Jump Forward - 1946
1st electronic digital computer Operated with vacuum tubes rather electro-mechanical switches 1000 times faster than Mark I No program storage - wired into circuitry.
This was still based on the decimal numbering system.
“programmed” by switches and cords
ENIAC
The Advent of the Semiconductor - 1947
Developed at Bell Labs by Shockley & Bardeen – Nobel Prize Point Contact Transistor replaced power hungry, hot and short lived vacuum tubes
History of Computers
Von Neumann
well as data. was a scientific genius and was a consultant on the ENIAC project. He formulated plans with Mauchly and Eckert for a new computer (EDVAC) which was to store programs as This is called the stored program concept and Von Neumann is credited with it. Almost all modern computers are based on this idea and are referred to as Von Neumann machines.
He also concluded that the binary system was more suitable for computers since switches have only two values. He went on to design his own computer at Princeton which was a general purpose machine.
First Generation Computers (1951-58) These machines were used in business for accounting and payroll applications. memory capacity.
Magnetic drums Valves
were unreliable components generating a lot of heat (still a problem in computers). They had very limited were developed to store information and
tapes
were also developed for secondary storage.
They were initially programmed in
machine language
(binary). A major breakthrough was the development of
assemblers
and
assembly language
.
EDVAC - Electronic Discreet Variable Automatic Computer 1951
Data stored internally on a magnetic drum Random access magnetic storage device First stored program computer
The 50’s the Era of Advances
Second Generation (1959-64) The development of the
transistor
revolutionised the development of computers. Invented at Bell Labs in 1948, transistors were much smaller, more rugged, cheaper to make and far more reliable than valves.
Core memory (non-volatile) was introduced and disk storage was also used. The hardware became smaller and more reliable, a trend that still continues.
Another major feature of the second generation was the use of
high-level
programming languages such as
Fortran
The computer industry experienced explosive growth.
and
Cobol
. These revolutionised the development of software for computers.
Technical Advances in the 60’s
John Mccarthy coins the term “Artificial Intelligence” 1960 - Removable Disks appear 1964 - BASIC - Beginners-all purpose Symbolic Instruction Language Texas Instruments offers the first solid- state hand held calculator 1967 - 1st issue of Computerworld published
Third Generation (1965-71) IC’s (Integrated Circuits) were again smaller, cheaper, faster and more reliable than transistors. Speeds went from the microsecond to the nanosecond (billionth) to the picosecond (trillionth) range. ICs were used for main memory despite the disadvantage of being
volatile
. Minicomputers were developed at this time.
Terminals replaced punched cards for data entry and disk packs became popular for secondary storage. IBM introduced the idea of a compatible family of computers, 360 family, easing the problem of upgrading to a more powerful machine
Third Generation (1965-71) Substantial
operating systems
of computers.
were developed to manage and share the computing resources and
time sharing
operating systems were developed. These greatly improved the efficiency Computers had by now pervaded most areas of business and administration.
The number of transistors that be fabricated on a chip is referred to as the
scale of integration
(
SI
). Early chips had
SSI
(small SI) of tens to a few hundreds. Later chips were
MSI
(Medium SI): hundreds to a few thousands,. Then came the thousands range.
LSI
chips (Large SI) in
Moore’s Law
In 1965 Gordon Moore graphed data about growth in memory chip performance. Realized each new chip roughly twice capacity of predecessor, and released within ~2 yrs of it => computing power would rise exponentially over relatively brief periods of time. Still fairly accurate. In 30 years, no of transistors on a chip has increased ~20,000 times, from 2,300 on the 4004 in 1971 to 42 million on the Pentium® IV.
The 1970’s - The Microprocessor Revolution
A single chip containing all the elements of a computer’s central processing unit.
Small, integrated, relatively cheap to manufacture .
The Super Computers - 1972
The Cray Parallel processing power Speed 100 million arithmetical functions per second Sensitive to heat - cooled with liquid nitrogen Very expensive
Fourth Generation VLSI allowed the equivalent of tens of thousand of transistors to be incorporated on a single chip. This led to the development of the
microprocessor
a processor on a chip.
Intel
produced the 4004 which was followed by the 8008,8080, 8088 and 8086 etc. Other companies developing microprocessors included Motorolla (6800, 68000), Texas Instruments and Zilog.
Fourth Generation
Personal computers were developed and IBM launched the IBM PC based on the 8088 and 8086 microprocessors.
Mainframe computers have grown in power. Memory chips are in the megabit range. VLSI chips had enough transistors to build 20 ENIACs.
Secondary storage has also evolved at fantastic rates with storage devices holding gigabytes (1000Mb = 1 Gb) of data.
Fourth Generation
On the software side, more powerful operating systems are available such as Unix. Applications software has become cheaper and easier to use. Software development techniques have vastly improved. Fourth generation languages 4GLs make the development process much easier and faster.
Fourth Generation
Languages are also classified according to generations from machine language (1GL), assembly language (2GL), high level languages (3GL) to 4GLs.
Software is often developed as
application packages
. VisiCalc a spreadsheet program, was the pioneering application package and the original
killer application. Killer application:
A piece of software that is so useful that people will buy a computer to use that application.
The ALTAIR from a Voyage to Altair - Star Trek -1975
The Birth of the Micro Computer 1975
Jobs and Wozniac develop the Apple II Commodore PET, programs stored on a cassette Tandy-Radio Shack TRS-80 5 1/2 inch floppy disk becomes the standard for software
Finally, The Computer as Man of the Year - 1982
Revenge of the nerds Bill Gates Microsoft, 1978 Steve Jobs Steve Wozniak Alan Turing