240 likes | 272 Views
This lecture provides an introduction to the historical thoughts and concepts of computer engineering, covering topics such as the definition of a computer, calculation and computation, algorithms, and a timeline of computer technology.
E N D
CSE 101Computer Engineering Concepts & Algorithms Lecture 1 Historical Thoughts
Introduction • What is a computer? • Computer as a useful tool: • Wide application area: Companies, Schools, Airports, Hospitals, Banks, Military, Airports... • Quite new: A product of information age. • Industrial age:Electricity, telephones, radio, automobiles, planes • Information age: Computers, Internet, Mobile Communication.
Calculation and Computation • Calculation: • determiningsomethingbymathematicalorlogicalmethods • Transforming one or more inputs into one or more results. • Multiply 7 and 8. • Computation: • can be defined as finding a solution to a problem from given inputs by means of an algorithm. • Denotes a more general process involving data and algorithms. • Algorithm: A well-defined set of instructions to perform a certain task. • A program that keeps records of students in a school and answers queries about the data it keeps.
Timeline of Computer Technology 1900 – 1800 BC The first use of place-value number system (eg the decimal system – value of the number depends both on the digit itself and the position of the digit) 1000 – 500 BC The invention of abacus: the first actual calculating mechanism known to man 300 – 600 AD The first use of the number 0, and negative numbers (first appeared in India)
Blaise Pascal In 1640 Pascal started developing a device to help his father add sums of money. The Arithmetic Machine could only add and subtract, while multiplication and division operations were implemented by performing a series of additions or subtractions. Arithmetic machine (1642) Gottfried von Leibniz Leibniz developed Pascal's ideas and, in 1671, introduced the Step Reckoner, a device which, as well as performing additions and subtractions, could multiply, divide, and evaluate square roots by series of stepped additions. Step Reckoner (1671) Pascal's and Leibniz's devices were the forebears of today's desk-top computers, and derivations of these machines continued to be produced until their electronic equivalents finally became readily available and affordable in the early 1970s.
In the early 1800s, a French silk weaver called Joseph-Marie Jacquard invented a way of automatically controlling the silk loom by recording patterns of holes in a string of cards. In the years to come, variations on Jacquard's punched cards would find a variety of uses, including representing the music to be played by automated pianos and the storing of programs for computers IBM 80-column punched card format
Charles Babbage The first device that might be considered to be a computer in the modern sense of the word was conceived in 1822 by the eccentric British mathematician and inventor Charles Babbage. The Difference Engine, which was reconstructed in 1970s from cast iron, bronze and steel, consisted of 4,000 components, weighed three tons, and was 10 feet wide and 6½ feet tall. In Babbage's time, mathematical tables, such as logarithmic and trigonometric functions, were generated by teams of mathematicians working day and night on primitive calculators. Due to the fact that these people performed computations they were referred to as "computers." In fact the term "computer" was used as a job description (rather than referring to the machines themselves) well into the 1940s. This term later became associated with machines that could perform the computations on their own.
Working with Babbage was Augusta Ada Lovelace, the daughter of the English poet Lord Byron. Ada, who was a splendid mathematician and one of the few people who fully understood Babbage's vision, created a program for the Analytical Engine. Had the Analytical Engine ever actually worked, Ada's program would have been able to compute a mathematical sequence known as Bernoulli numbers. Based on this work, Ada is now credited as being the first computer programmer and, in 1979, a modern programming language was named ADA in her honor. The Difference Engine was actually only partially completed when Babbage conceived the idea of another, more sophisticated machine called an Analytical Engine (around 1830). The Analytical Engine was intended to use loops of Jacquard’s punched cards to control an automatic calculator, which could make decisions based on the results of previous computations. This machine was also intended to employ several features subsequently used in modern computers, including sequential control, branching, and looping.
Boole’s work was only learned by Philosophy and Logic students, until in 1938 Claude E. Shannon published an article based on his master's thesis at MIT, where he showed how Boole's concepts of TRUE and FALSE could be used to represent the functions of switches in electronic circuits. Claude Shannon, Creator of information theory George Boole Boole made significant contributions in several areas of mathematics, but was immortalized for two works in 1847 and 1854, in which he represented logical expressions in a mathematical form now known as Boolean Algebra. Boole's work was all the more impressive because, with the exception of elementary school and a short time in a commercial school, he was almost completely self-educated.
Alan Turing In 1937 Turing invented a theoretical computer as an abstract "paper exercise." This theoretical model, which became known as a Turing Machine, was both simple and elegant, and subsequently inspired many "thought experiments." During World War II, Alan Turing worked as a cryptographer, decoding codes and ciphers at one of the British government's top-secret establishments. Turing was a key player in the breaking of the German's now-famous ENIGMA Code.
1943 – 1947 ENIAC – Electronic Numerical Integrator and Calculator:
1943 – 1947 ENIAC – Electronic Numerical Integrator and Calculator: • It could do nuclear physics calculations (in two hours) which it would have taken 100 engineers a year to do by hand. • The system's program could be changed by rewiring a panel.
Johann von Neumann In June 1944, the Hungarian- American mathematician Johann (John) Von Neumann first became aware of ENIAC. Von Neumann, who was a consultant on the Manhattan Project, immediately recognized the role that could be played by a computer like ENIAC in solving the vast arrays of complex equations involved in designing atomic weapons.
In 1945, he published a paper titled “First draft of a report to the EDVAC”: A memory containing both data and instructions. Also to allow both data and instruction memory locations to be read from, and written to, in any desired order. A calculating unitcapable of performing both arithmetic and logical operations on the data. A control unit,which could interpret an instruction retrieved from the memory and select alternative courses of action based on the results of previous operations. The computer structure resulting from the criteria presented in the "First Draft" is popularly known as a von Neumann Machine, and virtually all digital computers from that time forward have been based on this architecture
SMIL, one of the first Swedish computers, built at Lund University in the mid-fifties. The original SMIL consisted of about 2000 vacuum tubes. SMIL was the main university computer for more than 15 years and wasn't decommissioned until 1972. This picture shows SMIL as it looked in 1956.
Two of the greatest inventions of the 20th century: Transistors and Integrated Circuits Formed from materials known as semi-conductors, not very well understood until 1950s They would be much smaller, lighter and would require less power than the vacuum tubes that were being used until that time. The world's first transistor, invented at Bell Labs in 1947 Dr. John Bardeen, Dr. William Shockley, and Dr. Walter Brattain, inventors
Transistors, may range in number from 2 to more than 100,000, are integrated together on pieces of silicon to produce Integrated Circuits to perform more complex functions. An integrated circuit contains transistors, capacitors, resistors and other parts packed in high density on one chip. The transistors, resistors, and capacitors are formed very small, and in high density on a foundation of silicon.
Microprocessors The first microprocessor developed by Hoff contained approximately 2,300 transistors and could execute 60,000 operations per second. In 1973 Intel presented the first true general-purpose microprocessor, which contained around 4,500 transistors and could perform 200,000 operations per second, and destined to be the central processor of many of the early home computers.
1973 The Intel Corporation delivers the first integrated circuit capable of executing a fully usable programme, the Intel 8080. The microprocessoris born. 1977 The Apple Computer Company is started by two college- dropouts in their garage, Steve Jobs and Steve Wozniak. The machine uses inexpensive parts and home color television. The BASIC programming language is written by Bill Gates of Microsoft. 1981 Microsoft provides the Disk Operating System (DOS) for the IBM Personal Computer Late 1980s The use of Windows operating shell produced by Microsoft provides a Graphical User Interface (GUI) for users
Computers can be generally classified by size and power, though there is considerable overlap: personal computer: A small, single-user computer based on a microprocessor. In addition to the microprocessor, a personal computer has a keyboard for entering data, a monitor for displaying information, and a storage device for saving data. workstation: A powerful, single-user computer. A workstation is like a personal computer, but it has a more powerful microprocessor and a higher-quality monitor. minicomputer: A multi-user computer capable of supporting from 10 to hundreds of users simultaneously. mainframe: A powerful multi-user computer capable of supporting many hundreds or thousands of users simultaneously. supercomputer: An extremely fast computer that can perform hundreds of millions of instructions per second.
The actual machinery -- wires, transistors, and circuits is called the hardware; the instructions and data are called software. All general-purpose computers require the following hardware components: memory : Enables a computer to store, at least temporarily, data and programs. mass storage device: Allows a computer to permanently retain large amounts of data. Common mass storage devices include disk drives. input device : Usually a keyboard and mouse, the input device is the conduit through which data and instructions enter a computer. output device : A display screen, printer, or other device that lets you see what the computer has accomplished. central processing unit (CPU): The heart of the computer, this is the component that actually executes instructions. (If the CPU is built around a microprocessor device, it is also referred to as a Microprocessor Unit, MPU)
Computer Generations. • FIRST GENERATION(1945-1956) • operating instructions for a specific task • different machine languages for each computer • vacuum tubes • SECOND GENERATION(1956-1963) • general purpose computers • transistors • printers, memory, disk, operating systems, programs
Computer Generations. • THIRD GENERATION(1965-1971) • Integrated Circuit • smaller • multi-programming • FOURTH GENERATION(1971-Present) • 1981 – IBM’s Personal Computer • wide area of usage • ten years later, 65 million PCs were being used • laptops, palmtops • macintosh
Computer Generations. • FIFTH GENERATION(Future) • parallel processing • expert systems • superconductor technology