490 likes | 619 Views
History of Computing. Some interesting videos. IBM http://www.youtube.com/watch?v=39jtNUGgmd4 WATSON http://www.youtube.com/watch?v=6m9RzsBiobA DEEP BLUE http://www.youtube.com/watch?v=NJarxpYyoFI. Pre 1642. 3000 BC – Abacus is invented in Babylonia – extended short
E N D
Some interesting videos IBM http://www.youtube.com/watch?v=39jtNUGgmd4 WATSON http://www.youtube.com/watch?v=6m9RzsBiobA DEEP BLUE http://www.youtube.com/watch?v=NJarxpYyoFI
Pre 1642 • 3000 BC – Abacus is invented in Babylonia – extended short term memory beyond 4 to 7 things • 800 AD - Chinese begin to use zero. • From 800 to 1641 some important stuff happens in math – algebra, calculus, geometry, etc.
ANNOUNCEMENTS • Class is canceled for Friday 4/22 • Final Exam is Friday 4/29 • Wed 4/27 will be REVIEW
Why was math important? • Money – to keep track of the transfer and payment of goods • Predictability – rising and falling costs, expected delivery of goods, life expectancy, when to plant and harvest, when animals would migrate (for hunting), planning your life • Location – finding your location on the sea. • Time – when and where. Early math: the calculation of the earth’s circumference using sticks, shadows, and distance http://www.youtube.com/watch?v=4q3Efu8fGWQ
Why was zero important? When anyone thinks of one hundred, two hundred, or seven thousand the image in his or her mind is of a digit followed by a few zeros. The zero functions as a placeholder; that is, three zeroes denotes that there are seven thousands, rather than only seven hundreds. If we were missing one zero, that would drastically change the amount. Just imagine having one zero erased (or added) to your salary! Yet, the number system we use today - Arabic, though it in fact came originally from India - is relatively new. For centuries people marked quantities with a variety of symbols and figures, although it was awkward to perform the simplest arithmetic calculations with these number systems.
6000 years ago… The Sumerians (4000 BC) were the first to develop a counting system to keep an account of their stock of goods - cattle, horses, and donkeys, for example. The Sumerian system was positional; that is, the placement of a particular symbol relative to others denoted its value. They were the first to have a “place” for zero. The Sumerian system was handed down to the Akkadians around 2500 BC and then to the Babylonians in 2000 BC. It was the Babylonians who first conceived of a mark to signify that a number was absent from a column; just as 0 in 1025 signifies that there are no hundreds in that number. Although zero's Babylonian ancestor was a good start, it would still be centuries before the symbol as we know it appeared. It meant two things: a placeholder, and the absence of something, was in fact something on its own
Tabulation – the importance of keeping records Tycho Brahe http://www.nada.kth.se/~fred/tycho/index.html • No telescope • Charted hundreds of thousands of stars and planets to within .5 arc of movement (1 arc = 1 degree = 1/360 of an orbit) for 50 years • Discovered stars, comets, supernova (exploding stars) • Again, no telescope • Discovered that the universe was not constant • Thought this would aid time and navigation
Calculating Latitude: Where is the sun on the horizon – tells you how far North or South you are Calculating Longitude – cannot tell how far East or West you are, by looking at the sun, because it all depends on time of day. However, if you knew the time in London when the sun was directly overhead from you, then you knew how many hours of Earth rotation London was. And since the Earth rotates once per 24 hours, you would know where on the Earth you were located. Hence, the need for a precision clock – clocks could not keep perfect time, because metals changed with temperature, moisture, and salt.
1642 - 1800 • 1642-1643 – Blaise Pascal invents the first gear-driven adding machine “Pascalene” Photo courtesy of Wikipedia • 1666 – Mechanical calculator that can add and subtract. • 1777 – Multiplying calculator invented. WHY? Not for ease of use… But because people couldn’t be trusted, so a machine verified the calculations, as “proof”.
Industrial Revolution • 1801 – Jacquard Loom • Boom in development of machinery led to growth of factories and mass production of goods. What was it about a loom? Patterns could be “remembered” through the use of recipes, contained on paper with holes
Paper with holes? http://www.youtube.com/watch?v=MjywQADj4WY Early, programmability just meant “repeatability”
Mechanization – machines began to do people’s work OK they multiplied people’s strength (mills, plows) but what about people’s finer abilities?) The first computer scientists were WATCHMAKERS Automatons: http://www.youtube.com/watch?v=MQMOPmpZ45Y http://www.youtube.com/watch?v=_zCYhDx8bes http://www.lovelyfineantiques.com/automatasuk.htm http://www.youtube.com/watch?v=y9RZz8QwkQM&NR=1&feature=fvwp
One of the greatest Mechanizations in History Managed to keep perfect time regardless of temperature and moisture.
Charles Babbage • 1821 – Difference Engine • Used wheels with numbers and worked with gears like a clock. User input an equation and turned crank until answer appeared. Only a prototype built. • Image from: http://www.tcf.ua.edu/AZ/ITHistoryOutline.htm
Charles Babbage • Charles Babbage's greatest claim to fame is that he didn't build the world's first computer - although he sure tried hard enough. • Babbage was something of a zealot in the cause of mathematical accuracy - this was a man who once wrote to poet Alfred Lord Tennyson and demanded he change the lines: "Every moment dies a man, Every moment one is born" to "Every moment dies a man, Every moment one and one-sixteenth is born". • But in spite of his eccentricities (Babbage also nurtured an almost pathological hatred of organ grinders), he almost, almost, became the first man to invent the modern computer. Babbage's big mistake was being born in an age which had the basic knowledge to design such a machine, but no technology with which to build it.
In 1822, fed up with the: "intolerable labour and fatiguing monotony" involved in the endless calculation required for scientific tables (a task he believed ranked: "among the lowest occupations of the human intellect" - Babbage came up with a preliminary model of his Difference Engine - a machine, he was sure, that would remove this drudgery forever. • It could solve a polynomial: y = ax2 + bx + c Column 1 would be set to the values of a, b, c, and initial value of x. One crank of the handle would place y in column 2. Another turn would place y in column 3, and the value of y for (x+1) in column 2. Another turn would place y in column 4, and the value of y for (x+1) in column 3, and the value of y for (x+2) in column 2. …etc….
A sucker for punishment, by 1833 Babbage had embarked on an even more ambitious project - his Analytical Engine: "a machine of the most general nature". Babbage's Analytical Engine was to be the world's first general use programmable computer, a machine designed not just for solving one particular problem, but to carry out a range of calculations ordered by its operator. • Designed to include both a ‘store’ and a ‘mill’, the Analytical Engine's ‘store’ would retain up to 100 forty-digit numbers awaiting their turn at the ‘mill’. Once operated on, results would also be returned to storage, to be held until needed for further use or printed out. Babbage had quite accurately (if somewhat vaguely) described the modern computer memory and processor.
Ada Byron, Countess of Lovelace, Babbage's confidant and cohort - and one of the few people to actually understand the potential of his machine - compared the Analytical Engine with the Jacquard-loom: "We may most aptly say that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves." • Even if building the Analytical Engine had been possible, the finished machine would have been a frightening - and dangerous - device. • Driven by steam, the Analytical Engine would have been roughly the size of a train engine, comprised of an incredibly complex intermeshing of thousands of clockwork parts - the smallest imbalance in any of which would have caused the machine, at the very best, to shake itself to pieces.
Final Exam is Friday 4/29 • Wed 4/27 will be REVIEW
Summary So Far • The exchange of money required numbers, zeros (or, a placeholder signifying zero), and calculations – the Sumerians. • Charting natural occurrences required extensive tabulations and calculations – Tycho Brahe • Extensive repeated calculations required tabulating machinery – Charles Babbage • Repeated operations within machinery required programmability – Jacquard Loom • Navigating the sea required precision instrumentation – Watchmakers
1836 – 1890’s – Information at a distance • 1836 – Samuel Morse invents Morse code – used ones and zeros (dashes and dots) to convey information • 1838 – Telegraph demonstrated – true information at a distance • 1854 – George Boole introduced system for logic and reasoning using ones and zeros The presence or absence of electrical signals is now a “language”. Morse and Boole both used 1s and 0s to send information.
1890 Census: good governing led to data processing needs • Herman Hollerith’s Electric Tabulating System wins the competition to compute the 1890 census. • 1893 – First four function calculator is introduced. • 1896 – Hollerith establishes the Tabulating Machine Company Photo: IBM
1900 – 1931 – people become comfortable with electricity and communications, in their homes • 1907 – First regular radio broadcasts from New York. • 1927 – First demonstration of TV in US with sound transmitted over telephone lines. • 1929 – Color television signals transmitted. • 1931 – Scan-tron is created
1931 - 1940 • 1931 – Konrad Zuse builds the Z1 – first electronic digital calculator. Photo: http://www.bnv-gz.de/ • 1937 - Turing’s paper presents ideas on computability. • 1940 - Zuse builds the Z2, the first fully functioning electomagnetic computer. • 1940 – First color television broadcast.
1941 – 1942The survival of civilization • December 5, 1941 – Zuse’s Z3, a complete fully functional program – automatically controlled electromechanical computer. It has 64 word memory and takes 3 seconds per multiplication. • 1942 – Colossus helps break German Enigma http://www.youtube.com/watch?v=VaPK2JwYIuA&feature=player_embedded#at=85 Photo: http://www.picotech.com/applications/colossus.html
Alan Turing • Had all the tools, and a real need (WWII) - http://www.youtube.com/watch?v=XGCs3Q9Friw&feature=related • It can truthfully be said, that Alan Turing saved 100s of millions of lives
1943 • 1943 – ENIAC contracted for use in calculating ballistic tables • Weighed 30 tons • 30’ x 50’ space • 18,000 vacuum tubes • 360 multiplications performed per second Photo: Wikipedia
1944 • 1944 – Mark I – uses punched paper to program and vacuum tubes and relays to solve problems • Had everything but miniaturization Photo from Wikipedia
1945 - 1950 • June 1945 – Stored program computer idea introduced by John vonNeumann. • 1950 – Turing test for machine intelligence published. • September 9, 1945 – First computer “bug” found at 15:45, a moth. http://tafkac.org/faq2k/compute_86.html
1951 • March 31, 1951 – UNIVAC I delivered to census bureau • 16,000 pounds • 5,000 vacuum tubes • 6,000 calculations per second • $159,000 • June 16, 1951 – First programming error occurs at the census bureau. Photo of Univac courtesy of Wikipedia • 1951 – First American computer to implement the stored program concept is completed.
1952 - 1956 • 1952 – UNIVAC I predicts the outcome of the presidential election with only 5% of votes in. • 1954 – First line printer developed. • 1956 – First keyboard used to directly input information into a computer. • 1956 – First programming language, FORTRAN invented.
1957 - 1962 • 1957 – First photograph is scanned, processed, and redisplayed by a computer. • 1958 – Second generation of computers (transistor computers) are produced. First generation are those made with vacuum tubes. • 1959 – First commercial copy machine introduced by Xerox. • 1962 – First department of Computer Science established at Purdue and Stanford. • 1962 – First video game (Spacewar) invented at MIT.
1963 - 1969 • 1963 – Joseph Weizenbaum (MIT) develops Eliza. • 1964 – Mouse is invented. • 1964 – Third generation of computers emerges – integrated circuit • 1968 – Federal Information Processing Standard encourages use of six digit format YYMMDD. • 1969 – ARPANET – network of researches at UCLA, UC Santa Barbara, SRI, and U. of Utah goes live – Internet is born • 1969 – UNIX is created at AT&T Bell Labs.
1970 - 1972 • 1970 – Floppy disks and daisy wheel printers debut. • 1970 – Fourth generation of computers – large scale integrated circuits: 15,000 circuits on a chip as opposed to 1,000 in third generation. • 1972 • Hand held calculators introduced and makes slide rule obsolete. • Atari founded. • Email program created and sends messages across Arpanet.
1973 - 1976 • 1973 – Xerox PARC and Alan King develop a PC that uses icons, graphics, and a mouse. • 1974 – Xerox PARC develops first WYSIWYG application, Bravo. • 1975 – IBM introduces laser printer. • 1975 – Xerox PARC – Ethernet and first LAN developed. • 1976 – IBM develops ink jet printer. • 1976 – Steve Jobs & Steve Wozniak build Apple I.
1977 - 1980 • 1977 – Apple II announced and produced with 16K RAM, 16K ROM, $1298. • 1978 – First word processor, Wordstar. • 1979 – First spreadsheet, VisiCalc • 1979 – Work on first database program begun – dBase II. • 1979 – Cell phones tested in Japan and Chicago. • 1979 – Pac Man and early video games appear. • 1980 – IBM selects PC-DOS from unknown company Microsoft as OS for its new PC.
1981 - 1982 • 1981 – First commercially successful portable computer, the Osborne I: 23 pounds, 64K RAM. • 1982 – Commodore 64: 64K RAM, 20K ROM, $595. • 1982 – Sony announces CD technology. • 1982 – Commercial email across 25 cities begins. • 1982 – Term Internet first assigned to a group of networked computers. • 1982 – 3,275,000 PCs sold, up from 300,000 in 1981.
1983 - 1987 • 1983 - Lotus 1,2,3 combined spreadsheets, graphics, and database together in one package. • 1984 – Apple’s Macintosh unveiled with Graphical User Interface. • 1985 – Speeds reach 1 billion operations per second. • 1985 – Windows jumps on the GUI bandwagon. • 1986 – Wall Street Journal talks about Computer Aided Software Engineering.
1988 - 1990 • 1988 – Internet Relay Chat debuts. • 1989 – 1000 hospital computer systems die 215 days after Jan. 1, 1900 • 1990 - • 54 million computers in the US. • MS Windows 3.0 introduced. • First commercially available dial-up Internet access. • Arpanet decommissioned.
1991 - 1992 • 1991 • Mobile, pen-based computers that can read handwriting introduced. • World Wide Web standards released that describe framework for linking documents on different computers. • 1992 • Microsoft sells 3 million copies of Windows 3.1 in first two months. • Prodigy serves 1 million subscribers. • PDA announced by Apple.
1993 - 1997 • September 1994 – Netscape web browser becomes available. • 1995 – Toy Story, fully generated by computer. • August 1995 – Windows 95 is launched. • 1995 – Ebay sells its first item • 1996 – Heavy traffic on Internet causes World Wide Wait. • 1997 – Worldwide: • 50 million World Wide Web users • 15 million Internet hosts computers. • 150 countries on the Internet.
1997 – 2000 • 1997 – Deep Blue beats chess master Kasparov in game of chess. • 1998 – Google Inc is founded • 1999 – Y2K worries cause companies to spend $500 billion worldwide. • 2000 – Dot com goes dot bust – growth of Internet companies comes to a screeching halt.
2001 • 2001 – Microsoft introduces XP as its newest Operating System. • 2001 - Mac OS X the first Unix-based Mac OS is released. • 2001 – Wikipedia is launched
2002 - 2009 • 2004 – Facebook debuts its first website • 2004 – Mozilla releases Firefox • 2005 – Mac announces they will use Intel processors inside the Mac. • 2007 – Windows releases Vista – Windows 7 release announced soon after • 2007 – Apple introduces the iPhone • 2010 – Apple introduces the iPad
Lots of history, but these are trends that YOU can watch now: • Moore’s Law: computer power doubles every 18 months. True now for 20 years. • Miniaturization: circuits are now a few atoms thick, will get smaller. • Wires are no longer needed. Everything is wireless. • Speed: changing logic states and detecting that change is now approaching the speed of light. • Implantable computers will overcome the huge time scale of evolution. • Turing Test will soon be passed by a computer: indistinguishable from a human interaction. Soon they will enslave us. Kidding.
Additional Resources and Readings • http://www.comphist.org/ifip_report.php • http://www.computerhistory.org/timeline • http://plato.stanford.edu/entries/computing-history • Interesting people • http://www.computerhistory.org/fellowawards/hall • http://ei.cs.vt.edu/~history/people.html