120 likes | 189 Views
AND THE INTERNET. By: Katie M. 4 th Period. The Creation of the First Computer. First programmable computer The Z1 originally created by Germany's Konrad Zuse in his parents living room in 1936 to 1938 is considered to be the first electrical binary programmable computer.
E N D
AND THE INTERNET By: Katie M. 4th Period
The Creation of the First Computer First programmable computer The Z1 originally created by Germany's Konrad Zuse in his parents living room in 1936 to 1938 is considered to be the first electrical binary programmable computer.
History of the Term “De-Bug” In 1947, Grace Murray Hopper was working on the Harvard University Mark II Aiken Relay Calculator (a primitive computer). On the 9th of September, 1947, when the machine was experiencing problems, an investigation showed that there was a moth trapped between the points of Relay #70, in Panel F. The operators removed the moth and affixed it to the log. The entry reads: "First actual case of bug being found." The word went out that they had "debugged" the machine and the term "debugging a computer program" was born. Although Grace Hopper was always careful to admit that she was not there when it actually happened, it was one of her favorite stories.
History of the Internet In August 1968, after Lawrence Roberts and the DARPA funded community had refined the overall structure and specifications for the ARPANET, an RFQ was released by DARPA for the development of one of the key components, the packet switches. Due to Kleinrock's early development of packet switching theory and his focus on analysis, design and measurement, his Network Measurement Center at UCLA was selected to be the first node on the ARPANET. All this came together in September 1969 when BBN installed the first IMP at UCLA and the first host computer was connected. Two more nodes were added at UC Santa Barbara and University of Utah. Thus, by the end of 1969, four host computers were connected together into the initial ARPANET, and the budding Internet was off the ground. Computers were added quickly to the ARPANET during the following years, and work proceeded on completing a functionally complete Host-to-Host protocol and other network software. In October 1972 Kahn organized a large, very successful demonstration of the ARPANET at the International Computer Communication Conference . This was the first public demonstration of this new network technology to the public. From there email took off as the largest network application for over a decade. This was a harbinger of the kind of activity we see on the World Wide Web today, namely, the enormous growth of all kinds of "people-to-people" traffic.
Babbage’s Analytical Engine Charles Babbage (1791-1871), computer pioneer, designed two classes of engine, Difference Engines, and Analytical Engines. Difference engines are so called because of the mathematical principle on which they are based, namely, the method of finite differences. The beauty of the method is that it uses only arithmetical addition and removes the need for multiplication and division which are more difficult to implement mechanically. Difference engines are strictly calculators. They crunch numbers the only way they know how - by repeated addition according to the method of finite differences. They cannot be used for general arithmetical calculation. The Analytical Engine is much more than a calculator and marks the progression from the mechanized arithmetic of calculation to fully-fledged general-purpose computation. There were at least three designs at different stages of the evolution of his ideas. So it is strictly correct to refer to the Analytical Engines in the plural.
Herman Hollerith’s Tabulating Machine After receiving his Engineer of Mines degree at age 19, Hollerith worked on the 1880 US census, a laborious and error-prone operation that cried out for mechanization. After some initial trials with paper tape, he settled on punched cards (pioneered in the Jacquard loom) to record information, and designed special equipment -- a tabulator and sorter -- to tally the results. His designs won the competition for the 1890 US census, chosen for their ability to count combined facts. These machines reduced a ten-year job to three months, saved the 1890 taxpayers five million dollars, and earned him an 1890 Columbia PhD¹. This was the first wholly successful information processing system to replace pen and paper. Hollerith's machines were also used for censuses in Russia, Austria, Canada, France, Norway, Puerto Rico, Cuba, and the Philippines, and again in the US census of 1900.
ENIAC ENIAC, short for Electronic Numerical Integrator And Computer,was the first general-purpose, electronic computer. It was a Turing-complete, digital computer capable of being reprogrammed to solve a full range of computing problems. ENIAC was designed to calculate artilleryfiring tables for the United States Army's Ballistic Research Laboratory, but its first use was in calculations for the hydrogen bomb. It boasted speeds one thousand times faster than electro-mechanical machines, a leap in computing power that no single machine has since matched. This mathematical power, coupled with general-purpose programmability, excited scientists and industrialists. The ENIAC's design and construction were financed by the United States Army during World War II. The construction contract was signed on June 5, 1943, and work on the computer was begun in secret by the University of Pennsylvania's Moore School of Electrical Engineering starting the following month under the code name "Project PX". The completed machine was unveiled on February 14, 1946 at the University of Pennsylvania, having cost almost $500,000. ENIAC was shut down on November 9, 1946 for a refurbishment and a memory upgrade, and was transferred to Aberdeen Proving Ground, Maryland in 1947. There, on July 29, 1947, it was turned on and was in continuous operation until 11:45 p.m. on October 2, 1955.
Programs - A plan for the programming of a mechanism (as a computer) - A sequence of coded instructions that can be inserted into a mechanism (as a computer)
Benefits of the Transistor over the Vacuum Tubes Smaller size (despite continuing miniaturization of vacuum tubes) Highly automated manufacture Lower cost (in volume production) Lower possible operating voltages (but vacuum tubes can operate at higher voltages) No warm-up period (most vacuum tubes need 10 to 60 seconds to function correctly) Lower power dissipation (no heater power, very low saturation voltage) Higher reliability and greater physical ruggedness (although vacuum tubes are electrically more rugged. Also the vacuum tube is much more resistant to nuclear electromagnetic pulses (NEMP) and electrostatic discharge (ESD)) Much longer life (vacuum tube cathodes are eventually exhausted and the vacuum can become contaminated) Complementary devices available (allowing circuits with complementary-symmetry: vacuum tubes with a polarity equivalent to PNP BJTs or P type FETs are not available) Ability to control large currents (power transistors are available to control hundreds of amperes, vacuum tubes to control even one ampere are large and costly) Much less microphonic (vibration can modulate vacuum tube characteristics, though this may contribute to the sound of guitar amplifiers) " Nature abhors a vacuum tube " Myron Glass (see John R. Pierce), Bell Telephone Laboratories, circa 1948
The Micro-Computer A microcomputer is a computer with a microprocessor as its central processing unit. They are physically small compared to mainframe and minicomputers. The term "Microcomputer" came into popular use after the introduction of the minicomputer, although Isaac Asimov used the term microcomputer in his short story "The Dying Night" as early as 1956 (published in The Magazine of Fantasy and Science Fiction in July that year). Most notably, the microcomputer replaced the many separate components that made up the minicomputer's CPU with a single integrated microprocessor chip. The earliest models such as the Altair 8800 were often sold as kits to be assembled by the user, and came with as little as 256 bytes of RAM, and no input/output devices other than indicator lights and switches, useful as a proof of concept to demonstrate what such a simple device could do.
The Binary System The word "binary" describes a system that has only two possible digits. To understand this, let's first compare this to a system you're probably more familiar with, the Decimal system. The word "decimal" describes a system that has ten possible digits. These are the digits 0 through 9. Every number expressed in the decimal system is a combination of these ten digits. You use the decimal system every day, it comes naturally, we all have 10 fingers and 10 toes (unless your family tree doesn't fork, but let's not go there), and some of us use those 10 fingers and toes extensively to help with every day addition and subtraction. The binary system works essentially the same way, with the only difference that it only has two digits. These are visually expressed by the digits 0 and 1. Every number expressed in the binary system is a combination of these two digits.
Resources • http://www.computerhope.com/issues/ch000984.htm • http://www.google.com/imgres?imgurl=http://augmentedrealitywiki.com/images/c/c6/Konrad-Zuse.jpg&imgrefurl • http://www.google.com/imgres?imgurl=http://www.german-way.com/imagesGW/Konrad_Zuse.gif&imgrefurl • http://www.google.com/imgres?imgurl=http://www.rtd-net.de/Zuse.GIF&imgrefurl • http://www.google.com/imgres?imgurl=http://www.installaware.com/installaware/debug.gif&imgrefurl • http://www.jamesshuggins.com/h/tek1/first_computer_bug.htm • http://www.jamesshuggins.com/h/bas1/credits.htm#firstcomputerbug • http://www.computerhistory.org/babbage/engines/ • http://www.columbia.edu/acis/history/hollerith.html • http://en.wikipedia.org/wiki/ENIAC • http://www.merriam-webster.com/dictionary/program • http://en.wikipedia.org/wiki/Microcomputer • http://www.edaboard.com/thread81894.html • http://www.pcnineoneone.com/howto/binary1.html