150 likes | 412 Views
History of Computers. Tori Harbin 3 rd period. Who invented the first computer?. 1837 Charles Babbage, a British professor of mathematics, had an idea for Analytical Engine, the first stored-program mechanical computer. Charles Babbage is known as the “Father of Computing”.
E N D
History of Computers Tori Harbin 3rd period
Who invented the first computer? • 1837 Charles Babbage, a British professor of mathematics, had an idea for Analytical Engine, the first stored-program mechanical computer. • Charles Babbage is known as the “Father of Computing”. • In 1939 John V. Atanasoff and Clifford Berry developed the Atanasoff-Berry computer (ABC). • ABC computer was built by hand. • 1943 the ENIAC (Electronic Numerical Integrator and Computer) was made. • The UNIVAC I (Universal Automatic Computer) was the first commercially available, “mass produced” electronic computer manufactured by Remington Rand in the USA and was delivered to the US Census Bureau in June 1951. • By the 1990s, the microcomputer or Personal Computer (PC) became a common household appliance, and became even more widespread with the advent of the Internet.
What were computers made of? • First Generation (c1950): These were built from electronic valve (vacuum tube) technology. Consequently they were large, consumed a great deal of electrical power, and were not reliable enough for continuous operation for days at a stretch. They were only suitable for enthusiasts and people who had to have them. • Second Generation (c1960): These were built from transistors. These were so much smaller, consumed so muchless power, and were so long-lived compared to valves, that these were the first computers which could be successfully marketed. • Third Generation (c1970): These were built from Large Scale Integration (LSI) of many transistors onto a single chip. This permitted the diversification of the computer market into two distinct areas. The first used the technology to build much more powerful computers at the same size and price as before, still called ``mainframes''. The second used the technology to build small cheap computers which offered the power of the previous generation of mainframes at a fraction of the size and cost. These were called minicomputers. • Fourth Generation (c1980): These were built from Very Large Scale Integration (VLSI) of many thousands of transistors onto a single chip. This permitted the diversification of the computer market into two distinct areas. The first used the technology to build much more powerful computers at the same size and price as before, still called ``mainframes'' and ``minicomputers''. The second used the technology to build small cheap computers which offered the power of the previous generation of minicomputers at a fraction of the size and cost. These were called microcomputers. These were personal computers, since they were small and cheap enough that being idle when the owners weren't using them didn't matter. • Fifth Generation (c1990): The Japanese caused some alarm in the US and UK by announcing their intention to develop the Fifth Generation based on ideas developed in Artificial Intelligence research. Unlike previous generations this was not based on a clear hardware discontinuity, and the project in the end partly fell short of its original goals, and was partly overtaken by events. No really good candidate for the label of Fifth Generation has turned up yet, although one possibility is the linking of computers via the Internet and the World Wide Web, so that they have become participants in a very large and rapidly growing world-wide global information store. While the ideas and early implementations are decades old (the Internet began as the DARPA Defence Research net in the US and as JANet [Joint Academic Network] in the UK), it was only in the 1990s that large numbers of the general public acquired their own personal computers with Internet and Web capability as a standard feature.
Prices of computers. • As you can see the price of computers has dropped drastically over the years. From 1999 to 2003 the index dropped over 20% every year. Lately the rate of price drop has slowed but the prices for computers are still dropping. The index has decreased at a rate of 11-12% annually for the last 3 years. Remember this is an index of prices so its not directly related to a specific dollar amount for a specific item. They are looking at personal computers and peripheral equipment prices as a whole so its a combination of everything in that category: very expensive computers, cheap computers, monitors, printers, etc. So while the overall prices may have gone down from a 700 level to 100 level that doesn't mean that any individual computer part you buy today will cost 1/7th of what it cost 10 years ago. For example the cheapest Dell system today costs about $400 but the cheapest Dell 10 years ago was not 7 times that much. Overall this kind of price trend does matches reality pretty well. I remember in the late 1980's that an IBM PC would cost in the ballpark of $3000. I remember spending about $1500 for my first real PC back around 1994. Today you can buy a decent Dell system for around $400. A printer today can be bought for as little as $25-$50 on sale but 10 years ago they'd easily run $150-$200 minimum. From the index and from direct observation, its clear that personal computer prices have been steadily decreasing over the past decade.
Uses of computer. • Uses of a personal computer Like other computers, personal computers can be instructed to perform a variety of individual functions. A set of instructions that tells a computer what to do is called a program. Today, more than 10,000 application programs are available for use on personal computers. They include such popular programs as word processing programs, spreadsheet programs, database programs, and communication programs. Word processing programs are used to type, correct, rearrange, or delete text in letters, memos, reports, and school assignments. Spreadsheet programs enable individuals to prepare tables easily. The users of such programs establish rules for handling large groups of numbers. For example, using a spreadsheet program, a person can enter some numbers into a table and the program will calculate and fill in the rest of the table. When the user changes one number in the table, the other numbers will change according to the rules established by that user. Spreadsheets may be used for preparing budgets and financial plans, balancing a chequebook, or keeping track of personal investments. Database programs allow a computer to store large amounts of data (information) in a systematic way. Such data might include the name, address, telephone number, salary, and starting date of every employee in a company. The computer could then be asked to produce a list of all employees who receive a certain salary. Communication programs connect a personal computer to other computers. People can thereby exchange information with one another via their personal computers. In addition, communication programs enable people to link their personal computers with databanks. Databanks are huge collections of information stored in large centralized computers. News, financial and travel information, and other data of interest to many users can be obtained from a databank. Other programs include recreational and educational programs for playing games, composing and hearing music, and learning a variety of subjects. Programs have also been written that turn household appliances on and off. Some people develop their own programs to meet needs not covered by commercially prepared programs. Others buy personal computers mainly to learn about computers and how to program them.
How much electricty does the computer use? • A typical desktop computer uses about 65 to 250 watts. To find the figure for your particular computer you can contact the manufacturer (not me), or see my section on measuring electrical use.Add another 17-72 watts for an LCD monitor, or about 80 watts if you have an old-school 17" CRT. Don't forget related devices. My cable modem uses 7 watts, my D-Link DI-604 router uses 4.5 watts, and my Motorola phone box for use with Vonage uses 2 watts while idle (3 when I'm on the phone).Most laptop computers use about 15-45 watts, far less than desktops.With most devices you can look at the label to see how much energy they use, but that doesn't work so well with computers because the label gives the theoretical maximum, not the typical amount used. A computer whose label or power supply says 300 watts might only use about 70 watts when it's actually running, and only 100 even in peak times with serious number-crunching and all the drives spinning.As long as your computer goes into sleep/standby when you're not using it, your computer doesn't use squat for electricity, compared to the rest of your household. You'll save a lot more energy by addressing your heating, cooling, and lighting use rather than obsessing over your computer. For most people, their computers' energy use is not a significant portion of their total use, even if they use their computers a lot. Of course, you should absolutely make sure your computer is set to sleep automatically when you're not using it, because it's silly to waste energy, but your computer likely isn't even close to being the biggest energy-waster in your home.
How much does it coast to run your computer? • To calculate your costs use this formula:Wattsx Hours Used x Cost per kilowatt-hour = Total Cost • 1000 • For example, let's say you have a big high-end computer with a gaming-level graphics card and an old CRT monitor, and you leave them on 24/7. That's about 330 watts x 24 hours x 365 days/yr = 2,890,800 watt-hours, or 2891 kilowatt-hours. If you're paying $0.20 per kWh, you're paying $578 a year to run your computer.Let's try a different example: You have a computer that's less of an energy hog, like in iMac G5 20", which uses about 105 watts, and you're smart enough to turn it off when you're not using it. You use it for two hours a day, five days a week. That's ten hours a week, or 520 hours a year. So your 105 watts times 520 hours = 54,600 watt-hours. Divide by 1000 and you have 55 kilowatt-hours (kWh). If you're paying 10¢ per kilowatt-hour, then you're paying $5.50 a year to run your computer.That's quite a range, $5.50 to $578 a year. It really depends on what kind of computer it is, how much you use it, and your local rate for electricity -- and especially whether you turn off the computer when you're not using it (or at least sleep it). Both the examples above are extremes. I used to have only one example somewhere in the middle but then I'd see people on blogs and forums misquoting it by writing, "Mr. Electricity says a computer costs about about $150/yr. to run." No, that is not what I said. I said that was just an example. Your situation is almost certainly different, and you need to consider all the variables, like what kind of computer it is, how much you use it, and most especially whether you leave it running all the time or sleep it when you're not using it.
Energy required to make a computer? • This paper said it took about 6400 megajoules of electricity to make a desktop computer and a 17" CRT monitor in 2000, which would be about 1778 kwh -- or as much electricity as the typical household uses in two months. Yet another reason to buy used.
Sleep and screensavers. • When your computer sleeps (aka "standby", "hibernate") the computer uses 0-6 watts. (So does the monitor.) You can set your computer to sleep automatically after a certain amount of idle time. Setting your computer to auto-sleep is the best and easiest way to save on computer energy use! Here's how to do it:In Windows XP go to Start > Control Panel > Power OptionsOn a Mac go to System Preferences > Energy SaverIt's not terribly important to understand the difference between Sleep, Standby, and Hibernate. In a nutshell, hibernate saves your workspace (all the open windows) and then turns your computer off, so it saves more energy than standby, but a hibernating computer takes longer to wake up. For the curious I have a separate article about the differences between Sleep, Standby, and Hibernate.A screensaver that shows any image on the screen doesn't save any energy at all -- you save energy only if the monitor goes dark by going to sleep. If you turn the monitor off at the switch it will use 0 to 10 watts. (Some electronics equipment draws a small amount of energy even when it's switched off.)
Specific Models Macs have generally used less energy than PC's, though I haven't tested any Macs since Apple made the switch to Intel processors. If any Mac users would like to send me their readings, tested with a watt-hour meter, I'll be happy to publish the numbers here.Below is how much energy Dell computers use, according to their website in 2006, and how much my Apple iMac G5 uses based on my measurements. I would have included some HP desktops in the table but HP hasn't bothered to update its website with the energy consumption for current models in over four years. For Dell models not listed below try a Google search using this format: site:dell.com b110 watts Replace the "b110" with the model you're searching for, of course.Don't write to me to ask me how much your particular computer uses, because I didn't make your computer and unlike you, I don't have access to it. Contact the manufacturer or buy a watt-hour meter.Dell DesktopsMaximumMinimumSleepOffDimension B110 (Pentium 4 520)112603.01.7Optiplex GX620 (Pentium 4 630)127721.32.5Dimension E310 (Pentium 4, 2.8GHz)132711.71.7Optiplex 170L (Pentium 4, 3.2GHz)163803.72.2Dimension E510 (Pentium 4 551)1651061.30.7Dimension XPS 6002001425.54.5Dimension XPS 400 (Pentium 4 551)2581492.01.0 Apple iMac G5 w/built in 20" LCD screenDoing nothing97 wattsMonitor dimmed84 wattsMonitor sleep62 wattsCopying files110 wattsWatching a DVD110 wattsOpening a bunch of pictures120 wattsComputer sleep3.5 watts • Yes, it doesn't make sense that the GX620 is listed as using more power when it's off than when it's sleeping, but I'm just reprinting the numbers from Dell's specs.The University of Pennsylvania has a somewhat more recent list of Mac / PC wattage.
Wether to use a laptop to save energy? • Some people think it's a bad idea to replace desktops with laptops even though they use less energy, because they're more likely to require repair, those repairs are more expensive than desktop repairs, many users thus choose to replace their broken laptop rather than getting it fixed, and laptops require disposal of chemically-laden batteries when they wear out. While these things are true, I think the average person (and the environment) will still come out ahead over all by using a laptop over a desktop, because only a fraction of laptops will actually break and get replaced. If every laptop failed like this during its life (or even if most of them did), we could easily say that it would be better to stick with desktops. But since only a fraction of laptops fail, I still think there's a net savings by using laptops
Websites. • http://www.bigsiteofamazingfacts.com/who-invented-the-first-computer • http://www.mi-is.be/be_en/05/index.html • http://www.catonmat.net/blog/donald-knuths-first-computer/ • http://www.dai.ed.ac.uk/homes/cam/fcomp.shtml • http://pc-desktop-computers.com/images/float/computer-parts-wholesale.jpg • http://www.freeby50.com/2009/04/cost-of-computers-over-time.html • http://i.ehow.com/images/a06/04/f4/create-price-tag-200X200.jpg • http://library.thinkquest.org/C007091/uses.htm • http://www.vforteachers.com/images/person%20at%20computer%20could%20be%20teacher.JPG
Websites. • http://michaelbluejay.com/electricity/computers.html