400 likes | 631 Views
The American University in Cairo Computer Science and Engineering Department. CSCE 106 Fundamentals of Computer Science. Assisting Slides. Introduction to Computer Systems. Origins of Computing Machines. Early computing devices Abacus: positions of beads represent numbers
E N D
The American University in Cairo Computer Science and Engineering Department CSCE 106Fundamentals of Computer Science Assisting Slides
Origins of Computing Machines • Early computing devices • Abacus: positions of beads represent numbers • Gear-based machines (1600s-1800s) • Positions of gears represent numbers • Blaise Pascal, Wilhelm Leibniz, Charles Babbage
Early Data Storage • Punched cards • First used in Jacquard Loom (1801) to store patterns for weaving cloth • Storage of programs in Babbage’s Analytical Engine • Popular through the 1970’s • Gear positions
Early Computers • Based on mechanical relays • 1940: Stibitz at Bell Laboratories • 1944: Mark I: Howard Aiken and IBM at Harvard • Based on vacuum tubes • 1937-1941: Atanasoff-Berry at Iowa State • 1940s: Colossus: secret German code-breaker • 1940s: ENIAC: Mauchly & Eckert at U. of Penn.
Personal Computers • First used by hobbyists • IBM introduced the PC in 1981. • Accepted by business • Became the standard hardware design for most desktop computers • Most PCs use software from Microsoft
Terminology • Hardware: Equipment • Software: Programs and algorithms
Computer Architecture • Central Processing Unit (CPU) or processor • Arithmetic/Logic unit versus Control unit • Registers • General purpose • Special purpose • Bus • Motherboard
Memory Terminology • Random Access Memory (RAM): Memory in which individual cells can be easily accessed in any order • Dynamic Memory (DRAM): RAM composed of volatile memory
Main Memory Addresses • Address: A “name” that uniquely identifies one cell in the computer’s main memory • The names are actually numbers. • These numbers are assigned consecutively starting at zero. • Numbering the cells in this manner associates an order with the memory cells.
Bits and Bit Patterns • Bit: Binary Digit (0 or 1) • Bit Patterns are used to represent information. • Numbers • Text characters • Images • Sound • And others
Main Memory Cells • Cell: A unit of main memory (typically 8 bits which is one byte) • Most significant bit: the bit at the left (high-order) end of the conceptual row of bits in a memory cell • Least significant bit: the bit at the right (low-order) end of the conceptual row of bits in a memory cell
Measuring Memory Capacity • Kilobyte: 210 bytes = 1024 bytes • Example: 3 KB = 3 times1024 bytes • Sometimes “kibi” rather than “kilo” • Megabyte: 220 bytes = 1,048,576 bytes • Example: 3 MB = 3 times 1,048,576 bytes • Sometimes “megi” rather than “mega” • Gigabyte: 230 bytes = 1,073,741,824 bytes • Example: 3 GB = 3 times 1,073,741,824 bytes • Sometimes “gigi” rather than “giga”
Mass Storage • On-line versus off-line • Typically larger than main memory • Typically less volatile than main memory • Typically slower than main memory
Mass Storage Systems • Magnetic Systems • Disk • Tape • Optical Systems • CD • DVD • Flash Drives
Representing Text • Each character (letter, punctuation, etc.) is assigned a unique bit pattern. • ASCII: Uses patterns of 7-bits to represent most symbols used in written English text • Unicode: Uses patterns of 16-bits to represent the major symbols used in languages world side • ISO standard: Uses patterns of 32-bits to represent most symbols used in languages world wide
Representing Numeric Values • Binary notation: Uses bits to represent a number in base two • Limitations of computer representations of numeric values • Overflow – occurs when a value is too big to be represented • Truncation – occurs when a value cannot be represented accurately
The Binary System The traditional decimal system is based on powers of ten. The Binary system is based on powers of two.
Figure 1.17 An algorithm for finding the binary representation of a positive integer
Figure 1.18 Applying the algorithm in Figure 1.15 to obtain the binary representation of thirteen
Boolean Operations • Boolean Operation: An operation that manipulates one or more true/false values • Specific operations • AND • OR • NOT
Stored Program Concept A program can be encoded as bit patterns and stored in main memory. From there, the CPU can then extract the instructions and execute them. In turn, the program to be executed can be altered easily.
Terminology • Algorithm: A set of steps that defines how a task is performed • Program: A representation of an algorithm • Programming: The process of developing a program
Low Level Language First Generation Language • Machine instruction: An instruction (or command) encoded as a bit pattern recognizable by the CPU • Machine language: The set of all instructions recognized by a machine
Machine Instruction Types • Data Transfer: copy data from one location to another • Arithmetic/Logic: use existing bit patterns to compute a new bit patterns • Control: direct the execution of the program
Program Execution • Controlled by two special-purpose registers • Program counter: address of next instruction • Instruction register: current instruction • Machine Cycle • Fetch • Decode • Execute
Ethical Theories • Consequence based: What leads to the greatest benefit? • Duty based: What are my intrinsic obligations? • Contract based: What contracts must I honor? • Character based: Who do I want to be?