1 / 31

Advanced Computer Architecture: Fundamental Concepts

Discover key concepts in computing models, roles of a computer architect, abstraction layers, and more. Dive into designing systems with processing-in-memory technologies and explore the evolving landscape of computer architecture.

jchin
Download Presentation

Advanced Computer Architecture: Fundamental Concepts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ADVANCED COMPUTER ARCHITECTURE Fundamental Concepts: Computing Models Samira Khan University of Virginia Jan 16, 2019 The content and concept of this course are adapted from CMU ECE 740

  2. AGENDA • Review from last lecture • Why study computer architecture? • Fundamental concepts • Computing models

  3. LAST LECTURE RECAP • What it means/takes to be a good (computer) architect • Roles of a computer architect (look everywhere!) • Levels of transformation • Abstraction layers, their benefits, and the benefits of comfortably crossing them • An example problem and solution ideas • Designing a system with processing-in-memory technologies • Course Logistics • Assignments: HW (today), Review Set 1 (Next Wednesday)

  4. REVIEW: KEY TAKEAWAY • Breaking the abstraction layers (between components and transformation hierarchy levels) and knowing what is underneathenables you to solve problems and design better future systems • Cooperation between multiple components and layers can enable more effective solutions and systems

  5. HOW TO DO THE PAPER REVIEWS • 1: Brief summary • What is the problem the paper is trying to solve? • What are the key ideas of the paper? Key insights? • What is the key contribution to literature at the time it was written? • What are the most important things you take out from it? • 2: Strengths (most important ones) • Does the paper solve the problem well? • 3: Weaknesses (most important ones) • This is where you should think critically. Every paper/idea has a weakness. This does not mean the paper is necessarily bad. It means there is room for improvement and future research can accomplish this. • 4: Can you do (much) better? Present your thoughts/ideas. • 5: What have you learned/enjoyed/disliked in the paper? Why? • Review should be short and concise (~half a page to a page)

  6. AGENDA • Review from last lecture • Why study computer architecture? • Fundamental concepts • Computing models

  7. AN ENABLER: MOORE’S LAW Moore, “Cramming more components onto integrated circuits,” Electronics Magazine, 1965. Component counts double every other year Image source: Intel

  8. Number of transistors on an integrated circuit doubles ~ every two years Image source: Wikipedia

  9. RECOMMENDED READING • Moore, “Cramming more components onto integrated circuits,”Electronics Magazine, 1965. • Only 3 pages • A quote: “With unit cost falling as the number of components per circuit rises, by 1975 economics may dictate squeezing as many as 65 000 components on a single silicon chip.” • Another quote: “Will it be possible to remove the heat generated by tens of thousands of components in a single silicon chip?”

  10. WHAT DO WE USE THESE TRANSISTORS FOR?

  11. WHY STUDY COMPUTER ARCHITECTURE? • Enable better systems: make computers faster, cheaper, smaller, more reliable, … • By exploiting advances and changes in underlying technology/circuits • Enable new applications • Life-like 3D visualization 20 years ago? • Virtual reality? • Personalized genomics? Personalized medicine? • Enable better solutions to problems • Software innovation is built into trends and changes in computer architecture • > 50% performance improvement per year has enabled this innovation • Understand why computers work the way they do

  12. COMPUTER ARCHITECTURE TODAY (I) • Today is a very exciting time to study computer architecture • Industry is in a large paradigm shift (to multi-core and beyond: accelerators, FPGAs, processing-in-memory) – many different potential system designs possible • Many difficult problems motivating and caused by the shift • Power/energy constraints  multi-core? • Complexity of design  multi-core? • Difficulties in technology scaling  new technologies? • Memory wall/gap • Reliability wall/issues • Programmability wall/problem • Huge hunger for data and new data-intensive applications • No clear, definitive answers to these problems

  13. COMPUTER ARCHITECTURE TODAY (II) • These problems affect all parts of the computing stack – if we do not change the way we design systems • No clear, definitive answers to these problems Problem Many new demands from the top (Look Up) Algorithm Fast changing demands and personalities of users (Look Up) User Program/Language Runtime System (VM, OS, MM) ISA Microarchitecture Logic Many new issues at the bottom (Look Down) Circuits Electrons

  14. COMPUTER ARCHITECTURE TODAY (III) • Computing landscape is very different from 10-20 years ago • Both UP (software and humanity trends) and DOWN (technologies and their issues), FORWARD and BACKWARD, and the resulting requirements and constraints Hybrid Main Memory Persistent Memory/Storage Microsoft Catapult (FPGA) Heterogeneous Processors General Purpose GPUs Every component and its interfaces, as well as entire system designs are being re-examined

  15. COMPUTER ARCHITECTURE TODAY (IV) • You can revolutionize the way computers are built, if you understand both the hardware and the software (and change each accordingly) • You can invent new paradigms for computation, communication, and storage • Recommended book: Thomas Kuhn, “The Structure of Scientific Revolutions” (1962) • Pre-paradigm science: no clear consensus in the field • Normal science: dominant theory used to explain/improve things (business as usual); exceptions considered anomalies • Revolutionary science: underlying assumptions re-examined

  16. Thomas S Kuhn • PhD in Physics from Harvard in 1949 • During his PhD switched from physics to the History and Philosophy of Science • Joined University of California Berkeley as a professor of the History of Science in 1961 • Wrote the book “Structure of the Scientific Revolutions” in 1962

  17. COMPUTER ARCHITECTURE TODAY (IV) • You can revolutionize the way computers are built, if you understand both the hardware and the software (and change each accordingly) • You can invent new paradigms for computation, communication, and storage • Recommended book: Thomas Kuhn, “The Structure of Scientific Revolutions” (1962) • Pre-paradigm science: no clear consensus in the field • Normal science: dominant theory used to explain/improve things (business as usual); exceptions considered anomalies • Revolutionary science: underlying assumptions re-examined

  18. So What is the Structure of Scientific Revolutions? Step 4: Crisis and Emergence of Scientific Theory Step 5: Scientific Revolution Step 2: Normal Science Step 3: Anomaly Step 1: Pre-paradigm History of Science

  19. COMPUTER ARCHITECTURE TODAY (IV) • Thomas Kuhn, “The Structure of Scientific Revolutions” (1962)

  20. … BUT, FIRST … • Let’s understand the fundamentals… • You can change the world only if you understand it well enough… • Especially the past and present dominant paradigms • And, their advantages and shortcomings – tradeoffs • And, what remains fundamental across generations • And, what techniques you can use and develop to solve problems

  21. AGENDA • Review from last lecture • Why study computer architecture? • Fundamental concepts • Computing models

  22. WHAT IS A COMPUTER? • Three key components • Computation • Communication • Storage (memory)

  23. WHAT IS A COMPUTER? Processing Memory (program and data) I/O control (sequencing) datapath

  24. THE VON NEUMANN MODEL/ARCHITECTURE • Also called stored program computer (instructions in memory). Two key properties: • Stored program • Instructions stored in a linear memory array • Memory is unified between instructions and data • The interpretation of a stored value depends on the control signals • Sequential instruction processing • One instruction processed (fetched, executed, and completed) at a time • Program counter (instruction pointer) identifies the current instr. • Program counter is advanced sequentially except for control transfer instructions When is a value interpreted as an instruction?

  25. THE VON NEUMANN MODEL/ARCHITECTURE • Recommended reading • Burks, Goldstein, Von Neumann, “Preliminary discussion of the logical design of an electronic computing instrument,”1946. • Stored program • Sequential instruction processing

  26. THE VON NEUMANN MODEL (OF A COMPUTER) MEMORY Mem Addr Reg Mem Data Reg PROCESSING UNIT INPUT OUTPUT TEMP ALU CONTROL UNIT IP Inst Register

  27. THE VON NEUMANN MODEL (OF A COMPUTER) • Q: Is this the only way that a computer can operate? • A: No. • Qualified Answer: But, it has been the dominant way • i.e., the dominant paradigm for computing • for N decades

  28. THE DATA FLOW MODEL (OF A COMPUTER) • Von Neumann model: An instruction is fetched and executed in control flow order • As specified by the instruction pointer • Sequential unless explicit control flow instruction • Dataflow model: An instruction is fetched and executed in data flow order • i.e., when its operands are ready • i.e., there is no instruction pointer • Instruction ordering specified by data flow dependence • Each instruction specifies “who” should receive the result • An instruction can “fire” whenever all operands are received • Potentially many instructions can execute at the same time • Inherently more parallel

  29. VON NEUMANN VS DATAFLOW • Consider a Von Neumann program • What is the significance of the program order? • What is the significance of the storage locations? • Which model is more natural to you as a programmer? a b v <= a + b; w <= b * 2; x <= v - w y <= v + w z <= x * y + *2 - + Sequential * Dataflow z

  30. MORE ON DATA FLOW • In a data flow machine, a program consists of data flow nodes • A data flow node fires (fetched and executed) when all it inputs are ready • i.e. when all inputs have tokens • Data flow node and its ISA representation

  31. ADVANCED COMPUTER ARCHITECTURE Fundamental Concepts: Computing Models Samira Khan University of Virginia Jan 16, 2019 The content and concept of this course are adapted from CMU ECE 740

More Related