310 likes | 457 Views
Connecting the Physical World with Pervasive Networks. ECET 581 Reading Report Due: 9/12/06 By Douglas A. Schultz. Summary. This article discusses instrumenting the physical world with pervasive networks of sensor-rich, embedded computation. Authors: Deborah Estrin
E N D
Connecting the Physical World with Pervasive Networks ECET 581 Reading Report Due: 9/12/06 By Douglas A. Schultz
Summary • This article discusses instrumenting the physical world with pervasive networks of sensor-rich, embedded computation. • Authors: • Deborah Estrin • University of California, Los Angeles • David Culler and Kris Pister • University of California, Berkeley • Gaurav Sukhatme • University of Southern California
Mark Weiser • (July 23, 1952 - April 27, 1999) was a chief scientist of Xerox PARC and widely considered to be the father of Ubiquitous computing (also known as Ubicomp) and calm technology. • Weiser’s Vision: A world in computing is so pervasive, that everyday devices can sense their relationship to us and each other. • Technology has improved to the point where Weiser’s vision can become reality, applied to computing and social applications.
Pervasive Networks • Systems should fulfill two of Weiser’s visions. • Ubiquity, by injecting computation into the physical world with high spatial density. • Invisibility, by having the nodes and collective nodes operate autonomously. • Systems must have reusable building blocks for sensing , computing and manipulating the physical world. • Systems must not contain specialized instrumentation.
Opportunity Ahead • We need the ability to easily deploy sensors, computation and actuation into our world. • The devices must be general purpose that can adapt and organize to support different applications and environments. • Taxonomy of emerging system types for the next decade.
Challenges • Most serious are System Challenges. • The many distributed system elements, limited access to elements, and extreme environmental dynamics all combine to force review of: - Layers of abstraction • Kinds of hardware acceleration used • Algorithmic techniques
Immense Scale • Many small devices will be in these new systems. • In 5 to 10 years device size could be as small as a cubic millimeter. • Measurement fidelity and availability will come from the quantity of redundant measurements and their correlation.
Limited Access • Devices will be embedded where a wired connection is impossible or too expensive to use. • Communications will have to be wireless and nodes will have to rely on renewable or harvested energy. • Scale could be so large no one person could ever touch all the devices. • Energy sources will limit the amount of activity, as in sensor measurements, per unit time.
Extreme Dynamics • Since the system is nodal and tied to the constantly changing physical world it will have extreme dynamics. • Reaction to environmental changes will directly effect the devices performance. • Most of the time the device senses no change and uses low power. • When an event occurs, then high and low-level data, flow from sensors and actuators must be managed effectively. • These systems must continuously adapt to resource and event activity.
A taxonomy of systems. • New system design must come from hands on experience with new technology. • The use of many small low power connected devices will be applied to solve problems where individual devices are impractical. • System reuse and evolution is key to pervasiveness.
Physically embedded systems • Scale, variability, and autonomy relate to the system elements and data being captured.
Scale • 1st characteristic of system design • Space and time factors, effect the sampling interval, overall system coverage, and the total number of sensor nodes. • Sampling: What you are trying to measure determines the sampling scale. The application also effects the sampling scale. If it’s event detection, (lower resolution) vs. event or signal reconstruction, (higher resolution). • Extent: Also effects the scale. Environmental systems could be 10 kilometers vs. a building or room system. • Density: System density is the measure of sensor nodes per footprint of input stimuli. High density systems can extend the life of sensors nodes and reduce noise by redundant measurements.
Variability • 2nd characteristic of system design. • Static systems use design time optimization. • Dynamic systems use run time optimization. • Structure: Ad hoc vs. engineered. Structure vs. bio-complexity monitoring. Also combinations of both. • Task: Variability takes the form of how much can we tweak the system for single mode operation. • Space: Variability in space equals mobility. Applies to nodes and what you are trying to measure.
Autonomy • 3rd aspect of system design. • Higher system autonomy, indicates less human involvement, which requires more complex internal processing. • Modalities: Autonomous systems depend on multiple sensory modalities. This lowers system noise, and helps identify measurement anomalies. • Complexity : Autonomy makes the system model more complex. A system that just delivers data for a human to process is less complex. A system that executes depending on system state, and inputs over time, and must execute a programming language is much more complex.
Where are we now? (1) • Weiser suggested a need for different size devices, from the size of a pin to a whole building. • Small packages in the physical world. • PDA’s have had wireless LAN capabilities but this requires a large battery pack. Bluetooth (short range wireless network) has recently been added to PDA’s. • PDA’s now have cell phone capabilities, and both support GPS.
Where are we now? (2) • Sensors have been further reduced in size due to advances in MEMS (microelectromechanical systems). • Small CMOS low-power radios are also being developed. For example, Crossbow developed by UC and DARPA. (See picture of UC Berkeley Mote.)
Where are we now? (3) • As device size decreases and complexity increases, several new OS have been developed. • Vxworks (www.windriver.com) • Geoworks (www.geoworks.com) • Chorus (www.sun.com/chorusos) • Small footprints and TCP/IP capabilities have been added. • Windows CE adds a subset of windows to PDA’s. • Unix variants i.e. Linux provide real-time multitasking support.
Where are we now? (4) • An effective networked node must have a runtime environment that supports; • Scheduling, • Device Interface, • Networking, • Resource management, • Concurrent data flows from sensor to network to controllers.
Tiny OS • The TinyOS (tiny operating system environment) is component based. • Traditional scheduling loops are replaced by fine-grain multi-threading. • TinyOS provides fine-grain power management, extensive concurrency, with limited processing resources. • Open-source OS.
Sensing and actuation • Interacting with the real world involves energy exchange in two forms: • sensing and actuation. • Sensing, a sensor, converts (temperature, light intensity, etc.) to information. • Actuation lets a node convert information into action, while it’s main role is improve sensing. • An Actuator, moves part of itself, relocates itself or moves other items in the environment. • To deal with uncertainty in sensing and actuation filtering is used at each node and overlapping measurement areas. • A new approach is to hybridize control, a planner and reactive system communicate via a 3rd software layer.
Localization • Nodes must know their location. • Where am I? • In relation to a map, other nodes or global coordinate system. • Scale and autonomy play a large role in location computation. Recent trends use algorithms to localize large networks autonomously. • Localization can be seen as a sensor-fusion problem. • One recent example of coarse localization let the nodes build a map of their environment. • Another algorithm builds a mesh where nodes are point masses and springs represent relationships between nodes.
Distributed system architecture. • Constraints imposed by battery power will make or break these systems. • Systems will not be able to constantly stream data out to a computer for analysis. • Computation must be along side the sensors so data will be processed locally. • Self configuring systems will turn off redundant nodes to conserve energy. • Data-centric vs. address-centric network architecture using directed diffusion.
Distributed system architecture (2) • Directed diffusion separates data and node identity. Used by Linux systems and a subset runs on TinyOS. • Higher-tier resources will be necessary to compliment the lower level data nodes. An example could be a roving robot that replaces batteries or recharges the batteries of the nodes.
Where are we headed? • Thousands of devices embedded in buildings, bridges, water ways, highways, and protected areas to monitor health and detect critical events. • Advances in miniaturization mean we can now put instruments in the experiment, instead of conduction the experiment inside an instrument. • System architecture will have to support interrogating, programming , and manipulating the real world. • Embedded systems will need to self-organize, spatial reconfiguration is needed.
Conclusion • Interdisciplinary approaches will be needed to solve system problems with; • Networking • Operating systems • Databases • Artificial intelligence • Virtual reality • Signal processing • Training will have to be applied at the graduate and undergraduate level.
About the authors (1) Deborah Estrin is a professor of computer science at the University of California, Los Angeles. Her research interests include design of network and routing protocols for large global networks, sensor networks, and applications for environmental monitoring. She has a PhD in computer science from the Massachusetts Institute of Technology. She has served on several program committees and editorial boards. Contact her at the Computer Science Dept., UCLA, 3713 Boelter Hall, 420 Westwood Plaza, Los Angeles, CA 90095; destrin@cs. ucla.edu.
About the authors (2) David Culler is a professor of computer science at the University of California, Berkeley, and director of Intel Research at UCB. His research interests include embedded networks of small wireless devices, parallel computer architecture, parallel programming languages, and high-performance communication. He has a PhD from MIT. Contact him at Computer Science Division #1776, 627 Soda Hall, UCB, Berkeley, CA 94720-1776; culler@cs.berkeley.edu.
About the authors (3) Kris Pister is an associate professor in the Electrical Engineering and Computer Sciences Department at UCB. His research interests include the development and use of standard MEMS fabrication technologies, micro robotics, and CAD for MEMS. He has a BA in applied physics from the University of California, San Diego, and an MS and PhD in electrical engineering from UCB. Contact him at the Electrical Eng. and Computer Science Dept., UCB, 497 Cory Hall, Berkeley, CA 94720- 1770; pister@eecs.berkeley.edu.
About the authors (4) Gaurav Sukhatme is an assistant professor in the Computer Science Department at the University of Southern California and the associate director of its Robotics Research Laboratory. His research interests include embedded systems, mobile robot coordination, sensor fusion for robot fault tolerance, and human–robot interfaces. He has an MS and PhD in computer science from USC. He is a member of the IEEE, AAAI, and ACM and has served on several conference program committees. Contact him at the Computer Science Dept., MC0781, USC, 941 West 37th Place, Los Angeles, CA 90089-0781; gaurav@usc.edu.