410 likes | 532 Views
Multi-Paradigm Evaluation of Embedded Networked Wireless Systems. Rajive L. Bagrodia Professor Computer Science Dept, UCLA rajive@cs.ucla.edu DAWN PI Meeting, October 14, 2008 Joint work with Yi-Tao Wang and M.Varshney. Next Generation DOD Networks.
E N D
Multi-Paradigm Evaluation of Embedded Networked Wireless Systems Rajive L. Bagrodia Professor Computer Science Dept, UCLA rajive@cs.ucla.edu DAWN PI Meeting, October 14, 2008 Joint work with Yi-Tao Wang and M.Varshney
Next Generation DOD Networks Network Characteristics: Heterogeneous, Scalable, Mobile
The Multi-Paradigm Evaluation Framework • Multi-Paradigm Testbed for Heterogeneous wireless networks • In-situ evaluation of applications, protocols or sub-networks • ... in a high fidelity simulation environment • Simulated components: Radio, channel or complete sub-networks Model heterogeneous networks Physical Emulation Application-centric evaluation Simulation Computer Science Department 3 3 Design and Evaluation of Environmental Adaptive Wireless Systems
Embedded Networked Systems • Large class of embedded system are resource constrained devices (e.g. sensors) • Important to capture the interactions between applications and protocols with • Operating systems • Hardware • Other resources such as memory, CPU, clock drifts etc • Typically part of heterogeneous networks • Emulation of embedded systems: • Should capture the execution environment (OS & H/W) • Model environmental resources with high fidelity • Support heterogeneous networks (e.g. UAVs with UGS)
Current Tools • Simulators (TOSSIM, EmTOS, etc) • Validates basic application behavior • Lacks detailed simulation models • Restricts accuracy and expressiveness of their simulations • Cannot evaluate applications in deployment conditions • Physical testbeds • Accurate • Lacks spatial and temporal scalability • Difficult to perform simulations under complex conditions • Can’t repeat simulations
Our Target Heterogeneous Network • Deploy TinyOS motes to replace aging SOS motes • Must ensure: • TinyOS motes can co-exist with SOS motes • Maintain the network as SOS motes die off • Impossible to model using existing tools • Too complex for current simulators • Too large for physical testbeds
Outline • Motivation • Related approaches • OS emulation • SOS emulation via SenQ • TinyOS emulation via TinyQ • Heterogeneous network evaluation
Need for Operating System Emulation Case Study: SMAC Mac Protocol Sleep Duration (1024 ms) Wakeup Duration (300 ms) Sender Timer 1: backoff=rand(0,63ms) Timer 2: 300-backoff ms Destination Protocol Implemented as: a) Pure simulation model b) Emulation with SOS operating system
Emulation vs. Simulation Diagnosis: sleep schedule fall out of synchronization Backoff timer <10ms will expire at 10ms Timeouts > 250ms broken down in chunks of 250ms 9 9
Culprit: OS Timer Management t+7 t t+300 Timer Expire; Tx packet, set timer for 293ms Timer Expire; Set timer for 1024 ms, Sleep Set timer for 7ms What actually happened.... Skew = 3ms!! t t+10 t+303 SOS minimum timer latency = 10ms Set timer for 7ms Skew = 4ms!! t t+44 t+294 t+304 256ms = 250ms timer + 6ms timer Max timer interval = 250ms Set timer for 44ms What should have happened ... 10 10
OS Emulation Approaches • Operating System emulation • Model the applications and protocol stack in context of the real operating system • SenQ [msl.cs.ucla.edu/projects/senq] • For TinyOS and SOS • Underlying QualNet network simulator • Support multi-tiered heterogeneous networks • High fidelity models for sensing channels, clock drifts, battery, power consumption, CPU power • TOSSIM [www.cs.berkeley.edu/~pal/research/tossim.html] • For SOS • Custom network simulator. • Logical connectivity, lack of multi-tiered networks and environmental models
Emulation approaches (2) • Hardware Emulation • Avrora [compilers.cs.ucla.edu/avrora] • Atemu [www.hynet.umd.edu/research/atemu] • Instruction cycle level emulation • Hardware resources modeled • (+) Highest fidelity for protocol emulation • (-) Slow execution time (much slower than real time) • (-) Lack scalability • (-) Lack of detailed models for channel and environment • Good for small scale T&E (2-5 nodes) to be followed by OS level emulation (10-10000 nodes)
State-of-the-art • Network Simulators with sensor models • SensorSim(2001), SWAN(2001) • (+) No need to migrate away from familiar platforms • (-) No emulation support • Emulators with networking support • TOSSIM(2003), EmStar(2004), EmTOS(2004) • (+) Easy development-debugging-deployment cycle • (-) Discrete event simulation engine and channel models not accurate • (-) Specific for given OS platform • (-) Does not support heterogeneous networks (IP, WiMAX etc) • Instruction Cycle Emulation • Atemu(2004), Avrora(2005), Worldsens(2007) • (+) Greatest measure of software modeling fidelity • (-) Intractable for even moderate sized networks Computer Science Department 13 13
SenQ Objectives • Ability to emulate sensor applications & OS • Independent of underlying sensor operating system • Integrate multiple sensor OS in a single execution • Scalable to 10,000+ radios • Real-time or faster execution • Support heterogeneous networks • Flexible in configuring scenarios Clock drift, Battery, Power consumption Real implementations of TinyOS, SOS Sensing channel
SenQ Approach Virtual Sensor Layer Step 4 Network Simulator Sensor Node Driver: Virtual H/W Step 2 Battery Step 3 Process-oriented simulation Applications Parallel Sim. Processor Protocols WiMax Stack Step 5 Clock OS IP Stack Step 1 Sensing Driver: H/W 1 Mobility Driver: H/W 2 Radio,Channel
SenQ Emulation: Key Features • Sensor node appear as a “layer” in the discrete event network simulator (QualNet) • Network simulator masquerades as “hardware platform” to sensor node • Architecture supports any OS / multiple OS • Efficient implementation: ~10,000 nodes • Supports parallel emulation • Supports modeling Heterogeneous Networks
Clock Drift Models • Clock = Oscillator + Counter + Zero-time • Error in Oscillator ⇒Clock Drift • (Different definition of a second) • f = fnom + ∆f0 + a.(t - t0) + ∆fn(t) + ∆fe(t)* Nominal frequency Environmental factor (temperature etc) Aging rate Long term variations Short term variations (noise) *White paper, Symmetricom
Case Study: Acoustic Sensing t+∆t1 ∆t1 ∆t2 ∆t3 Algorithm t t+∆t2 ϕ t+∆t3 Angle ϕ
SenQ simulation study • Benefits of SenQ • Easy configuration • Repeatable execution • Study tradeoffs Variance: 0.64o 2.64o Clock drift: rand(0,5) μs/sec Sync. Protocol: RATS Sync Beacon period: 2sec, 60sec
Power Consumption Model • Inaccurate model of battery • Reservoir of charge (U mA-sec) • Subtract, I (mA) * t (sec) from U after each event (Tx, Rx etc) • Accurate model of battery • Non linear discharge • Recovery • Model (Rakhmatov, 2002): Ideal Observed
Model Optimizations • Why the model does not work? • Computationally expensive • Large memory requirement • Optimizations • For small load magnitude and intervals, found invariants that simplified the model • Precompute the function in a lookup table (saves execution time) • Merge multiple entries into one with a correction term (saves space) Loss in accuracy < 0.1%
Impact of Processor Power Consumption • Knowledge of power consumed by nodes in network is essential to avoid hot-spots or compare performance. • Claim: power consumed by processor is substantial portion of total power consumed. • Power consumed by processor is not a constant overhead. • It is state and context dependent but depends on what action is taken on events • Ignoring this component can predict incorrect trends or even inversion of results if only radio is considered
Processor Power Consumption • Simulation • 49 node grid topology • 802.11b at 1Mbps • 600mA and 400mA for Tx and Rx, resp • SA1100 processor at 133MHz. • 190.4mA/instruction • 3 CBR sessions between random pairs. • Pre-computed routes % Power consumed by processor The ratio is not constant and depends on what role the node plays in simulation.
Incorrect Simulation Results • Hot-spots are ignored (case 1 and 2) or mis-predicted (case 5). • Relative % incorrectly predicted (Case 4, 6% vs. 21%). • Inversion of results compared with simple model (cases 1, 3, 5). • Summary • Processor power consumption is significant contributor. • This overhead is not constant that can be easily modeled. • Ignoring this component can predict incorrect result. White bars: Power consumption by radio only (current simulators). Black bars: Power consumption by both radio and processor (detailed model).
Incorrect Simulation Results Summary • Processor power consumption is significant contributor. • This overhead is not constant that can be easily modeled. • Ignoring this component can predict incorrect result.
Emulation of TinyOS • Override drivers to communicate with QualNet • TinyOS applications think QualNet is just another hardware platform • Hardware interactions (i.e., sending a packet) creates an event in QualNet • Each mote runs in a separate thread
Accuracy of TinyQ • Compared results from TinyQ to those from Harvard’s MoteLab • Application routes periodic packets from one sender to one receiver • Accurate wireless model provides identical results
Impact of Accurate Channel Models • Multihop Oscilloscope: • Application distributed with TinyOS 2.x • Route sensor readings to root mote using tree routing & CSMA MAC • As node density increases MAC layer interference must decrease PDR
Performance of TinyQ • Compared against TOSSIM using Blink (no radio)
Performance of TinyQ (cont.) • Compared against TOSSIM using RadioCountToLeds (uses radio)
Performance of TinyQ (cont.) • TinyQ was able to execute most applications that TOSSIM could • Performs worse than TOSSIM on applications that don’t use the radio due to the extra emulation overhead • Performs 10X better on applications that used the radio • TOSSIM uses a connectivity graph which leads to thrashing when the network gets large
Heterogeneous Networks Emulation of Heterogeneous networks
Case Study 1 OS: Linux/Windows Stack: IP/AODV/802.11 Radio: 802.11a/b OS: TinyOS/SOS Stack: Surge/Tree routing/BMAC Radio: 100kbps, 10s meter range Sensors: OS Level Emulation WiFi: True Emulation Interoperable!
Heterogeneous Networks in SenQ • Sensor subnet (1000 nodes) • SOS emulation, CC1100 radio model • IP subnet (50 nodes) • QualNet simulation • IP, AODV, UDP/TCP, 802.11 radio and MAC • Gateway nodes (1-10 nodes) • Gateway batches K packets from sensors and then transmits • SenQ support for interfacing diverse networks
Heterogeneous Network: Case Study 2 • Sensor network is emulated • IP network is simulated • Gateways are modeled as nodes with two network interfaces • 500 SOS nodes, 500 TinyOS nodes, 50 IP nodes spread randomly over 400m x 400m terrain
Results: Heterogeneous Network • TinyOS motes boot up at 30 simulation seconds • SOS motes die between 40 and 60 simulation seconds
Results: Heterogeneous Network • TinyOS application behaves correctly • Also shows that 500 motes is not the optimum number of motes to cover area • Many motes are isolated and cannot route to a gateway
Future Work • Hybrid sensor network modeling (Yi-tao Wang) • Integration of transport and vehicular comm network simulator (Yi Yang) • Cross layer interactions between routing protocol and MAC interface using Multi-Paradigm Framework (Shrinivas Mohan) Physical Emulation Simulation