490 likes | 643 Views
A Real-World Test-bed for Mobile Ad hoc Networks: Methodology, Experimentations, Simulation and Results. Per Gunningberg, Erik Nordström, Christian Rohner, Oskar Wibling Uppsala University. Background and problem. IETF is standardizing MANET (Mobile Adhoc NETwork) routing protocols:
E N D
A Real-World Test-bed for Mobile Ad hoc Networks: Methodology, Experimentations, Simulation and Results. Per Gunningberg, Erik Nordström, Christian Rohner, Oskar Wibling Uppsala University
Background and problem • IETF is standardizing MANET (Mobile Adhoc NETwork) routing protocols: • One proactive protocol - knowledge about all nodes • One reactive protocol - path on the need basis • Based on experiences from three protocols: • AODV - Adhoc On Demand Distance Vector (reactive) • DSR - Dynamic Source Routing (reactive) • OLSR - OptiMized Link State Routing(proactive) • Problem: But majority of research done through simulations...
Part One • A test-bed for evaluating ad hoc routing protocols. • Close to reality • What to measure and how to analyze • Repeatable experiments • Grey Zone Phenomena • Conclusion
The Uppsala Ad hoc Protocol Evaluation Testbed (APE) • People carrying laptops with 802.11b • Suitable for indoor experiments that are hard to model in simulation
The Ad hoc Protocol Evaluation Testbed (APE) • Execution environment on top of existing OS. • Runs on Win and Linux • Scenarios with movement choreography. • Emphasizes easy management for scaling. • 800++ downloads.
Laptop instructions (choreography) node.11.action.0.msg=Test is starting... node.11.action.0.command=start_spyd node.11.action.0.duration=1 node.11.action.1.command=my_iperf c 2 t 330 node.11.action.1.msg=Stay at this location. node.11.action.1.duration=30 node.11.action.2.msg=Start moving! Go to Point A, the end of building. node.11.action.2.duration=75 node.11.action.3.msg=You should have arrived at Point A. Please stay. node.11.action.3.duration=30
Measurement procedures • Every node collects SNR from every other node it can hear during the test session • Every event is time stamped • Received Packets/Application results are collected at all nodes • Routing state snapshots are collected • Analysis is done after the test session.
Replaying a scenario • SNR mapped to virtual distance • Each time interval corresponds to a topological map Point D 25 50 100 125 150 75 T Point A
APE is a Testbed for… • Relative protocol performance comparisons • Radio channel effects on ad hoc routing protocols • Interactions between hardware, software, protocol, mobility and radio environment Example: Grey Zone Phenomena • Validation of simulation models • Generation of traces
0 1 2 3 3 802.11 Gray Zone Phenomena Unicast AA Broadcast
Challenge • Results should be reproducible and comparable between tests • It follows that experiments must be repeatable... • ...and therefore stochastic factors need to be dealt with • So – what can we achieve?
Stochastic Factors in Real World Experiments • Node mobility adds frequent changes in the network topology. • We use choreography and “measure topology differences” • Variations in hardware and software configuration. • We use identical hardware and software. • Time varying radio environment affects link quality and error rates.
Topology differences - visual check Experiment 1 Experiment 2 RED= Average mobility GREEN = 25% with lowest mobility BLUE = 25% with highest mobility
Part Two • Evaluating MANET protocols with the APE testbed, simulation and emulation. • Scenarios • UDP, Ping and TCP • Side-by-side comparison • Faulty protocol constructs • Conclusion
Routing protocols ability to adapt • OLSR - Proactive Link state routing. Monitors neighbors and exchange link state info. • AODV - broadcasts to set up path. HELLO or Link feedback to detect link failure. • DSR - broadcasts with source route. Listens to other traffic to find shorter route. RTT measurements and network ACKs. React to connectivity changes
Emulation • Same configuration as Real world • Table-top emulation • MAC filters force connectivity changes • Reduces radio and mobility factors • Interference reduces bandwidth
Simulation • Scenarios recreated in a ns2-simulation using “default” models: • Transmission range tuned to better match indoors • Mobility with jitter modeled after real world measurements • Results averaged over 10 runs • Results provide a baseline • Can simulations using default (simple) models be used to predict routing protocol performance in complex real world environments?
3x3x3x(10 runs) = 270 runs Multidimensional Comparison • Three MANET routing protocol implementations: • OOLSR, AODV-UU, DSR-UU • Three traffic types: • UDP (20 pkts/s CBR) • Ping (20 pkts/s CBR) • TCP (File transfer) • Three mobility scenarios: • End node swap, Relay node swap, Roaming node • Three environments (dimensions): • Simulation, Emulation, Real world
Experimental Test Environment • Indoors with offices and corridors • Four nodes (0, 1, 2, 3) • Four waypoints (A, B, C, D) • One data stream from node 3 to node 0
A B C D 0 1 2 3 Relay Node swap AA
Scenarios – Relay Node Swap • End nodes stationary • Intermediate nodes changes position • Hop count never smaller than 2
A B C D 0 1 2 3 End node swap AA
Scenarios – End Node Swap • End nodes change positions • Intermediary nodes stationary • Hop count changes from 3 to (2) and 1 and back
A B C D 0 1 2 3 Roaming node AA
Scenarios – Roaming Node • Roaming node is source node • All other nodes stationary
Observations • Simulation and Emulation similar in absolute CBR performance but not in relative protocol ranking • Real world CBR performance is significantly lower • Discrepancy grows with traffic complexity and scenario • TCP performance is orders of magnitude lower for real world compared to simulation • periods of no-progress time in real world
Observations (continued) • OLSR tries less hard to re-route and therefore achieves more even performance • Radio factors account for most of the discrepancy between simulation and real world... • ...but secondary effects, such as cross-layer interactions that are protocol specific, dominate, e.g.: • Lost HELLOs (AODV) • Excessive buffering (DSR)
Protocol comparison conclusion • If one protocol performs better than another in simulation, is it possible to assume the same for the real world? • NO
Flip-Flop Routing DSR Real Word Simulation
Conclusions • APE aims to address the lack of real world ad hoc experimental research test-beds • Repeatability addressed at a level that allows relative protocol comparisons • The value of cross-environment evaluation • Revealing of sensing problems leading to instabilities and poor performance • Not visible in simulations
The End • Paper: • http://www.it.uu.se/research/group/core/publications/GC_technical_report.pdf • APE testbed: • http://apetestbed.sourceforge.net/ • The Research group: • http://www.it.uu.se/research/group/core/
Extra Slides • More details…