130 likes | 147 Views
Developing Metrics for Cognitive Networking A deceptively simple exercise in research programming. J Christopher Ramming SRI International 5/11/2003. DARPA’s role in science and technology. Source: DARPA’s strategic plan, 2003. Desirable properties Describes a new capability
E N D
Developing Metrics for Cognitive NetworkingA deceptively simple exercise in research programming J Christopher Ramming SRI International 5/11/2003
DARPA’s role in science and technology Source: DARPA’s strategic plan, 2003
Desirable properties Describes a new capability Can’t get there incrementally Possible now because of new ideas, approaches, breakthroughs Enables new “concepts of operations” (CONOPS) Quantitatively expressed Objectively measurable Examples Create a system that can be used to pinpoint, within X meters, the location of an object anywhere on the surface, given an observation window of less than Y seconds Given a 0-day worm that infects ~X% of vulnerable machines in (0,t), create a system to quarantine peak infectionproportion to Y%, while maintaining Z% mission system availability in (0,t) Develop a fighter jet with radar return of <X and electronic emission of <Y Develop body armor weighing less than X pounds and having an areal density of >Y ounces/ft2 Perfecting a program objective
Recipe • Identify a new capability we can create • Argue that there is no incremental path • Identify technical foundations • Specify a new CONOP based on the capability • Determine metrics • Understand testbed/simulation requirements and experimental design needed to evaluate progress
Cognitive networkingBroadly identifying a capability to focus on • Fault management • Network configuration • Traffic engineering • Protocol selection • Tactical monitoring • Signal intelligence Today’s explorations
Metrics for cognitive networking Performance* in distributed environments Performance* in nonstationary environments Cognition-specific Region of great interest Asymptotic performance* Time & space complexity User availability Packet stats (latency, jitter, loss) Networking-specific Domain-independent * “performance” used here in the ML sense of the word
Performance in non-stationary environments Basic metrics: asymptotic limit; epochs and/or time needed to approach asymptote Phase change invalidating historical knowledge and models + How much prior learning can be retained? Performance Experience How long does it take to recognize that the current model is broken? If the change can be recognized, obsolete assumptions can be eliminated to recalibrate the system -
Performance in distributed environments 192.0.2.11 ???? Stanford (Sunet ID) DHCP Server (Leases) DNS 192.0.2.10 Stanford Kerberos Server (AS/TGS) Stanford NetDB (MAC Address Registration)
Evaluating metrics • Experimental design • If the metric can’t be properly evaluated, it may not be an appropriate choice • Testbeds and simulations • A potentially heavy tax! Let’s work with our colleagues who already have testbeds and simulators
Why the focus on metrics? • Knowing what metrics are meaningful demonstrates grasp of the subject • Metrics can [sometimes] be used to evaluate interim progress • Metrics are a communications tool that help focus group activity • A clearly focused objective helps the taxpayer and DARPA’s clients understand what they’re going to get for their money
Performance in non-stationary environments Basic metrics: asymptotic limit; epochs and/or time needed to approach asymptote Phase change invalidating historical knowledge and models + Performance Experience A naive learning system may or may not ever recover on its own -
Recognizing disruptionBasic recovery in non-stationary environments Basic metrics: asymptotic limit; epochs and/or time needed to approach asymptote Phase change invalidating historical knowledge and models + Performance Experience How long does it take to recognize that the current model is broken? If the change can be recognized, obsolete assumptions can be eliminated to recalibrate the system -