100 likes | 238 Views
BU/NSF Workshop on I nternet M easurement I nstrumentation C haracterization. Http://www.cs.bu.edu/pub/imic. Boston University, August 30, 1999. IMIC Workshop: Panel 17:00 PM - 18:20 PM. IMIC Challenges, Opportunities & Initiatives Coordinator: Azer Bestavros Panelists
E N D
BU/NSF Workshop onInternet Measurement Instrumentation Characterization Http://www.cs.bu.edu/pub/imic Boston University, August 30, 1999
IMIC Workshop: Panel17:00 PM - 18:20 PM • IMIC Challenges, Opportunities & Initiatives Coordinator: Azer Bestavros • Panelists • Sugih Jamin, University of Michigan • Walter Willinger, AT&T Research • Mostafa Ammar, Georgia Tech • Henning Schulzrinne, Columbia University • Karen Sollins, NSF/CISE ANIR Program Director
IMIC Workshop Themes • IMIC Challenges • Common pitfalls in IMIC research • Dealing with issues of Internet Scale and Heterogeneity • IMIC Opportunities • Uncharted research questions/issues • Inter/Intra-disciplinary collaborations to be fostered • IMIC Initiatives • Needed Infrastructure? Funding Initiatives? • What is next? Future IMIC meetings
IMIC Workshop: PanelRecap and Questions • Session I: “What is steady-state in the Internet?” • How important are topological measurements (e.g. hop distances)? How correlated are dynamic metrics and Internet topology? What are good metrics (e2e application-oriented vs network-oriented)? • What is the impact of measurement infrastructure on network behavior (e.g. stability)? • Is Network measurement enough? What about server measurement? What are the impacts of server performance over the network (and vice versa)? • Experience from Surveyor suggests that variability can be as high as 5-fold in short periods of time! How susceptible are IDMaps to variability? Could variability be a metric of IDMaps? How do we compose various metrics and measurements (e.g. additive vs min/max)? • What initiatives could facilitate the collection of Internet data and deployment instrumentation infrastructure?
IMIC Workshop: PanelRecap and Questions • Session II: “The Good, the Bad, and the Ugly” • Characterization techniques dealing with heterogeneity and large scale. Success stories w.r.t. self-similarity, heavy-tailed distributions, multi-fractals. Spacio-temporal characterization is eminent due to trend in Internet topology characterization. • Modeling: Reduced to a data fitting exercise! No need for new “black boxes”. Cure is in focussing on “invariants” across large data sets. Turn Internet modeling into a physical science. • Performance Evaluation: Abstracting away the interesting. Variability is prevalent. Where is feedback? • Unique Opportunities: High quality measurements; first-rate experimental environment; exciting mathematics that ties measurement to underlying. Need “qualitative insights”!
IMIC Workshop: PanelRecap and Questions • Session II: “The Good, the Bad, and the Ugly” • E2E views for getting accurate picture of Internet assuming NO help from the network (Infrastructure, Cross sections & Composition of cross sections). • Lack of correlation forces the use of multicast (vs unicast). Can we infer network characteristics with weaker correlations (than what multicast provides)? • Validation through simulations versus Validation through deployment and measurement. How important is the “controlled environment” requirement? Does a “controlled environment” make sense in the context of the Internet? • Opportunities: relating multicast to unicast; composing views; topology compression; integrating e2e techniques with network support. • Web versus Internet: Adding the “human factor” into an already complex system. Need for more datasets to validate models. Emerging trends at application layer may invalidate state-of-art results/conclusions.
IMIC Workshop: PanelRecap and Questions • Session III: “The Application knows better” • The Anycast architecture as an E2E service that enables server performance to be taken into consideration. Interesting questions include effect on stability; positive impact of anycasting clients on non-anycasting clients. Challenges: How do we take the network characteristics into account? How does anycasting scale? • Leveraging network mechanisms to improve E2E performance (e.g. using Enhanced ECN to coordinate multiple streams to same client sharing network bottleneck and using DiffServ to prioritize packets based on context). • Need for parametric E2E transports that allow application to pick-and-choose their QoS along reliability, priority, deadlines, and dependencies (e.g. HPF). • Need to differentiate between policy (algorithmic) and transport protocols.
IMIC Workshop: PanelRecap and Questions • Session IV: “The Network World Order” • Need to provide few service classes that match common applications (e.g. multimedia, etc.) What it my application is a “misfit”? • For multimedia traffic TCP friendliness is not good enough; RSVP is too complex, unscalable, and not incremental (as opposed to YESSIR). • Reservation techniques for QoS require resource allocation; SLAs are insufficient. Need mapping of SLAs to traffic provisioning mechanisms. • Network solutions require “economic incentive to throttle”---longer commitment implies expensive price (e.g. RNAP pricing). • Tradeoffs are possible once resource awareness is known---but it need not be hard. Can we capture such awareness in a middleware to make application development a snap :) • Bottlenecks are a moving target (Storage and CPU are moving to clients, B/W bottenecks are not in the backbone).
IMIC Workshop: PanelIssues • Teasers: • Is measurement of network performance enough? What is the impact of server performance over the network (and vice versa)? • Validation through simulations versus validation through deployment and measurement. How important is the “controlled environment” requirement? Do network testbeds make sense in the context of IMIC? • What network mechanisms are “right” to support E2E application requirements (e.g. E-ECN, AnyCasting, DiffServ, …)? • Is pricing an option to impose a network world order? How does one reconcile pricing and variability (pricing requires consistency)? • What does it take to get “neat” solutions adopted?
BU/NSF Workshop onInternet Measurement Instrumentation Characterization Http://www.cs.bu.edu/pub/imic Boston University, August 30, 1999