110 likes | 124 Views
Report on NREDS at SIGCOMM 2003. Karen Sollins ICNP 2003 November 7, 2003. Details. NREDS = Network Research: Exploration of Dimensions and Scope Date: August 25, 2003 Location: SIGCOMM 2003, Karlsruhe, Germany Funding: None - intentionally Report will be in CCR
E N D
Report onNREDS at SIGCOMM 2003 Karen Sollins ICNP 2003 November 7, 2003
Details • NREDS = Network Research: Exploration of Dimensions and Scope • Date: August 25, 2003 • Location: SIGCOMM 2003, Karlsruhe, Germany • Funding: None - intentionally • Report will be in CCR • More details will be at http://www.acm.org/sigcomm
Objective • Take up from Looking Over the Fence: A Neighbor’s View of Network Research • Start discussions about (for example) • the nature and choices about what to do in research, • how to value it • the impact of those valuations on the choices people make of research topics • Doing this outside the agenda of any particular agency or other funding source • Include people from broad set of kinds of “network researchers” • Begin search for what to do beyond this
Committee Mark Allman (ICIR, prev. BBN Technologies) Balaji Prabhakar (Stanford) Stefan Savage (UCSD) Karen Sollins (MIT, chair) Student scribes Steve Bauer (MIT) Mayank Sharma (Stanford) Renata Teixeira (UCSD) Organization
Participation • Solicited position papers: accepted 6 (short), plus invited some of the submitters whose papers were not accepted • Supplemented with other invitations to expand the participation - tried to include people who might otherwise not attend SIGCOMM - partially successful • By invitation only - limited to about 30 people
Structure of the discussions • Not presentation of position papers. • Each session had a brief speaker and a briefer respondent (less than 1/2 hr and over an hour of discussion) • Four major sessions and a conclusion • Do we have a sharedmeaning of “network research”? • Where is the science in network research? • Where is the research beyond the current tipping point? • How do we value and evaluate research? How does/should our field evolve? • Where do we go from here?
Meaning of “network research” • Clear disagreement about actual topics - everyone has their own favorites • Drivers for definition: • Effective impact on industry • Curiosity • Education • Definition of underlying axioms
Science of “network research” • No science in “fitting” - curves, graph theory, or whatever - notice no possibility of failure, can always do curve fitting to data • Little science used in protocol design: not sure of value • TCP • BGP • Challenge is understanding complex systems • There is science in applying control, coding, and information theory - early stages, both in terms of protocol design and more architectural (multilayer design) • Discussed the importance (role and impact) of measurement, data cleansing, and archiving data - repeatable studies, time studies, etc. • Idea of “pockets” of science in the field
The Tipping Point • The point at which economic choice not to change outweighs the economic choice to change - can be graphed with inflection point • What does it mean to influence the inflection point? • Can we move ourselves to an alternative curve with a different inflection point? • Questions of in which dimensions innovation happens when? E.g. process vs. product innovation, other externalities that have impact • Exploration of model of evolution of network • How do we evolve our model of evolution?
Valuing and evaluating research • Find it in nature of organization: academic vs. non-profit lab vs. gov’t lab vs. industrial lab • Funding models - not only amounts, also questions of being stuck on a treadmill of incremental projects in order to keep flow going • Nature of people (senior leadership and jr. PhDs, faculty as small entrepreneurs, etc.) • Kinds of support from organization • Degree of impact of mission on research • Issues of recognition and motivation • Peer acceptance (conferences, publications ,etc.) • Organizational acceptance (promotions) • Funding (how much, from whom)
Random collection of further ideas • More workshops • Small is good for conversation • Large is good for more inclusion • Longer is good for working through issues rather than just raising them • Suggestion: multiple several day workshops on same topic in parallel • Identification of “fundamental” questions of underlying theory (a la Math) • Democratization of valuation by eliminating anonymous and perhaps limited reviewing - try doing signed reviewing open to anyone who wants to have an opinion. • Encourage both broader participation and more churn on program committees - do a certain amount of tracking across committees, to spread load more broadly • Make commitments to cross-disciplinary, high risk, disruptive ideas