150 likes | 285 Views
GEC17: GENI Instrumentation and Measurement Sessions. Sun. July 21, Mon. July 22, 2013 Marshall Brinn, Jeanne Ohren GPO. Outline. We will be holding two sessions at this GENI Engineering Conference (GEC) to focus on the I&M efforts [1] I&M: Topics and Status [Sunday 7/21 1030-1200]
E N D
GEC17: GENI Instrumentation and Measurement Sessions Sun. July 21, Mon. July 22, 2013 Marshall Brinn, Jeanne Ohren GPO
Outline • We will be holding two sessions at this GENI Engineering Conference (GEC) to focus on the I&M efforts • [1] I&M: Topics and Status [Sunday 7/21 1030-1200] • [2] I&M: Looking Forward [Monday 7/22 1330-1530] • The goal of these sessions are to take a short-term and long-term view of the I&M project • [1] Where are have we gotten to? What are our current status and issues? What are our natural next steps? • [2] Where should we try to steer to I&M effort in the remaining project scope?
Focus on the Experimenters • As we take stock on the I&M efforts and try to plot the path forward, try to keep the needs of experimenters in the front of your minds • The ultimate metric of our success is the degree to which we’ve helped experimenters use GENI to pursue their research • Remember that we’re trying to support two populations: • “Power Experimenters”: Knowledgeable researchers who want to use the full power of GENI and the I&M tools to perform large-scale, highly customized, complex experiments. • “Beginners”: Grad-students or even undergraduates who are relatively new to network research and for whom we want to provide an easy path to do reasonably interesting/enlightening things.
Session 1: Sunday 7/21/2013 1030-1200 I&M: Topics and Status
Outline • For both GEMINI and GIMI [15 minutes each] • Discussion of current status and issues • Where are we at? • What’s been good or hard in developing or integrating your capability? • What might you suggest we do differently technically, or administratively? • Sneak-peak at tonight’s demo sessions: What will you be highlighting?
Outline [2] • [30 min.] Looking for opportunities for value-added contributions: • What functionality (from your own I&M effort) do you think could provide significant value to other I&M efforts or GENI services (Clearinghouse, tools, aggregates)? • What functionality (from other I&M efforts or other GENI services) would you like to leverage to strengthen your offering to experimenters?
Session 2: Monday 7/22/2013 1300-1530 I&M: Looking Forward
Outline • Try to imagine what your demo at GEC18 should or could look like • Or perhaps GEC19 or GEC20… • What additional kinds of interoperability and collaboration between I&M and other GENI services could be developed to enhance experimenter experience? • Specifically, how can GEMINI and GIMI leverage and integrate each other’s capabilities to provide uniform and value-added services?
Motivation This second session is to encourage out-of-the-box thinking about the I&M effort and where we’re at and where we might or should go from here. • We hope we can all step back and rethink a little of our approach and direction, as appropriate. • Obviously we don’t want to change for change’s sake • Nor do we want to derail the good progress being made on all fronts. • But if there are “low hanging fruit” that we can identify that would provide good value to experimenters, we should try to identify and pursue them.
Some “provocative” thoughts • GIMI has a strong orchestration story through OMF • Could GEMINI benefit from taking advantage of the GIMI OMF framework? • GEMINI has a strong set of topology and measurement services • Could GIMI benefit from integrating some of these services into the GIMI portal or services?
Some “provocative” thoughts [2] • GEMINI Desktop uses ssh to configure resources, while GIMI uses OMF (over XMPP). • Can we discuss the relative strengths/weaknesses of these two approaches? • From both a developer and particularly an experimenter perspective? • Both GIMI and GEMINI use iRODS as their essential experimental result store. • Is there something to be gained from leveraging this common feature (some way to communicate between systems e.g.)? • Could we benefit from re-focusing on common data descriptions?
Some “provocative” thoughts [3] • What is the right level of explicit ‘programming’ that should be required to manage different experiments: • Should simple experiments require simple programs or no (visible) programs at all? • Should complex experiments be driven by programs or something more like a user interface? • Is there more we can or should be doing to improve reproducibility of experiments? • Capturing the running environment configuration, run-scripts, control parameters, etc.