1 / 20

Analysis Environment Challenges

Analysis Environment Challenges. Lassi A. Tuura Northeastern University, Boston. Physics analysis is to a large degree an iterative process of Reducing data samples to more interesting subsets Distilling the sample into information at higher abstraction level

tgoldsmith
Download Presentation

Analysis Environment Challenges

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analysis Environment Challenges Lassi A. Tuura Lassi A. Tuura Northeastern University, Boston

  2. Physics analysis is to a large degree an iterative process of Reducing data samples to more interesting subsets Distilling the sample into information at higher abstraction level By summarising lower level information By calculatingstatistical entities from the samples What Is An Analysis Environment? Experiment Interpret Reduce Distill  Lassi A. Tuura • A large part of the work can be done on very high-level entities in an interactive analysis and presentation tool • Hence focus on tools that work on simple summary information(DSTs, N-tuples, tag databases, ...) • Additional tools for detector and event visualisation

  3. So What Is An Analysis Environment? • Analysis involves a lot more than just the interactive tool • Learn from the “PAW revolution” • N-tuples provided new, more powerful ways to work with the data • New user interface • Move towards closer integration with data continues • We can do much more and better than just a N-tuple today • Examples: ROOT added trees, CMS uses a full-blown object model • Experiments are making big jumps in data accessibility • Exploiting widely used, very powerful object models—not just data • New levels of automation and integration are becoming available for networks, distributed computing and mass-storage systems • User interfaces to these new data models need to catch up! • The analysis environments will need considerable links with the rest of the experiment’s computing and software infrastructure Lassi A. Tuura

  4. The Challenge • Beyond the interactive analysis tool • Data analysis & presentation: N-tuples, histograms, fitting, plotting, … • A great range of other user activities with fuzzy boundaries • Batch • Interactive from “pointy-clicky” to Emacs-like power tool to scripting • Setting up configuration management tools, application frameworks and reconstruction packages • Data store operations: Replicating entire data stores; Copying runs, events, event parts between stores; Not just copying but also doing something more complicated—filtering, reconstruction, analysis, … • Browsing data stores down to object detail level • 2D and 3D visualisation • Moving code across final analysis, reconstruction and triggers Today this involves (too) many tools Lassi A. Tuura

  5. Example: Distributing Your Data Store • Problem: replicating and sharing your experiment’s data in full or in part for various analysis tasks and GRID • Tools exist but... • Do I understand my experiment’s world-wide configurations well enough to use the tools confidently? • How do I find out the data store nearest me in the first place? • If I want a private working store that shares the experiment data at the same time, what should I do? • What if I do not want just a plain file copy, but want only a copy of the reconstructed data for the calorimeter from a certain sample that includes events in tens of files? • What if I want to share my analysis settings and results with my colleague for a verification? • Enquiring minds want to know! Lassi A. Tuura

  6. What Do We Need? One size never fits all—the tools need to adapt! • A uniform integrated interface to the whole task range (within reasonable limits)? A tool suite or a work bench? • Wizards for common tasks to guide us through the choices, to give sensible defaults and to explain the terminology? • Some ideas that might prove helpful • Showing the data store or parts of it as a directory • Conceptual “home directory” in the data store • Make it easy to put stuff related to your analyses under your “home directory” (framework and reconstruction setups, parameters etc.) • Make it easy to access analysis setups and results of different groups • Keep track of configurations, input and output data selections, … • A “desktop” where you can have shortcuts/links • Standard shortcuts for common stuff Lassi A. Tuura

  7. Concepts In Today’s Apps Extrapolate these to a data store… Lassi A. Tuura (IGUANA prototype)

  8. Concepts In Today’s Apps… Lassi A. Tuura Visualisation window Command-line interface that reflects actions in other windows Plus of course batch mode without pointy-clicky!

  9. How To Get There? • Few can afford to develop a new interactive analysis tool, let alone coherent tools for the entire range of analysis tasks! • Divide, conquer and co-operate • Divide the problem into categories, such as GUI, event and detector visualisation, and data analysis and presentation • We need to share: use existing modules in each category where possible—write your own only where nothing suitable exists (and don’t get attached to code, ditch it when something better is available!) • Integrate the lot into a user-friendly and productive environment • Make applications by choosing from the module pool—experiments could construct their own specific environments with customisation • For this to work, the pool should be truly modular • Need to take into account alldependencies, not just the obvious ones • Need to think what it would take to test all the features provided by each component—those form its immediate dependencies Lassi A. Tuura

  10. What Kind of an Architecture? • Modular where it matters • Model-View-Controller and alike work to partition the domain • Layer to keep front-ends and back-ends separate • Ensure a standard for visual components to facilitate integration • Interfaces for data access • Narrow interfaces to link the analysis and visualisation sub-framework to the core framework • Not everything needs an abstract interface! • It may be better to make a strategic choice to use a particular product if it can be contained and completely replaced in 6-9 months • Example: Use OpenInventor instead of inventing your own 3D API • We need to assess and bound the risks, not total safety! Lassi A. Tuura

  11. More About Interfaces • Example: selecting events using high-level summary data • Pick your favourite name for the same concept:Tags, N-tuples, DSTs, B-tree indices… • N-tuple was both an access paradigm and a storage method • Historical emphasis was on storage format • Shift the emphasis to an access and query interface • Can provide the look and feel for a proven access method (N-tuple) with natural modern extensions • Implementation behind the interface may vary • Data may already be cached or accessed from deep in the event • May exploit advanced indexing and retrieval • May involve computation on demand • May even be necessary to read from tape • Other interfaces can provide access to underlying features Lassi A. Tuura

  12. Summary • Analysis environment includes a lot more than just the interactive data analysis and presentation tools • As experiment complexity grows we need • To be able to drill down to and interact with data in many new ways • A good solid user interface for the whole range of tasks all the way from batch mode operation to the quick pointy-clicky jobs • Building all this from scratch is neither affordable nor wise • Exploit existing components—HEP, open source or commercial • Components need clearly defined responsibilities: a mission statement • Abstract interfaces are useful means to • Help people co-operate and not disturb each other too much • Provide hooks for all the cool new stuff we will see • Layer and partition the problem domain • Bound risks should a technology or a component fail Lassi A. Tuura

  13. Lassi A. Tuura

  14. Some Architecture Ideas • Three-tier architecture • Application model (framework, reconstruction, simulation …) • Specific ways of looking at objects (3D, 2D, hierarchical browser, object inspector, fitter…) • Representation tier to tie the above two together • Dynamically load and integrate required bits together • (MV)2C: Representation is the view from application model, but model to the visualiser • Possible interesting result: scripting becomes “yet another view” and does not require special treatment or privilege • A host of wizards • Coherent, good human interface • Easily adapted and expanded to new tasks • Should be able to leave behind scripts or other batch mode food Lassi A. Tuura

  15. Interface Pros and Cons • Modularity and good interfaces make a big difference • When one particular component fails, it doesn’t take others down • Easier to add new features—without disturbing existing ones • Easier to adapt to new, sometimes radically different contexts • Testing is manageable and actually gets done • Easier to manage the project and for people to co-operate(often much more of the work is in communication, not coding) • …but they come at a price • Costlier to develop up front • Bad interface can make life really awkward • Hard to justify if you have only one implementation • A good interface needs one clearly defined mission—coming up with it may require considerable work, but usually is more than worth it as doing so usually clarifies problem understanding and project strategy Lassi A. Tuura

  16. Do Languages Matter? • No—Great concepts will survive in almost any language • Especially within a common paradigm like object oriented languages • It is the paradigm changes that hurt, changing from objects to components is a more difficult change than from C++ to Java… • Will we see extern “Java” { class XYZ { … }; }? • Yes—Consider this scenario • Someone in the collaboration comes up with a new analysis cut • … and that cut proves very interesting • … so the analysis needs to get into the trigger express line If the analysis was done by C++ code that writes out a N-tuple that was then processed with a few-thousand lines of PAW KUMACs and FORTRAN, you’ll have a hard time finding volunteers to re-code it for the trigger, let alone someone willing to double-check it It is not (just) the languages that hurt... Lassi A. Tuura

  17. CMS Analysis Architecture At a Glance File File Data Store (Objectivity) File File File Other Non- IGUANA Tools Data Browser Lassi A. Tuura ORCA CARF Cmscan OSCAR GRID Tools Tony’sscripts Objy tools Federation wizards Analysis job wizards IGUANA

  18. Modularity Example: IgAPDlab Could pick only a subset for some related task Lassi A. Tuura

  19. Current IGUANA Tools (By Origin) LHC++ or HEP Public- domain IGUANA Commercial Lassi A. Tuura

  20. Current IGUANA Tools (By Purpose) Lassi A. Tuura

More Related