610 likes | 724 Views
Preparation and Planning for ICSE 2006 (28th International Conference on Software Engineering). Leon J. Osterweil (ljo@cs.umass.edu) University of Massachusetts Amherst, MA 01003 USA SinoSoft 2003 Beijing, China 3 December 2003. Who am I?. Professor of Computer Science
E N D
Preparation and Planning forICSE 2006(28th International Conference on Software Engineering) Leon J. Osterweil (ljo@cs.umass.edu) University of Massachusetts Amherst, MA 01003 USA SinoSoft 2003 Beijing, China 3 December 2003
Who am I? • Professor of Computer Science • University of Massachusetts, Amherst • Software Engineering Researcher • Software Process Definition • Software Analysis • Dean, College of Natural Sciences & Mathematics • General Chair, ICSE 2006 • 28th Int’l. Conference on Software Engineering
Why Are We (with Mary Lou Soffa) Here? • Plan for ICSE 2006 • Meet Research colleagues • Plan events leading up to ICSE 2006 • Support Chinese initiatives in software engineering
What is ICSE? • Largest, most important, meeting on Software Engineering Research • Also covers practice, history, future • Actually a set of colocated meetings • Many accompanying workshops • Many tutorials • Usually 1000-1500 attendees • Held annually around the world • In US in odd-numbered years • In Europe every four years • Previous Asian meetings • Tokyo, Osaka, Singapore
Washington San Francisco Atlanta San Diego Orlando Monterey, CA Pittsburgh Austin Baltimore Seattle Boston Los Angeles Portland St. Louis (2005) Munich, Germany Tokyo, Japan London, England Singapore Nice, France Melbourne, Australia Sorrento, Italy Berlin, Germany Osaka, Japan Limerick, Ireland Toronto, Canada Edinburgh, Scotland (2004) Past ICSEs
ICSE 2006 • In Shanghai • First time on Asian continental mainland • General Chair • Leon J. Osterweil • Program CoChairs • Mary Lou Soffa • Dieter Rombach • Organizer • Prof. Dehua Ju
Component Structure of ICSE • Main Conference • Three days, 4-5 parallel tracks • Satellite conferences • 2-3 smaller ones, both before and after ICSE • Many workshops • 12-15, mostly one or two days before ICSE • 30-60 attendees • Many tutorials • As many as 25 • Both before and after ICSE • Tools/trade exposition (?) • If there is interest
Possible Events Prior to 2006 • In 2005 • One large (200 attendees?) conference in Shanghai • One or more workshops around China • In 2004 • Research seminars in Beijing and Shanghai • Research tutorial series around China • Workshop somewhere in China
What are our research interests? • Leon Osterweil • Software Process • Software Analysis • Mary Lou Soffa • Software Analysis, Testing, and Debugging • Compilers and Optimizers • Dieter Rombach • Empirical Methods • Reviews and Walkthroughs
What do I mean by “process” • Activities like development, verification, evolution, etc. of (eg.) software products • High level processes: • Develop requirements, Do Object Oriented Design, Formally verify • Low level processes • Archive test results, Verify a lemma • Often concurrent • Often coordinate people, automated systems • Often resource sensitive • Usually (regrettably) informal or undefined
My interest in process:Reasoning to assure quality products and services • Superior quality products from superior processes • Build quality in, don’t “test in” quality (manufacturing) • Many observed “process errors” • Real processes are intricate • Automation can help/support • Reasoning to assure quality in processes • Simulations support reasoning • Other approaches too (eg. static analysis)
Other Reasons for Interest in Process • Communication • Coordination • Intuitive understanding • Prediction/projection • Verification • Training • Automation • Deep understanding • Etc.
Appropriate Modeling Notation is Key • Different formalism approaches support different goals • Formalisms vary in • Rigor • Precision (semantic detail) • Semantic scope • Clarity Which formalisms are effective in demonstrably supporting which kinds of reasoning?
Processes are software They should be Engineered
Processes are software They should be Engineered Using appropriate languages
Process Definition Approaches • Natural language • Structured Natural Language • Pictorial representations • DFDs • FSAs • Petri Nets • Object Technologies • Programming languages Directly analogous to product definition approaches Different approaches for different Phases Purposes
Process definition language issues • Blending proactive and reactive control • Coordinating human and automated agents • Without favoring either • Specification of resources • Exception management • Real time specification
The Little-JIL Process Language • Vehicle for exploring language abstractions for • Reasoning (rigorously defined) • Automation (execution semantics) • Understandability (visual) • Supported by • Visual-JIL graphical editor • Juliette interpreter • Evaluation by application to broad domains • A third-generation process language • A “work in progress”
Little-JIL Example:“Smart” Regression Test Process RegressionTest GetArtifacts ReportResults Stop PerformTest SelectTests Stop PerformTest Report Failure ExecuteTest GetExecutable GetTest Cases NoteFailure Compare Results Get Input Data Run Test Get Expected Output Data
The “Step” is the central Little-JIL abstraction Interface Badge (includes resource specs) Prerequisite Badge Postrequisite Badge TheStepName Z X Handlers Substep sequencing Reactions
Requirements Rework Invocation of step originally defined as substep of Requirements
Requirements Rework Same exception thrown Invocation of step originally defined as substep of Requirements
Requirements Rework Same exception thrown Invocation of step originally defined as substep of Requirements Different invocation context -> different response
What does this tell us? • Abstraction/reinstantiation is necessary • For an adequately articulate language • For clear understanding of “rework” Other language features similarly motivated By specific examples and experiences
Sequential In order, left to right Parallel Any order (or parallel) Choice Choose from Agenda Only one choice allowed Try In order, left to right Little-JIL Proactive Flow Specified by four Sequencing Kinds Iteration usually through recursion Alternation using pre/post requisites
Example of Choice and Try Step Kinds Implement Reuse_Implementation Custom_Implementation Look_for_Inheritance Look_for_Parameterized_Class Look_for_Objects_to_Delegate_to Main Goal: Support Human flexibility
Reactive Control through Scoped Exception Handing • Steps may have one or more exception handlers • React to exceptions thrown in descendent steps • Handlers are steps themselves InterfaceFilesDon’tCompile DevelopInterfaceFiles InterfaceFilesCompile
Four different continuations on exception handlers • Complete • Handler was a “fixup” and now it is OK to go back • Continue • Handler brought step to an acceptable postcondition state and it is OK to go on • Restart • SNAFU. Handler cleaned up mess, now OK to redo • Rethrow • Go up to parent and hope the parent knows what to do
Examples of Resources • Input artifacts: requirements document, locks on key artifacts • People: designers with varying skills • Tools: ROSE • Agents: Each step has a distinctly identified unique resource responsible for execution of the step (and all of its substeps)
Bob Resource Human Design Team Carol Hardware Ted Software Alice Data Manager PC Sparc Designer Resource Model: Requires andWhole-Part Relationships
Resource Request Example Agent: OODDesigner;expert tool: ClassDiagramEditor artifact: DiagramReposLock IdentifyRelationships SpecifyRelationships RefineRelationships Resource request is a query on the Resource specification repository
Juliette: The Little-JIL Interpreter • Juliette is distributed • Every step has its own interpreter • Interpreter executed on agent’s platform • Communication via Agendas • One for each agent and service • Services include: • Object Management • Resource Management • Step sequence Management • Agenda Management
Achieving Product QualityThrough Quality Processes • Thru reasoning about process characteristics • Analogous to software product measurement and evaluation • Dynamic monitoring of process execution • like interactive debugging and tracing • Simulations can be predictive • Tracing provides audit trails • Need static analysis of processes too • Prove absence of pathologies
Process Reasoning Examples • Is the process correct (eg. consistent with rqts.)? • How fast will the process run? • How to be sure that humans perform their jobs? • Train them, monitor their participation • Are resources adequate, efficiently used? • How to improve the process • And be sure that changes are improvements? • Simulations can spot problems • Static analysis can verify for all executions
The Capability Maturity Model (CMM)is a Specific Approach to Software Process Improvement
The Capability Maturity Model (CMM)is a Specific Approach to Software Process Improvement It is a test plan for black box testing of processes
The Capability Maturity Model (CMM)is a Specific Approach to Software Process Improvement It is a test plan for black box testing of processes Can’t test quality into software process products either
Current Evaluation Projects • Software Development: • Perpetual testing: Programming flexibly evolvable integrated testing and analysis • Configuration Management • Collaborative Object Oriented Design • Performing data flow analysis processes • Robot coordination • Distributed scientific statistical data processing • Medical and Nursing processes • Ecommerce processes such as auctions • Egovernment processes
Scientific Statistical Data Processing • How do scientists do their work? • Reproducing results is core of all science • Should help in reproducing results • Evidence this this has been done (dynamic) • Determine if there are any statistical processing pathologies • Avoid false “findings”
Produce a 3-D Forest Model Fly-Over Data Mosaic 3D Model Fly-Over Data Fly-Over Data Fly-Over Data Maybe plan a fly-over, maybe just get a different dataset… Creates “new” versions of the fly-over data
Medical/Nursing Processes • Defining procedures, protocols, formally, rigorously, completely • Complicated by exceptions • Traces provide audit trails • Analysis can find flaws, omissions, etc.