230 likes | 248 Views
The Past, Present and Future of Search-Based Software Engineering for the Next 30 Minutes “There’s plenty of room at the top”. John A Clark Dept of Computer Science University of York jac@cs.york.ac.uk SBSE Workshop. Cumberland Lodge, Windsor 15-16 September 2003. Specification.
E N D
The Past, Present and Future of Search-Based Software Engineering for the Next 30 Minutes “There’s plenty of room at the top” John A Clark Dept of Computer Science University of York jac@cs.york.ac.uk SBSE Workshop. Cumberland Lodge, Windsor 15-16 September 2003
Specification Architecture Design Code Object Code Talk Requirements • Take a step back and see where there has been action in SBSE. • Provide some categoriesto think about problems. • Ask some questions about the subject: • What is software? (Not as ridiculous as it sounds) • What gaps are there? Traditional stages
Types of Activity Level n Design - Refinement Reverse Engineering Inter-level Activities Cross-lifecycle activities verification Level (n+1) Description Validation Intra-level Activities Improvement
1 0 1 0 0 1 1 1 0 1 0 1 1 How low can you go? • Object code. • Intra-level. People have investigated heuristic search as a means of code optimisation. • Timing performance. • Register usage. • Space. • Various ‘meaning preserving’ transformations (OK, usually ignoring precision).
Going Down,Around, Up • Refine. Code to object code - refinement step is generally referred to as compilation –fully automated. • Similarly for Field Programmable Gate Arrays, netlists can be produced automatically from VHDL or Handel-C etc. • But room for optimisation here, route and place is a hard task. • Related working hardware proper – but HW/SW software distinction is increasingly blurred. • Reverse. Have seen attempts to map object code graphs onto source code graphs (but no use of heuristic search).
Code Level • Testing – a very considerable amount of automated testing generation • Significant amount in UK (various); • Germany (Daimler Chrysler); • And various instances around the world; • Also note – some parallel work from the EC community. • Code transformation - testability transformations
Architectural Level • Components and how they work together. • Clustering/modularity work considered architectural level. • What are criteria for a good architecture? • Not an easy question. • Model based testing now becoming high profile. • Automated testing form architectures is an opportunity to repeat code based successes at this level?
Specification to Code • Genetic Programming! • When the actual solution is a program you want to run. • Suspect most of the SBSE community’s perception of computation and software functional correctness is a traditional one. • See little reason why the above should not count as automatic `refinement’.
Specification to Architecture • Actually some architectural problems can only be solved practically using heuristic search – e.g. allocating tasks to processors in real-time distributed systems (multiple criteria) • May need some task replication on different processors. • Timing deadlines to met. • Communications overheads. T1,T5,T7 T2,T8,T9 T3,T4,T8 T1,T2, T5,T7
Specification to Design • Very few applications here. • Security protocols • Goals of the protocols formalised as statements of beliefs • Messages containing beliefs evolved as a (provably correct) refinement. • Can we do automated refinement of process algebraic descriptions? • Are there possibilities for automated refinement in say Z or B?
Specification to Specification • Are there possibilities for transforming specifications automatically? • What would the language for transformation look like? • What would ‘fit’ specs look like? • Needs further software engineering work? • Other specification matters: • Some work on using genetic algorithms to explore large state spaces (deadlock detection and security examples) • Small exploratory tests on trying to break synthesised specs (Michael Ernst’s technique to generate specifications and optimisation based code level tests to break them.)
Requirements • Not a great deal of work here. • Next release problem addressed (how to prioritise incorporation of requirements into releases). • And???????
Management Tools • Some work on effort/cost prediction • More general data description of more general application.
Non-functional properties • Software executes on some ‘processor’ • Timing is an issue of course. • Worst, average, jitter. • Memory usage is an issue. • Power? • Devices may be highly resource constrained, the power to carry out a task may actually be a crucial factor (pervasive environments) and tradeoffs may be in order • Precision??????
Generalised Diagnostics • Testing – showing there is a fault • Debugging – showing where there is a fault. • Ideas generalise • In huge systems how can you track down what a problem is? • What data needs to be collected? • What are characteristics signatures of failures? • Software engineering? Why not? • We may be heading for self-diagnosing, self-healing,…..systems. Autonomic computing. • Run time ‘diagnostics’ clearly of significant importance.
Generalised Stress Testing • Testing – you are trying to ‘break the system’ • Lots of work at code level falls into this category: falsification testing, exception generation, safety testing, reuse precondition breaking, timing. • Little or no work has been carried out for system level properties. • Yet in may ways big systems is where the actions is at these days. • Can we attack quality of service claims?
Generalised Robustness • m-out-of-n schemes are a standard model in the safety domain • Protect against hardware failure • N-version programming also used • Get separate teams to develop versions and then run them all, take a vote on results. • Protect against implementation error. • Diversity is a key concept – but N-version programming is controversial • Lack of independence.
Generalised Robustness • But diversity seems too good to ditch. • Embrace notions of populations (Xin) giving solutions: • Can 100 poor solutions combine to give good results? • How can we enforce diversity? • Severely limit program size. • Enforce different function symbol sets, etc. • More standard ways too. (POOR)100=GOOD ?
General Tools Improvement • Software is developed in an environment. • SBSE include the derivation of better tools: • Verification tools: • ‘Testing’ tools obviously. • Counter-example generators. • Algebraic simplifiers. • Variable re-orderings for model checkers etc. • Better prediction methods and tools.
On the Horizon • Pervasive computing networks. • Late binding service provision • Service provision negociated at run-time.(current SBSE work seems static/off-line) • Very large scale IT. • Meanwhile in a (several) universe(s) far far away….
Quantum and Nanotech • “There’s plenty of room at the bottom.” Richard Feynman • There are many worlds….. • Quantum algorithms: we have only two major high level procedures: • Grover’s search • Shor’s Quantum Discrete Fourier Transform • Can we discover others by simulation and GP? • A few researchers have published in this area. • Real nano-tech involves computer scientists! Programming nanites is software engineering! • General point is there are alternative models of computation and other architectures that we should not wholly ignore (even if more standard computing platforms are the principal target).
Conclusions • Lots of good work. • But there are gaps • As we go large. • As we go small. • As we give up! • As we go dynamic. • As we go higher. “There’s plenty of room at the top.”