410 likes | 449 Views
Project Final Presentation. Liana Lisboa – PM Project: Starship. Summary. SIMPLE Factory Domain Analysis Phase Architecture Design Phase Implementation Phase Testing Project Metrics Project Strong and Weak Points Project Learned Lessons Improvement. SIMPLE Factory.
E N D
Project Final Presentation Liana Lisboa – PM Project: Starship
Summary • SIMPLE Factory • Domain Analysis Phase • Architecture Design Phase • Implementation Phase • Testing • Project Metrics • Project Strong and Weak Points • Project Learned Lessons • Improvement
SIMPLE Factory • The factory started with 7 members and an collaborator, and ended up with one member less.
Domain Analysis Phase • Artifacts • Feature Model • Feature Matrix • Domain Requirements • Doman Use Case
Domain Analysis Phase • Feature Model • Changes occurred during the design and implementation phases. • These changes reflected redundancy in the model (the implication relationships) and improvement of the relationships.
Domain Analysis Phase • Feature model before design and implementation made inRequiline
Domain Analysis Phase • Feature model after design and implementation made inRequiline
Domain Analysis • Process Evidence
Domain Analysis Phase • Feature matrix – no changes • Domain requirements – no changes • Domain use cases – Its was further detailed with the inclusion of “alternative fluxes” during the design phase
Domain Analysis Phase • Learned Lessons • Necessity of a domain expert for the identification of the features; • Inclusion of the domain requirement documentation improved the documentation of the domain; • Feature documentation could become extremely complex as the domain grows, although its is helpful to identify if the feature is really a part of the domain;
Project Metrics • Domain Analysis Phase
Design Phase • Artifacts • Domain Architecture • Feature Dependency Matrix • Component Specification • Evidence Document
Design Phase • Domain architecture definition • GRASP Patterns • Gamma Patterns • Component Grouping
Design Phase • Components identified before the implementation phase • Game Presentation • Game Core • Audio • Display • End Game • High Score • Configuration • Collision • Screen Control
Design Phase • Problems identified in the component division • The “component grouping” put together the majority of the domain variability • The components were too connected
Components division – second round Audio Actions Entity Event Graphic Movement Design Phase
Design Phase • Modifications • Game Core divided in: • Entity • Movement • Actions • Event • Audio and Configuration put together in: • Audio • Game Presentation and Display became: • Graphic • Collision is now part of Event
Design Phase • Components division – third round • Movement were further divided in: • Movement • Strategy -> The movement strategy for the enemies
Design Phase • Component Specification -> updated according to the new components • Feature Dependency Matrix -> Was left aside in the new components division • Evidence Document -> no changes
Design Phase • Process Evidence
Design Phase • Learned Lessons • Each component has to be a unique kind of variability; • The components have to be “self-contained”
Project Metrics • Design Phase
Implementation Phase • Components implemented usually with java and latter with OSGi • The implemented components with and without OSGi: • Audio • Graphic • Movement • Strategies • Core -> application that can be extended
Implementation Phase • DEMONSTRATION!!
Implementation Phase • How variability was implemented? • Optional – with the Builder Pattern • Or-feature -> with the Strategy Pattern
Implementation Phase - Testing • Issues: 16 issues reported on issue tracker; • Some Issues: • No documentation (Javadoc, component description); • Components assigned as High and Medium not implemented; • No exception handling; • It is not hard to validate a component contract if you have: • Proper component specification; • A process to validate then; • Tools to support the activity.
Implementation Phase - Testing • Conclusion • It is hard to create automated tests for components that have basically user interaction (i.e. audio, image quality...) • Ideally, the component must come with a test bed for anyone who wants to see if that component fulfills its own acceptance criteria. • It is possible to test components!
Implementation Phase • Learned Lessons • To create the OSGI services from the self-contained components, it is necessary a minimum documentation • Main test Classes • Component Documentation of Use • It is strongly recommended the presence of the component developer in the first phases of OSGI creation • It is strongly recommended the presence a technical leader to get the wide angle view of implementation activities • Avoid necklaces and dependencies • Improve the communication among distant developers
Implementation Phase • Learned Lessons (cont.) • The developers must be physically together (or at least virtually together) in the initial and during critical phases of development • Distributed implementation in these phases are a risk to the project schedule
Project Metrics • Implementation Phase – Construction #1
Project Metrics • Implementation Phase – Construction #2
Project Metrics • Implementation Phase – Construction #3
Project Metrics • Closing Phase
Project Metrics • General View
Project Strong and Weak Points • Strong Points • Coherent and well defined division of the activities in the Domain Analysis Process; • Software Architect experience in Games Domain; • Reuse of information from other teams and areas; • Team commitment at the beginning of the project; • Weak Points • The process for the design phase was not very easy to understand; • It is strange that one of the most important documents for the design phase is the use case document since it is not part of the domain analysis process proposed; • Team lack of commitment in the final phase; • Lack of experience of the team with SPL;
Learned Lessons • The domain expert is necessary specially in the domain analysis and design phases; • The requirements and use casesshould describe the common {how they are common to the domain} and an overview of the variability, and should not describe specifically functions {in our case, “play game” should not exist}; • The architecture must be planned to the technology that it will be developed.
Improvements • Include the domain analysis documents to the process and set its relations; • Define the metric for the DA phase, and which evaluation functions should be considered to define the domain scope and how it influences the DA; • Clarify that features beside the ones extracted from the analyzed documentation can be included in the domain; • The phases should be more connected. Like in the DA process there is no mention of the use case document that is necessary for the design phase, which does not consider the technology to be used in the implementation phase;
Improvements • The architecture classes should be defined after the components identification, so that each components architecture can be more isolated; • Using the OSGi as the implementation technology, it would be interesting that each variability could be implemented as a service and its component would verify if the service is available or not. That way, the variability change in the application would be dynamic.