1 / 43

EA/Methodology Training

EA/Methodology Training. 7 July 2010. SRS-FDS Methodology. Why? The TRT deliverables How? Process References Standards EA. References. https://subversion.infotechfl.com/svn/ CoMa /SRS-FDS/ TRT_Docs /References/ SRS_FDS_Artifact_Review_Process_Orientation /

jenny
Download Presentation

EA/Methodology Training

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EA/Methodology Training 7 July 2010

  2. SRS-FDS Methodology • Why? • The TRT deliverables • How? • Process • References • Standards • EA

  3. References https://subversion.infotechfl.com/svn/ CoMa/SRS-FDS/ • TRT_Docs/References/SRS_FDS_Artifact_Review_Process_Orientation/ • Great overview of the deliverables • EA and Methodology/References/ • Methodology Standard

  4. References The Tracker • http://tinyurl.com/comasrsfds

  5. Software Tools • FitNesse • GoogleDocs • Enterprise Architect • With Iconix and ITI Add ins • Balsamiq • Greenshot

  6. Today’s Stories • G.UA.REM: Remarks • M.MD.MTMR: Material to Material Relationship

  7. Agenda • User Stories • Domain Model • Requirements • Use Cases and User Interfaces • Robustness Diagrams

  8. User stories Available in FitNesse and in the repository. Imported into EA. Track at the sentence level.

  9. Domain Model • Created by SuperSMEs • SMEs use terms, add issues

  10. Understanding the Review ProcessHow the artifacts work together • Domain Model Artifact • Primary Purpose - provide a diagram of all objects referenced in the use cases and their relationships to each other, and define the business terms for the glossary • Note:  This is NOT a database design • Domain Class – a group or collection of common business terms • Examples: Contract, Reference Item, Contract Item • Domain Class Attribute – a business term • “Contract” Examples: Awarded Date, Description, and Spec Book • “Reference Item” Examples: ID, Spec Book, Description, and Unit • “Contract Item” Examples: everything from Reference Item (via Generalization) and Unit Price, Line Number, and Quantity Paid to Date

  11. Understanding the Review ProcessHow the artifacts work together • Domain Model Artifact (continued) • Includes domain diagram - the relationship between the domain classes • 3 types of links (for details see Construction_Materials_SRS_FDS_Domain_Links_Overview) • Aggregation/Composition - diamond shape • "Has A" - implies ownership • Contract has-a Subcontract • Generalization - arrowhead shape • "Is A" - implies a "kind of" relationship • Subcontract item is-a Contract item • Association - no line ends • All other relationships • Will include a descriptive verb • Subcontract References Subcontract Agency Option

  12. Understanding the Review ProcessHow the artifacts work together • Domain Model Artifact (continued)

  13. Understanding the Review ProcessHow the artifacts work together • Domain Model Artifact (continued) • Includes domain class details • NAME [NEW - Status] (Aliases) • Name - short but descriptive business name • NEW (blank if existing) - designates whether the class exists in current web Trns•port • Status - designates whether the class is ready for review or not (note that classes completed in Iteration 1 may relate to classes completed in later iterations - the relationship, but not the details, will be documented and reviewed). Status values are “Draft”, “Proposed”, and “Approved”. • Any aliases (other business names for the class) • Notes - including definitions. • “[NEW – Proposed]" classes are to be reviewed

  14. Understanding the Review ProcessHow the artifacts work together • Domain Model Artifact (continued) • Includes domain attributes  • Name - business name of attribute • New - designates whether the class exists in current web Trns•port • Initial - documents the initial value if always the same • Notes - includes definitions, examples, and logic • "New" attributes are to be reviewed • Any missing attributes are to be identified • Useful for commenting and revising system requirements (with "Track Changes" on)

  15. Understanding the Review ProcessHow the artifacts work together • Domain Model Artifact (continued)

  16. Requirements • User requirements are prefaced with UR. • Possible types: • Agency Option • Logic • NR-Example • NR-Info • Requirement

  17. Requirements For Requirement Stereotypes, begin with either: • “The system will…” • “Ability to …”

  18. Understanding the Review ProcessHow the artifacts work together • Requirements Artifact • Primary Purpose - clarify requirements • The system requirements describe, in the language of the software developer and integrator, what the proposed product must do to implement the user requirements. The URS should be reviewed and analyzed and used as the basis for developing the system requirements.  The system requirements must include functional, non-functional, technical architecture, preliminary data, and interface requirements for the proposed system in sufficient detail to support system design. Security, accessibility, user interface, and performance requirements must always be included in the non-functional requirements.

  19. Understanding the Review ProcessHow the artifacts work together • Requirements Artifact (continued) • Includes user requirements • New content to be reviewed  • Includes all original user story sentences • Old content not to be reviewed - reference only • Includes trace (relationship) between user story sentence and requirement • Trace is to be reviewed • Useful for commenting and revising requirements (with "Track Changes" on)

  20. Understanding the Review ProcessHow the artifacts work together • Requirements Artifact (continued)

  21. Requirements In EA…

  22. Use Cases and User Interfaces • Typically done in parallel • Use Balsamiq for UI mock ups

  23. Balsamiq • Many standard UI components are mocked. • See Tracker for lists of existing UI mocks. • Can capture existing system if only a few changes will be made to it. • See Illustrated UI Glossary for terms.

  24. Understanding the Review ProcessHow the artifacts work together • Storyboard Artifact (all new content to be reviewed) • Primary Purpose - validate the user and application interaction and the requirements, and to provide a visual of how the requirements are satisfied by the application.   • Not to represent the final user interface or capture the placement of every field; field-level detail will be included only when necessary to document business rules or processes, such as interaction with the user (for example, key identifying fields used for selection from a list).  • Field-level detail is documented in the Domain Model.   • Field-level screen detail will be added during Design and Construction Phase. • Any missing user interactions are to be identified • Useful for commenting and revising system requirements (with "Track Changes" on)

  25. Understanding the Review ProcessHow the artifacts work together • Storyboard Artifact (continued)

  26. Use Cases • The description of the interaction between user and system • Selecting a user • Some standard actions • Access • Invokes

  27. Understanding the Review ProcessHow the artifacts work together • Use Case Artifact (all new content to be reviewed) • Primary Purpose - describe the interaction between the application and the user and serves as a communication tool between Info Tech and TRT members to document the expected application responses to user actions based on the stated requirements. The use cases are documented in English text for ease of reading. A use case diagram is also developed to graphically depict this interaction between the user and the application. • Any missing user interactions are to be identified • Useful for commenting and revising system requirements (with "Track Changes" on)

  28. Understanding the Review ProcessHow the artifacts work together • Use Case Artifact (continued) • Actor • The most likely (or, if indeterminate, the least proficient) user for each use case. Someone or something outside the system that either acts on the system – a primary actor – or is acted on by the system – a secondary actor. An actor may be a person, a device, another system or sub-system, or time.

  29. Understanding the Review ProcessHow the artifacts work together • Use Case Artifact (continued) • Preconditions • Defines all the conditions that must be true (i.e., describes the state of the system) before the use case can be started. • Basic Path • The primary scenario, or typical course of events. Given the great impact of security on Trns•port, the path requiring the lowest level of access is typically modeled as the basic path (you view before you create, modify, or delete). • Alternate Path(s) • Optional secondary paths or alternative scenarios, which are variations on the main theme. Each business rule may lead to an alternative path. • Exceptions at the system level are already met by existing web Trns•port. • The description of an exception will indicate how the system will respond to, or (if possible) recover from, the error condition.

  30. Understanding the Review ProcessHow the artifacts work together • Use Case Artifact (continued)

  31. Robustness Diagram • Not delivered • Used to disambiguate use cases • Used to communicate with developers • May be done in combination with use cases and user interfaces

  32. RTM • Created automatically from the links created in the traceability diagram. • Some SMEs find it very useful for peer reviews.

  33. Understanding the Review ProcessHow the artifacts work together • RTM Artifact (traceability to be reviewed) • Primary Purpose  • RTM - user story package level  Provide traceability from user story sentences to user requirements and use cases within a user story package (FitNesse page) • RTM - global  Provide backwards traceability for all system requirements to a source user requirement (now), and (later) forward traceability to design objects and test procedures • Any incorrect or missing traces are to be identified • Useful for filtering by user story, system requirement, and/or use case

  34. Understanding the Review ProcessHow the artifacts work together • RTM Artifact (continued)

  35. Understanding the Review ProcessHow the artifacts work together • RTM Artifact (continued) • I = Iteration (useful in global RTM) • # = a sequential number (useful for reference) • Source = ID of User Story Sentence (useful for traceability) • User Story Sentence = the text from FitNesse

  36. Understanding the Review ProcessHow the artifacts work together • RTM Artifact (continued) • URS ID = ID of User Requirement (useful for traceability) • User Requirement Description = the text of Requirement • Type = type of requirement (NR-Info, NR-Example, CC, Logic, or Requirement) • SRS ID = ID of Use Case (useful for traceability) • Status = status of Use Case

  37. Understanding the Review ProcessHow the artifacts work together • RTM Artifact (continued)

  38. Understanding the Review ProcessHow the artifacts work together MS Excel filtering • RTM Artifact (continued)

  39. Understanding Traceability • RTM Artifact Advantages • Review - useful for filtering by user story, user requirement, and/or use case • Requirement Artifact Advantages • Revision - useful for commenting and revising user requirements (with "Track Changes" on) • RTM Artifact Disadvantages • Source (user story sentence) repeats for each URS (user requirement) • URS (user requirement) repeats for each SRS (use case) • Requirement Artifact Disadvantages • Every traced user story sentence is listed below each user requirement

  40. Issues Report • Not delivered. • Keep track of discussions. • See the Methodology Standards for more details.

  41. Reviews • Peer • Developer • QA

More Related