260 likes | 421 Views
SYST699 – Spec Innovations. Innoslate™ System Engineering Management Software Tool Test & Analysis Aron Ceely Justin Mathews Kate Stevenson Bruck Woldie. Agenda. Introduction Project Approach Test and Evaluation Plan Analysis Recommendations Questions Acknowledgments.
E N D
SYST699 – Spec Innovations Innoslate™ System Engineering Management Software Tool Test & Analysis Aron Ceely Justin Mathews Kate Stevenson Bruck Woldie
Agenda • Introduction • Project Approach • Test and Evaluation Plan • Analysis • Recommendations • Questions • Acknowledgments
Background – SPEC Innovations SPEC Innovations • Systems and Proposals Engineering Company. • Established in 1993. • Providesa wide variety of technical and proposal services to government and commercial customers. • Sponsor – Dr. Steven H. Dam, PhD in Nuclear Physics from Univ of South Carolina; BS degree from GMU in Physics. • Innoslate™ is the company’s first foray into commerical product development.
Background – Innoslate™ Innoslate™ Systems Engineering Tool • Developed as anaggregation of several systems engineering tools and concepts that have been utilized for the past 25 years. • Cloud-based web application developed specifically for systems engineers to encompass activities related to the full systems lifecycle. • Includes capabilities related to Requirements Management, collaboration, system Modeling, Reporting, and Discrete Event Simulation.
Project Problem Definition • Software product development is very new to SPEC Innovations,necessitating a critical review and analysis of the Innoslate™ tool to provide independent feedback and recommendations. • SPEC Innovations’ primary need is an evaluation of the tool’s usability and functionality to measure its ability to assist system engineers in doing their job.
Project Objectives • Testing and criticalevaluation of the capabilities offered by Innoslate™. • Compile and detail test resultsand recommendations as part of a report to the sponsor. • Deliver the final project report. • Present project and findingsto the sponsor and GMU faculty.
Approach Overview • Identified a sample project for developing specific deliverables pertaining to systems engineering life cycle stages. • ScopedInnoslate™ capabilities for testing and evaluation. • Developed and executed a test plan outlining the test’s scope, strategy, evaluation criteria, and test cases. • Conducted analysis of test resultsbased on the developed evaluation criteria.
Meeting and Event Planning Assistant System • An existing completed project selected from a MSSE course. • Project’s purpose was to design an efficient mechanism for requesting, planning, executing, and monitoring group meetings (both face-to-face and at-a-distance). • Decomposed into seven components that span all phases of the System Engineering Lifecycle (SELC). • Focus was on the Concept Development and Engineering Development phases.
Innoslate™ Capabilities • Isolated capabilities within Innoslate™ that can be(1) used to develop the scoped deliverables from the project and (2) evaluated within the timeframe given. • Core components of Innoslate ™ that were tested include: • Database View • Document Analyzer • Collaboration • Requirements • Report Generation
Test Scope • Focused on functionality and usability of the tool. • Capabilities of Innoslate™ have been scoped to: • Database View, including storing of assets, artifacts, etc. • Requirements Gathering, including capturing of requirements for the MEEPAS project. • Document Analyzer, including automated versus manual parsing of inputted documents. • Report Generation, which includes inputting objects into the database and generating a report.
Functionality Testing (1) • Examined how the tool functions by inputting data, observing the tool’s behavior in generating outputs. • Outcomes of the test were used to compare the expected outputs with the actual outputs. • Functional evaluation criteria were established to evaluateeach of the capability’s functionality.
Functionality Testing (2) • Example criteria include: • If Assets/entities can be created, deleted, and stored within the tool. • If Reports can be generated. • If Projects, completed and in progress, can be shared between team members.
Usability Testing • Focused on measuring the tool’s capability to meet its intended purpose as well as its ease of use. • Example usability evaluation criteria include: • How many SELC artifacts can be fully completed using the tool? • Is the tool intuitive? How familiar does user have to be with the tool’s User Guide? • After extended use, how much does the user become familiarized with using different parts and functions of the tool ?
Test Cases • Test Cases were developed for testing whether the system was working correctly based on the User Guide. • Results were collected as a Pass or Fail. • Written for each scoped Innoslate™ capability.
Analysis Scope • Scope was specific to providing user experience related results and recommendations pertaining to Innoslate™. • Intended to assist SPEC Innovations with feedback for marketing the tool while continuing to mature its weak points. • Focus of testing was on Functionality, Usability, Performance, and Documentation of Innoslate™. • Test Case results were analyzed from Google Chrome and Mozilla Firefox users.
Analysis Results – Database View • Database Activities - Performed most database related activities with no issues: Creating, Saving, Viewing, and Searching class entities; Filtering and Sorting the database’s view; Exporting entities from the tool. • Importing Documents - Importing documents associated with particular entities works as expected, except for a few user-friendliness glitches mentioned under the Recommendations section. • Deleting Entities/Documents - Deleting already created entities of small to medium number of entities works as expected, except for occasional hang-ups experienced when deleting large number of entities at the same time.
Analysis Results – Requirements • Requirement Database Activities - Creating, viewing, editing, and deleting of requirements performed correctly in accordance with the user guide. • Validating – Scoring found to be a bit inconsistent (two requirements worded very similar, but received very different scores), validation process occasionally caused system to hang. • Requirements Deletion – Except for deleting large number of requirements at once taking a long time, is able to delete created requirements successfully. • Requirements Report Generation - Generating the requirements document initializes the document sections; however it is not intuitive how to actually get the specific entered requirements into the document itself.
Analysis Results – Document Analyzer • Importing Document – Uploaded both Word and PDF files; imports small to medium-sized files using the Analyzer with no issues; issues with importing large files (progress bar freezes occasionally). • Analyzer – PDFs not parsed completely, forcing user to manually enter remaining portions of the document; Word documents parsed successfully; results of parsing couldn’t be seen in database view due to a quota error and the tool freezes.
Analysis Results – Report Generation • Class-Based Reports (e.g. requirements) – Tool generated class summary report successfully to display a list of artifact entities. • CONOPs Report – Wizard establishes template for creating report, but selecting tasks from wizard’s list led to error messages. Also, was unable to identify how to incorporate existing CONOPs entities into the auto generated template. • DoDAF Reports – Report output failed to generate expected document. DoDAF views are not yet implemented in the current version of Innoslate™. • JCIDS Reports – Tool generated class summary report successfully to display a list of artifact entities
Recommendations on Existing Features (1) • Database Feature • Caution/Alert user about data loss when navigating away from an entity’s creation/editing screen. • Graphic display to indicate that an entity contains one or more files. • Capability to delete already uploaded files. • Ability to sort using all attributes from Database view.
Recommendations on Existing Features (2) • Requirements Gathering • Automatic-numbering of requirements or perhaps a visual display noting the current numbering scheme. • Progress bar when running the requirements validation function. • Capability to add multiple requirements at once without having to exit to the database view after creating each requirement.
Recommendations on Existing Features (3) • Document Analyzer • When uploading files, recommended to provide explicit messages prompting the user about the status and outcome of the file upload. • When file fails to upload, recommended to provide more specific error messages why the upload has failed. • Report Generation • Since DoDAF views and reports are essential tools for Systems Engineers, it is recommended to add that capability. • It would be useful if the 'Report Completed' dialog had a 'View Report' button which would open the report that gets generated.
Future Recommendations (1) • Implement Project Scheduling capability • Functions for building project schedules and tasks, assigning the tasks to users collaborating on the same project, and generating project outputs to assist in project tracking. • Gantt chart, Critical Path Analysis, PERT chart • Develop or integrate Project Cost Estimation tool • Generate estimates based on requirements, use-cases, etc.
Future Recommendations (2) • Publish Document Object Model (DOM) • Expose the DOM model to API (Application Programming Interface) using a script language such as JavaScript. • Enables creating and manipulating Systems Engineering projects and entities programmatically. • Boosts productivity of Systems Engineers.
Acknowledgments • Thank you to our sponsors at SPEC Innovations: • Steven H. Dam, PHD • Chris Ritter • Thank you to our advisor, Dr. Hoffman • Thank you to GMU’s SYST and OR department