290 likes | 404 Views
MSE Presentation 3. By Padmaja Havaldar- Graduate Student Under the guidance of Dr. Daniel Andresen – Major Advisor Dr. Scott Deloach-Committee Member Dr. William Hankley- Committee Member. Introduction. Overview Revised Artifacts Testing Evaluation Project Evaluation
E N D
MSE Presentation 3 By Padmaja Havaldar- Graduate Student Under the guidance of Dr. Daniel Andresen – Major Advisor Dr. Scott Deloach-Committee Member Dr. William Hankley- Committee Member
Introduction • Overview • Revised Artifacts • Testing Evaluation • Project Evaluation • Problems Encountered • Lessons Learned • User Manual • Conclusion • Demonstration
Overview • Objective: To develop a web-based Statistical Analysis tool based on the statistics alumni information. The four types of tests used for analysis were Regression analysis, Correlation analysis, Hypothesis test and Chi-Square test.
Revised Artifacts • Object Model
Revised Artifacts • Object Model
Revised Artifacts • Formal Requirement Specification • The USE model was refined with the changes suggested during presentation 2
Components • J2EE Application Server • Enterprise Java Beans • Java Servlets • XML • HTML • Java Applets
Component Design • Servlet
Component Design • Entity Bean
Component Design • Session Beans
Testing Evaluation • Registration form • All the inputs to the fields in the form were tested. • Functionality of tests: Each test was tested to check its functionality by using test cases and also by checking the output obtained from the tool with that of Excel. Some of the test cases for the tests are listed below • Regression test • Less than 3 members • No MS members • No PhD members
Testing Evaluation • Chi-Square • No Citizens • No International students • No person with a job in 3 months of graduation • No person without a job in 3 months of graduation • Hypothesis test • No MS alumni • No PhD alumni • Correlation • No members
Testing using JMeter Stress or performance test was conducted using JMeter based on the number of simultaneous users accessing the site To check the results using JMeter, graphs were plotted as results. Throughput is dependent upon many factors like network bandwidth, clogging of network and also the amount of data passed The deviation is amount deviated. this should be as small as possible for best results. The average defines the average time required to access the questions page. Testing Evaluation
The values seem high because the data is passed to the bean and many calculations are performed on the data The servlet uses the result to display the graphs as applets and also some tabular representations Testing Evaluation
Testing Evaluation • According to careful consideration it would be close to impossible to have more than 30 simultaneous users with no lag between them so that tests were made with 15, 30 and 45 users • The time required looks higher than normal text web sites thus the total performance is best at low simultaneous users but high number of users deteriorates the performance.
Testing Evaluation Testing using Microsoft Application Center Test • Test type:Dynamic • Test duration:00:00:05:00 • Test iterations:227 • Total number of requests:4,093 • Total number of connections:4,093 • Average requests per second:13.64 • Average time to last byte (msecs):72.39 • Number of unique requests made in test:12 • Number of unique response codes:2 • Errors Counts • HTTP:0 • DNS:0 • Socket:0 • Average bandwidth (bytes/sec):134,048.33 • Number of bytes sent (bytes):1,434,357 • Number of bytes received (bytes):38,780,141 • Average rate of sent bytes (bytes/sec):4,781.19 • Average rate of received bytes (bytes/sec):129,267.14
Testing Evaluation • Scalability • Database • Oracle database is highly scalable. The number of users in the database does not affect the performance of the database because of the fact that the database has only one table of users. • The database is used to retrieve users from the table. • Application • Tests with 200 simultaneous users also provided reasonable results. • Average time for each user to access the questions page: 5 seconds • Deviation was 2 seconds. • Portability • Since the J2EE architecture is based on the Java framework, the application can be used across many enterprise platforms. • Robustness • Using client side scripting and error checking within the middle tier, the application is more or less robust against invalid data. • The application has undergone many iterations of unit testing to finally culminate into a robust application. • The worst case tests with JMeter also provided reasonable results to exemplify the fact that the application is highly robust.
Formal Technical Inspection • Inspection of the SRS was conducted by Laksmikanth Ghanti and Divyajyana Nanjaiah • The inspection results specified that the SRS was 99% satisfactory. Minor changes were corrected by adding a section for Anticipated future changes in version 2.0 of the SRS and making provision for additional error messages in the SRS Results
User Manual • An installation guide and a detailed walkthrough of the project is provided in the user manual • User manual
Project Evaluation • Project Duration
Project Evaluation • Lines of Code • Estimate in first phase = 4636 • Actual Lines of code • Entity Java Beans = 1869 • Servlet =1040 • XML =120 • Total = 3029 lines of code.
Problems Encountered • Learning curve • J2EE and Deploytool • Does not update files automatically • Not best suited for unit testing or development practices • EJB packaging errors • Alumni data
Lessons Learned • Methodology • Usefulness of methodologies • Reviews • The feedback during reviews was very helpful • Technology • J2EE architecture and deploy tool
Conclusion • SAT was implemented using the J2EE architecture • JMeter and Microsoft ACT was used to stress test the application and the performance was found to be satisfactory • The SAT is extensible