240 likes | 261 Views
This project aims to design and develop an Online Music Store for public users, with functionalities for browsing, searching, buying products, and managing accounts. The implementation phase includes coding, testing, and analysis using technologies like ASP.NET 2.0 and C#.
E N D
Online Music StoreMSE Project Presentation IIIPresented by: Reshma Sawant Major Professor: Dr. Daniel Andresen 03/11/08
Phase III Presentation Outline • Project Overview • Brief Review of Phases • Action Items from Phase II • Implementation/Demo • Assessment Evaluation • Project Evaluation • Lessons learned
Project Overview The objective of this project is to design and develop an Online Music Store. • Target: Public Users • Product: Media for Music • User Types: User, Administrator • Functionalities for Users: Browsing, searching, buying products, getting song recommendations, managing personal account • Functionalities for Administrator: Manage Catalog Details, Manage Orders, Manage Shopping Cart
Review of Phases • Phase I: • Requirement Specifications • Phase II: • Designed Web Pages • Created Test Plan • Phase III (Current): • Coding • Testing and Analysis
Action Items from Phase II • Correct multiplicities in Class Diagram • Multiplicity between ShoppingCart Class and CartItem Class should be 1..*
Action Items from Phase II 2) Revise SLOC count and Project Duration • Included in Project Evaluation
Implementation & Demo • Technologies Used: • IDE – Microsoft Visual Studio 2005 • Technology - ASP.NET 2.0 • Language – C# • Database – SQL Server 2005
Assessment Evaluation • Manual Testing - To ensure the correctness of various parts of code
Assessment Evaluation • E.g. Register Web Page for User • E.g. Edit Shopping Cart
Assessment Evaluation • Performance Testing • Goal: • Determine load in terms of concurrent users and requests • Determine Response Time – time between the request being initiated for a Web Page to time taken for it to be completely displayed on a user’s browser • Tool Used – JMeter(http://jakarta.apache.org) • Inputs to JMeter: • Number of Users • Ramp-up period – time (sec) to load the full number of users chosen • Loop Count - how many times to repeat the test • E.g. Users = 10, Loop-Count = 20, Ramp-up period = 5 sec => 10 Users will be loaded in 5 sec with total requests = 200 (10*20)
Assessment EvaluationPerformance Testing Factors • Load Type • Peak Load – maximum number of users and requests loaded in short duration (e.g. 5 sec). • Sustained Load – maximum users and requests loaded for longer period (e.g. 5 mins). • Connection • Wireless Connection at 54.0 Mbps • LANConnection at 100.0 Mbps • Web pages Tested • HTML Page (Login Web Page) • Database Intensive Page (Home Page) • Business Logic Page (Shopping Cart Page)
Assessment EvaluationPerformance Testing Environmental Set-up • Machine Configuration • Operating System – Windows XP Professional • Memory – 1GB RAM • 100GB HardDisk • Intel Pentium M Processor 1.7 GHz
Assessment Evaluation Home Page [http://localhost:2416/CDShop/Default.aspx] • Peak Load at Wireless (54 Mbps) vs. LANConnection (100 Mbps) • Note • Loop-Count constant at 20,000 • Ramp-up period of 5 sec • Users – 200, 600, 800, 1000 • Observations • Response Time increases linearly with number of users for both Wireless and LAN • Max no.of users handled by the system before it becomes saturated = 1000 • Response Time is less for LAN due to better bandwidth.
Assessment Evaluation Home Page [http://localhost:2416/CDShop/Default.aspx] • Constant Users vs. Constant Loop-Count for Wireless Connection Users Constant at 200 Loop-Count Constant at 20,000 Loop-Count increased up to 20000 Users – 200, 600, 800, 1000
Assessment Evaluation Home Page [http://localhost:2416/CDShop/Default.aspx] • Observations • Response Time increases rapidly with number of users but not very much when the users are kept constant and only loop-count is increased. • Reason: • If the number of users is kept constant and only the loop-count is increased, the number of requests/sec handled by the server remains constant for every increase in the loop count. • If the users are increased and loop count is kept constant, the requests/sec handled by the server increases with increasing users, but the number of executions remain constant and hence the longer response time.
Assessment Evaluation • Note • Loop-Count constant at 20,000 • Ramp-up period of 5 sec • Users – 200, 600, 800, 1000 • Observations • Response Time increases more for Home Page as compared to Login and Shopping Cart Page • Lowest Response Time for Login Page as no database requests are submitted by the user • Moderate Response Time for Shopping Cart page because there are more computations • Response Time for Shopping Cart Page is approx. 28% more on an average than for Login Page • Response Time for Home Page is approx. 246% more on an average than for Login Page • Comparison of Response Times of all 3 WebPages at Wireless Connection of 54.0Mbps
Assessment Evaluation Home Page [http://localhost:2416/CDShop/Default.aspx] • External Factors affecting Response Time • Varying Network Bandwidth • Limited System Hardware Resources (CPU, RAM, Disks) and Configuration • JMeter Tests and Server running on the same machine
Assessment Evaluation Summary • For Peak Load • Users – 200, 600, 800, 1000 • Loop-Count constant at 20,000 • Ramp-up period = 5 sec • Response Time increases rapidly with number of users but not very much when the users are kept constant and only loop-count is increased. • Response Time is highest for Home page, Intermediate for Shopping Cart Page and Lowest for Login Page
Assessment Evaluation Login Page [http://localhost:2416/CDShop/Login.aspx] • For Sustained Load at Wireless Connection
Project Evaluation • Project Duration (actual) • Phase I = 86 hours • Phase II = 140.5 hours • Phase III = 304.5 hours • Total = 531 hours • Project Duration (in Months) • Estimated at the end of Phase II = 6.5 Months • Actual = 7.5 Months
Project Evaluation • Category BreakDown • Research = 38.5 hours • Design = 37 hours • Coding = 305.5 hours • Testing = 32 hours • Documentation = 118 hours • Total = 531 hours
Project Evaluation • SLOC Count (Actual) – LocMetrics Tool (http://www.locmetrics.com) • C# Code (Including C# auto-generated code) = 2757 • SQL Code = 540 • XML Code = 86 • CSS Code = 412 • Total = 3795 • SLOC Count (Estimated) • At the end of Phase II – 3200 (Based on prototype design in phase I)
Project Experience • Lessons Learned: • New technology • Use of various tools for designing and testing – Visual Studio 2005, JMeter, LocMetrics • Working with UML and Class Diagrams • Entire life cycle of the project– requirement gathering, Design, Coding, Testing and Documentation • Testing applications at different levels