29
MSE Presentation 3 By Padmaja Havaldar- Graduate Student Under the guidance of Dr. Daniel Andresen – Major Advisor Dr. Scott Deloach-Committee Member Dr. William Hankley- Committee Member

MSE Presentation 3

Embed Size (px)

DESCRIPTION

MSE Presentation 3. By Padmaja Havaldar- Graduate Student Under the guidance of Dr. Daniel Andresen – Major Advisor Dr. Scott Deloach-Committee Member Dr. William Hankley- Committee Member. Introduction. Overview Revised Artifacts Testing Evaluation Project Evaluation - PowerPoint PPT Presentation

Citation preview

Page 1: MSE Presentation 3

MSE Presentation 3By

Padmaja Havaldar- Graduate Student

Under the guidance of

Dr. Daniel Andresen – Major AdvisorDr. Scott Deloach-Committee Member

Dr. William Hankley- Committee Member

Page 2: MSE Presentation 3

Introduction

Overview Revised Artifacts Testing Evaluation Project Evaluation Problems Encountered Lessons Learned User Manual Conclusion Demonstration

Page 3: MSE Presentation 3

Overview

Objective: To develop a web-based Statistical Analysis tool based on the statistics alumni information. The four types of tests used for analysis were Regression analysis, Correlation analysis, Hypothesis test and Chi-Square test.

Page 4: MSE Presentation 3

Overview

Servlet

XML

Data Base

Session Beans

Entity Bean

EJ B

Statistical analysis Tool

Page 5: MSE Presentation 3

Revised Artifacts Object Model

Page 6: MSE Presentation 3

Revised Artifacts

Object Model

Page 7: MSE Presentation 3

Revised Artifacts

Formal Requirement Specification

The USE model was refined with the changes suggested during presentation 2

Page 8: MSE Presentation 3

Components

J2EE Application Server Enterprise Java Beans Java Servlets XML HTML Java Applets

Page 9: MSE Presentation 3

Component Design

Servlet

Servlet

CreateRegression() : doubleCreateCorrelation() : doubleCreateHypothesis() : doubleCreateChiSquare() : doubleRegisterUser() : double

RegistrationBean

LinearRegressionBeanCorrelationBean HypothesisBean

ChisquareBean

Page 10: MSE Presentation 3

Component Design

Entity BeanLinearRegression Bean

RegistrationHome

RegistrationBean

Chi Square Bean

Hypothesis Bean

Correlation Bean

RegistrationRemote

name : StringLoginid : StringGpa : doubleSalary : doublepasswd : StringDegree : BooleanCitizen : BooleanjoinIn3Mths : Boolean

getAllUsers() : voidgetLoginId() : StringgetGpa() : DoublegetSalary() : DoublegetPassword() : StringgetDegree() : ChargetCitizenship() : BooleangetJoinIn3Mths() : BooleanFindAll() : CollectionFindUser(String LiD, String Pwd) : Collection)

Page 11: MSE Presentation 3

Component Design

Session Beans HypothesisRemote

home : RegistrationHomeColl : CollectionnMS, nPhD : intmeanMS, meanPhD : doubleVarianceMS : doubleVariancePhD : doubledegrees : intpooledSD : doubletValue : doubleCDF : doublepValue : double

getUsers() : voidgetMeanSalaries() : voidgetVariance() : voidgetDistribution(double t)() : voidgetPValue(double cdf)() : voidgetnMS() : intgetnPhD() : intgetDegrees() : intgetPValue() : doublegetMeanMS() : doublegetMeanPhD() : doublegetVarianceMS() : doublegetVariancePhd() : doublegetpooledSd() : doublegetTValue() : doublegetCDF() : double

Servlet

RegistrationBean

Page 12: MSE Presentation 3

Testing Evaluation Registration form

All the inputs to the fields in the form were tested.

Functionality of tests:Each test was tested to check its functionality by using test cases and also by checking the output obtained from the tool with that of Excel.

Some of the test cases for the tests are listed below Regression test

Less than 3 members No MS members No PhD members

Page 13: MSE Presentation 3

Testing Evaluation Chi-Square

No Citizens No International students No person with a job in 3 months of graduation No person without a job in 3 months of graduation

Hypothesis test No MS alumni No PhD alumni

Correlation No members

Page 14: MSE Presentation 3

Testing Evaluation

Testing using JMeter

Stress or performance test was conducted using JMeter based on the number of simultaneous users accessing the site

To check the results using JMeter, graphs were plotted as results.

Throughput is dependent upon many factors like network bandwidth, clogging of network and also the amount of data passed

The deviation is amount deviated. this should be as small as possible for best results.

The average defines the average time required to access the questions page.

Page 15: MSE Presentation 3

Testing Evaluation

The values seem high because the data is passed to the bean and many calculations are performed on the data

The servlet uses the result to display the graphs as applets and also some tabular representations

Page 16: MSE Presentation 3

Testing Evaluation According to careful

consideration it would be close to impossible to have more than 30 simultaneous users with no lag between them so that tests were made with 15, 30 and 45 users

The time required looks higher than normal text web sites thus the total performance is best at low simultaneous users but high number of users deteriorates the performance.

10 Users/

second

(optimal)

30Users/

second

(average)

45 Users/

second

(Worst)

Deviation 248 ms 559 ms 1542 ms

Throughput 755/min 981/min 824/min

Average 709 ms 1619 ms 2998 ms

Page 17: MSE Presentation 3

Testing EvaluationTesting using Microsoft Application Center Test Test type:Dynamic Test duration:00:00:05:00 Test iterations:227 Total number of requests:4,093 Total number of connections:4,093 Average requests per second:13.64 Average time to last byte (msecs):72.39 Number of unique requests made in test:12 Number of unique response codes:2

Errors Counts HTTP:0 DNS:0 Socket:0 Average bandwidth (bytes/sec):134,048.33 Number of bytes sent (bytes):1,434,357 Number of bytes received (bytes):38,780,141 Average rate of sent bytes (bytes/sec):4,781.19 Average rate of received bytes (bytes/sec):129,267.14

Page 18: MSE Presentation 3

Testing Evaluation Scalability

Database Oracle database is highly scalable. The number of users in the database does

not affect the performance of the database because of the fact that the database has only one table of users.

The database is used to retrieve users from the table. Application

Tests with 200 simultaneous users also provided reasonable results. Average time for each user to access the questions page: 5 seconds Deviation was 2 seconds.

Portability Since the J2EE architecture is based on the Java framework, the application can

be used across many enterprise platforms. Robustness

Using client side scripting and error checking within the middle tier, the application is more or less robust against invalid data.

The application has undergone many iterations of unit testing to finally culminate into a robust application.

The worst case tests with JMeter also provided reasonable results to exemplify the fact that the application is highly robust.

Page 19: MSE Presentation 3

Formal Technical Inspection

Inspection of the SRS was conducted by

Laksmikanth Ghanti and

Divyajyana Nanjaiah

The inspection results specified that the SRS was 99% satisfactory. Minor changes were corrected by adding a section for Anticipated future changes in version 2.0 of the SRS and making provision for additional error messages in the SRS

Results

Page 20: MSE Presentation 3

User Manual

An installation guide and a detailed walkthrough of the project is provided in the user manual

User manual

Page 21: MSE Presentation 3

Project Evaluation

Project Duration

Start Time Finish Time

Expected Actual

Phase I 03/15/03 06/30/03

Phase II 07/01/03 7/28/03 10/27/03

Phase III 10/28/03 27/11/03 12/09/03

Page 22: MSE Presentation 3

Project Evaluation

Phase I DurationTime in minutes

Research, 990, 26%

Design, 240, 6%Deploying &

Testing, 1110, 28%

Documentation, 840, 22%

Coding, 700, 18%

Research

Design

Coding

Deploying & Testing

Documentation

Page 23: MSE Presentation 3

Project Evaluation

Phase II Duration Time in minutes

Research, 390, 10%

Design, 1120, 29%

Coding, 900, 23%

Deploying & Testing, 810,

21%

Documentation, 640, 17%

Research

Design

Coding

Deploying & Testing

Documentation

Page 24: MSE Presentation 3

Project Evaluation

Phase III Duration Time in minutes

Research, 120, 3%

Deploying & Testing, 1800,

46%

Documentation, 920, 23%

Coding, 1050, 26%

Design, 80, 2%Research

Design

Coding

Deploying & Testing

Documentation

Page 25: MSE Presentation 3

Project Evaluation

Lines of Code Estimate in first phase = 4636 Actual Lines of code

Entity Java Beans = 1869 Servlet =1040 XML =120 Total = 3029 lines of code.

Page 26: MSE Presentation 3

Problems Encountered

Learning curve J2EE and Deploy tool

Does not update files automatically Not best suited for unit testing or development

practices EJB packaging errors

Alumni data

Page 27: MSE Presentation 3

Lessons Learned

Methodology Usefulness of methodologies

Reviews The feedback during reviews was very helpful

Technology J2EE architecture and deploy tool

Page 28: MSE Presentation 3

Conclusion

SAT was implemented using the J2EE architecture

JMeter and Microsoft ACT was used to stress test the application and the performance was found to be satisfactory

The SAT is extensible

Page 29: MSE Presentation 3

Demonstration