Upload
theodore-newman
View
219
Download
0
Tags:
Embed Size (px)
Citation preview
Quality Assessment II
Peter Dologdolog [at] cs [dot] aau [dot] dk2.2.05Intelligent Web and Information SystemsNovember 2, 2010
Outline
Design models and testingUsage analysisStatistical testing
2Peter Dolog, Web Engineering, Quality Assesment
Design Models and Quality Assessment
Metrics for navigation
Navigation maps: size, depth, breadth, compactnessNavigation context: coupling of nodes, intrinsic
complexity
Can be checked in navigation design modelsFor example if represented in XML, analysis can be
done by XSLT
4Peter Dolog, Web Engineering, Quality Assesment
Correctness of navigation
Syntacticall the constructs are correct and consistent with respect to a given syntax
Semanticnon-determinismracing conditionsdeadlocks
5Peter Dolog, Web Engineering, Quality Assesment
Usability
Consistency:consistent use of patterns in content composition, navigation, and function invocation
Ease of Navigation:availability of links – from associations between data, in addition links to home, and other related upper pages to avoid dead ends.
Low page density:number of content units at the page
6Peter Dolog, Web Engineering, Quality Assesment
Modify Pattern
7Peter Dolog, Web Engineering, Quality Assesment
© Springer, Web Applications Engineering 2009
Termination Variant
8Peter Dolog, Web Engineering, Quality Assesment
© Springer, Web Applications Engineering 2009
Other design based techniques
See the last lecture:White box : Data flow basedBlack Box
9Peter Dolog, Web Engineering, Quality Assesment
From last lecture: performance testing
10Peter Dolog, Web Engineering, Quality Assesment
Testing Loop
11Peter Dolog, Web Engineering 2010, Quality Assesment I
© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.
Results
12Peter Dolog, Web Engineering 2010, Quality Assesment I
© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.
Additional response time testing and stressing
13Peter Dolog, Web Engineering 2010, Quality Assesment I
© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.
White box testing (2)
14Peter Dolog, Web Engineering 2010, Quality Assesment I
© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.
White Box Testing
15Peter Dolog, Web Engineering 2010, Quality Assesment I
© Mauro Andreolini, Michele Colajanni, and Riccardo Lancellotti: Web System Reliability and Performance. In Web Engineering, 2006, Springer, pages: 181-218.
Web Usage Analysis
16
Typical Web Log
Client IPTimestampMethod of transaction (Get, Post, …)URL requestedProtocol version (e.g HTTP1.1)HTTP return codeSize of response returned to clientCookie at the client browserURL of referrerClient used to access (e.g. Firefox, Mozzila, IE, …)
17Peter Dolog, Web Engineering, Quality Assesment
Usage Analysis
We can mine for related pages from the requestsA typical query would be:
Give me all URL’s accessed in one session by usersRank them according to number of occurance patterns in the log
Good for the web sites with less complex and static pages.
If the patterns correlate with the links then we are fine
However, for complex applications, more fine grained log structure is neccassary to assess the usage
18Peter Dolog, Web Engineering, Quality Assesment
Goals
You want to typically find out whetheryour designed links are followed or notwhether your pages are reachablewhich of the pages are not used and whywhich data items are usedare there any anomalies, i.e. invoking functions after strange navigation sequenceand so on
19Peter Dolog, Web Engineering, Quality Assesment
Goal’s achievements
You are able to do this if:you know the composition of pageswhich data entities and records are inhow they are processedwhich operations and functions are invokeddo data fulfil their role (central role, access role, interconnection)how the links are related to the items and so on
20Peter Dolog, Web Engineering, Quality Assesment
Design Model
Data Schema (In MVC Model)Hypertext Schema (In MVC Model and partly
Controller)Presentation Model (In MVC View and partly
Controller)…
You can extend the web log with these information
21Peter Dolog, Web Engineering, Quality Assesment
WebML QA approach
22Peter Dolog, Web Engineering, Quality Assesment
© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.
Design Schema Analysis
Performed at the design timeChecks for some quality attributes (see last
lectures)There are various metrics to check for those
attributes (see P. Fraternali et. al.: WQA: An XSL Framework for Analyzing the Quality of Web Applications. IWWOST 2002 workshop proceedings.)
23Peter Dolog, Web Engineering, Quality Assesment
Web Usage Analysis
Data Access AnalysisHypertext Access AnalysisNavigation Analysis
24Peter Dolog, Web Engineering, Quality Assesment
Web Usage Mining
Discovers patterns of navigating connections which are not designed
Results in rules: • XML Association Rule X=>Y• XML Sequential Pattern, where rule head and
body are bound to a position in the log statement (indicates temporal relation)
25Peter Dolog, Web Engineering, Quality Assesment
Information Extraction
26Peter Dolog, Web Engineering, Quality Assesment
© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.
You can extend your WA logs as well
http://logging.apache.org/log4j/1.2/index.html
To insert log about an event from application runtime
Such as user interacting with data unit, or part of a page, … when populating it and sending as a result of a request
27Peter Dolog, Web Engineering, Quality Assesment
Concepetual Log
28Peter Dolog, Web Engineering, Quality Assesment
© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.
WebML QA approach
29Peter Dolog, Web Engineering, Quality Assesment
© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.
Consistency Analysis
30Peter Dolog, Web Engineering, Quality Assesment
© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.
Most Accessed Entities
31Peter Dolog, Web Engineering, Quality Assesment
© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.
Missing Links
32Peter Dolog, Web Engineering, Quality Assesment
© P. Fraternali, P.L. Lanzi, M. Matera, A. Maurino. "Model-Driven Web Usage Analysis for the Evaluation of Web Application Quality". Submitted for publication to JWE, April 2004.
Statistical Testing
33Peter Dolog, Web Engineering, Quality Assesment
Statistical Testing
Web• Huge user population• Complex Architecture• Renders traditional coverage-based testing less
appropriateStatistical Testing
• Generation of test cases from usage data• Targeting high risk areas• Focuses on a probability distribution from
application domain
34Peter Dolog, Web Engineering, Quality Assesment
Outcome
Test cases generated randomlyOutcome of the test is used to predict reliability
given the usage profileThe amount of resources reduced significantly
Usual predictions focus on:• Reliability• Time to failure• Mean time between failures• All are of probabilistic nature
35Peter Dolog, Web Engineering, Quality Assesment
Process of Statistical Testing
Construct the statistical testing models or usage models basedon actual usage scenarios and frequencies.
Use these models for test case construction, selection andexecution.
Analyze the test results for reliability assessments andprediction, and for decision-making.
36Peter Dolog, Web Engineering, Quality Assesment
Usage Model
Characterizes the application operationGraph – nodes different states of usage model and
arcs transitions between themProbability distribution may be assigned to the arcs
Operation profiles are used to represent usage models
Help to guide allocation sequences
37Peter Dolog, Web Engineering, Quality Assesment
Unified Markov Model
Based on operation profilesStates and transitionsProbabilities on transitions – best estimate of actual
use
Web applications: hits and goesVisited pages or collections of related pages – statesUsage patterns/Navigation patterns and their
likelihood – transitions
Can be formed as hierarchical
38Peter Dolog, Web Engineering, Quality Assesment
Example
39Peter Dolog, Web Engineering, Quality Assesment
Hao, J. and Mendes, E. 2006. Usage-based statistical testing of web applications. In Proceedings of the 6th international Conference on Web Engineering (Palo Alto, California, USA, July 11 - 14, 2006). ICWE '06, vol. 263. ACM, New York, NY, 17-24. DOI= http://doi.acm.org/10.1145/1145581.1145585
Testing
Perform exhaustive complete coverage testing for the toplevelmodel.
Perform selective testing for some lower-level models, whichcovers frequently visited sub-parts.
Perform more selective testing for the remaining low levelUMMs.
40Peter Dolog, Web Engineering, Quality Assesment
Test case generation
Treshold set upThose which are above trashold contribute to the
critical navigation pathStart with higher numberLater decrease
41Peter Dolog, Web Engineering, Quality Assesment
Building a UMM
Web logsaccesserror
42Peter Dolog, Web Engineering, Quality Assesment
Calculating Reliability
One possibility is to look at (Nelson Model)number of errors discovered during testingamount of time needed to discover them
Relaibility = 1 – f/n
f is number of errorsn is number of hits for those errors
Mean time between failures = n/f
43Peter Dolog, Web Engineering, Quality Assesment