View
217
Download
4
Category
Preview:
Citation preview
AppendixSMART-T Questionnaire
This appendix lists the questions used in the SMART-T process to guide themigration of software testing to the cloud. The questions are structured similar tothe SMIG questionnaire from SMART, but are geared towards software testingand cloud computing. The questions focus on the following areas: (1) businessdrivers and technical factors; (2) stakeholders; (3) the legacy test environment andthe target test environment; (4) defining candidate test cases for migration; (5)describing the existing test environment; and (6) describing the target testenvironment in the cloud.
S. Tilley and T. Parveen, Software Testing in the Cloud,SpringerBriefs in Computer Science, DOI: 10.1007/978-3-642-32122-1,� The Author(s) 2012
87
Business Drivers and Technical Factors
Discussion Topic Questions
Goal and Expectations of
Migration
What are the business drivers for the migration
effort?
What are the technical drivers for the migration
effort?
Has prior pilot studies been done within the
organization for migration of software testing?
What are the expectations from the migration
effort?
What are the short-term goals?
What are the long-term goals?
What are perceived advantages/disadvantages in
migrating testing to the cloud?
Budget and Schedule
What is the timeframe for the migration?
What is the budget for migration?
Who is paying for the migration?
Are there any potential for additional resources?
Any constraints related to budget?
Other EffortsHave any other migration efforts been attempted?
What was the outcome?
Why did it fail or succeed?
What are lessons learned?
88 Appendix: SMART-T Questionnaire
Stakeholders
Discussion Topic Questions
Legacy Test Environment End
Users
Who are the end users of the legacy testing
environment?
Do all of them belong to the same organization or
group inside an organization?
Legacy System Owners
Who owns the legacy testing environment?
If there is more than one owner, are these separate
organizations?
Will the owners be directly involved with the
migration?
Legacy System Developers and
Maintainers
Who is the developer for the legacy test harness?
Are developers available to support the migration
process?
Is the maintenance group separate from the
development group?
If so, are maintainers available to support the
migration process?
People Conducting the Migration
Are current developers or maintainers going to be
performing the migration?
If not, what organization is performing the
migration?
What is the process for bringing them up to speed on
the legacy system?
Will this organization be available during the
SMART-T engagement?
TargetEnvironment
Owners
Is the target cloud environment owned and
maintained by a separate organization?
If so, will representatives be available during the
SMART-T engagement and during the migration
process?
Appendix: SMART-T Questionnaire 89
Legacy Test Environment and Target Test Environment
Discussion TopicQuestions
High Level Understanding of the Legacy Test
Environment
What is the main functionality provided by the legacy
test environment
What is the history of the legacy test environment
What is the high-level architecture of the system?
What portion of the system is envisioned for migration?
What is the current user interface to the system?
How complex is the user interface?
Candidate Test cases for
Migration
Have potential test cases for migration been identified?
What was the process to identify these test cases?
What applications will be using these tests?
How do these selected tests/test suites represent the goals
of migration?
What types of tests are currently in use in the system?
What types of tests will be migrated?
What programming language(s) is(are) the tests written
in?
What dependencies exist between the tests/test suites?
Are dependencies easy to identify?
Are the tests readable and easy to understand?
Are the tests documented?
Are the test results clear and easy to understand?
How difficult is test maintenance in the current system?
What difficulties exist in test maintenance?
High Level Understanding of
the Target Environment
What is the main driver to choose the cloud?
What are the main components in the target test environment?
Is it a standard or proprietary environment?
90 Appendix: SMART-T Questionnaire
Define Candidate Test Cases for Migration
Discussion Topic Questions
Characteristics of Test
Cases
Which application does the test cases belong to?
How many test cases are there?
What language are the test cases written in?
What is the high-level flow of execution for the
test cases?
What tools are needed to execute the test cases?
Are these tools commercial or home grown?
How are they test results presented?
Test Case Creation Who creates the test cases?
What process is followed in creating the test
cases?
Are manual test cases written before test cases are
automated?
What portion of the test cases can be converted to
automated test cases?
Test Code A nalysis What programming languages were used in the
development of the system?
What code documentation is available?
What coding standards are followed?
Is the coding standards document available?
What is the code structure?
What is the mapping between this structure and
the module view of the system?
Appendix: SMART-T Questionnaire 91
What coding standards are followed?
Is the coding standards document available?
Describe Existing Test Environment
Discussion
Topic
Questions
Functionality What is the main functionality provided by the system?
History What is the history of the system?
How old is it?
Who built it?
Who maintains it?
Platform What is the execution platform?
Test
Documentation
What system documentation is available?
How old is the documentation?
What part of the system is not documented or has outdated
documentation?
Development
Environment
What is the development environment?
Architecture Is there architecture documentation for the system?
How old is the architecture documentation?
Is there support available for all commercial components?
How will the commercial components adapt to the cloud
environment?
Are there any proprietary interfaces or connectors?
What are the major modules of the system?
What is the runtime view of the system?
How is dependencies handled?
Code What programming languages were used in the
development of the system?
What code documentation is available?
92 Appendix: SMART-T Questionnaire
Describe Target Test Environment
Discussion Topic
Questions
Constraints What are the constraints for using cloud for testing applications in this
organization?
What are the security concerns?
What are potential problems caused by these constraints (i.e. direct calls
to the operating system)?
Are there constraints on the use of commercial products?
If there are problems, are there potential replacements?
Ownership Who owns the target cloud environment?
If an external organization, what current communication and
collaboration exists?
Test Execution Platform
Where will the tests be executed?
Will they be hosted/ stored in the cloud or will they be delivered as
needed?
Are there requirements to write deployment procedures?
Cloud Infrastructure
What are the major components of the cloud infrastructure to be used
for testing?
Which components are commercial and which will be developed
internally?
Is there documentation available?
How well specified is the infrastructure?
Does thecloud environment provide infrastructure services; i.e.
communication, security, data storage?
How will these services be handled?
What is the communication model?
Are there tools to help in this area?
Are there available libraries and tools for the legacy platform to connect
to the infrastructure?
Appendix: SMART-T Questionnaire 93
References
1. Agrawal H.: Efficient coverage testing using global dominator graphs. In: Proceedings ofProgram Analysis for Software Tools and Engineering (PASTE’99), pp. 11–20, 1999
2. Amazon EC2 Pricing. http://aws.amazon.com/ec2/pricing/3. Amazon S3 Pricing. http://aws.amazon.com/s3/pricing/4. Amazon Web Services (AWS). http://aws.amazon.com/5. Apache. Ant. http://ant.apache.org/6. Apache. http://www.apache.org/7. Apache. Hadoop: An open source implementation of MapReduce. http://lucene.apache.
org/hadoop/8. Apache. Word Count http://wiki.apache.org/hadoop/WordCount9. Armbrust, M., Fox, A., Grith, R., Joseph, A., Katz, R., Konwinski, A., Lee, G., Patterson,
D., Rabkin, A., Stoica, I., Zaharia, M.: Above the clouds: a Berkeley view of cloudcomputing. Technical Report UCB/EECS-2009-28, Electrical Engineering and ComputerSciences, University of California at Berkeley, 2009
10. Barham, P., Dragovic, B., Fraser, K., Hand, S., Harris, T., Ho, A., Neugebauer, R., Pratt, I.,Warfield, A.: Xen and the art of virtualization. In: Proceedings of the 19th ACMSymposium on Operating Systems Principles (SOSP), pages 164–177, October 2003
11. Beck, K.: Extreme Programming Explained: Embrace Change. Addison-WesleyProfessional, Boston (1999)
12. Beck, K.: Test Driven Development: By Example. Addison-Wesley Longman PublishingCo., Inc., Boston (2002)
13. Bergey, J., O’Brien, L., Smith, D.: Options Analysis for Reengineering (OAR): A methodfor mining legacy assets. Technical Report CMU/SEI-2001-TN-013, Pittsburgh, PA:Software Engineering Institute, Carnegie Mellon University, 2001
14. Berndt, D., Watkins, A.: High volume software testing using genetic algorithms. In:Proceedings of the 38th Annual Hawaii International Conference on System Sciences(HICSS), 2005
15. Black, J., Melachrinoudis, E., Kaeli, D.: Bi-criteria models for all-uses test suite reduction.In: Proceedings of the 26th International Conference on Software Engineering (ICSE 2004:May 23–28, 2004; Edinburgh, UK). IEEE CS Press, 2004
16. Bower, E.: Performance analysis of a distributed execution environment for JUnit test caseson a small cluster. MSE thesis, Department of Computer Sciences, Florida Institute ofTechnology, July 2010
17. Brodie, M., Stonebraker, M.: Migrating Legacy Systems-Gateway, Interfaces & IncrementalApproach. Morgan Kaufmann Publishers, Inc., San Francisco (1995)
S. Tilley and T. Parveen, Software Testing in the Cloud,SpringerBriefs in Computer Science, DOI: 10.1007/978-3-642-32122-1,� The Author(s) 2012
95
18. Bryce, R.: Renewable Energy Can’t Run the Cloud. Wall Str. J. May 29, 201219. Buyya, R., Yeo, C., Venugopal, S., Broberg, J., Brandic, I.: Cloud computing and emerging
IT platforms: vision, hype, and reality for delivering computing as the 5th utility. FutureGener. Comput. Syst. 25, 599–616 (2009)
20. Chen, Y.F., Rosenblum, D., Vo, K.P.: Testtube: a system for selective regression testing. In:Proceedings of the 16th International Conference on Software Engineering (ICSE’94: May16–21, 1994; Sorrento, Italy), pp. 211–220. IEEE CS Press, 1994
21. Chu, C., Kim, S., Lin, Y., Yu, Y., Bradski, G., Ng, A., Olukotun, K.: Map-Reduce formachine learning on multicore. In: Advances in Neural Information Processing Systems(NIPS), pp. 281–288, Vancouver, Canada, 2006
22. Chun, B.: Dart: distributed automated regression testing for large-scale networkapplications. In: Proceedings of the 8th International Conference on Principles ofDistributed Systems, 2004
23. Cimitile, A., Carlini, U., Lucia, A.: Incremental migration strategy: data flow analysis forwrapping. In: Proceedings of the 5th IEEE Working Conference on Reverse Engineering(WCRE’98: Oct. 12–14, 1998; Honolulu, HI), pp. 59–68. IEEE CS Press, 1998
24. Cimitile, A., Lucia A., Lucca G., Fasolino A.: Identifying objects in legacy systems. In:Proceedings of the 5th International Conference on Program Comprehension (WPC’97:May 28–30, 1997; Detroit, MI), pp. 138–147. IEEE CS Press, 1997
25. Comella-Dorda, S., Wallnau, K., Seacord, R. Robert, J. A survey of legacy systemmodernization approaches. Technical Report CMU/SEI-2000-TN-003. Pittsburgh, PA:Software Engineering Institute, Carnegie Mellon University, 2000
26. Cooney, M.: U.S. Energy lab nabs 10-petaflop IBM supercomputer for future research.Network World, February 2011
27. Copeland, P.: Google’s innovation factory: testing, culture, and infrastructure. In:Proceedings of the 3rd IEEE International Conference on Software Testing, Verification,and Validation (ICST 2010: April 6–10, 2010; Paris, France), pp. 11–14. IEEE CS Press, 2010
28. Crane, D., Pascarello, E., James, D.: Ajax in Action. Manning Publications Co., Greenwich(2005)
29. Dean, J., Ghemawat, S.: MapReduce: a flexible data processing tool. Commun. ACM 53(1),72–77 (2010)
30. Dean, J., Ghemawat, S.: MapReduce: simplified data processing on large clusters. Commun.ACM 51(1), 107–113 (2008)
31. Dean, J., Ghemawat, S.: MapReduce: simplified data processing on large clusters. In: 6thSymposium on Operating System Design and Implementation (OSDI’04), 2004
32. DejaGnu: http://www.gnu.org/software/dejagnu/33. Do, H., Rothermel, G., Kinneer, A.: Prioritizing JUnit test cases: an empirical assessment
and cost-benefits analysis. Empir. Softw. Eng.: Int. J. 11(1), 33–70 (2006)34. Dollimore, J., Kindberg, T., Coulouris, G.: Distributed Systems: Concepts and Design, 4th
edn. Addison Wesley,Boston (2005)35. Duarte, A., Cirne, W., Brasileiro, F., Machado, P.: Gridunit: software testing on the grid. In:
Proceedings of the 28th International Conference on Software Engineering (ICSE 2006:May 20–28, 2006; Shanghai, China), pp. 779–782. IEEE CS Press, 2006
36. Duarte, A., Wagner, G., Brasileiro, F., Cirne, W.: Multi- environment software testing onthe grid. In: Proceedings of the 2006 Workshop on Parallel and Distributed Systems: testingand Debugging (PADTAD’06), pp. 61–68. ACM Press, 2006
37. Duarte, R., Cirne, W., Brasileiro, F., Duarte, P., Machado, D.: Using the computational gridto speed up software testing. In: Proceedings of 19th Brazilian Symposium on SoftwareEngineering, 2005
38. Elbak, A., Andersen, P.: IDC Energy Report—Potential Server and Datacenter CO2 Savingsin Denmark, October, 2009
96 References
39. Elbaum, S., Malishevsky, A., Rothermel, G.: Incorporating varying test costs and faultseverities into test case prioritization. In: Proceedings of the 23rd International Conferenceon Software Engineering (ICSE 2001: May 12–19, 2001; Toronto, Canada), pp. 329–338,IEEE CS Press, 2001
40. Elbaum, S., Malishevsky, A., Rothermel, G.: Test case prioritization: a family of empiricalstudies. IEEE Trans. Softw. Eng. 28(2), 159–182 (2002)
41. Electric Cloud. http://www.electric-cloud.com/42. Elsayed, T., Lin, J., Oard D.: Pairwise document similarity in large collections with
MapReduce. In: Proceedings of the 46th Annual Meeting of the Association forComputational Linguistics (ACL 2008), Companion Volume, pp. 265–268, Columbus,Ohio, 2008
43. Eucalyptus. http://open.eucalyptus.com44. Expect. http://expect.nist.gov/45. Feldmen, S.: Make-A program for maintaining computer programs. Softw. Pract.
Experience 9(4), 255–256 (Apr. 1979)46. Fischer, K., Raji F., Chruscicki, A.: A methodology for retesting modified software. In:
Proceedings of National Telecommunication Conference, pp. B6.3.1–B6.3.6, 198147. Foster, I., Kesselman, C.: The Grid: Blueprint for a New Computing Infrastructure. Morgan
Kaufmann, San Francisco (2004)48. Foster, I., Zhao, Y., Raicu, I., Lu, S.: Cloud Computing and Grid Computing 360-degree
compared. Grid Computing Environments Workshop, pp. 1–10, 200849. Ghemawat, S., Gobio, H., Leung, S.: The Google file system. In: Proceedings of the 19th
ACM Symposium on Operating Systems Principles (SOSP 2003), pp. 29–43, BoltonLanding, New York, 2003
50. GNU: GCC, the GNU Compiler Collection: http://gcc.gnu.org/ Last accessed: January, 201151. Google. Google C++ Testing Framework. Online at code.google.com/p/googletest/52. Goseva-Popstojanova, K., Hamill, M., Perugupalli, R.: Large empirical case study of
architecture-based software reliability. In: Proceedings of the 16th IEEE InternationalSymposium on Software Reliability Engineering (ISSRE), pp. 43–52, 2005
53. Hadoop Cluster. http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-multi-node-cluster/
54. Hadoop Streaming. http://hadoop.apache.org/common/docs/r0.15.2/streaming.html55. Harrold, M., Soffa M.: An incremental approach to unit testing during maintenance. In:
Proceedings of the 14th IEEE International Conference on Software Maintenance (CSM’88:Nov. 16–19, 1998; Bethesda, MD), pp. 362–367. IEEE CS Press, 1998
56. Harrold, M., Gupta, R., Soffa, M.: A methodology for controlling the size of a test suite.ACM Trans. Softw. Eng. Methodol. 2(3), 270–285 (1993)
57. Harrold, M., Jones, J., Li, T., Liang, D., Orso, A., Pennings, M., Sinha, S., Spoon, S.,Gujarathi, A.: Regression test selection for java software. In: Proceedings of the ACMConference on Object-Oriented Programming, Systems, Languages, and Applications(OOPSLA), November 2001
58. Hartmann, J., Robson, D.: Techniques for selective revalidation. IEEE Softw. 7(1), 31–36(1990)
59. Hoste, K., Eeckhout, L.: Cole: compiler optimization level exploration. In: Proceedings ofthe 6th annual IEEE/ACM international symposium on code generation and optimization(CGO ‘08), pp. 165–174. ACM Press, 2008
60. HP. Loadrunner. www.hp.com/LoadRunner61. http://gcc.gnu.org/onlinedocs/libstdc++/manual/test.html62. http://www.linuxselfhelp.com/gnu/dejagnu/html_chapter/dejagnu_6.html63. IBM. http://www.ibm.com64. IEEE Std. 610.12-1990, Standard Glossary of Software Engineering Terminology, IEEE,
1990
References 97
65. Jeffrey, D., Gupta N.: Test case prioritization using relevant slices. In: Proceedings ofInternational Computer Software Applications Conference, pp. 411–420, September 2006
66. Jetty. http://www.mortbay.org67. Kajko-Mattsson, M., Lewis, G., Smith, D.: Evolution and maintenance of SOA-based
systems at SAS. In: Proceedings of the 41st Annual Hawaii International Conference onSystem Sciences (HICSS 2008), 2008
68. Kaner, C., Bond, W., McGee, P.: High volume test automation. Keynote, InternationalConference on Software Testing Analysis and Review (STAREAST 2004: Orlando, FL,May 20, 2004)
69. Kaner, C., Bach. J.: GUI-Level Test Automation and Its Requirements. www.testingeducation.org/k04/documents/BBSTGUIautomation.ppt
70. Kapfhammer, G.: Automatically and transparently distributing the execution of regressiontest suites. In: Proceedings of the 18th International Conference on Testing ComputerSoftware, 2001
71. Kim J., Porter A.: A history-based test prioritization technique for regression testing inresource constrained environments. In: Proceedings of the 24th International Conference onSoftware Engineering (ICSE 2002: May 19–25, 2002; Orlando, FL). IEE CS Press, 2002
72. Kundra, V.: Federal Cloud Computing Strategy. www.cio.gov/documents/Federal-Cloud-Computing-Strategy.pdf
73. Kung, D.C., Gao, J., Hsia, P., Toyoshima, Y., Chen, C.: On regression testing of object-oriented programs. J. Syst. Softw. 32(1), 21–31 (1996)
74. Lakhotia, A.: Understanding someone else’s code: analysis of experiences. J. Syst. Softw.23, 269–275 (1993)
75. Lauriat, S.: Advanced Ajax: Architecture and best practices. Prentice Hall PTR, UpperSaddle River (2007)
76. Leon, D., Podgurski, A.: A comparison of coverage-based and distribution- basedtechniques for filtering and prioritizing test cases. In: Proceedings of the 14th IEEEInternational Symposium on Software Reliability Engineering (ISSRE), pp. 442–453, 2003
77. Leung, H., White, L.: Insights into regression testing. In: Proceedings of the Conference onSoftware Maintenance, pp. 60–69, October 1989
78. Lewis, G., Morris, E., Simanta, S., Wrage L.: Common misconceptions about service-oriented architecture. In: Proceedings of the 6th International IEEE Conference onCommercial-off-the-Shelf (COTS)-Based Software Systems (ICCBSS’07), IEEE, 2007,pp. 123–130
79. Lewis, G.: Service-Oriented Architecture and its Implications for Software LifecycleActivities. Webinar Series, Carnegie Mellon University, Software Engineering Institute,2008. http://www.sei.cmu.edu/library/abstracts/webinars/14aug2008.cfm
80. Lewis, G., Morris, E., Smith, D.: Migration of legacy components to service-orientedarchitectures. Softw. Archaeol. 8, (2005)
81. Lewis, G., Morris, E., Simanta, S., Wrage, L.: Effects of service-oriented architecture onsoftware development lifecycle activities. Softw. Process: Improv. Pract. 13, 135–144 (2008)
82. Lewis, G., Morris, E., Smith, D., Simanta, S.: SMART: Analyzing the Reuse Potential ofLegacy Components in a Service-Oriented Architecture Environment. Technical NoteCMU/SEI-2008-TN-008, Carnegie Mellon University, Software Engineering Institute, 2008
83. Li, J., Chinneck, J., Woodside, M., Litoiu, M.: Deployment of services in a cloud subject tomemory and license constraints. In: Proceedings of the IEEE 2009 International Conferenceon Cloud Computing (CLOUD 2009), pp. 33–40. IEEE Press, 2009
84. Lin J.: Scalable language processing algorithms for the masses: a case study in computingword co-occurrence matrices with Mapreduce. In: Proceedings of the 2008 Conference onEmpirical Methods in Natural Language Processing (EMNLP 2008), pp. 419–428,Honolulu, Hawaii, 2008
85. Lin, J., Dyer C.: Data-Intensive Text Processing with MapReduce. Morgan & ClaypoolPublishers, San Rafael (2010)
98 References
86. Mann, K., Jones, T.: Distributed Computing with Linux and Hadoop. IBM DeveloperWorks, December 2008
87. McCarthy, J.: John McCarthy’s 1959 Memorandum. IEEE Ann. Hist. Comput. 14(1), 20–23(1992)
88. McGee, P., Kaner, C.: Experiments with high volume test automation. ACM SIGSOFTSoftw. Eng. Notes 29(5), 1–3 (2004)
89. Memon, A., Porter, A., Yilmaz, C., Nagarajan, A., Schmidt, D., Natarajan, B.: Skoll:distributed continuous quality assurance. (ICSE 2004: May 23–28, 2004; Edinburgh, UK),pp. 459–468. IEEE CS Press, 2004
90. Microsoft Virtual PC. http://www.microsoft.com/virtualpc91. Nurmi, D., Wolski, R., Grzegorczyk, C., Obertelli, G., Soman, S., Youseff L., Zagorodnov,
D.: Eucalyptus: a Technical Report on an Elastic Utility Computing Architecture LinkingYour Programs to Useful Systems. UCSB Computer Science Technical Report, Number2008-10, 2008
92. Nurmi, D., Wolski, R., Grzegorczyk, C., Obertelli, G., Soman, S., Youseff, L., Zagorodnov,D.: The Eucalyptus open source cloud-computing system. In: Proceedings of CloudComputing and Its Applications workshop (CCA’08), 2008
93. Onoma, A., Tsai, W., Poonawala, M., Suganuma, H.: Regression testing in an industrialenvironment. Commun. ACM 41(5), 81–85 (May 1998)
94. OurGrid. http://www.ourgrid.org/95. Patton, R.: Software Testing, 2nd edn. Sams, Indiana (2005)96. Qu, B., Nie, C., Xu, B., Zhang, X.: Test case prioritization for black box testing. In:
Proceedings of the 31st IEEE International Conference on Computer Software andApplications (COMPSAC), pp. 465–474, 2007
97. Rothermel, G., Harrold, M.: A safe, efficient regression tests selection technique. ACMTrans. Softw. Eng. Methodol. 6(2), 73–210 (1997)
98. Rothermel, G., Harrold, M.J.: Analyzing regression test selection techniques. IEEE. Trans.Softw. Eng. 22(8), 529–551 (1996)
99. Rothermel, G., Harrold, M.J., Dedhia, J.: Regression test selection for C++ software.J. Softw. Test, Verification, Reliab. 10(6), 77–109 (2000)
100. Rothermel, G., Untch, R., Chu, C.: Prioritizing test cases for regression testing. IEEE Trans.Softw. Eng. 27(10), 929–948 (2001)
101. Rothermel, G., Untch, R., Chu, C., Harrold, M.: Test case prioritization: an empirical study.In: Proceedings of the 16th IEEE International Conference of Software Maintenance(ICSM), pp. 179–188, 1999
102. Saff, D., Ernst M.: An experimental evaluation of continuous testing during development.International Symposium on Software Testing and Analysis (ISSTA 2004: July 2004;Boston, MA), pp. 76–85. ACM Press, 2004
103. Saff, D., Ernst, M.: Reducing wasted development time via continuous testing. In:Proceedings of the 14th IEEE International Symposium on Software Reliability Engineering(ISSRE), pp. 281–292, Denver, CO. 2003
104. SauceLab. http://saucelabs.com/105. SEI. Migrating Legacy Systems to SOA Environments http://www.sei.cmu.edu/training/
V06.cfm106. Shadish, W., Cook, T., Campbell, D.: Experimental and Quasi-Experimental Designs for
Generalized Causal Inference. Houghton Mifflin, Boston (2002)107. Shivastava, A., Thiagarajan, J.: Effectively prioritizing tests in development environment.
In: Proceedings of the International Symposium on Software Testing and Analysis (ISSTA),pp. 97–106. ACM Press, 2002
108. Skytap. http://www.skytap.com/109. SOASTA. http://www.soasta.com/110. Spring Framework. http://www.springsource.org/
References 99
111. Sreedevi, S., Bryce, R., Viswanath, G., Kandimalla, V., Koru, A.: Prioritizing user-session-based test cases for web applications testing. In: Proceedings of the 1st IEEE InternationalConference on Software Testing Verification and Validation (ICST 2008), pp. 141–150.IEEE CS Press, 2008
112. Srikanth, H., Williams, L., Osborne, J.: System test case prioritization of new and regressiontest cases. In: Proceedings of the International Symposium on Empirical SoftwareEngineering, pp. 64–73, 2005
113. Tanenbaum, A., Steen, M.: Distributed Systems: Principles and Paradigms, 2nd edn.Prentice Hall, Englewood Cliffs (2006)
114. TCL. http://tcl.sourceforge.net/115. Tilley, S.: The evolution process framework. Lecture notes, Software Maintenance &
Evolution course, Florida Institute of Technology, Spring 2009116. Tilley, S.: SWE 5002: Software Engineering 2, Florida Institute of Technology, offered
Spring 2009117. Tilley, S., Parveen, T.: 2nd International Workshop on Software Testing in the Cloud
(STITC 2010), Held as part of the 3rd International Conference on Software Testing,Verification and Validation (ICST 2010: April 6–9, 2010; Paris, France)
118. Tilley, S., Parveen, T. (eds.): Software Testing in the Cloud: Perspectives on an EmergingDiscipline. IGI Global, 2012.
119. US National Institute of Standards and Technology (NIST). A NIST Definition of CloudComputing. Report SP 800-145, Sept. 2011
120. uTest. http://www.utest.com/121. Vaquero, L., Merino, L., Caceres, J., Lindner, M.: A break in the clouds: towards a cloud
definition. SIGCOMM Comput. Commun. Rev. 39(1), 50–55122. VMWare. Understanding Full Virtualization, Paravirtualization, and Hardware Assist.
http://www.vmware.com/resources/techresources/1008123. VMware. vSphere Hypervisor (ESXi). http://www.vmware.com/products/vsphere-hypervisor/124. Vokolos, F., Frankl, P.: Empirical evaluation of the textual differencing regression testing
technique. In: Proceedings of the 14th IEEE International Conference of SoftwareMaintenance (CSM’88: Nov. 16-19, 1998; Bethesda, MD), pp. 44–53. IEEE CS Press, 1998
125. Walcott, K., Soffa, M., Kapfhammer, G., Roos, R.: Time aware test suite prioritization. In:Proceedings of the International Symposium on Software Testing and Analysis (ISSTA),pp. 1–12, ACM Press, 2006
126. White, T.: Hadoop: The Definitive Guide. O’Reilly, Yahoo! Press (2009)127. Wolfe, J., Haghighi, A., Klein, D.: Fully distributed EM for very large datasets In:
Proceedings of the 25th International Conference on Machine Learning, pp. 1184–1191,Helsinki, Finland, 2008
128. Wong, E., Horgan, J.R., London, S., Bellcore, H.: A study of effective regression testing inpractice. In: Proceedings of the 8th IEEE International Symposium on Software ReliabilityEngineering (ISSRE), pp. 264–274, 1997
129. Wu, B., Lawless, D., Bisbal, J., Richardson, R., Grimson, J., Wade, V., O’Sullivan, D.: Thebutterfly methodology: a gateway-free approach for migrating legacy information systems.In: Proceedings of the 3rd IEEE Conference on Engineering of Complex Computer Systems(ICECCS), Italy, September 1997
130. Xavier C., Iyengar S.: Introduction to parallel algorithms. In: Zomaya, A.Y. (ed.) WileySeries on Parallel & Distributed Computing, Wiley, New York (1998)
131. xUnit. http://sourceforge.net/projects/xunit/132. Yahoo!. Managing a Hadoop Cluster. http://developer.yahoo.com/hadoop/tutorial/module7
.html133. Youseff, L., Wolski, R., Gorda, B., Krintz, C.: Paravirtualization for HPC systems.
Workshop on Xen in high-performance cluster and grid computing, 2006134. Yuejian, L., Nancy, W.: An overview of regression testing. ACM SIGSOFT Softw. Eng.
Notes 24(1), 69–73 (January 1999)
100 References
135. Zhang, X., Nie, C., Xu, B., Qu, B.: Test case prioritization based on varying testingrequirement priorities and test case costs. In: Proceedings of the IEEE InternationalConference on Quality Software (QSIC), pp. 15–24, 2007
136. Zheng, L., Harman, M., Hierons, R.: Search algorithms for regression test caseprioritization. IEEE Trans. Softw. Eng. 33(4), 225–237 (2007)
References 101
About the Authors
Scott Tilley is a faculty member at the Florida Institute of Technology, where heis a Professor of Software Engineering in the Department of Computer Sciences, aProfessor of Information Systems in the College of Business, and an AssociateMember of the Harris Institute for Assured Information. He is also a VisitingScientist at Carnegie Mellon University’s Software Engineering Institute. Hiscurrent research is in software testing, cloud computing, and system migration. Heis the lead editor of the book Software Testing in the Cloud: Perspectives on anEmerging Discipline (IGI Global, 2012). He writes the weekly ‘‘TechnologyToday’’ column for the Florida Today newspaper (Gannett). Scott holds a PhD inComputer Science from the University of Victoria.
Tauhida Parveen is an independent consultant and trainer with an emphasis oncloud computing and software testing. She has worked in quality assurance withorganizations such as WikiMedia Foundation, Millennium Engineering &Integration, Yahoo!, Sabre, and Progressive Auto Insurance. She has presentedat numerous trade conferences, published in several academic journals, andorganized workshops at international events. She is an ISTQB Foundation LevelCertified Software Tester (CTFL). She is the co-editor of the book SoftwareTesting in the Cloud: Perspectives on an Emerging Discipline (IGI GLobal, 2012).Tauhida holds a PhD in Computer Science from the Florida Institute ofTechnology.
S. Tilley and T. Parveen, Software Testing in the Cloud,SpringerBriefs in Computer Science, DOI: 10.1007/978-3-642-32122-1,� The Author(s) 2012
103
Recommended