<<

. 10
( 10)



AUT Title: AUT Title the definitive title of the application under test
Date: Date the date this document was completed



Distribution
Copy Number Recipient
1. recipient 1
2. recipient 2

3. recipient 3
etc.

N. Project File




Contents
1. Introduction
2. Overview
3. Variances
4. Assessment
5. Results
6. Evaluation
7. Summary of Activities


1 Introduction

1.1 Background
This document provides the Test Summary for the testing activities within iteration
Iteration ID for the AUT Title .
This section should contain a brief background and history to this testing project
which should include a brief description of the purpose of the AUT, de¬nitive version
information for the AUT, any other relevant supporting information.


1.2 Structure of the Report
This report is structured in the following manner:

Section 2, Overview, provides an overview of the signi¬cant events and activities
documented within the Iteration ID testing of the AUT Title .
Appendix G. Agile Test Summary Report Template
302


Section 3, Variances, records any variances of the artifacts from those areas
agreed on previously, especially in areas that may cause concern to the group
accepting the test results, including any references to supporting documentation
that covers the reasons for the deviations.
Section 4, Assessment, provides a brief assessment of the comprehensiveness of
the testing process for the Iteration ID testing of the AUT Title .
Section 5, Results, provides a summary of the results of the Iteration ID testing
of the AUT Title .
Section 6, Evaluation, provides an overall evaluation of the testing process,
including any observed problems and/or limitations.
Section 7 provides a summary of the major testing activities and events for the
Iteration ID testing of the AUT Title .


1.3 References
A list of documents referenced within this test summary report document.


2 Overview
This section provides a high-level overview of the signi¬cant events and activities
documented during the Iteration ID testing of the AUT Title .
This section also speci¬es the scope of the testing (what was and what was not
tested), and speci¬es the test environment details (including the hardware, software,
and data used in the testing).

For example:
The Iteration 02-Team-Org-Chart-01 Test for the CTS Wiki (v1.0, 2008) was
begun on July 7, 2008, and completed on July 8, 2008. During this testing only the
CTS Org Chart section of the Wiki was considered.
The testing was conducted on an IBM T60 Laptop with 4Gbytes of memory
running Microsoft Windows XP Professional V2002 SP2.


3 Variances
This section is used to record any variances of the artifacts from those areas agreed
on previously, especially in areas that may cause concern to the group accepting the
test results and including any references to supporting documentation that covers
the reasons for the deviations.

For example:
Conditions observed during the course of testing resulted in the design of addi-
tional test cases to explore concerns regarding the CTS Wiki Org Chart legibility.
A number of additional defects were identi¬ed as a result, which were subsequently
corrected and successfully retested.
Appendix G. Agile Test Summary Report Template
303


4 Assessment
This section provides a brief assessment of the comprehensiveness of the testing
process for the completed testing phase against the test objectives and constraints
speci¬ed in the test plan document.
Where code coverage measurements have been made, the results should also be
included in this section.
This section also identi¬es any aspects of the AUT that were not tested as thor-
oughly as planned (due to insuf¬cient time or resources).
For example:
All test cases were executed with the exception of Test Cases 02-Team-Org-
Chart-04 and 02-Team-Org-Chart-05 (testing the CTS Wiki Org Chart Navigation),
which were omitted due to challenging testing timescales combined with the need
to further investigate concerns associated with the CTS Wiki Org Chart legibility.


5 Results
This section provides a summary of the results of the Iteration ID testing of the AUT
Title , identi¬es all resolved issues, and summarizes the details of their resolution
and lists any outstanding issues.
For example:
Test Case 02-Team-Org-Chart-03 revealed problems with the legibility of the
CT Wiki Org Chart (and in particular potential issues with Red“Green perceptually
challenged users). The developers corrected the observed defects and the amended
code passed the retest. However, it was not possible within the timescales of the
testing task to regression-test the associated navigation requirements. This issue
remains outstanding and should be observed following installation and use in the
live environment.


6 Evaluation
This section provides an overall evaluation of the testing process including problems
and limitations.
For example:
The Org Chart section of the CTS Wiki (v1.0, 2008) underwent comprehensive
testing, with only two defects being observed.
Additional test cases were designed and executed to explore the CTS Wiki Org
Chart legibility, and following correction and retesting it is believed that legibility
will be acceptable in use.
Since a number of problems were observed with navigation, and only perfunctory
retesting was possible due to challenging testing timescales, it is thought that there
will be a medium likelihood of navigation failure in use.
Appendix G. Agile Test Summary Report Template
304


7 Summary of Activities
This section provides a summary of the major testing activities and events for
the Iteration ID testing of the AUT Title . This section also summarizes testing
resource information, such as total staf¬ng levels and total testing time.

For example:
Test Start Date: July 7th 2008 Test End Date: July 8th 2008

Item Planned Actual

Staff Levels 3 2
Test Design Effort 1.5 2.0
Test Execution Effort 2 2
Re-Test Effort 0.5 0.75
Test Management and Reporting 0.5 0.5
Etc.
APPENDIX H




My Agile Process Checklist




H.1 Introduction
To assist you in the selection of agile practices to populate your own agile process,
this appendix contains a checklist of agile practices that show graphically which
practices are particularly appropriate for different styles and sizes of agile project. This
information can be used as a visual summary and aide memoire of the information
provided in Chapter 25.
To help you understand the format and contents of the checklist, refer to Fig-
ure H.1.



Agile Practices Agile Project Style

Large
Foundation Small Medium Offshore
Agile Development and Testing Practices
Iteration
Pair Testing
Early Involvement of Test Practices




This practice is not
This practice
Use of this practice Use of this practice
applicable as a This practice should has particular
should be considered should be strongly
foundation practice be very strongly challenges on
with care on a small considered on a considered for use offshore
project medium project on a large project projects




Explanation of the format and content of the agile practices checklist.
H.1




305
Appendix H. My Agile Process Checklist
306


Foundation Small Medium Large Offshore

Agile Development and Testing Practices

Iteration

Pair Testing

Early Involvement of Test Resources

Test-Driven Design

All Defects Fixed Immediately

Continuous Integration

Test Refactoring

Identify Targets of Test

Code Coverage Metrics

Rapid Prototyping

Collective Code & Test Ownership

Every Day Is Test Day

Agile Exploratory Testing

Agile Process and Project Management Practices

Co-Location of Project Stakeholders

Be Sensitive to the Reuse of Traditional Test Techniques

Well-Bounded Iterations

Progress Measurement

Progress Visualization

Availability of Agile Process Guidance

Metrics

Process Improvement

Everyone Is a Software Engineer

Agile Estimation

Agile Requirements Management Practices

Use Cases / User Stories
Appendix H. My Agile Process Checklist
307


Reduce Requirements Documentation Overload

Look Beyond Requirements to Customer Needs

Ensure all Requirements have Tests

Agile Meetings and Communications Practices
Improve Interpersonal Communications

Agile Project Start-up Meeting

Agile Iteration Start-up Meeting

Daily Stand-up Meeting

Interim Iteration Meeting

Agile Retrospectives

Agile Workshop Meetings

Agile Closedown Meeting

Agile Automation Practices
Tool Selection Process

Static Analysis Tools

Automated Unit Test

Test Harness Tools

Functional Test Tools

Requirements Management Tool

Build Management Tools

Automated Configuration Management

Change Management Tools

Defect Tracking Tools

Process Enactment Tools
References




1. National Institute of Standards and Technology (NIST), U.S. Dept. of Commerce, The Eco-
nomic Impacts of Inadequate Infrastructure for Software Testing, May 2002.
2. Miller, P., An SEI Process Improvement Path to Software Quality, Presented at Quality of
Information and Communications Technology, 2007.
3. Object Management Group, Software Process Engineering Metamodel (SPEM) 2.0 Request
for Proposal, May 2005.
4. Watkins, J., Testing IT: An Off-the-Shelf Software Testing Process, Cambridge University
Press, 2001.
5. Royce, W. Managing the Development of Large Software Systems, Proceedings of IEEE
WESCON, August 1970.
6. Boehm, B., A Spiral Model of Software Development and Enhancement, IEEE Computer, May
1988.
7. Krushten, P, The Rational Uni¬ed Process, 2nd ed., Reading, MA: Addison-Wesley, 2000.
8. Jacobson, I., et al., Object-Oriented Software Engineering: A Use Case Driven Approach,
Wokingham, UK: Addison-Wesley, 1992.
9. Rentsch, T., Object-Oriented Programming, SIGPLAN Notices, Vol. 17, No. 18, p. 9, Septem-
ber 1982.
10. Carroll, J., and Long, D., Theory of Finite Automata with an Introduction to Formal Lan-
guages. Englewood Cliffs, NJ: Prentice Hall, 1989.
11. Booch, G. Object-Oriented Analysis and Design with Applications, 2nd ed., Redwood City, CA,
1994.
12. Rumbaugh, J., et al., Object-Oriented Modelling and Design, Englewood Cliffs, NJ: Prentice
Hall, 1991.
13. http://www.stickyminds.com/sitewide.asp?Function=edetail&ObjectType=ART&
ObjectId=3572&tth=DYN&tt=siteemail&iDyn=2
14. IBM Rational Purify Plus Documentation, International Business Machines, 2008.
15. Graham, D. and Fewster, M., Automating Software Testing, Reading, MA: Addison-Wesley,
1999.
16. Martin, J., RAD, Rapid Application Development, New York: MacMillan, 1990.
17. http://blogs.msdn.com/jmeier/archive/2007/10/21/how-to-use-time-boxing-for-getting-
results.aspx



309
References
310


18. Benyon-Davies, P., et al., Rapid Application Development (RAD): An Empirical Review, Euro-
pean Journal of Information Systems, Vol. 8, 1999.
19. Crinnion, J., Exploitation of the 4GL, Software Development ™92 “ Management Track,
Blenheim Online Publications, London, 1992.
20. Kent, B., Extreme Programming Explained: Embrace Change, Addison-Wesley, 2000.
21. DSDM Consortium, The DSDM Framework, v1.0, 1995.
22. CCTA, PRINCE 2: Project Management for Business, 4th ed., 1996.
23. Takeuchi, H., and Nonaka, I., The New New Product Development Game, Harvard Business
Review, Jan.“Feb. 1986.
24. DeGrace, P., and Stahl, L. H., Wicked Problems, Righteous Solutions: A Catalogue of Modern
Software Engineering Paradigms, Prentice-Hall, 1990.
25. Sutherland, J., and Schwaber, K., Agile Development Lessons Learnt from the First Scrum,
OOPSLA ™96, Austin, TX, 1996.
26. Schwaber, K., and Beedle, M., Agile Software Development with SCRUM, Pearson Education,
2008.
27. http://agile.csc.ncsu.edu/
28. Larman, C., Agile and Iterative Development: A Manager™s Guide, Addison-Wesley Profes-
sional, 2003.
29. Fernandez, O., The Rails Way, Addison-Wesley, 2007.
30. Thomas, D., et al., Agile Web Development with Rails, 2nd ed., Pragmatic Bookshelf, 2006.
31. Green¬eld, J., et al., Software Factories: Assembling Applications with Patterns, Models,
Frameworks, and Tools, Wiley, 2004.
32. Gilb, K., Evolutionary Project Management and Product Development, prepublication draft,
May, 2007. http://www.gilb.com/
33. Booch, G., et al., UML Users Guide, Addison Wesley Longman, 1998.
34. www.ivarjacobson.com/products/essup.cfm
35. Garcia, S., and Turner, R., CMMI(R) Survival Guide: Just Enough Process Improvement (The
SEI Series in Software Engineering), Addison-Wesley Professional, 2006.
36. http://www.wikipedia.org/
37. Paulk, M. C., et al., Key Practices of the Capabilty Maturity Model “ V1.1, Pittsburgh, PA:
Software Engineering Institute, Carnegie-Mellon University.
38. Bittner, K., and Spence, I., Use Case Modelling, Addison-Wesley, 2002.
39. Reeser, T., et al., Citrix XenApp Platinum Edition for Windows: The Of¬cial Guide, Mc-Graw
Hill Osbourne, 2009.
40. Lowe, W., VMware Infrastructure 3 for Dummies, John Wiley & Sons, 2008.
41. Lipp, P., et al., Trusted Computing Challenges and Applications, Springer, 2008.
42. Graham, D., Evans, I., and Van Veenendaal, E., Foundations of Software Testing: ISTQB
Certi¬cation, Cengage, 2008.
43. http://www.nunit.org
44. http://www.artinsoft.com/
45. http://en.wikipedia.org/wiki/Adapter_pattern
46. http://en.wikipedia.org/wiki/Sed
47. http://nant.sourceforge.net/
48. Morrison, M., Sams Teach Yourself XML in 24 Hours: Complete Starter Kit (Sams Teach
Yourself ), SAMS, 2005.
49. http://xjb.codehaus.org/
References
311


50. Beck, K., JUnit Pocket Guide, O™Reilly Media, Inc., 2004; http://www.easymock.org/
51. http://www.sevenmock.org/
52. Buckley, C., et al., Implementing IBM Rational ClearQuest, IBM Press, 2006.
53. Bach, J., Exploratory Testing, http://www.satis¬ce.com/articles.shtml
54. Gloger, B., The Six Step Approach to Perform Heartbeat Retrospectives, http://www.
glogerconsulting.de/downloads/Gloger-heartbeat-retros-V11.pdf, 2006.
55. http://www.panopticode.org
56. Fowler, M., et al., Refactoring: Improving the Design of Existing Code, Addison-Wesley Pro-
fessional, June 1999.
57. Ambler, Scott W., Vermeulen, Allan, Bumgardner, Greg, and Metz, Eldon, The Elements of
Java Style, Cambridge University Press, 2000.
58. Holzner, S., Eclipse: A Java Developer™s Guide, O™Reilly Media, 2004.
59. Dick, J., Hull, E., and Jackson, K., Requirements Engineering, 2nd ed., London: Springer-
Verlag, 2004.
60. Wiegand, J., and Gamma, E., The Eclipse Way: Processes That Adapt, Presented at EclipseCon,
2005.
61. Zielczynski, P., Requirements Management Using IBM Rational RequisitePro, IBM Press,
2007.
62. IBM Corporation, IBM Rational Functional Tester Documentation v7.0, 2007.
63. Gamma, E., Collaborative Software Development “ Developing Software Like a Band Plays
Jazz, Presented at EclipseCon, 2008.
64. Robbins, S., and Bell, M., Second Life for Dummies, John Wiley & Sons, 2008.
65. Meatzel, Kai-Uwe, What If Your Tools Knew Your Team? “ Bringing Processes to Life, Pre-
sented at EclipseCon, 2008.
66. http://www.iconixsw.com
67. http://www.bloomberg.com
68. Nissen, C. F., Passing Your ITIL Foundation Exam, TSO “ The Stationary Of¬ce, 2007.
69. http://emma.sourceforge.net/
70. http://portal.acm.org/citation.cfm?id=1158776
71. Wagner, A., The Transactional Manager- How to Solve People Problems with Transactional
Analysis, Denver: T.A. Communications, 1981.
72. Evans, I., Achieving Software Quality Through Teamwork, Artech, 2004.
73. Brookes, F. P., The Mythical Man Month and Other Essays on Software Engineering, 2nd
ed., Addison-Wesley, 1995.
74. Busco, C., et al., When Crisis Arises and the Need for Change Confronts Individuals: Trust
for Accounting and Accounting for Trust, http://www.cimaglobal.com/downloads/research_
enroac_busco.pdf, November 2003.
75. de Bono, E., Six Thinking Hats, London: Penguin Books, 1999.
76. Pas, J., Emotional Intelligence as the Key to Software Quality, EuroSTAR, Stockholm,
2001.
77. de Bono, E., Edward De Bono™s Web, http://www.edwdebono.com/, October 2003.
78. TQMI, Problem Solving “ Tools and Techniques, TQMI, 2001.
79. Robson, M., Problem Solving in Groups, Gower, 1995.
80. Belbin, R. M., Management Teams “ Why They Succeed or Fail, Butterworth Heinemann,
1981.
81. Belbin, R. M., Team Roles at Work, Butterworth Heinemann, 1995.
References
312


82. Belbin Associates, Belbin Team Roles, http://www.belbin.com/belbin-team-roles.htm, October
2003.
83. Kroeger, O., Thuesen, J. M., and Rutledge, H., Type Talk at Work: How the 16 Personality
Types Determine Your Success on the Job, Bantam Doubleday Dell Publishing Group, 2002.
84. Team Technology Web site, Working Out Your Myers Briggs Type, http://www.
teamtechnology.co.uk/tt/t-articl/mb-simpl.htm, October 2003.
85. Team Technology Web site, The Mother of Strategic Systems Issues: Personality, http://www.
teamtechnology.co.uk/tt/t-articl/news1.htm, October 2003.
86. Kirton, M. J., Adaptors and Innovators de¬ned, see KAI Web site, http://www.kaicentre.com/,
July 2003.
87. Honey, P., Learning Styles, http://www.peterhoney.co.uk/product/learningstyles,
October 2003.
88. White, B., Software Con¬guration Management Strategies and Rational ClearCase: A Prac-
tical Introduction (Object Technology), Addison-Wesley, 2000.
89. Bell, M., Service-Oriented Modeling (SOA): Service Analysis, Design, and Architecture, John
Wiley & Sons, 2008.
90. http://en.wikipedia.org/wiki/System_of_systems
91. From the RUP for Large Projects Rational Method Composer Documentation, version 7.2.,
IBM Rational.
92. Young, R., Effective Requirements Practices, Boston: Addison-Wesley, 2001.
93. http://www.ibm.com/developerworks/rational/library/05/510_svc/
94. http://www.ambysoft.com/surveys/agileFebruary2008.html
95. Lee, K. A., IBM Rational ClearCase, Ant, and Cruisecontrol: The Java Developer™s Guide to
Accelerating and Automating the Build Process, IBM Press, 2006.
96. Preece, J., et al., Human Computer Interaction, Addison-Wesley, 1994.
97. Schein, E., Organizational Psychology, 2nd ed., Prentice-Hall, 1970.
98. Standish Group, Chaos Report, Boston: The Standish Group, 1995.
99. Derby, E., et al., Agile Retrospectives: Making Good Teams Great, Pragmatic Bookshelf, 2006.
100. MacDonah, M., Implementing EssUP within Rational Team Concert, Proceedings of the
Rational Software Developer Conference, U.K., 2008.
101. IBM Corporation, IBM Rational Team Concert Documentation v7.0, 2008.
102. http://www.pmcomplete.com/BPM/HTML/bpm2f1v.asp
103. Watt, D., E-Business Implementation, Butterworth-Heinemann, 2002.
104. http://www.scrumalliance.org/articles/46-advice-on-conducting-the-scrum-of-scrums-
meeting
105. Andersen, P. B., A Theory of Computer Semiotics: Semiotic Approaches to Construction and
Assessment of Computer Systems, Cambridge University Press, 1997.
106. Belin, D., and Simone, S. S., The CRC Card Book, Addison-Wesley, 1997.
107. http://www.extremeprogramming.org/index.html
(Old Ref 29) Object Management Group, Software & Systems Process Engineering Meta-
Model, v2.0, April, 2008.
(Old Ref 80) Ahern, D. M., CMMI Distilled: A Practical Introduction to Integrated Process
Improvement (SEI Series in Software Engineering), Addison-Wesley, 2008.
(Old Ref 107) Beck, K., Test-Driven Development by Example, Pearson, 2004.
Index




Alternative ¬‚ows, 168
.NET, 88
Ambassador user, in DSDM, 276
15-minute rule, 236
Analysis model, in Objectory, 12
7irene, 162
ArtinSoft, 69
80/20 rule, 236
Automated build, 197, 217, 218, 233, 239, 240,
246
Acceptance testing, 14, 107, 154, 167, 265
Automated build and testing, 217
Acceptance testing scenarios, 107
Automated build process, 233
Acceptance testing, in XP, 264, 265
Automated build tool, 197
Actors, 170
Automated continuous build, 197
Adoption of your agile testing process, 251
Automated development tools, in RAD, 261
Advisor user, in DSDM, 276
Automated functional test, 73, 246
Agile champion, 253, 254, 255
Automated integration testing, 73
Agile closedown meeting, 236
Automated integration tests, 75
Agile distributed development tool, 112
Automated performance testing, 73
Agile estimation, 227
Automated regression testing, 75
Agile exploratory testing, 227
Automated regression tests, 90
Agile iteration start-up meetings, 229
Automated software testing, 159
Agile meetings, 106, 211, 212
Automated test, 70
Agile pilot project, 251, 253, 254
Automated test script, 245
Agile process guidance, 234, 242, 243, 253
Automated testing, 67, 72, 130, 197, 237
Agile project closedown meetings, 216, 232, 236,
Automated testing tools, 37, 42, 154, 230
244
Automated unit test(s), 72, 75, 133, 218, 230, 246
Agile project start-up meetings, 229
Automated unit test coverage, 136
Agile proof of concept, 251, 253
Automated unit test tools, 218
Agile requirements management, 194, 207, 225
Automated unit testing, 45, 73, 97, 107, 197
Agile retrospectives, 130, 156, 215, 228, 230, 231,
Automation engineers, 277
232, 234, 255
Automation infrastructure, 84
Agile special interest group, 255
Agile speci¬cations, 164
Basecamp, 24
Agile templates, 52, 204
BCS, 86, 193
Agile test result record form template, 292
Beck, Kent, 20, 161
Agile test summary report template, 300
Beedle, Mike, 23, 24
Agile testing method, 180
Belbin Team Roles, 146
Agile training and mentoring, 253, 256
Best-of-breed life cycle, 97
Agile workshop meetings, 235, 244
Black box style tests, 264
AgileSOA, 161, 166
Black box, testing, 14
AIX, 82, 133
Block interfaces, 12
Allott, Stephen K., 86

313
Index
314


Client Advisory Board, 128
Block models, 12
CMM, 19, 118, 200, 206
Block speci¬cations, 12
CMMI, 228
Bloomberg, 116
Code analysis tools, 99, 250
Boehm, Barry, 9, 19
Code complexity, 98, 197, 203, 233
Booch, Grady, 12, 26
Code complexity metrics, 228
Boundary analysis, 15
Code complexity number (CCN), 98
BPEL, 165
Code complexity tools, 250
British Computer Society, 86, 193
Code coverage, 75, 98, 125, 133, 134, 136, 138,
Brooks, Fred, 142
181, 197, 199, 203, 209, 228, 233, 239, 240,
Bug ¬xes, in DSDM, 276
242, 243, 246, 303
Build management, 226
Code coverage metrics, 199, 233, 240
Build management process, 250
Code coverage tools, 138
Build management tools, 218, 221, 237, 246,
Code integration, in XP, 269
250
Code refactoring, 154, 197
Build process, 221
Coding standards, 182
Build Server, 126
Coding standards, in XP, 268
Build Veri¬cation Test, 82, 134
Coding, in XP, 268
Burn rates, 99
Collective code ownership, 123, 125, 227
Burndown, 83, 283
Collective code ownership, in XP, 264, 269
Burndown chart, 116, 186, 201, 241
Collective test ownership, 227
Burndown, chart, in Scrum, 283
Commercial off-the-shelf, 73
Bus-count, 266
Company coding standards, 217, 245
Business architects, 277
Completer-¬nishers, 146
Business area de¬nition, in DSDM, 275
Complexity metrics, 246
Business domain model, 174
Component testing, 13
Business driven, analysis, 74
Computer-aided design, 115
Business Process Execution Language, 165
Computer-aided software testing, 72
Business process model, 170, 174
Con¬guration management tools, 197, 221, 227,
Business risk, 117
230
Business study, in DSDM, 275
Con¬guration management, in DSDM, 278
BVT, 82
Con¬rmit√ 31
AS,
Con¬rmit , 31
CAD, 115
Consulting experts, in Scrum, 281
Capability Maturity Model, 31, 119, 200, 206
Continual integration, 132
Capability Maturity Model Integration,
Continuous build, 221
Capture“replay testing tools, 56, 73, 108, 195,
Continuous build server, 75
219, 226, 245
Continuous integration, 34, 45, 46, 47, 68, 69, 70,
Cassidy, Colin, 72
124, 125, 130, 133, 154, 158, 197, 204, 209, 218,
CAST, 72
221, 226, 228, 230, 231, 233, 237, 239, 246, 250
Cause-and-effect diagrams, 144
Continuous integration build, 93, 95
CCN, 98
Continuous integration tool, 126
Chana, Dass, 151
Continuous integration, and test, 107
Change management solutions, 12, 74, 80, 85,
COTS, 73
221, 238, 246, 247, 256, 262, 271, 278
CRC, 267
Change management tools, 221, 238, 247
CRC cards for design sessions, in XP, 267
Change request tool, 208
Cross-geography operation, 250
Checklist of agile practices, 305
Cumulative Test Analysis, 135
Chicken roles, in Scrum, 280
CIB, 93
Daily stand-up meeting, 230
CIB tool dashboard, 98
Daily stand-up meetings, in XP, 266
Citrix, 49, 50, 88
Dashboard, in Scrum, 283
Class, Responsibilities, and Collaboration cards
de Bono, Edward, 143, 211
(CRC), 267
Defect count, 178
Clean room environment, 260
Index
315


Evo, 31, 215
Defect detection rates, 228, 231
Evo™s Key Process Principles, 206
Defect distribution rates, 228
Evolutionary Development, 31
Defect tracking solution, 238, 246, 247
Evolutionary Project Management, 25
Defect tracking tools, 238, 247
Executive sponsor, in DSDM, 276
De¬nition of an agile offshored approach, 105
Experimentus Ltd, 62
De¬nition of an agile testing process, 33, 39, 45,
Exploratory testing, 83, 90, 127
50, 55, 63, 68, 82, 94, 124, 133, 166, 185
Extensible Markup Language, 73
De¬nition of the agile development and testing
External systems model, 170, 175
process, 152
Extreme Programming, 20, 44, 89, 93, 107, 115,
DeGrace, 23
122, 124, 161, 263
Deliverable, in Scrum, 281
Delivering effective and ef¬cient agile testing, 251
Facilitator, in DSDM, 277
Deloitte, 184
FAST, methodology, 87
Denning, Nick, 53, 251
Fat client, 67
Dependency injection, 78
Fat server, 67
Derivatives trading system, 53
Feasibility prototype, in DSDM, 274
Design prototypes, in DSDM, 275
Feasibility report, in DSDM, 274
Design team, in AgileSOA, 166, 172
Feasibility study, in DSDM, 274
Developer testing, 13, 132, 133
Financial risk management, 53
Developer, in DSDM, 276
FIT, 127
Development plan, in DSDM, 275
Fitness for business purpose, in DSDM, 270
Diegesis Limited, 53, 251
Fitnesse, 127
Distributed build management tool, 250
Formal tool selection, 236
Distributed electronic data, 228
Foundation agile best practices, 225
Distributed process enactment tools, 250
Fowler, Martin, 267
Documented business requirements, in RAD, 260
Fragile, 88, 89
Domain model, 170
Framework for Integrated Testing, 127
Domain model, in Objectory, 11
FTSE, 62
DSDM, 216, 270
Functional extensions, 164
DSDM phases and stages, 274
Functional model, in DSDM, 275
Dynamic information, 94
Functional prototypes, in DSDM, 275
Dynamic Systems Development Method, 21
Functional testing tools, 134, 233, 237, 245, 247,
250
Early and frequent testing, in XP, 263
Functional tests, 107, 195
EasyMock, 76, 78, 219
Eclipse, 82, 104, 107, 181
Gallagher, Brian, 19
Eclipse test and performance tools project, 49
Gamma, Erich, 129
Eclipse Way, 107
Geographically distributed sites, 113
EJB, 75
Gilb, Kai, 25
ElectroMind, 86
Gilb, Tom, xiii, 25, 31, 32, 191, 200
Electronic REPO Market, 116
Globalization Veri¬cation Testing, 83
EMMA, 134, 181, 199
GVT, 83
Emotional intelligence, 143
End of iteration retrospective meeting, 235, 236,
Hammurapi, 181
244
Hansson, David Heinemeier, 24
Enhancements, in DSDM, 276
Harmonize offshore time zones, 249
Enterprise JavaBean, 75
HBOS, 93
Enterprise Service Bus, 132
Heartbeat retrospectives, 94, 228, 230
Equivalence partitioning, 15
Heat-maps, 61
Error guessing, 227
Hendrickson, Elisabeth, 150
Essential Uni¬ed Process, 27, 39, 204
Hierarchical power structures, 140
EssUP, 27, 220, 222
Hodgkinson, Greg, 160
Evans, David, 122
House coding style, 217
Evans, Isabel, 139, 212
Index
316


ITIL, 118
HP-UX, 133
Ivar Jacobson Consulting Ltd, 37
HTML, 127
Hursley, 81
Jacobson, Ivar, 11, 26, 43
Hyades, 49
JAD Workshops, 260, 261
Java, 72, 82, 104, 107, 133, 180
IBM, 132
Java Enterprise Edition Application Server, 133
IBM i series, 133
Java mock object frameworks, 72
IBM Software Group, 81
Joint Application Design (JAD), 259, 260
ICONIX, 115
Joint Application Design Workshops, 213, 216,
IDE, 77
235, 244
Identify targets of test, 70, 195, 198, 233, 239
Joint Application Development Workshops, 260
Identify targets of test practice, 246
Joint code ownership, 197
Idiomatic phrases, 211
JRun, 65
Implementation plan, in DSDM, 275
JUnit, 76, 107, 110, 119, 138, 181, 264
Implementation team, in AgileSOA, 166, 172
Improvix Ltd, 66
Kingston, Peter, 103
Increment of shippable software, 281
Knowles, Howard, 66
Increment of shippable software, in Scrum, 283
Increment review document, in DSDM, 275
Large-sized agile projects, 225
Incremental prototypes, in RAD, 260, 262
Laser-guided testing, 132, 198
Incremental Prototyping, RAD, 259
Lava lamps, 126
Information Age award, 128
Legacy systems, 74
Information Systems Examination Board, 87
Life assurance company, 62
Information Technology Infrastructure Library,
Linux, 82, 88, 133
the, 118
Lotus Notes, 153
Ingres, 53
Lunch and learn sessions, 255
Instant messaging, 112, 151, 153, 157, 202, 211,
243, 249
Maintenance and update of your testing process,
Integrated development environment, 77
251
Integration test, 14, 185
Maintenance, in DSDM, 276
Intel, 133
Management sponsor, 254
Intellect Testing Special Interest Group, 193
Maner, Walter, 259
Intensive RAD project, 260
Manual exploratory testing, 199
Interim iteration meetings, 231, 235, 244
Manual tester packs, 153
International Software Testing Quali¬cations
Manual testing, 70, 71, 114, 124, 219, 220, 237,
Board, 62, 87
246
Internet telephony, 112, 202, 243, 249
Martin, James, 19
ISEB, 87
May, Peter, 184
Ishikawa diagrams, 144
Medium-sized agile projects, 225
Ishikawa ¬shbones, 143, 144, 212
Metrics, 99, 231, 234, 248
ISTQB, 87, 103
Metrics collection, 247
Iteration, 195
Metrics program, 205, 220, 243
Iteration plan, 107
Metrics scheme, 231, 234, 242
Iteration planning, 95
Middleware, 81
Iteration planning meeting, in XP, 265
Mills, Simon, 7
Iteration planning meetings, 213
Minor releases, 94
Iteration planning, in XP, 266
Mitigation strategies, in Scrum, 279
Iteration start-up meetings, 234
Mock framework, 218
Iteration velocity, 99
Mock objects, 76, 125, 218
Iterative approach, 105
Module testing, 14
Iterative development, 94, 164, 225
MoSCoW Principle, in DSDM, 277
Iterative development, in XP, 266
MQSeries, 132
Iterative essentials, 39
Myers Briggs Type Indicator, 146
Iterative models, 10
Index
317


Portal solutions, 242
nAnt, 70
Portlet engine, 165
NASA, 93
Portlets, 165
National Institute of Standards and Technology, 1
Power Builder, 104
Nine fundamental principles, of DSDM, 272
Practice summary sheets, 222
NIST, 309
Preproject phase, in DSDM, 274
Non-co-located teams, 211
PRINCE, 241, 259
Nonaka, 23
PRINCE2, 22, 59, 200, 213, 234
Nonfunctional requirements list, in DSDM, 275
Procedural programming, 8
Normalizing estimation data, 201
Process, 2, 7
Norman, Michael G., 49
Process advice, 247
Nowakowska, Joanna, 180
Process enactment tools, 113, 203, 205, 211, 217,
NUnit, 69, 125
222, 238, 242, 247, 248, 250
Process engineer, 161
Object Management Group, the, 27
Process framework, 161
Object modeling technique, 12
Process guidance, 221, 247, 248
Object orientation, 8
Process guidance online, 234
Objectory method, 10
Process guidance tool, 250
Obtaining customer feedback, 199
Process improvement, 2, 26, 27, 32, 36, 62, 86,
offshore, 247
90, 107, 200, 205, 206, 216, 224, 228, 234, 235,
offshore assets, 248
242, 256, 282
offshore representative, 248
Process improvement program, 205
offshore team, 248
Process summary cards, 204, 234
offshored, 225, 247, 250
Product backlog, 186, 202, 281
offshored agile project, 202, 215
Product backlog, in Scrum, 282
offshored projects, 246, 248, 250, 255
Product burndown chart, in Scrum, 283
offshored staff, 220
Product owner, in Scrum, 280
offshoring, 103
Progress visualization, 203, 228, 241, 248
offsite, 225, 247, 248, 250, 255
Project burndown charts, 203, 228
OMG, 27
Project closedown meeting, 244
OMT
Project communications infrastructure, 249
Online process guidance, 222
Project life-cycle phase, in DSDM, 274
OOAD method, 12
Project manager, in DSDM, 271, 276
Open source, 93
Project portal, 241, 242, 243, 248
Open University, 193
Project risk, 9
Options, ¬nancial instrument, 54
Project risks, in Scrum, 279
Oracle, 104
Project steering committee, 200
Order management system, 73
Project velocity, 99
Outline plan, in DSDM, 274
Project Velocity, in XP, 266
PRojects IN Controlled Environments, 22
Pair developers and testers, 42
Proli¬cs, 72, 160
Pair development, 153
Prototypes, 9, 174, 199, 234, 240
Pair programming, 42, 45, 47, 74, 107, 123, 155,
Prototyping, in DSDM, 277
156, 195, 203, 207
Pair programming, in XP, 266, 269
Quality assurance manager, 256
Pair testing, 108, 109, 153, 156, 196, 204, 207,
Quality assurance police, 195
227, 231, 233, 239
Quality assurance, in XP, 263
Panopticode scan, 97, 98
Quality management metrics, 227
Performance tests, 167
Quality managers, 277
Phased RAD project, 260
Quality metrics dashboard, 123
Phillips, Martin, 81
Pig and chicken roles, 23, 214, 279
RAD, 19, 86, 216, 235, 244
Pig roles, in Scrum, 280
Rails, 24
Planning, in XP, 265
Rapid Application Development, 18, 19, 86, 259
Plants, 146
Index
318


Royce, Winston W., 8
Rapid prototyping, 153, 158, 199, 231, 234, 240
Ruby on Rails, 24
Rapid prototyping, in RAD, 259
Rules and practices of Scrum, 279
Rapidity of development, RAD, 259
Rumbaugh, Jim, 12, 26
Rational functional tester, 154
RUP, 12, 19, 26, 27, 74, 152, 161, 166, 220
Rational Objectory Process, 12
Rational Uni¬ed Process, 12, 26, 62, 74, 152, 161,
SameTime, 153
204
Scalable vector graphic, 98
RDBMS, 53
Scapa Technologies Ltd, 49
Ready reference sheets, 242
Schults, Scott, 19
Real-time visualization, 226
Schwaber, Ken, 23
Record/playback, testing, 73
SCM, 93
Red and green lava lamps, 197
Scribe, in DSDM, 277
Refactoring, 47, 77, 94, 165, 269
Scrum, 23, 38, 90, 93, 106, 115, 116, 161, 184,
Refactoring, in XP, 267
200, 201, 213, 230, 235, 236, 244, 282
Regression testing, 14, 53, 57, 58, 59, 60, 66, 71,
Scrum artifacts, 282
73, 107, 132, 133, 154, 157, 159, 185, 195, 198,
Scrum board, 186
219, 237, 239, 240, 303
ScrumMaster, 99, 187, 280
Regression testing tools, 57, 233, 239, 247, 250
Scrum meetings, 281
Regression testing, in RAD, 262
Scrum of Scrum of Scrums, 244
Relational Database Management System, 53
Scrum of Scrums, 23, 184, 185, 187, 235, 236,
Release candidate, 185
244, 282
Release plan, in XP, 265
Scrum team, 280
Release planning meeting, in XP, 269
Second Life, 112, 156, 159, 202, 203, 211, 243,
Release, in Scrum, 283
249, 311
Remote desktop, 49
Secure token authentication, 50
Requirements, 9, 105, 163, 186, 207, 228
Service consumers, 170, 173
Requirements analysis, 207
Service contracts, 170
Requirements churn, 23
Service interaction speci¬cations, 173
Requirements creep, 152
Service locator, 78
Requirements elicitation, 207, 262
Service model, 170
Requirements list, in DSDM, 275
Service-Oriented Architecture, 160
Requirements management, 106, 107, 110, 112,
Service-oriented solutions, 173
114, 155, 208, 209, 217, 223, 234, 237, 246, 250
Service provider(s), 170, 173
Requirements management tools, 110, 153, 157,
SevenMock, 72, 80, 219
229, 232, 234, 237, 243, 246, 250
Sewell, Nick, 37
Requirements speci¬cations, 163, 164, 168
Show-casing increments, 174
Requirements team, in AgileSOA, 166, 172
SIGiST, 86, 193
Retail chain, 73
Simplicity, in XP, 267
Retrospective, 211
Six Hats in a Meeting, 143, 211
Reuse packs, 195
Slippage, 116
Rhino Mocks, 76
Small agile projects, 224
Risk assessment, in DSDM, 273
SOA, 160, 162, 185
Risk log, in DSDM, 274
SOA design patterns, 175
Risk log, updated, in DSDM, 275
Soak testing, 51
Risk management, 120
Social networking technologies, 112
Risk mitigation, 120
Software Con¬guration Management, 93
Risk register, 120
Software development life cycles, 94
Risk-based approach to testing, 117
Software engineering, 160
Rodan Systems S.A., 180
Software Futures Ltd, 115
Roles and responsibilities, 200
Software Process Engineering Meta-Model, 27
Rolling out your agile process, 253
Solaris, 82, 133
Roll-out and adoption of agile best practices, 251
Solution architect, 189
Roll-out and adoption of your agile process, 256
Sourceforge.net, 84
RoR, 19
Index
319


Targets of test, 233
Specialist Group in Software Testing, 86, 193
TDD, 110, 180, 182, 196
Specialist roles, in DSDM, 277
Team Foundation Server, 88
Speci¬cations, 164
Team leader, in DSDM, 271, 276
SPEM, 19, 27
Technical coordinator, in DSDM, 271, 276
Spike development, 94
Technology-driven, analysis, 74
Spike solution, 267
Teleconferencing, 151
Spike solutions, in XP, 267
Test automation, 73, 110, 132, 133, 195, 198
Spiral model, 9
Test automation, in XP, 263, 264
Spiral process, 94
Test automation, tools, 90
Sprint, 90, 185, 228
Test case coverage, 228
Sprint backlog, 281
Test case design, 210, 227
Sprint backlog, in Scrum, 283
Test cases, 11, 38, 41, 42, 52, 64, 68, 74, 107, 196,
Sprint burndown chart, in Scrum, 283
198, 207, 226, 229, 277, 302
Sprint cycle, 120
Test coverage, 107, 197, 229
Sprint goal, 281
Test-driven design, 196, 226
Sprint planning, 186
Test-driven development (TDD), 74, 91, 107, 110,
Sprint planning meeting, 185, 186, 230, 281
123, 180, 182, 196, 268
Sprint retrospectives, 186, 206, 282
Test-driven methodology, 180
Sprint review, 186
Test early and often, 70
Sprint review meeting, 281
Test harness, 56, 74, 218, 230, 245
Sprint start-up meetings, 213
Test harness tools, 218, 232, 237, 245
Sprint velocity, 186
Test maintenance, 198
Sprints, 94, 119, 202, 222, 226, 280, 281
Test management solutions, 246
SQL Server, 88
Test management techniques, 195
SQS, 122
Test management tools, 90, 205, 238, 247
STAF, 84
Test packs, 237
Stahl, 23
Test plan document, 15
Stakeholder meetings, in Scrum, 280
Test plan, in DSDM, 271
Stakeholders, in Scrum, 281
Test plan, in RAD, 261
Stand-up brie¬ngs, 186
Test procedure, in DSDM, 271
Stand-up meetings, 23, 106, 208, 213, 214, 222,
Test process, 7
228, 232, 234, 241, 243
Test process improvement, chart, 88
Stapp, Lucjan, 180
Test record, in DSDM, 271
State transition analysis, 15
Test refactoring, 154, 198, 227, 233, 239
Static analysis, 45, 82
Test result record form template, 292
Static analysis tools, 232, 237, 245
Test results, in DSDM, 271
Static models, in DSDM, 275
Test script, in DSDM, 271
STAX, 84
Test scripts, 11
Story cards, 163
Test speci¬cation document, 15
Strategic Thought Limited, 53
Test stories, 83
Subsystem descriptions, 12
Test summary report template, 300
Sutherland, Jeff, 23
Test team, in AgileSOA, 167, 172
SVG, 98
Testability, 196
SVT, 83
Tested and delivered system, in DSDM, 275
Swiss Exchange, 115, 116
Tested software plus test records, in DSDM, 275
System architecture de¬nition, in DSDM, 275
Tester, in DSDM, 271, 277
System integrators, 277
Test-infected, 122, 129
System metaphor, in XP, 267
Testing Solutions Group, 139
System test, 14, 64, 132, 134, 187
Testing speci¬c retrospective, 215
System veri¬cation tests, 83
Testing time box, 226
System-of-systems, 161
Testing tools, 229
Testing, in DSDM, 270
Takeuchi, 23
Thomas, Graham, 44, 150
Targeted test execution, 195
Index
320


Users, in Scrum, 280
Thompson, Geoff, 62
Tilt, Jon, 132, 193
VB.Net, 66
Time-boxed iterations, 133
VB6, 66
Time boxing, 106, 111, 226, 261
Velocity, 99, 116, 117, 120, 178, 201, 202, 227, 266
Time box, 167, 176, 199, 201
Velocity metrics, 201
Time-box plans, in DSDM, 275
Verifying the requirements, 199
Time-boxed, 20, 133, 282
Videoconferencing, 46, 112, 202, 203, 211, 243,
Time boxing, in DSDM,
249
TPTP, 49
Virtual desktop, 49
Trading systems, 115, 116
Virtual meeting tools, 211
Trained users, operators, and support staff, in
Virtual meetings, 112, 202, 203, 249
DSDM, 275
Virtual project meetings, 243
Transactional manager, 141
Virtual team meetings, 249
Transparency in planning, in Scrum, 279
Visionary, in DSDM, 276
Treasury dealing markets, 44
Visual modeling, 11
Tree-maps, 61
Visual modeling, in DSDM, 278
Trinem, 93
Visual Studio Team System, 88
Truck count, 266
V-model, 13, 14, 15, 62, 64, 65, 152, 200, 234, 240
VMware, 49
UAT, 14, 89, 187
UCD,
Wagner, Abe, 141
UML, 278
Warden, Richard, 115
Unattended testing, 219, 226, 246
Warsaw University of Technology, 180
Unattended-testing techniques, 198
Waterfall, 8, 189
Uni¬ed Modelling Language, 26, 115, 278
Waterfall-style projects, 259
Unit test coverage, 98
Weaver, Jayne, 147
Unit test tools, 250
Weaver Triangle, 147, 208, 212
Unit testing, 13, 70, 107, 125, 183, 210, 233
WebSphere, 132
Unit testing tools, 226
Websphere application server, 73
Unit tests, 75, 98, 186
Well-bounded iterations, 226
Unit tests, in XP, 263
White-box style tests, 264
UNIX, 88, 133
White-box, testing, 14
Us and them, tester“developer issue, 206
Wicked problems, 121
Use case essentials, 39
Wiki, 151, 215
Use case ¬‚ows, 168, 175
Wilson, James, 93
Use case model, in objectory, 11
Windows, 133
Use case play, 169
Work¬‚ow tooling, in AgileSOA, 173
Use cases, 11, 12, 27, 39, 41, 105, 107, 108, 120,
Workshops, in DSDM, 278
163, 167, 168, 170, 174, 178, 196, 207, 210,
220, 226, 229, 278
XBreed, 24
Use cases speci¬cations, 170
XJB test platform, 78
User acceptance testing, 14, 89, 116, 178, 187
XML, 73, 78
User-centered design, 82
XP, 20, 161, 213, 240, 263
User documentation, in DSDM, 275
xUnit, 125
User stories, 94, 95, 186, 187, 189, 201, 202, 207,
213, 222, 226, 229, 264, 265, 266, 268, 280
zOS, 133
User stories, in Scrum, 282
zOS, mainframe, 82
User stories, in XP, 265

<<

. 10
( 10)