Component tests
automating,
282
definition,
501
supporting function of,
5
Conditions of satisfaction
business-facing tests and,
142–143
definition,
501–502
Context-driven testing
definition,
502
quadrants and,
106–107
Continuous build process
failure notification and,
112
feedback and,
119
FitNesse tests and,
357
implementing,
114
integrating tools with,
175
,
311
source code control and,
124
what testers can do,
121
Continuous feedback principle, 22
Continuous improvement principle, 27–28
Continuous integration (CI)
automating,
280–282
as core practice,
486–487
installability and,
231–232
Remote Data Monitoring system example,
244
running tests and,
111–112
Conversion, data migration and, 460–461
Core practices
coding and testing as one process,
488–489
continuous integration,
486–487
incremental approach,
488
overview of,
486
synergy between practices,
489
technical debt management,
487–488
test environments,
487
Courage, principles, 25–26, 71
Credibility, building, 57
Critiquing the product
business facing tests.
See
Business-facing tests
, critiquing the product (Quadrant 3)
technology-facing tests.
See
Technology-facing tests
, critiquing the product (Quadrant 4)
CrossCheck, testing Web Services, 170
CruiseControl, 126, 244, 291
Cultural change, 37. See also Organizations
Cunningham, Ward, 106, 168, 506
Customer expectations
business impact and,
475–476
production support,
475
Customer-facing test. See Business-facing tests
Customer support, DTS (Defect Tracking System) and, 82
Customer team
definition,
502
interaction between customer and developer teams,
8
overview of,
7
Customer testing
Alpha/Beta testing,
466–467
definition,
502
overview of,
464
UAT (user acceptance testing),
464–466
Customers
collaborating with,
396–397
,
489–490
considering all viewpoints during iteration planning,
388–389
delivering value to,
22–23
importance of communicating with,
140
,
414–415
,
444
iteration demo,
191–192
,
443–444
participation in iteration planning,384–385
relationship with,
41–42
reviewing high-level tests with,
400
speaking with one voice,
373–374
CVS, source code control and, 124
D
Data
automating creation or setup,
284–285
cleanup,
461
conversion,
459–461
release planning and,
348
writing task cards and,
392
Data-driven tests, 182–183
Data feeds, testing, 249
Data generation tools, 304–305
Data migration, automating, 310, 460
Databases
avoiding access when running tests,
306–310
canonical data and automation,
308–309
maintainability and,
228
product delivery and updates,
459–461
production-like data and automation,
309–310
setting up/tearing down data for each automated test,
307–308
testing data migration,
310
De Souza, Ken, 223
Deadlines, scope and, 340–341
Defect metrics
overview of,
437–440
release metrics,
364–366
Defect tracking, 79–86
DTS (Defect Tracking System),
79–83
keeping focus and,
85–86
overview of,
79
reasons for,
79
tools for,
83–85
Defect Tracking System. See DTS (Defect Tracking System)
Defects
alternatives for dealing with bugs,424–428
choosing when to fix bugs,
421–423
dealing with bugs,
416–419
deciding which bugs to log,
420–421
media for logging bugs,
423–424
metrics and,
79
TDD (test-driven development) and,
490
writing task cards and,
391–392
zero bug tolerance,
79
,
418–419
Deliverables
“fit and finish” deliverables,
454
nonsoftware,
470
overview of,
468–470
Delivering product
Alpha/Beta testing,
466–467
business impact and,
475–476
communication and,
462–463
customer expectations,
475
customer testing,
464
data conversion and database updates,
459–461
deliverables,
468–470
end game,
456–457
installation testing,
461–462
integration with external applications,
459
nonfunctional testing and,
458–459
overview of,
453
packaging,
474–475
planning time for testing,
455–456
post-development testing cycles,
467–468
production support,
475
release acceptance criteria,
470–473
release management,
470
,
474
releasing product,
470
staging environment and,
458
testing release candidates,
458
UAT (user acceptance testing),
464–466
what if it is not ready,
463–464
what makes a product,
453–455
Demos/demonstrations
of an iteration,
443–444
value to customers,
191–192
Deployment, automating, 280–282
Design
automation strategy and,
292–294
designing with testing in mind,
115–118
Detailed test cases
art and science of writing,
178
big picture approach and,
148–149
designing with,
401
Developer team
interaction between customer and developer teams,
8
overview of,
7–8
Development
agile development,
3–4
,
6
automated tests driving,
262–263
business-facing tests driving,
129–132
coding driving,
406
post-development testing cycles,
467–468
Development spikes, 381
Development team, 502
diff tool, 283
Distributed teams, 431–432
defect tracing systems, and,
82
physical logistics,
66
online high level tests for,
399
online story board for,
357
responding to change,
29
software-based tools to elicit examples and requirements, and,
163–164
Documentation
automated tests as source of,
263–264
problems and fixes,
417
reports,
208–210
of test code,
251
tests as,
402
user documentation,
207–208
Doneness
knowing when a story is done,
104–105
multitiered,
471–472
Driving development with tests. See TDD (test-driven development)
DTS (Defect Tracking System), 80–83
benefits of,
80–82
choosing media for logging bugs,
424
documenting problems and fixes,
417
logging bugs and,
420
reason for not using,
82–83
Dymond, Robin, xxx
Dynamic analysis, security testing tools, 225
E
easyb behavior-driven development tool, 165–168
EasyMock, 127
Eclipse, 125, 316
Edge cases
identifying variations,
410
not having time for,
112
starting simple and then adding complexity,
Читать дальше