Está en la página 1de 55

The Certified Software Quality Engineer Handbook

Also available from ASQ Quality Press: The Software Audit Guide John Helgeson Fundamental Concepts for the Software Quality Engineer, Volume 2 Sue Carroll and Taz Daughtrey, editors Safe and Sound Software: Creating an Efficient and Effective Quality System for Software Medical Device Organizations Thomas H. Faris Quality Audits for Improved Performance, Third Edition Dennis R. Arter The ASQ Auditing Handbook, Third Edition J.P. Russell, editing director The Internal Auditing Pocket Guide: Preparing, Performing, Reporting, and Follow-Up, Second Edition J.P. Russell Root Cause Analysis: Simplified Tools and Techniques, Second Edition Bjrn Andersen and Tom Fagerhaug The Certified Manager of Quality/Organizational Excellence Handbook, Third Edition Russell T. Westcott, editor The Certified Quality Engineer Handbook, Third Edition Connie M. Borror, editor Six Sigma for the New Millennium: A CSSBB Guidebook, Second Edition Kim H. Pries The Certified Quality Process Analyst Handbook Eldon H. Christensen, Kathleen M. Coombes-Betz, and Marilyn S. Stein Enabling Excellence: The Seven Elements Essential to Achieving Competitive Advantage Timothy A. Pine To request a complimentary catalog of ASQ Quality Press publications, call 800-248-1946, or visit our Web site at http://www.asq.org/quality-press.

The Certified Software Quality Engineer Handbook


Linda Westfall

ASQ Quality Press Milwaukee, Wisconsin

American Society for Quality, Quality Press, Milwaukee 53203 2010 by Linda Westfall All rights reserved. Published 2009 Printed in the United States of America 15 14 13 12 11 10 09 5 4 3 2 1 Library of Congress Cataloging-in-Publication Data Westfall, Linda, 1954 The certified software quality engineer handbook / Linda Westfall. p. cm. Includes bibliographical references and index. ISBN 978-0-87389-730-3 (hard cover : alk. paper) 1. Electronic data processing personnelCertification. 2. Computer software ExaminationsStudy guides. 3. Computer softwareQuality control. I. Title. QA76.3.W466 2009 005.1'4dc22 ISBN: 978-0-87389-730-3 No part of this book may be reproduced in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher. Publisher: William A. Tony Acquisitions Editor: Matt T. Meinholz Project Editor: Paul OMara Production Administrator: Randall Benson ASQ Mission: The American Society for Quality advances individual, organizational, and community excellence worldwide through learning, quality improvement, and knowledge exchange. Attention Bookstores, Wholesalers, Schools, and Corporations: ASQ Quality Press books, videotapes, audiotapes, and software are available at quantity discounts with bulk purchases for business, educational, or instructional use. For information, please contact ASQ Quality Press at 800-248-1946, or write to ASQ Quality Press, P.O. Box 3005, Milwaukee, WI 53201-3005. To place orders or to request a free copy of the ASQ Quality Press Publications Catalog, including ASQ membership information, call 800-248-1946. Visit our Web site at www.asq.org or http://www.asq.org/quality-press. Printed in the United States of America Printed on acid-free paper CMM, CMMI, and Capability Maturity Model are registered in the US Patent & Trademark Office by Carnegie Mellon University. 2009030360

For Robert Westfall, my husband, my partner, my best friend, and my playmate. Thank you for all of your support and patience while I wrote this book and as I volunteered countless hours to ASQ and other organizations over the years. Thank you for sharing your life with me, making me laugh out loud, cooking all of those fantastic meals, and sharing your passion for fireworks with me. Life with you continues to be a blast!!!

(This page intentionally left blank)

Table of Contents

CD-ROM Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . List of Figures and Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

xiii xv xxiii xxvii

Part I General Knowledge


Chapter 1 A. Quality Principles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Benefits of Software Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Organizational and Process Benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 2 B. Ethical and Legal Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. ASQ Code of Ethics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Legal and Regulatory Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 3 C. Standards and Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ISO 9000 Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . IEEE Software Engineering Standards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . SEI Capability Maturity Model Integration (CMMI) . . . . . . . . . . . . . . . . . . . . . Chapter 4 D. Leadership Skills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Organizational Leadership. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Facilitation Skills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Communication Skills. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 5 E. Team Skills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Team Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Team Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 4 7 10 10 12 16 17 19 21 25 26 34 40 48 49 55

Part II Software Quality Management


Chapter 6 A. Quality Management System. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Quality Goals and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Customers and Other Stakeholders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4. Outsourcing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 7 B. Methodologies (for Quality Management) . . . . . . . . . . . . . . . . . . 1. Cost of Quality (COQ). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Process Improvement Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii 62 62 72 76 79 89 89 93

viii

Table of Contents

3. Corrective Action Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4. Defect Prevention. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 8 C. Audits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Audit Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Audit Roles and Responsibilities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Audit Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

99 104 107 109 113 116

Part III Systems and Software Engineering Processes


Chapter 9 A. Life Cycles and Process Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . Waterfall Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . V-Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . W-Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Spiral Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Iterative Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Test-Driven Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Feature-Driven Development. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Incremental Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Rapid Application Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Evolutionary Development. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 10 B. Systems Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Embedded System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . n-Tier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ClientServer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Web . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Wireless . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Messaging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Collaboration Platforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 11 C. Requirements Engineering. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Requirements Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Requirements Elicitation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Requirements Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 12 D. Requirements Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Requirements Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Requirements Change Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4. Bidirectional Traceability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 13 E. Software Analysis, Design, and Development . . . . . . . . . . . . . . 1. Design Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Quality Attributes and Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Software Reuse. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4. Software Development Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5. Software Development Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 130 132 132 133 136 136 138 140 142 143 146 146 147 148 149 150 150 150 152 155 160 172 181 181 184 190 190 195 195 200 203 205 206

Table of Contents

ix

Chapter 14 F. Maintenance Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Maintenance Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Maintenance Strategy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

213 213 215

Part IV

Project Management
220 225 240 253 258 261 267 274 280 282 285 299 301

Chapter 15 A. Planning, Scheduling, and Deployment . . . . . . . . . . . . . . . . . . . 1. Project Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Project Scheduling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Project Deployment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 16 B. Tracking and Controlling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Phase Transition Control. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Tracking Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Project Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4. Program Reviews. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 17 C. Risk Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Risk Management Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Software Security Risks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Safety and Hazard Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Part V

Software Metrics and Analysis


306 307 314 319 323 325 339 351 360 360 363 368

Chapter 18 A. Metrics and Measurement Theory . . . . . . . . . . . . . . . . . . . . . . . . 1. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Basic Measurement Theory and Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Psychology of Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 19 B. Process and Product Measurement . . . . . . . . . . . . . . . . . . . . . . . . 1. Software Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Process Metrics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Metrics Reporting Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 20 C. Analytical Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Data Collection and Integrity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Quality Analysis Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Part VI

Software Verification and Validation (V&V)


386 388 390 394 396 403

Chapter 21 A. Theory (of V&V) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. V&V Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Software Product Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 22 B. Test Planning and Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Test Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Test Plans. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Table of Contents

3. Test Designs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4. Software Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5. Tests of Supplier Components and Products . . . . . . . . . . . . . . . . . . . . . . . . . . 6. Test Coverage Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7. Code Coverage Techniques. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8. Test Environments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9. Test Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 23 C. Reviews and Inspections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Objectives of Peer Reviews. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Benefits of Peer Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . What to Peer Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Selecting Peer Reviewers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Informal versus Formal Peer Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Types of Peer Reviews. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Risk-Based Peer Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Desk Checks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Walk-Throughs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . InspectionsThe Process and Roles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . InspectionsPlanning Step. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . InspectionsKickoff Meeting Step . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . InspectionsPreparation Step . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . InspectionsInspection Meeting Step . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . InspectionsPost-Meeting Steps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Soft Skills for Peer Reviews and Inspections. . . . . . . . . . . . . . . . . . . . . . . . . . . . Technical Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 24 D. Test Execution Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . Test Execution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Test Case . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Test Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Test Log . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Problem Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Test Results Data and Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Test Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 25 E. Customer Deliverables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Peer Reviews. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Development Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Development Audits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pilots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Ordering, Manufacturing, Shipping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Installation Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Customer/User Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

406 414 425 426 432 438 441 444 444 445 446 446 449 450 451 452 455 456 458 460 460 461 464 464 465 466 467 468 469 470 471 472 472 474 475 475 475 475 476 477 477

Part VII Software Configuration Management


Chapter 26 A. Configuration Infrastructure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Configuration Management Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 482 482

Table of Contents

xi

2. Configuration Management Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Library Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 27 B. Configuration Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1. Configuration Items. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Software Builds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 28 C. Configuration Control and Status Accounting . . . . . . . . . . . . . 1. Item, Baseline, and Version Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Configuration Control Board (CCB) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3. Concurrent Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4. Status Accounting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 29 D. Configuration Audits. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Functional Configuration Audit (FCA) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Physical Configuration Audit (PCA). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Standardized Checklists. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Chapter 30 E. Product Release and Distribution . . . . . . . . . . . . . . . . . . . . . . . . . 1. Product Release . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2. Archival Processes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Appendix A Software Quality Engineer Certification Body of Knowledge. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

486 489 500 500 513 516 518 524 534 539 544 545 547 550 551 551 558 561 571 615 623

(This page intentionally left blank)

CD-ROM Contents

PracticeExam1.pdf PracticeExamAnswers1.pdf PracticeExam2.pdf PracticeExamAnswers2.pdf PracticeExam3.pdf PracticeExamAnswers3.pdf

xiii

(This page intentionally left blank)

List of Figures and Tables

Figure 1.1 Figure 1.2 Figure 1.3 Figure 3.1 Figure 3.2 Table 3.1 Figure 3.3 Figure 4.1 Figure 4.2 Figure 4.3 Figure 4.4 Figure 4.5 Figure 4.6 Figure 4.7 Figure 4.8 Table 4.1 Table 4.2 Table 4.3 Table 4.4 Figure 5.1 Table 5.1 Figure 5.2 Table 5.2 Figure 5.3 Figure 5.4 Figure 6.1 Table 6.1 Figure 6.2 Figure 6.3 Figure 6.4 Figure 6.5 Table 6.2

Cost of fixing defects. ............................................................................................ Kano model. ........................................................................................................... Steps in the benchmarking process. ................................................................... ISO 9001:2008 major elements. ............................................................................. SEI CMMI for Development staged representation. ........................................ SEI CMMI for Development staged representation. ........................................ SEI CMMI for Development continuous representation. ................................ Change force field analysis. ................................................................................. Satir change model. ............................................................................................... Knowledge transfer. .............................................................................................. Maslows hierarchy of needs. ............................................................................... Situational leadership styles. ............................................................................... Formal negotiation process. ................................................................................. One-way communication model. ........................................................................ Two-way communication model. ........................................................................ Oral communication techniques. ........................................................................ Written communication techniques. .................................................................. Open-ended versus closed-ended questionsexamples. .............................. Context-free questionsexamples. .................................................................... Stages of team development. ............................................................................... Team problems and potential solutions. ............................................................ Affinity diagram techniqueexample. ............................................................. Prioritization matrixexample. ......................................................................... Prioritization graphexample. ........................................................................... Force field analysis template. ............................................................................... QMS documentation hierarchy. .......................................................................... ETVX methodexample. ..................................................................................... Roles in a process flow diagramexample. ..................................................... Process flow diagramexample. ....................................................................... High-level process architectureexample. ...................................................... Categories of product stakeholders. ................................................................... Business needs and motivesexample. ............................................................

5 6 8 18 22 23 23 28 28 29 30 33 38 41 41 42 44 46 46 51 53 57 59 59 60 64 66 67 68 70 73 77

xv

xvi

List of Figures and Tables

Figure 6.6 Figure 6.7 Figure 6.8 Figure 6.9 Figure 6.10 Figure 7.1 Figure 7.2 Figure 7.3 Figure 7.4 Figure 7.5 Figure 7.6 Figure 7.7 Figure 8.1 Figure 8.2 Figure 8.3 Figure 8.4 Figure 8.5 Figure 9.1 Figure 9.2 Figure 9.3 Figure 9.4 Figure 9.5 Figure 9.6 Figure 9.7 Figure 9.8 Figure 9.9 Figure 9.10 Figure 9.11 Figure 9.12 Figure 9.13 Figure 9.14 Figure 9.15 Figure 10.1 Figure 10.2 Figure 10.3 Figure 11.1 Figure 11.2 Figure 11.3 Figure 11.4 Table 11.1 Figure 11.5

Acquisition process. .............................................................................................. Supplier score cardcalculated scoring examples. ......................................... Supplier score cardweighted scoring examples. ........................................... Supplier score cardindexed scoring examples. ............................................. Integrated product team. ...................................................................................... Classic model of optimal cost-of-quality balance. ............................................ Modern model of optimal cost of quality. ......................................................... Plandocheckact model. ................................................................................... Standard deviations versus area under a normal distribution curve. .......... DMAIC versus DMADV. ...................................................................................... Product problem resolution processexample. ............................................... Corrective action processexample. ................................................................. Internal first-party audit. ..................................................................................... External second-party audit. ............................................................................... External third-party audit. ................................................................................... Audit process. ......................................................................................................... Audit execution process. ...................................................................................... Waterfall modelexample. ................................................................................. V-modelexample. ............................................................................................... W-modelexample. .............................................................................................. Spiral model steps. ................................................................................................ Spiral modelexample. ....................................................................................... Iterative modelexample. ................................................................................... Testcoderefactor rhythm. ................................................................................. Test-driven development. ..................................................................................... Feature-driven development process. ................................................................ Incremental development processexample. .................................................. Incremental development over timeexample. .............................................. Combination of iterative and incremental modelsexample. ...................... Rapid application developmentexample. ....................................................... Evolutionary development over timeexample. ............................................. Evolutionary development processexample. ................................................ Five-tier architectureexample. ......................................................................... Clientserver architectureexample. ............................................................... Web architectureexamples. .............................................................................. Requirements engineering process. ................................................................... Incremental requirements development. ........................................................... Levels and types of requirements. ...................................................................... Use case diagramexample. ............................................................................... Use caseexample. ............................................................................................... Storyboardexample. ..........................................................................................

80 84 85 85 86 92 92 94 95 96 100 102 110 110 111 117 122 131 132 133 133 134 136 137 137 138 141 141 143 143 144 145 147 148 149 153 154 156 167 169 171

List of Figures and Tables xvii

Figure 11.6 Figure 11.7 Figure 11.8a Figure 11.8c Figure 11.9 Table 11.2 Figure 11.10 Figure 11.11 Figure 11.12 Table 11.3 Figure 12.1 Figure 12.2 Figure 12.3 Table 12.1 Figure 12.4 Table 12.2 Figure 12.5 Figure 13.1 Figure 13.2 Figure 13.3 Figure 13.4 Figure 13.5 Figure 15.1 Figure 15.2 Figure 15.3 Figure 15.4 Figure 15.5 Figure 15.6 Figure 15.7 Figure 15.8 Figure 15.9 Table 15.1 Table 15.2 Figure 15.10 Figure 15.11 Figure 15.12 Figure 15.13 Figure 15.14 Figure 15.15

Data flow diagram (DFD) symbols. .................................................................... Data flow diagram (DFD)example. ................................................................ Entity relationship diagram (ERD)example. ................................................ Entity relationship diagram (ERD) modalityexample. ............................... State transition diagram graphicexample. .................................................... State transition tableexample. ......................................................................... Class diagramexample. .................................................................................... Sequence diagramexample. ............................................................................. Activity diagramexample. ............................................................................... Event/response tableexample. ........................................................................ Requirements participants. .................................................................................. Organizational context. ........................................................................................ Iterative requirements evaluation. ...................................................................... Requirements prioritization matrix. ................................................................... Bidirectional (forward and backward) traceability. ......................................... Traceability matrixexample. ............................................................................ Trace taggingexample. ...................................................................................... Levels of cohesion. ................................................................................................. Levels of coupling. ................................................................................................. Methodology triangles. ........................................................................................ Agile manifesto. ..................................................................................................... Pair programming. ................................................................................................ Project management processes. ........................................................................... Life cycle phase project management processes. ............................................. Cost/schedule/scope trilogy. .............................................................................. Project planning is the road map for the project journey. .............................. Project planning. .................................................................................................... Long-term versus near-term planning. .............................................................. PMI project planning process group. ................................................................. Project estimates and forecasts. ........................................................................... Program evaluation and review technique (PERT) method. .......................... Weighted scale factors for the COCOMO II model. ......................................... Cost drivers for COCOMO II model. .................................................................. Rayleigh staffing curve. ........................................................................................ Product-type work breakdown structureexample. ...................................... Process-type work breakdown structureexample. ...................................... Hybrid work breakdown structureexample. ................................................ Activity on the line networkexample. ............................................................ Activity on the node networkexample. ..........................................................

173 174 175 175 176 176 177 177 178 179 179 180 182 184 185 189 191 193 194 197 197 206 207 209 221 222 223 224 225 226 227 232 235 237 238 239 242 243 243 244 244

Figure 11.8b Entity relationship diagram (ERD) cardinalityexample. ............................ Figure 11.8d Other cardinality and modality symbolexamples. ......................................

xviii List of Figures and Tables

Figure 15.16 Figure 15.17 Figure 15.18 Figure 15.19

Types of activity network relationships. ............................................................ Critical path analysisexample. ........................................................................ Scrum process. ....................................................................................................... Burn-up chartexample. .....................................................................................

245 246 250 252 253 254 260 261 263 264 268 269 270 270 271 271 272 272 273 274 283 284 284 285 289 292 292 293 294 294 295 296 296 298 307 309 311 314 317 324 327

Figure 15.20 Burn-down chartexample. ............................................................................... Figure 15.21 PMI executing process group. ............................................................................. Figure 16.1 Figure 16.2 Figure 16.3 Figure 16.4 Table 16.1 Table 16.2 Figure 16.5 Figure 16.6 Figure 16.7 Figure 16.8 Table 16.3 Figure 16.9 Figure 16.10 Figure 16.11 Figure 17.1 Figure 17.2 Table 17.1 Figure 17.3 Figure 17.4 Table 17.2 Table 17.3 Figure 17.5 Table 17.4 Table 17.5 Table 17.6 Table 17.7 Table 17.8 Table 17.9 Figure 18.1 Table 18.1 Figure 18.2 Figure 18.3 Figure 18.4 Figure 19.1 Figure 19.2 Actual project journey. .......................................................................................... PMI monitoring and controlling process group. .............................................. Scheduling Gantt chartexample. .................................................................... Tracking Gantt chartexample. ......................................................................... Earned valueexample. ...................................................................................... Interpreting earned value. ................................................................................... Interpreting earned value. ................................................................................... Design deliverable statusexample. ................................................................. Code deliverable statusexample. .................................................................... Story point deliverable statusexample. .......................................................... Productivity metricexamples. ......................................................................... Productivity metricexample. ........................................................................... Resource utilization metricexample. .............................................................. Staff turnover metricexample. ......................................................................... Risk/opportunity balance. ................................................................................... Risk duration. ......................................................................................................... Project management versus risk management. ................................................ Risk management process. ................................................................................... Risk statement examples. ..................................................................................... Risk exposure prioritization matrixexample. ............................................... Risk exposure scoresexample. ......................................................................... Risk-handling options. ......................................................................................... Obtain additional information actionexamples. .......................................... Risk avoidance actionexamples. ...................................................................... Risk-transfer action. .............................................................................................. Risk mitigation plansexamples. ...................................................................... Risk reduction leverageexample. .................................................................... Risk contingency planexamples. ..................................................................... Metrics defined. ..................................................................................................... Internal and external attributes of a code inspection. ..................................... Reliability and validity. ........................................................................................ Metric functionexamples. ................................................................................ Area under the normal curve. ............................................................................. Goal/question/metric paradigm. ....................................................................... Function points. .....................................................................................................

List of Figures and Tables

xix

Figure 19.3 Figure 19.4 Table 19.1 Figure 19.5 Figure 19.6 Figure 19.7 Figure 19.8 Figure 19.9 Figure 19.10 Figure 19.11 Figure 19.12 Figure 19.13 Figure 19.14 Figure 19.15 Figure 19.16

Cyclomatic complexityexamples. .................................................................... Structural complexityexamples. ..................................................................... Defect density example inputs. ........................................................................... Post-release defect densityexample. ............................................................... Problem report arrival rateexamples. ............................................................ Cumulative problem reports by statusexample. .......................................... Amount of test coverage neededexample. ..................................................... Changes to requirements sizeexample. ......................................................... Requirements volatilityexample. .................................................................... Availability example. ............................................................................................ Customer satisfaction summary reportexample. ......................................... Customer satisfaction detailed reportsexample. .......................................... Customer satisfaction trending reportexample. ........................................... Responsiveness to customer problemsexample. .......................................... Defect backlog agingexample. .........................................................................

329 330 331 332 333 333 335 336 336 338 341 342 343 344 344 345 346 346 347 347 348 350 350 352 352 353 353 353 354 354 355 355 356 357 358 359 359 364 369 370 371

Figure 19.17a Measuring escapesrequirements example. ................................................... Figure 19.17b Measuring escapesdesign example. ............................................................... Figure 19.17c Measuring escapescoding example. ............................................................... Figure 19.17d Measuring escapestesting example. ............................................................... Figure 19.17e Measuring escapesoperations example. ........................................................ Figure 19.18 Defect containment effectivenessexample. ................................................... Figure 19.19a Defect removal efficiencydesign review example. ....................................... Figure 19.19b Defect removal efficiencycode review example. .......................................... Figure 19.20 Figure 19.21 Figure 19.22 Figure 19.23 Figure 19.24 Figure 19.25 Figure 19.26 Figure 19.27 Figure 19.28 Figure 19.29 Figure 19.30 Figure 19.31 Figure 19.32 Figure 19.33 Table 20.1 Figure 20.1 Figure 20.2 Figure 20.3 Data tableexample. ............................................................................................ Pie chartexample. .............................................................................................. Line graphexample. ........................................................................................... Simple bar chartexample. ................................................................................. Horizontal bar chartexample. .......................................................................... Grouped bar chartexample. ............................................................................. Stacked bar chartexample. ............................................................................... Area graphexample. .......................................................................................... Box chartexample. ............................................................................................. Box chart components. .......................................................................................... Stoplight chartexamples. .................................................................................. Dashboardexample. .......................................................................................... Kiviat chartexample. ......................................................................................... Kiviat chart comparisonexample. ................................................................... Data ownershipexamples. ................................................................................ Basic flowchart symbols and flowchartexamples. ....................................... Pareto chartexample. ........................................................................................ Cause-and-effect diagramexample. ................................................................

xx

List of Figures and Tables

Figure 20.4 Figure 20.5 Figure 20.6 Figure 20.7 Figure 20.8 Figure 20.9 Figure 20.10 Figure 20.11 Figure 20.12 Figure 20.13 Figure 20.14 Figure 20.15 Figure 21.1 Figure 21.2 Figure 21.3 Figure 21.4 Figure 22.1 Figure 22.2 Figure 22.3 Figure 22.4 Figure 22.5 Figure 22.6 Figure 22.7 Table 22.1 Table 22.2 Figure 22.8 Table 22.3 Table 22.4 Table 22.5 Figure 22.9 Figure 22.10 Figure 22.11

Process-type cause-and-effect diagramexample. ......................................... Check sheetexample. ......................................................................................... Scatter diagramexamples. ................................................................................ Run chartexample. ............................................................................................ S-curve run chartexample. ............................................................................... Histogramexamples. ......................................................................................... Creating a control chartexample. .................................................................... Statistically improbable patternsexample. .................................................... Affinity diagramexample. ............................................................................... Tree diagramexample. ...................................................................................... Matrix diagramexample. .................................................................................. Interrelationship digraphexample. ................................................................. Verification and validation. ................................................................................. V&V techniques identify defects. ....................................................................... Verification and validation sufficiency. ............................................................. Probability indicators. ........................................................................................... When testing happens in the life cycle. ............................................................. Test activities throughout the life cycle. ............................................................. White-box gray-box testing. ............................................................................ Top-down testing strategy. .................................................................................. Bottom-up testing strategy. .................................................................................. Black-box testing. ................................................................................................... Input field boundaryexample. ......................................................................... Causes and effectsexample. ............................................................................. Causeeffect graphsexample. .......................................................................... Cause-and-effect graph with constraintsexample. ...................................... Causeeffect graph constraint symbolsexample. ......................................... Limited-entry decision tableexample. ........................................................... Test cases from causeeffect graphingexample. ........................................... Levels of testing. .................................................................................................... Function and subfunction listexample. ......................................................... Environment in which the function operates. ..................................................

371 372 373 374 374 375 376 377 380 381 381 382 387 387 390 392 395 395 397 398 399 400 407 410 411 412 412 413 414 415 416 417 419 421 425 427 431 433 433 434 434

Figure 22.12 Decision tree example. .......................................................................................... Figure 22.13 Load, stress, and volume testing. ........................................................................ Figure 22.14 Table 22.6 Table 22.7 Table 22.8 Table 22.9 Table 22.10 Test what changed. ................................................................................................ Test matrixexample. .......................................................................................... Configuration test matrix. .................................................................................... Statement coverageexample. ............................................................................ Decision coverageexample. .............................................................................. Condition coverageexample. ...........................................................................

Figure 22.15 Code example. ........................................................................................................

List of Figures and Tables

xxi

Table 22.11 Table 22.12 Figure 22.16 Table 22.13 Table 22.14 Figure 23.1 Figure 23.2 Figure 23.3 Figure 23.4 Table 23.1 Figure 23.5 Figure 23.6 Figure 23.7 Figure 23.8 Figure 23.9 Figure 24.1 Figure 24.2 Figure VII.1 Figure 26.1 Figure 26.2 Figure 26.3 Figure 26.4 Figure 26.5 Figure 26.6 Figure 26.7 Figure 26.8 Figure 26.9 Figure 26.10 Figure 27.1 Figure 27.2 Figure 27.3 Figure 27.4 Figure 27.5 Figure 27.6 Figure 27.7 Figure 27.8 Figure 27.9 Figure 27.10 Figure 28.1 Figure 28.2 Figure 28.3

Decision/condition coverageexample. ........................................................... Multiple condition coverageexample. ............................................................ Control flow graphexample. ............................................................................ Stubexample. ...................................................................................................... Driverexample. .................................................................................................. Selecting peer reviewers. ...................................................................................... Informal versus formal peer reviews. ................................................................ Types of peer reviews. .......................................................................................... Risk-based selection of peer review types. ........................................................ Work product predecessorexamples. ............................................................. Formal walk-through process. ............................................................................ Walk-throughs versus inspection. ...................................................................... Inspection process. ................................................................................................ Inspection planning step process. ...................................................................... Inspection meeting step process. ........................................................................ Types of testing documentation. ......................................................................... Test caseexamples. ............................................................................................. Software configuration management activities. ............................................... Software build. ....................................................................................................... Creating a new software work product. ............................................................ Creating a software build. .................................................................................... Testing a software build. ...................................................................................... Modifying a controlled work productcheck-out process. ........................... Modifying a controlled work productmodification process. ..................... Modifying a controlled work productcheck-in process. ............................. Main codelineexample. .................................................................................... Branchingexample. ............................................................................................ Mergingexample. ............................................................................................... Levels of software work product control. .......................................................... Configuration identification hierarchy. ............................................................. Configuration item acquisition. .......................................................................... Configuration unit identification schemeexample. ..................................... Labelingexample. .............................................................................................. Branching identification schemeexample. .................................................... Build identification schemeexample. ............................................................. Build identification schemeinternal build status example. ........................ Document identification schemeexample. .................................................... Types of baselines. ................................................................................................. Types of software configuration control. ........................................................... Change control process. ....................................................................................... Document control process. ...................................................................................

435 435 436 439 439 447 449 450 451 454 455 456 457 459 462 466 469 480 487 492 493 494 495 495 496 496 497 498 501 502 506 507 508 509 509 510 511 512 518 519 521

xxii List of Figures and Tables

Figure 28.4 Figure 28.5 Figure 28.6 Figure 28.7 Figure 28.8 Figure 28.9 Figure 28.10 Figure 28.11

Manual tracking of item changes. ....................................................................... Configuration item dependencies. ...................................................................... Configuration control board processes for change control. ........................... Configuration control board processes for document control. ...................... Using backward traceability for defect analysisexample. .......................... Using forward traceability for defect analysisexample. ............................. Multiple levels of CCBscode example. ........................................................... Membership of multiple levels of CCBs. ............................................................

522 524 526 528 530 531 532 532 533 534 535 536 537 538 546 548 552 552 555 556

Figure 28.12 Multiple levels of CCBssoftware requirements specification (SRS) example. .................................................................................................................. Figure 28.13 Figure 28.14 Figure 28.15 Figure 28.16 Figure 28.17 Table 29.1 Table 29.2 Figure 30.1 Figure 30.2 Figure 30.3 Figure 30.4 Software product evolution graphexample. .................................................. Software unit versioningexample 1. ............................................................... Software unit versioningexample 2. ............................................................... Software unit versioningexample 3. ............................................................... Impact analysis and concurrent development. ................................................. Example FCA checklist items and evidence-gathering techniques. ............. Example PCA checklist items and evidence-gathering techniques. ............. Corrective release. ................................................................................................. Feature release. ...................................................................................................... Packaging of releases over time. ......................................................................... The problem with patching. .................................................................................

Preface

ontinuous improvement is a mantra implicit to the quality profession. So as software quality engineers, we should not be surprised our own discipline has continued to evolve and change. By practicing what we preach in our own field, adopting lessons learned from implementing software quality principles and practices, and proactively staying involved in managerial, procedural, and technological advances in software engineering and the quality arena, software quality engineers have learned to increase the value they add to the end software products. One of the primary roles of a software quality engineer is to act as a management information source that keeps software quality as visible to software management as cost and schedule are when business plans and decisions need to be made. In order to fulfill this role, software quality engineers must continuously improve their skill and knowledge sets. The software quality profession has moved beyond the limits of using only testing or auditing as the primary tools of our trade. Software quality has emerged into a multi-faceted discipline that requires us, as software quality engineers, to be able to understand and apply knowledge that encompasses: Software quality management. The processes and activities involved in setting the organizations strategic quality goals and objectives, establishing organizational, project, and product quality planning, and providing the oversight necessary to ensure the effectiveness and efficiency of the organizations quality management system. Software quality management provides leadership and establishes an integrated, cross-functional culture where producing high-quality software is just the way we do things around here. Software quality engineering. The processes and activities needed to define, plan, and implement the quality management system for software-related processes, projects, and products. This includes defining, establishing, and continuously improving software-related systems, policies, processes, and work instructions that help prevent defects and build quality into the software. Software quality assurance. The planned and systematic set of all actions and activities needed to provide adequate confidence that the: Software work products conform to their standards of workmanship and that quality is being built into the products xxiii

xxiv Preface

Software quality engineering

Software quality assurance

Software quality management

Area of knowledge needed by a Software Quality Engineer

Software quality control

Verification and validation Soft skills

Organizations quality management system (or each individual process) is adequate to meet the organizations quality goals and objectives, is appropriately planned, is being followed, and is effective and efficient. Software quality control. The planned and systematic set of all actions and activities needed to monitor and measure software projects, processes, and products to ensure that special causes have not introduced unwanted variation into those projects, processes, and products. Software verification and validation. The processes and activities used to ensure that software products meet their specified requirements and intended use. It helps ensure that the software was built right and the right software was built. Soft skills. A software quality engineer also needs what are referred to as the soft skills to be effective in influencing others toward quality. Examples of soft skills include leadership, team building, facilitation, communication, motivation, conflict resolution, negotiation, and more. The ASQ Certified Software Quality Engineer (CSQE) Body of Knowledge (BoK) is a comprehensive guide to the common knowledge software quality engineers should possess about these knowledge areas. To keep the CSQE BoK current with industry and practitioner needs, a modernized version of the CSQE BoK is released every five years. This handbook contains information and guidance that supports all of the topics of the 2008 version of the CSQE BoK (included in Appendix A) upon which the CSQE exam is based. Armed with the knowledge presented in this handbook to complement the required years of actual work experience, qualified software quality practitioners may feel confident they have taken appropriate steps in preparation for the ASQ CSQE exam. However, my goals for this handbook go well beyond it being a CSQE exam preparation guide. I designed this handbook not only to help the software quality engineers, but as a resource for software development practitioners, project managers, organizational managers, other quality practitioners, and other professionals who need to understand the aspects of software quality that impact their

Preface xxv

work. It can also be used to benchmark their (or their organizations) understanding and application of software quality principles and practices against what is considered a cross-industry good practice baseline. After all, taking stock of our strengths and weaknesses we can develop proactive strategies to leverage software quality as a competitive advantage. New software quality engineers can use this handbook to gain an understanding of their chosen profession. Experienced software quality engineers can use this handbook as a reference source when performing their daily work. I also hope that trainers and educators will use this handbook to help propagate software quality engineering knowledge to future software practitioners and managers. Finally, this handbook strives to establish a common vocabulary that software quality engineers, and others in their organizations can use to communicate about software and quality. Thus increasing the professionalism of our industry and eliminating the wastes that can result from ambiguity and misunderstandings. For me, personally, obtaining my CSQE certification, participating in the development of the ASQ CSQE program and even the writing this book were more about the journey then the destination. I have learned many lessons from my colleagues, clients and students over the years since I first became involved with the ASQ CSQE effort in 1992, as well as during my 30 plus year career in software. I hope that you will find value in these lessons learned as they are embodied in this handbook. Best wishes for success in your software quality endeavors! Linda Westfall lwestfall@westfallteam.com

(This page intentionally left blank)

Acknowledgments

would like to thank all of the people who helped review this book as I was writing it: Zigmund Bluvband, Dan Campo, Sue Carroll, Carolee CosgroveRigsbee, Margery Cox, Ruth Domingos, Robin Dudash, Scott Duncan, Eva Freund, Tom Gilchrist, Steven Hodlin, Theresa Hunt, James Hutchins, Yvonne Kish, Matthew Maio, Patricia McQuaid, Vic Nanda, Geree Streun, Ponmurugarajan Thiyagarajan, Bill Trest, Rufus Turpin, and Cathy Vogelsong. I would like to thank Jay Vogelsong for the cartoons and character clip art used in this book. I would like to express my appreciation to the people at ASQ Quality Press, especially Matt Meinholz and Paul OMara, for helping turn this book into reality. I would also like to thank the staff of New Paradigm Prepress and Graphics for their copyediting skills, for creating the table of contents, list of figures, and index, and for turning my manuscript into a format worthy of being published. Finally, I would like to thank all of the people who volunteered their time, energy, and knowledge to work with the ASQ and the Software Division to turn the Certified Software Quality Engineer (CSQE) exam into reality and who continue to support the ongoing body of knowledge and exam development activities.

xxvii

(This page intentionally left blank)

Part I
General Knowledge
Chapter 1 Chapter 2 Chapter 3 Chapter 4 Chapter 5 A. Quality Principles B. Ethical and Legal Compliance C. Standards and Models D. Leadership Skills E. Team Skills
Part I

Chapter 1
Part I.A

A. Quality Principles

ince this is a book about software quality engineering, it would be appropriate to start with a definition of quality. However, the industry has not, and may never, come to a single definition of the term quality. For example, the ISO/IEC Systems and Software EngineeringVocabulary (ISO/IEC 2009) has the following set of definitions for quality: 1. The degree to which a system, component, or process meets specified requirements 2. Ability of a product, service, system, component, or process to meet customer or user needs, expectations, or requirements 3. The totality of characteristics of an entity that bear on its ability to satisfy stated and implied needs 4. Conformity to user expectations, conformity to user requirements, customer satisfaction, reliability, and level of defects present 5. The degree to which a set of inherent characteristics fulfills requirements. Based on his studies of how quality is perceived in various domains (for example, philosophy, economics, marketing, operations management), Garvin (Schulmeyer 1998) concluded, Quality is a complex and multifaceted concept. Garvin describes quality from five different perspectives: Transcendental perspective. Quality is something that can be recognized but not defined. As stated by Kan (2003), to many people, quality is similar to what a federal judge once said about obscenity: I know it when I see it. This perspective of quality takes the viewpoint of the individual into consideration. What is obscenity to one person may be art to another. What one customer considers good software quality may not be high enough quality for another customer. Tom Peters cites the customers reaction as the only appropriate measure for the quality of a product. This requires that product developers keep in touch with their customers to ensure that their specifications accurately reflect the customers real (and possibly changing) needs.

Chapter 1: A. Quality Principles

Manufacturing perspective. Philip Crosby defines quality in terms of conformance to the specification. His point is that an organization does not want a variety of people throughout the development of a product trying to make judgments about what the customer needs or wants. A well-written specification is the cornerstone for creating a quality product. For software, however, this perspective of quality may not be sufficient since according to Wiegers (2003), errors made during the requirements stage account for 40 percent to 60 percent of all defects found in a software project. From another viewpoint, this perspective refers to the ability to manufacture (replicate) a product to that specification over and over within accepted tolerances. W. Edwards Deming talks about quality needing precision of effort. Before an organization adjusts its processes to improve them, it must let them run long enough to understand what is really being produced. Then it needs to design its specifications to reflect the real process capabilities. While the primary focus of software quality is on the design and development activities, this manufacturing and precision of effort quality perspective reminds software organizations that the replication process can not be completely ignored. User perspective. Joseph M. Juran cites fitness for use as the appropriate measure for quality. Software practitioners can all probably relate stories of software products that conformed to their specifications but did not function adequately when deployed into operations. This perspective of quality not only considers the viewpoints of the individual users but their context of use as well. For example, what a novice user might consider a quality user interface might drive a power user to distraction with pop-up help and warning messages that require responses. What is a secure-enough interface for a software database used for personal information at home might be woefully inadequate in a business environment. Product perspective. Quality is tied to inherent characteristics of the product. These characteristics are the quality attributes, also called the ilities of the software product. Examples include reliability, usability, availability, flexibility, maintainability, portability, installability, and adaptability. Of course they dont all end in ility. Functionality, correctness, fault tolerance, integrity, efficiency, security, and safety are also examples of quality attributes. The more the software has high levels of these characteristics, the higher its quality is considered to be. The ISO/IEC 25000 Software EngineeringSoftware Product Quality Requirements and Evaluation (SQuaRE) standard series (transition from the previous ISO/IEC 9126 and 14598 series of standards) provides a reference model and definitions for external and internal quality attributes and quality-in-use attributes. This standards series also provides guidance for specifying requirements, planning and managing, measuring, and evaluating quality attributes.

Part I.A

Part I: General Knowledge

Part I.A.1

Value-based perspective. Quality is dependent on the amount a customer is willing to pay for it. This perspective leads to considerations of good enough software quality. Are people willing to pay as much for high-quality video game software as they are for high-quality software in biomedical devices or the high-quality software for airplane navigation systems?

1. BenefIts of softwAre QuAlIty

Describe the benefits that software quality engineering can have at the organizational level. (Understand) Body of Knowledge I.A.1

At its most basic, increasing the quality of the software typically means reducing the number of defects in the software. Defects can result from mistakes that occurred during the development process that introduced faults into the software work products. Defects can also be missing, ambiguous, or incorrect requirements that result in the development of software that does not match the needs of its stakeholders. The most cost-effective way of handling a defect is to prevent it. In this case, software quality is accomplished through process improvement, increasing staff knowledge and skill, and through other defect prevention techniques that keep defects out of the software. Every defect that is prevented eliminates rework to correct that defect and the effort associated with that rework. If a defect does get interjected into the software, the shorter the period of time between when that defect is introduced and when it is identified and corrected, the less rework effort is typically required to correct that defect. Eliminating the waste of rework allows organizations to use the saved effort hours to produce additional value-added software. In this case, software quality is accomplished through techniques that improve defect detection techniques and find the defects earlier. Both defect prevention and detection help keep software defects from being delivered into operations. The elimination or reduction of rework can also be used to shorten the cycle time required to produce a software product, and it can directly translate into reductions in costs. For example, as illustrated in Figure 1.1, when using traditional software development methods, if a requirements defect is found during the requirements phase and it costs one unit to fix (for example, three engineering hours of effort, or $500), that same defect will typically increase exponentially in cost to fix as it is found later and later in the life cycle. In fact, studies show that it can cost 100-plus times more to fix a requirements defect if it is not found until after the software is released into operations. The main point here is that the development of software is a series of dependencies, where each subsequent step

Chapter 1: A. Quality Principles

100+

$
Cost $ $ $

Part I.A.1

1 $ $

Requirements

Design

Code

Test

Operations

Life cycle phases

Figure 1.1

Cost of fixing defects.

builds on and expands the products of the previous step. For example, a single requirement could result in four design elements that expand into seven code units. All of these have supporting documentation and/or tests. A defect that is prevented or found early keeps the entire tree from needing to be backtracked, investigated, and potentially reworked. Agile methods specifically attack these costs by significantly shortening the development cycle using incremental development and through other techniques that shorten defect interjection to correction cycle times. If fewer software defects are delivered into operations, there is a higher probability of failure-free operations. Unlike hardware, software does not wear out with time. If a defect is not encountered during operations, the software performs reliably. Reliable software can increase the effectiveness and efficiency of work being done using that software. Reliable software reduces both failure and maintenance costs to the softwares customers and thus reduces the overall cost of ownership of the software product. Taking a broader view, high-quality software is software that has been specified correctly and that meets its specification. If the software meets the stakeholders needs and expectations and is value-added, it is more likely to be used instead of ending up as shelfware. If the customers and users receive software that has fewer defects, that is more reliable, and that performs to their needs and expectations, then those customers and users will be more satisfied with the software. This is illustrated in Figure 1.2, which depicts Noritaki Kanos model of the relationship between customer satisfaction and quality. Basic quality. There is a basic level of quality that a customer expects the product to have. These are quality requirements that are assumed by the customer and are typically not explicitly stated or requested. For example,

Part I: General Knowledge

Satisfaction region

Part I.A.1

Exciting quality

Expected quality

Dissatisfaction region

Basic quality

Figure 1.2

Kano model.

customers expect a car to have four tires, a windshield, windshield wipers, and a steering wheel. They will not ask for these items when purchasing a new car, but they expect them to be there. This level of quality does not satisfy the customer. (Note that the entire basic quality line is in the dissatisfaction region.) However, absence of quality at this level will quickly increase a customers dissatisfaction. Expected quality. The expected quality line on the graph in Figure 1.2 represents those quality requirements that the customer explicitly considers and requests. For example, they will state their preferences for the make, model, and options when shopping for a car. The customer will be dissatisfied if this level of quality is not met and increasingly satisfied as this quality level increases. Exciting quality. This is the innovative quality level and represents unexpected quality items. These are items that the customer doesnt even know they want, but they will love them when they see them, for example, when cup holders were introduced in cars. Note that the entire exciting quality line is in the satisfaction region. It should be remembered, however, that todays innovations are tomorrows expectations. If fact, most customers now consider a cup holder as part of the basic requirements for a car. An increase in the quality of the software can also increase the satisfaction of the software practitioners. For most software engineers, their favorite activity is not burning the midnight oil trying to debug critical problems reported from operations. By producing a high-quality product, engineers can also take pride in what they are doing, which increases their satisfaction.

Chapter 1: A. Quality Principles

2. orgAnIzAtIonAl And ProCess BenChmArKIng Part I.A.2

Use benchmarking at the organizational, process, and project levels to identify and implement best practices. (Apply) Body of Knowledge I.A.2

Benchmarking is the process used by an organization to identify, understand, adapt, and adopt outstanding practices and processes from others, anywhere in the world, to help that organization improve the performance of its processes, projects, products, and/or services. Benchmarking can provide management with the assurance that quality and improvement goals and objectives are aligned with best-in-class practices of other organizations. At the same time it helps ensure that those goals and objectives are obtainable because others have obtained them. The use of benchmarking can help an organization think outside the box, and can result in breakthrough improvements. Figure 1.3 illustrates the steps in the benchmarking process. The first step is to determine what to benchmark, that is, which process, project, product, or services the organization wants to analyze and improve. This step involves assessing the effectiveness and efficiencies, strengths, and weaknesses of the organizations current practices, identifying areas that require improvement, prioritizing those areas, and selecting the area to benchmark first. The Certified Manager of Quality/ Organizational Excellence Handbook (Westcott 2006) says, examples of how to select what to benchmark include systems, processes, or practices that: Incur the highest costs Have a major impact on customer satisfaction, quality, or cycle time Strategically impact the business Have the potential of high impact on competitive position in the marketplace Present the most significant area for improvement Have the highest probability of support and resources if selected for improvement The second step in the benchmarking process is to establish the infrastructure for doing the benchmarking study. This includes identifying a sponsor to provide necessary resources and championship for the benchmarking activities within the organization. This also includes identifying the members of the benchmarking team who will actually perform the benchmarking activities. Members of this team should include individuals who are knowledgeable and involved in the area being benchmarked, and others who are familiar with benchmarking practices.

Part I: General Knowledge

1. Determine what to benchmark

Part I.A.2

2. Establish benchmarking infrastructure

3. Understand and measure current practice Repeat as appropriate 4. Determine source of benchmarking information

5. Analyze best practices

6. Plan and implement improvement

Figure 1.3 Steps in the benchmarking process.

In order to do an accurate comparison, the benchmarking team must obtain a thorough, in-depth understanding of current practice in the selected area. Key performance factors for the current practice are identified, and the current values of those key factors are measured during this third step in the benchmarking process. Current practices are studied, mapped as necessary, and analyzed. The fourth step is to determine the source of benchmarking best-practice information. A search and analysis is performed to determine the best-practice leaders in the selected area of study. There are several choices that can be considered: Internal benchmarking. Looks at other teams, projects, functional areas, or departments within the organization for best-practice information. Competitive benchmarking. Looks at direct competitors, either locally or internationally, for best-practice information. This information may be harder to obtain than internal information, but industry standards, trade journals, competitors marketing materials, and other sources can provide useful data. Functional benchmarking. Looks at other organizations performing the same functions or practices but outside the same industry. For example, an information technology (IT) team might look for best practices in other IT organizations in other industries. IEEE and ISO standards and the Software Engineering Institutes Capability Maturity Model Integration (CMMI) for Development are likely sources of information in addition to talking directly to individual organizations. Generic benchmarking. Looks outside the box. For example, an organization that wants to improve on-time delivery practices might look to FedEx, or an organization that wants to improve just-in-time, lean inventory practices might look to Wal-Mart or Toyota, even if the organization is not in the shipping, retail, or automotive arena.

Chapter 1: A. Quality Principles

In the fifth step of the benchmarking process, the best-practices information is gathered and analyzed. There are many mechanisms for performing this step including site visits to targeted benchmark organizations, partnerships where the benchmark organization provides coaching and mentoring, research studies of industry standards or literature, evaluations of best-practice databases, Internet searches, attending trade shows, hiring consultants, customer surveys, and other activities. The objective of this study is to: Collect information and data on the performance of the identified benchmark leader and/or on best practices Evaluate and compare the organizations current practices with the benchmark information and data Identify performance gaps between the organizations current practices and the benchmark information and data in order to identify areas for potential improvement and lessons learned This comparative analysis is used to determine where the benchmark is better and by how much. The analysis then determines why the benchmark is better. What specific practices, actions, or methods result in the superior performance? For benchmarking to be useful, the lessons learned from the best-practice analysis must be used to improve the organizations current practices. This final step in the benchmarking process involves: Obtaining management buy-in and acceptance of the findings from the benchmarking study. Incorporating the benchmarking findings into business analysis and decision making. Creating a plan of specific actions and assignments to adapt (tailor) and adopt the identified best practices to fill the performance gaps. Piloting those improvement actions and measuring the results against the initial values of identified key factors to monitor the effectiveness of the improvement activities. For successful improvement activities, propagating those improvements throughout the organization. For unsuccessful pilots or propagations, appropriate corrective action must be taken. Once the selected area has been successfully improved, lessons learned from the benchmarking activities should be leveraged into the improvement of future benchmarking activities. The benchmarking process can also be repeated to consider improvements for other areas from the prioritized list created in the first step of the benchmarking process, and of course this prioritized list should be updated as additional information is obtained over time. Benchmarking must be a continuous process that not only looks at current performance but also continues to monitor key performance indicators into the future as industry practices change and improve.

Part I.A.2

Index

A
acceptance criteria, 87, 26263, 416, 426, 506 acceptance testing, 416, 477 access control, 200, 300 accessibility, 150, 421 accessor operations, in object-oriented analysis and design, 199 accident, IEEE definition, 302 accountability, 15, 26, 140, 300 accuracy of data, 36667 metric, 339 quality attribute, 158 acquisition points, in software configuration management identification, 5056 acquisition process, in outsourcing, 8087 activity diagrams, 172, 178 activity network, 24445, 382 actor, versus user, 16667 ACWP (actual value), 26769 adaptive software maintenance, 214 adjourning, 52 affinity diagrams, 5758, 37879 Agile Alliance, 207 Agile Manifesto, 2078 agile project metrics, 25152 agile software development methods, 2068 allocated baseline, in software configuration management identification, 512 alpha testing, 477 analysis, in requirements coverage, 428 analytical techniques, 36084 anomaly report, 47172 appraisal cost of quality, 90 architectural design, 135, 146, 191, 195, 196, 203, 226, 270, 386, 513, 523 architectural design review, 275 archival processes, 55860 area graphs, 355

arrival rates, metric, 332 artistic testing, 42324 ASQ Code of Ethics, 1012 assemblers, 487 assumption, 81, 171, 286, 288, 33738, 408 attack, on software, 157, 200201, 299301 attribute, of entity, 308 audit, definition, 107 audit checklists, 121 audit corrective action, 12627 audit criteria, 11920 audit execution, 12225 audit follow-up, 12627 audit initiation, 11720 audit objectives, 1078 audit plan, 12021 audit planning, 12021 audit preparation, 12122 audit process, 11627 steps, 11617 audit program, 108 audit purpose statement, 11819 audit report, 125 audit reporting, 12526 audit scope, 119 audit team lead, 11415 audit team meetings, 12425 auditee management, 115 auditees, 116 auditor, lead, 11415 auditor management, 11314 auditors, 115 audits, 10727 consequences of, 1089 roles and responsibilities, 11316 types, 10913 author (inspection team member), 456 author assignment, 504 authority and responsibility, in software configuration management identification, 504

623

624 Index

automation of software builds, 514 test, 4023 availability metric, 338 quality attribute, 157

B
backup library, 491 backups, 55960 backward traceability, 192 in impact analysis, 530 bar charts, 35254 base measures, 31213 baseline metadata, 541 baselines, in configuration identification, 51113 basic quality, in Kano model, 56 basis path testing, 43637 BCWP (earned value), 26869 BCWS (planned value), 26769 behavioral patterns, 200 benchmarking, organizational and process, 79 best practices, identifying in audits, 126 beta testing, 47778 bidirectional traceability, 19094 in impact analysis, 530 big design up front (BDUF), 210 black-box testing, 400401 in regression testing, 424 Body of Knowledge, Software Quality Engineer Certification (Appendix A), 56171 bottom-up gray-box testing, 399400 boundary value analysis, 407 boundary value testing, 436 box charts, 355 box-and-whisker diagrams, 355 boxes, in cleanroom engineering, 212 brainstorming, 55 branch coverage, 434 branching, software configuration management library process, 49597 budget in project planning, 240 in project tracking and controlling, 265 build metadata, 541 build scripts, 488

builds. See software build burn charts, in crystal, 252 burn-down chart, 252 burn-up chart, 252 business analysts, 182 business objectives, in project deployment, 255 business requirements, 155 business requirements document (BRD), 160 business rules, in requirements engineering, 157

C
calendar time, 233 Capability Maturity Model Integration (CMMI) for Development, SEI, 2124 in corrective action process, 101 capacity, metric, 339 capital costs, in project planning, 240 cardinality, in entity relationship diagrams, 174 cause-and-effect diagram, 37071 causeeffect graphing (CEG), 40913 CD keys, 477 centerline, in control charts, 376 central limit theorem, 31415 certification testing, 416 change in software configuration management identification, 504 types of, 26 change agent, 27 change authority board (CAB), 52433 change control, 51820, 52627 change control board (CCB), 52433 in requirements change management, 190 in requirements management, 183 change management organizational, 2628 in requirements management, 183 tools, 48889 change request, 47172 charts, for metrics reporting, 35155 check sheets, 37172 check-in, 492, 534 checklist, 6971 audit, 121 standardized, 550 check-out, 494, 534 checksums, 476

Index 625

chief programmer work packages, in feature-driven development, 139 class, in object-oriented analysis and design, 198 class diagrams, 172, 17778 cleanroom engineering, 212 clear-box testing, 39697 client, audit, 113 clientserver architecture, 14849 closing meeting, of an audit, 114, 115, 122, 125 cluster sampling, 362 coaching, 3132 COBOL programming language, 145 COCOMO. See constructive cost model code and tests, 211 code coverage techniques, 43237 codeline, 495498, 53536 cohesion, 196, 201 collaboration platforms, 15051 colors, use of in graphs and charts, 355, 356 common cause variation, 378 commercial off-the-shelf (COTS) software in software configuration management identification, 505 testing of, 42526 communication in requirements elicitation, 162 types of, 4041 communication plans, 232 communication skills, 4047 competitive benchmarking, 8 compilers, 487 complex metrics, 31314 complexity, 137, 139, 168, 199, 2013, 238, 290, 310, 321, 323, 529 concept of operations document, 160 concurrent development, 53439 condition coverage, 43435 configuration audits, 54450 configuration components, 5012 configuration control, 51639 configuration control board (CCB), 100, 51920, 52433 member roles and responsibilities, 52526 multiple levels of, 53033 processes, 52628 configuration identification, 500515 baselines, 51113 managerial factors, 5045 methods (schema), 50611 technical factors in, 5034 configuration infrastructure, 48299

configuration items, 500513 dependencies, 523, 541 interrelationship data, 541 metadata, 541 configuration management deviations and waivers data, 541 configuration management team, 48286 configuration management tools, 48689 configuration status accounting, 53943 configuration status reporting, 54243 configuration units, 5012 conflict, management and resolution, 3437 conflicts of interest, 1012 conformance, 3, 112, 119, 544 consistency, in measurement validity, 310 constructive cost model (COCOMO), 236 constructive cost model version 2 (COCOMO II), 23639 effort multiplier cost drivers, 23839 constructor operations, in object-oriented analysis and design, 199 context-free questions, 46 contingency plans, in risk management, 297 continuous integration, in extreme programming, 210 contract objectives, in project deployment, 255 contract requirements, defining, in outsourcing, 83 contracts, 12 negotiating and awarding, in outsourcing, 8485 contractual risks, 290 control charts, 37678 control limits, 376 controlled library, 49091 controlled test environments, 44041 controlled work product, modifying, 49394 conversion (tort), 13 copyrights, 13 corrective action audit, 12627 procedures, 99104 process, 1014 project, 26566 request, 1012 corrective release, 551 corrective software maintenance, 21314 correctness, 3, 184, 212, 339, 388, 431, 480 correlation, 37273 in measurement validity, 310

626 Index

cost process metric, 340 project, 233, 240 cost/benefit analysis, 83, 134, 295, 306, 312 cost drivers, in the constructive cost model, 23839 cost objectives, in project deployment, 255 cost of quality (COQ), 8993 cost performance index (CPI), 268 cost variance, 268, 269 COTS. See commercial off-the-shelf (COTS) software counting criteria, 3089, 31213 coupling, 196 creational patterns, 200 critical design review, 276 critical path method (CPM), 24648 critical-to-quality (CTQ) characteristics, 1056 Crosby, Philip, 3 cross-functional teams, 48 crystal (agile methodology), 25152 customer, audit, 113 customer deliverables, 47478 customer impact metrics, 34145 customer satisfaction, metrics, 34143 customer/user testing, 47778 customers in requirements management, 181 as stakeholders, 7276 cycle time, process metric, 34041 cycle time objectives, in project deployment, 255 cyclic redundancy checks (CRC), 476 cyclomatic complexity, 310, 32829, 334 in basis path testing, 43637

D
daily deployment, in extreme programming, 211 dashboards, metrics, 35657 data high-quality, ensuring, 366 how to collect, 365 software configuration management, types, 541 test results, 472 who should collect, 36365 data accuracy, 36667 data availability time, 36768

data collection, 36368 reasons for, 306 and recording, in configuration status accounting, 53942 data collection time, 367 data completeness, 367 data domain coverage, 429 data extraction time, 368 data flow diagram (DFD), 172, 173 data flow modeling, in structured analysis and design, 197 data integrity, 36368 data modeling, in structured analysis and design, 197 data owner, role in data collection, 36365 data privacy, 14 data reporting time, 368 data requirements, 159 data tables, 35152 data timeliness, 36768 data-driven testing, 400401 date and time domain coverage, 429 debug, 90, 91, 100, 216, 332, 333, 441, 442, 445, 490, 514, 540 decision coverage, 434 decision criteria, in metric, 356 decision trees, 173 decomposition, in risk identification, 287 decoupling, 201 defect containment, 34849 defect density, 33031 defect detection, 34548 defect detection effectiveness, 34951 defect detection efficiency, 34951 defect prevention, 1046 defect removal efficiency (DRE), 34951 defect report, 47172 defects, and software quality, 4 deliverables, 6869 customer, 47478 tracking, 27072 delivery, software, 55557 Delphi technique, 23334 Deming, W. Edwards, 3 Deming circle, 93 demonstration, in requirements coverage, 42728 denial of service, 299 departmental audit strategy, 120 dependencies configuration item, 523 hardware, 557

Index 627

software, 55758 in software configuration management identification, 504 dependent variable, 372 depth metric, 32930 derived measures, 31314 design, quality attributes and, 200202 design constraints, 159 design for Six Sigma, 9697 design methods, 195200 design patterns, 199200 designers, in requirements management, 182 desk audits, 113 desk checks, 45254 versus inspections and walk-throughs, 45051 destructor operations, in object-oriented analysis and design, 199 detailed design, 97, 124, 135, 146, 182, 195, 196, 27071, 276, 386, 523 detailed design review, 276 developers, in requirements management, 182 development audits, of customer deliverables, 475 development library, 490 development testing, of customer deliverables, 475 developmental baselines, in software configuration management identification, 513 deviation, 26265, 269, 541 direct metrics, 31213 direct-access media, 557 disaster recovery plan, 559 discovery method audit strategy, 120 discriminative power, in measurement validity, 310 distributed work environments, working in, 54 distributors, as stakeholders, 73 DMADV (define, measure, analyze, design, verify) model, 9697 DMAIC (define, measure, analyze, improve, control) model, 9496 document audit, 113 document control, 52021, 52728 document studies, in requirements elicitation, 164 documentation quality management system, 63 test execution, 46678

documentation hierarchy, in software configuration management identification, 504 documentation plan, 229 documentation writers, in requirements management, 18283 domain testing, 43536 drivers, test, 43940 duration, 233, 24648 dynamic analysis, 389 dynamic cycle time, 341 dynamic library, 490 dynamic reliability models, 337

E
early design, in COCOMO II model, 236 earned value, tracking, 26769 earned value management (EVM), 267 analysis, 232 effective listening, techniques, 4345 efficiency (quality attribute), 157 effort, project, 23233 effort multiplier cost drivers, in COCOMO II model, 23839 electronic transfer, in software delivery, 556, 557 element method audit strategy, 120 embedded system, 14647 energized work, in extreme programming, 208 engineering change board (ECB), 52433 engineering process group (EPG), 49, 103 engineering processes, systems and software, 129217 enhancement request, 215, 488, 526 entity, in measurement, 308 entity behavior modeling, in structured analysis and design, 197 entity relationship diagram (ERD), 172, 17476 entry criteria, 262 environmental factors, in project planning, 228 environmental load testing, 420 equity theory, of motivation, 31 equivalence class partitioning, 4067 equivalence class testing, 43536 error guessing, 408 error tolerance (quality attribute), 158 escapes (defects), 345

628 Index

escort, 116 estimation and planning, in software configuration management identification, 504 estimation techniques, model-based, 23640 ethical compliance, 1012 ETVX method, of process mapping, 65 event, 178, 180 event/response tables, 173, 17880 evidence, objective definition, 12223 techniques for gathering, 12324 evolutionary change, 26 evolutionary development, 14345 evolutionary prototypes, 165 exception handling, 4089 exciting quality, in Kano model, 6 executable, 394, 474, 487, 488, 508, 513, 555 execution in requirements coverage, 428 in testing, 46768 exit criteria, 262 expectancy theory, of motivation, 31 expected quality, in Kano model, 6 expert-judgment estimation techniques, 23335 explicit knowledge, 29 explicit measures, 31213 exploratory audit, 120 exploratory testing, 42324 external attributes, of entity, 308 external audits, 11011 external failure cost of quality, 9091 external input, function type, 327 external inquiry, function type, 327 external interface file, 327 external interface requirements, 159, 160 external output, function type, 327 external validity, of metric, 310 extranet, 150 extreme programming (XP), 136, 20811 extrinsic motivation, 30

F
facilitated requirements workshops, 16364 facilitation skills, 3440 facilitator, 34 failure mode effects and criticality analysis (FMECA), 201, 302

failure report, 47172 fan-in metric, 330, 334 fan-out metric, 330, 334 fat client, in clientserver architecture, 14849 fault insertion, 408 fault seeding, 408 fault tolerance (quality attribute), 158 fault-error handling, 4089 feasibility, 165, 182, 287 feature release, 552 feature-driven development (FDD), 13840 field-testing, 47778 file systems, as users in functional testing, 41718 financial risks, 290 finding, 108, 109, 12526 firewall, 200, 300 firmware, 14647 first office verification, 47778 first-party audits, 10910 first-pass yield, process metric, 340 fishbone diagram, 37071 five whys method, 38384 flexibility (quality attribute), 158 flowcharts, 36870 focus groups, in requirements elicitation, 16263 follow-up, audit, 12627 follow-up audits, 11213 force field analysis, 60 forecasts, 23233 formal methods, 211 forming, phase of team development, 51 forward traceability, 191 in impact analysis, 530 fraud (tort), 14 full release, 554 function points, 32628 functional baseline, in software configuration management identification, 512 functional benchmarking, 8 functional characteristics, of configuration items, 505 functional configuration audit (FCA), 544, 545 functional coverage, 428 functional requirement, 152, 15556, 158, 160, 409, 416 functional testing, 400401, 41620

Index 629

G
Gantt charts, 26364 generic benchmarking, 8 Gilbs risk principle, 282 glass-box testing, 39697 goal/question/metric paradigm, 32425 gold-plating, 161 good-enough software, 402 graphs, for metrics reporting, 35155 gray-box testing, 397400 group dynamics, 52 grouped bar charts, 35254 groups, diverse, working with, 5254 guidelines, 17, 69 guides, 17

H
hackers, 74, 155, 300 haphazard sampling, 362 hardware dependencies, 557 in functional testing, 417 in test beds, 438 harnesses, test, 440 hash sums, 476 Hawthorne effect, 319 hazard analysis, 3013 high-level design review, 275 histograms, 37576 horizontal prototypes, 165 human factors, in metrics and measurement, 322 human factors studies, 17172 humans, as users in functional testing, 417

I
IDEAL model, SEI, 101 IEEE Computer Society, 19 IEEE software engineering standards, 1921 impact analysis, in configuration control, 52830 incremental change, 26 incremental deployment, in extreme programming, 211 incremental design, in extreme programming, 210

incremental development, 14042, 144 incremental requirements development, 154 independent test team, 400 independent variable, 372 information hiding, 147, 199 information radiators, in crystal, 25152 informative workspaces, in extreme programming, 208, 251 infrastructure plans, 231 inheritance, in object-oriented analysis and design, 198 initiator, audit, 113 in-process software configuration management audits, 54445 input, 65 input/output-driven testing, 400401 inspection, in requirements coverage, 427 inspection leader (audit team member), 45657 inspection package, 45960 inspections versus desk checks and walk-throughs, 45051 inspection meeting step, 46163 kickoff meeting step, 45859, 460 planning step, 45860 post-meeting steps, 464 preparation step, 46061 process and roles, 45658 and reviews, 44465 soft skills for, 46465 inspectors (inspection team members), 458 installability (quality attribute), 158 installation, software, 55557 installation testing, 477 instrumentation, in test beds, 438 integrated master schedules, 26465 integrated product team (IPT), 8687 integration complexity, 334 integration testing, 415 integrity (quality attribute), 157 intellectual property rights, 1213 interaction diagram, 178 interface, 3, 82, 146, 147, 159 interface coverage, 430 intermediate installation media, 55657 internal attributes, of entity, 308 internal audits, 10910 internal benchmarking, 8 internal failure cost of quality, 90

630 Index

internal logical file, function type, 327 internal validity, of metric, 310 International Function Point Users Group (IFPUG), 327 International Organization for Standardization (ISO), 17 internationalization, 555 internationalization configuration coverage, 43032 internationalization testing, 430 Internet, 14950 interoperability (quality attribute), 157 interrelationship digraph, 38182 interval scale measurement, 318 interviews, 4547 in requirements elicitation, 162 intranet, 15 intrinsic motivation, 30 Ishikawa diagram, 37071 ISO 9000 standards, 1718 ISO/IEC 9126 standard series, 3 ISO/IEC 14598 standard series, 3 ISO/IEC 25000 Software Engineering Software Product Quality Requirements and Evaluation (SQuaRE) standard series, 3, 158 item changes, tracking, 52122 iterative model, of software development, 136, 142

J
Jelinski-Moranda (JM) model, 33738 joint application design/development (JAD), 163 judgmental sampling, 36263 Juran, Joseph M., 3

labor costs, in project planning, 240 lead auditor, 11415 leaders, effective, qualities of, 2526 leadership definitions, 25 organizational, 2634 situational, 3234 styles of, 3334 leadership skills, 2547 lean client, in clientserver architecture, 148 lean techniques, 9799 legal compliance and issues, 1215 legal risks, 290 lessons learned in benchmarking, 9 in corrective action process, 101 in document studies, 164 engineering process groups and, 49 in inspections, 462463 as process assets, 228 in projects, 279280 and software configuration managment, 483, 485, 550 and standard processes, 64 in the test report, 473 and V&V techniques, 221 level of effort work activities, 220 library processes, 48999 license keys, 477 licenses, software, 13 life cycles, software development, 13045 line coverage, 433 line graphs, 352 lines of code (LOC), 326 linkers, 488 load/volume/stress testing, 420 loaders, 488 localization, 555 localization testing, 430

K
Kano model, of quality, 56 Kiviat chart, 35759 knowledge, types of, 2829 knowledge transfer, 2829

M
magnitude, in software configuration management identification, 504 maintainability metric, 339 quality attribute, 158, 202 maintenance, software management, 21317 strategy, 21517

L
labeling, 5089

Index 631

types, 21315 major milestone reviews, 27577 major nonconformance, in audit, 125 malicious code, 300 malpractice (tort), 14 management, and peer reviews, 44849 management reviews, 27879 management risks, 290 management system objectives, in project deployment, 255 managers, and peer reviews, 44647 manufacturing, verification and validation activities for customer deliverables, 47677 manufacturing perspective, of quality, 3 mapping system, for measurement, 308, 312 marketing requirements document (MRD), 160 Maslows hierarchy of needs, 3031 master library, 49091 mathematical proofs, 388 matrix diagrams, 37981 McCabes cyclomatic complexity. See cyclomatic complexity mean calculating the statistical mean, 315 time to change , 339 time to failure, 212, 337 time to fix, 310, 339 measurement definition, 3089 dos and donts, 32021 measurement error, 366 measurement method, 3089 measurement theory basic, 31418 and metrics, 30622 terminology, 30714 median, 315 meeting management, 3840 meetings, audit team, 12425 mental execution, in desk checks, 45253 mentoring, 3132 merging, software configuration management library process, 49899 messaging system architectures, 150 method, in object-oriented analysis and design, 199 methodologies, for quality management, 89106

metrics customers of, 32324 definition, 3078 dos and donts, 32021 and measurement theory, 30622 psychology of, 31922 software, 32539 terminology, 30714 test results, 472 metrics primitives, 31213 metrics providers, role in data collection, 36465 metrics reporting tools, 35159 migration, software, 216 milestone objectives, in project deployment, 255 minor nonconformance, in audit, 125 mock-ups, 165 modality, in entity relationship diagrams, 17576 mode, 315 model-based estimation techniques, 23640 models, 1624 definition, 17 moderator (inspection team member), 45657 modifier operations, in object-oriented analysis and design, 199 motivation, 2931 theories of, 31 types of, 30 multicultural environments, working in, 47 multiple condition coverage, 435 multitier architecture, 14748 multivoting, 58

N
negative correlation, 372 negative risks, 282 negligence (tort), 14 negotiate scope, 21011 negotiation, techniques, 3738 Nichols, Dr. Ralph, 43 nominal group technique (NGT), 56 nominal scale measurement, 31617 nonconformances, in audits, 125 nonfunctional requirements, 15859 nonverbal listening, 45 norming, phase of team development, 52 n-tier architecture, 14748

632 Index

O
Object Management Group (OMG), 17, 172 objective evidence definition, 12223 techniques for gathering, 12324 objectivity, 108 object-oriented analysis and design (OOAD), 19899 objects, in object-oriented analysis and design, 198 observation, in audit, 12526 offshore, 79 Open Web Application Security Project (OWASP), security principles, 301 open-ended questions, 4546 opening meeting, of an audit, 122 operating system, as user in functional testing, 417 operational profile testing, 41920 operational testing, 478 operations, in object-oriented analysis and design, 19899 operator documentation, 18283, 276, 404, 523 opportunities, in risk management, 282 oral communication, 42 ordering, verification and validation activities for customer deliverables, 47677 ordinal scale measurement, 31718 organizational change management, 2628 organizational leadership, 2634 organizational objectives, in project deployment, 255 organizational process assets, in project planning, 228 output, 6869 outsourcing, 7988 acquisition process, 8087

P
packaging, 553, 554 pair programming, in extreme programming, 20910 Pareto analysis, 369 Pareto chart, 36970 Pareto principle, 369 partial release, 554 patch release, 551, 554

patents, 12 path, 433 peer reviewers, selecting, 44649 peer reviews benefits of, 44546 of customer deliverables, 475 definition, 444 formal versus informal, 44950 objectives of, 44445 risk-based, 45152 soft skills for, 46465 types of, 45051 what to review, 446 perfective software maintenance, 214 performance (quality attribute), 157 performance testing, 420 performing, phase of team development, 52 personnel objectives, in project deployment, 255 personnel risks, 290 Peters, Tom, 2 phase containment effectiveness, 348 phase gate reviews, 27577 phase transition control, 26166 phase transition reviews, 27577 phase-end reviews, 27577 phases of a life cycle, 130, 13435 physical characteristics, of configuration item, 505 physical configuration audit (PCA), 544, 54748 pie charts, 352 piloting, dynamic analysis technique, 389 pilots, of customer deliverables, 475 plandocheckact (PDCA) model, 93 planning project, 22540 in quality management system, 7678 planning and estimation, in software configuration management identification, 504 planning poker technique, 234 platform configuration coverage and testing, 430 polar chart, 35759 polymorphism, 199 portability (quality attribute), 158 positive correlation, 372 positive risks, 282 post-architecture, in COCOMO II model, 236 post-conditions, in use cases, 168

Index 633

post-mortems, 27980 post-project reviews, 27980 preconditions, in use cases, 168 predictability, in measurement validity, 310 predictive validity, of metric, 310 preferred supplier relationship, 8788 preliminary design review, 275 presenter (inspection team member), 457 prevention cost of quality, 90 preventive action, 104 preventive software maintenance, 21415 prioritization graph, 5960 prioritization matrix, 5859 problem report backlogs, 33334 problem reports responsiveness metrics, 34345 as test execution documentation, 47172 process, definition, 64 process architecture, 69 process assets, organizational, in project planning, 228 process audits, 112 process capability metrics, 351 process change control board (PCCB), 103 process entities, 33940 process flow diagram, 6569, 173 process improvement models, 9399 process improvement opportunities, in audits, 126 process improvement plan, 229 process mapping, 6569 process measurement, 32359 process metrics, 33951 process models, software development, 13045 process owner teams, 49 process stakeholders, 74 processes project-specific, 7172 standardized, and quality management system, 6469 tailored, 7172 process-type cause-and-effect diagram, 370 procurement objectives, in project deployment, 255 product acceptance, 87, 426 product acceptance plan, 229 product architecture, in software configuration management identification, 503 product attributes, 15859

product audits, 112, 389 product backlog, in scrum, 249 product baselines, in software configuration management identification, 513 product distribution, and release, 55158 product evaluation, software, 39093 product functional requirements, 15657 product keys, 477 product liability (tort), 14 product limitations, in requirements elicitation, 161 product measurement, 32359 product owner, in scrum, 248 product perspective, of quality, 3 product problem resolution, 99101 product quality and reliability objectives, in project deployment, 255 product release, 55158 and distribution, 55160 product releases, types of, 55152 product scope, in requirements elicitation, 161 product stakeholders, 7273 product technical objectives, in project deployment, 255 product vision, in requirements elicitation, 161 production baseline, in software configuration management identification, 513 productivity, tracking, 272 program, definition, 28081 program evaluation and review technique (PERT), 23435 program reviews, 28081 programmers library, 490 project, definition, 220 project audits, 112 project charter, 221, 22728 project closure, 222 project closure plan, 229 project communications, 25657 project controlling, 25881 project corrective action, 26566 project deployment, 25357 project drivers, 22324 project estimation, 23233 project execution, 222 project forecasting, 23233 project initiation, 221 project management, 219303

634 Index

project manager, in requirements management, 183 project metrics agile, 25152 selecting, 274 project monitoring and controlling, 222 project objectives, 25455 project organization plans, 23031 project planning, 221, 22540 goals for, 224 project plans, 22526 project reviews, 27480 project scheduling, 24052 project stakeholders, 73 project team reviews, 27778 project tracking, 25881 methods, 26774 project vision and scope document, 160 proof of correctness, 388 proofs of concept, 165 prototypes, in requirements elicitation, 165 prototyping, in COCOMO II model, 236 Public Company Accounting Reform and Investor Protection Act of 2002, 1415

quality plans, product-level, 71 quality policies, 64 quality principles, 29 quality records, 69, 560 quarterly cycle, in extreme programming, 2089 questions, types of, 4546

R
race condition, 400, 409 radar chart, 35759 random audit, 120 random sampling, 361 range, 315 ranked-choice method, 58 rapid application development (RAD), 142 ratio scale measurement, 318 Rayleigh curve, 23940 Raytheon Electronic Systems (RES), cost of quality example, 9192 reader (inspection team member), 457 ready-to-release review, 27677 ready-to-ship review, 27677 ready-to-test review, 510 real customer involvement, in extreme programming, 210 recognition, in motivation, 30 recorder configuration control board role, 525 inspection team member, 457 team, 5051 recovery, 300, 559, 483, 485 reengineering, 204 refactoring, in software development, 13638 reflection, in extreme programming, 280 regression analysis, 424 regression testing, 42425 regulations, 17 regulatory compliance and issues, 1215 reinforcement theory, of motivation, 31 release backlog, in scrum, 25051 release management data, 541 release notes, 523, 554 release packaging, 55455 release planning and scheduling, 55254 release support, 558 reliability in measurement, 30912 quality attribute, 157, 201

Q
quality definitions of, 2 perceptions of, 24 quality action teams, 4849 quality analysis tools, 36884 quality assurance, xxixxii quality attributes, 15758 and design, 200202 quality circles, 4849 quality control, xxii quality council, 48 quality engineering, xxi, 97, 98 quality function, in requirements management, 182 quality gates, 26263 quality goals, 6272 quality improvement teams, 4849 quality management, methodologies for, 89106 quality management system (QMS), 6288 documentation, 63 quality management teams, 48 quality objectives, 6272

Index 635

reliability metric, 33738 replication, software, 55556 reproducibility, of software builds, 51415 request for proposal (RFP), 82 requirement, definition, 152 requirements not testable, in requirements coverage, 428 prioritizing, 18889 types, 15560 requirements allocation and traceability, in software configuration management identification, 504 requirements analysis, 154, 17280 requirements analysts, 182 requirements baseline, 15455 requirements change management, 190 requirements churn, 161, 33437 requirements coverage, 427 requirements development, 15354 requirements elicitation, 15354, 16072 miscellaneous techniques, 16465 requirements engineering, 15280 process, 152 requirements evaluation, 18489 requirements management, 155, 18194 participants, 18183 requirements management plan, 229 requirements review, 275 requirements specification, 154, 160 requirements validation, 154 requirements volatility, 33437 resource plans, 230 resource risks, 290 resource utilization, metric, 339 resource utilization testing, 421 resources, tracking, 273 response time, metric, 339 responsibility and authority, in software configuration management identification, 504 retirement, software, 217 retrospectives, 27980 return on investment (ROI), 249, 290, 295, 323 reusability (quality attribute), 158, 202 reuse software, 2035 in software configuration management identification, 5034 reverse engineering, 2045 review, definition, 444

reviews, and inspections, 44465 revision, definition, 523 rewards, in motivation, 30 rework, reduction of, in software quality, 45 rich client, in clientserver architecture, 14849 risk, definition, 283 risk analysis, 29092 risk exposure (RE), 291 risk identification, 28690 risk loss, 291 risk management, 282303 methods, 28599 planning, 29397 process, 28586 risk management plan, 229, 28586 risk mitigation action, 285, 295, 297 risk probability, 291 risk reduction leverage (RRL), 29597 risk response strategies, 285 risk statement, 289 risk taxonomy, 288 risk tracking, 29899 risk-based peer reviews, 45152 risk-based testing, 401 risk-based verification and validation, 39193 risk-handling options, 285 risks in risk management, 282 software, types of, 290 robustness (quality attribute), 158 root cause analysis, 38284 run charts, 37375

S
safety (quality attribute), 157, 201 safety analysis, 3013 safety-critical software, 302 sampling, 36063 Sarbanes-Oxley Act (SOX), 1415 Satir change model, 2728 scatter diagrams, 37273 schedule objectives, in project deployment, 255 schedule performance index (SPI), 26869 schedule variance, 26869 scheduling, project, 24052 scope, in software configuration management identification, 504

636 Index

scope creep, 33437 screener, configuration control board role, 525 scribe. See recorder scrum, 24851, 278 scrum master, 24849 scrum team, 24849 S-curve run chart, 37475 second-party audits, 11011 security designing for, 200201, 2012, 300 quality attribute, 157, 200201 security coverage, 430 security hole, 300 security principles, OWASP, 301 security risks, software, 29931 security testing, 430 SEI. See Software Engineering Institute sequence diagrams, 172, 178 service level agreement, 343, 558 service mark, 13 service pack, 551 set-based development, 98 shared code, in extreme programming, 211 Shewhart cycle, 93 shipping, verification and validation activities for customer deliverables, 47677 simulation testing, 402 simulators, in test beds, 438 single code base, in extreme programming, 211 site installation history data, 541 situational leadership, 3234 Six Sigma, process improvement model, 9497, 101 Six Sigma teams, 4849 size metrics, 32628 slack in activity networks, 244 in extreme programming, 209 SLIM. See software life cycle management model slim client, in clientserver architecture, 148 soft skills, for peer reviews and inspections, 46465 software analysis, 195212 Software and Systems Engineering Standards Committee, IEEE, 19 software applications, as users in functional testing, 418

software architecture. See architectural design software build, 487, 51315 creating, 492 testing, 493 software configuration auditing, 481 software configuration control, 480, 51643 software configuration identification, 480 software configuration management (SCM), 479560 in audits, 119 data, types, 541 definition, 47980 organizational-level group roles and responsibilities, 483 project data, 541 project-level group roles and responsibilities, 48385 risks associated with lack of good SCM practices, 481 special roles and responsibilities, 48586 software configuration management audits, 54450 software configuration management build tools, 48788 software configuration management change management tools, 48889 software configuration management librarians, 485 software configuration management libraries, 48990 software configuration management library processes, 48999 software configuration management library tools, 486 software configuration management managers, 485 software configuration management plan, 229 software configuration management planning and management, 481 software configuration management status accounting tools, 489 software configuration management toolsmiths, 485 software configuration management version control tools, 48687 software configuration status accounting, 480 software delivery, 55557 software dependencies, 55758

Index 637

software design, 195212 steps in, 196 software design specification (SDS), 194 software development, 195212 methods, 20612 tools, 205 software development life cycles, 13045, 22829 software development process models, 13045 software development processes, 22829 Software Engineering Institute (SEI), 21 Capability Maturity Model Integration (CMMI), 2124 software engineering processes, 129217 software installation, 55557 software license, 13 software life cycle management (SLIM) model, 23940 software maintenance management, 21317 types, 21315 software maintenance strategy, 21517 software metrics, 32539 and analysis, 30584 software migration, 216 software practitioners, in software configuration management, 486 software product baseline library, 491 software product entities, 32526 software product evaluation, 39093 software product partitioning, 5015 software quality, benefits of, 46 software quality assurance plan, 229 Software Quality Engineer Certification, Body of Knowledge (Appendix A), 56171 software quality management, 61127 software replication, 55556 software repository, 491 software requirements, versus system requirements, 15960 software requirements specification (SRS), 160 software retirement, 217 software reuse, 2035 software risks, 290 software security risks, 299301 software testing, levels of, 41416 software tests, 41425 software tools, in test beds, 438

software verification and validation (V&V), 385478 methods, 38890 risk-based, 39193 sufficiency, 389 task iteration, 38990 theory, 38693 software work product, creating new, 49192 source code, 87, 195, 203, 2045, 341, 43637, 487, 506, 508, 513, 53033, 555 V&V techniques for, 391 special cause variation, 378 spider chart, 35759 spiral model, of software development, 13335 sponsor, 27, 50, 81 sprint, in scrum, 24951 sprint backlog, in scrum, 25051 sprint planning meeting, in scrum, 24950 sprint retrospective, in scrum, 251 sprint retrospective meeting, 280 sprint review, in scrum, 251 sprint review meeting, 277 stacked bar charts, 354 staff turnover, 273 staffing, tracking, 273 staffing plans, 230 stakeholder functional requirements, 15556 stakeholder participation plans, 7476 stakeholders, 7276 benefits of identifying and involving, 74 identifying, in project initiation, 221 needs and motivations, 76 in requirements elicitation, 161 in requirements management, miscellaneous, 183 standard deviation, 94, 31516 standardized checklists, 550 standardized processes, and quality management system, 6469 standardized work instructions, 6971 standards, 1624 definition, 16 state coverage, 42829 state transition, 176 state transition diagrams and tables, 172, 176 statement coverage, 433 static analysis, 388 static cycle time, 34041 static library, 491 static reliability models, 337

638 Index

statistics, basic, 31418 status accounting, 53943 status reporting, 54243 stoplight charts, 356 stories, in extreme programming, 208 storming, phase of team development, 5152 storyboards, in requirements elicitation, 171 stratified sampling, 36162 stress testing, 420 strict product liability (tort), 14 structural complexity, 32930 structural patterns, 200 structural testing, 39697 structured analysis and design (SAD), 19697 stubs, 43839 subsidiary project plans, 22930 supplier audits, 11011 supplier components, tests of, 42526 supplier management, in outsourcing, 8587 supplier management plan, 229 supplier products, tests of, 42526 supplier qualification audit, 111 supplier surveillance audit, 111 suppliers identifying and evaluating potential, in outsourcing, 8283 selecting, in outsourcing, 8384 as stakeholders, 73 support, software release, 558 supportability (quality attribute), 158 system analysts, 182 system architecture, 146, 159, 196 system audits, 112 system library, 49091 system performance metrics, 33839 system requirements, versus software requirements, 15960 system testing, 415 system verification matrix, 427 systematic sampling, 361 systems architecture, 14651 systems engineering processes, 129217

T
tacit knowledge, 2829 tailoring, 67, 7172 task, in process definitions, 65 team champion, 4950

team colocation, in extreme programming, 208 team continuity, 210 team facilitator, 50 team leader, 50 team management, 4954 team skills, 4860 team tools, 5560 teams in extreme programming, 208 roles and responsibilities, 4951 stages of development, 5152 types of, 4849 technical reviews, 465 technical risks, 290 technical support, in requirements management, 183 technical writers, in requirements management, 18283 templates, 69, 105, 443 10-minute build, in extreme programming, 210 test automation, 4023 test beds, 438 test case review technique, in desk checks, 453 test cases, 46869 test coverage, 426 of code, 334 test coverage specifications, 42632 test design, and planning, 394443 test designs, 40613 test diagnostic tools, 442 test drivers, 43940 test environments, 43841 controlled, 44041 test execution, 46768 documentation, 46678 test harnesses, 440 test libraries, 440 test log, 47071 test management tools, 443 test matrix, 427 test planning, and design, 394443 test plans, 4035 test procedures, 46970 test readiness review, 276 test report, 47273 test results, data and metrics, 472 test scenarios, 46970 test scripts, 46970

Index 639

test strategies, 396403 test summary report, 47273 test tools, 44143 test utilities, 44142 testability, 445, 453 test-driven design (TDD), 13638, 401 test-driven development (TDD), 13638, 401 testers, in requirements management, 182 test-first programming, in extreme programming, 210 testing customer/user, 47778 definition, 394 installation, 477 thick client, in clientserver architecture, 14849 thin client, in clientserver architecture, 148 third-party audits, 111 thread, 41920 versus path, 433 throughput, metric, 338 throwaway prototypes, 165 tiger team, 49 time box, definition, 401 time-box testing, 4012 tool administrators, in software configuration management, 485 tools configuration management, 48689 metrics reporting, 35159 quality analysis, 36884 software development, 205 team, 5560 test, 44143 test diagnostic, 442 test management, 443 virus detection and removal, 488 top-down gray-box testing, 39899, 400 tort, 1314 total defect containment effectiveness (TDCE), 349 trace tagging, 194 traceability definition, 190 in impact analysis, 530 traceability matrix, 192 tracing, audit technique, 124 tracking of item changes, 52122 in measurement validity, 310 trademarks, 1314

training plans, 231 transcendental perspective, of quality, 2 tree diagrams, 379 trouble report, 47172

U
Unified Modeling Language (UML), 172 Unified Modeling Standards Language, 17 unit test specification, 194 unit testing, 414 usability characteristics of, 42122 quality attribute, 157, 202 usability testing, 42122 use case diagram, 167 use cases, in requirements elicitation, 16671 user, versus actor, 16667 user/customer testing, 47778 user documentation, 18283, 276, 404, 523 user perspective, of quality, 3 user requirements specification (URS) document, 160 user stories, in requirements elicitation, 166 users four classes, in functional testing, 41718 in requirements management, 181 as stakeholders, 7273

V
validation definition, 386 of requirements, 154 validity, in measurement, 30912 value adjustment factors, 32728 value stream mapping, 98 value-based perspective, of quality, 4 variance, 316 verbal listening, 45 verification, definition, 386 verification and validation (V&V), software, 385478 of customer deliverables, 47677 methods, 38890 risk-based, 39193 sufficiency, 389 task iteration, 38990

640 Index

of test tools, 441 theory, 38693 verification and validation plan, 229, 393 verification step, in process definition, 65, 6769 version, definition, 522 version control, 52223 version description document (VDD), 523, 554 vertical prototypes, 165 virus detection and removal tools, 488 visibility, in software configuration management identification, 505 V-model, of software development, 132 and testing, 394 volume testing, 420

W
walk-throughs, 45556 versus inspections and desk checks, 45051 waste, in software development, 9798 waterfall model, of software development, 13032 web architecture, 14950

weekly cycle, in extreme programming, 209 white-box testing, 3967 in regression testing, 424 wideband Delphi technique, 234 width metric, 330 wireless network access, 150 W-model, of software development, 13233 work breakdown structure (WBS), 24142 work instructions project-specific, 72 standardized, 6971 tailored, 72 work product controlled, modifying, 49394 externally delivered, 502 software, creating new, 49192 supplier, in software configuration management identification, 505 work product dispositions, 463 working library, 490 worst-case testing, 42223 written communication, 43

Y
Yourdon/DeMarco symbols, for data flow diagrams, 173

También podría gustarte