Está en la página 1de 32

Page 1

ABC Co. Testing Plan

ATM Project Test Plan

Authors: Matthew Heusser Tabrez Sait Patrick M. Bailey

April 17, 2002

Page 2

ABC Co. Testing Plan

Test Plan Table Of Contents


Purpose.............................................................................................................................................................4 Assumptions.....................................................................................................................................................4 Budgetary Assumptions............................................................................................................................4 .Net Technology Considerations..............................................................................................................4 Budget...............................................................................................................................................................4 Organization.....................................................................................................................................................6 Organizational Structure Chart.....................................................................................................................6 Roles and Responsibilities............................................................................................................................7 Methodology Approach....................................................................................................................................8 Process Overview.........................................................................................................................................8 General Testing Guidelines......................................................................................................................8 Early Development of Test Cases.............................................................................................................8 Extreme Programming..............................................................................................................................8 Definitions..................................................................................................................................................10 Risk Analysis..................................................................................................................................................11 Unit Test/Integration Test Plan.......................................................................................................................13 Approach.....................................................................................................................................................13 Unit Testing:...........................................................................................................................................13 Interface Testing:....................................................................................................................................14 Setup and Configuration Requirements......................................................................................................14 Participants.................................................................................................................................................14 Defect Handling..........................................................................................................................................14 Estimated Cost/Percentage of Budget........................................................................................................14 Approvals....................................................................................................................................................14 Setup and Configuration Requirements......................................................................................................14 Migration Requirement...........................................................................................................................14 Function Test Plan..........................................................................................................................................15 General....................................................................................................................................................15 Build Environment..................................................................................................................................15 Function Test Environment....................................................................................................................16 Participants.................................................................................................................................................16 Estimated Cost/Percentage of Budget........................................................................................................16 System Test Plan.............................................................................................................................................17 Approach.....................................................................................................................................................17 General....................................................................................................................................................17 Prioritization...........................................................................................................................................17 Actual Steps............................................................................................................................................18 Note On Security Testing.......................................................................................................................19 Manual Testing.......................................................................................................................................19 Application Automated Testing..............................................................................................................19 Setup and Configuration Requirements......................................................................................................19 Migration of Code.......................................................................................................................................20 Participants.................................................................................................................................................20 Estimated Cost/Percentage of Budget........................................................................................................20 Approvals....................................................................................................................................................20 Acceptance Testing.........................................................................................................................................21 Error Management..........................................................................................................................................22 Defect Prioritization................................................................................................................................22 Regression Testing..........................................................................................................................................23 General........................................................................................................................................................23 Approach.....................................................................................................................................................23 Appendix A System Test Budget Priority Analysis....................................................................................24

Page 3

ABC Co. Testing Plan

Appendix B - Test Case Template..................................................................................................................30 Appendix C - Defect Tracking Document......................................................................................................31 Appendix D Works Cited............................................................................................................................32

Page 4

ABC Co. Testing Plan

Purpose
This document provides the test plan for the ATM .Net project. It discusses the projects assumptions that went into its formulation, the basic philosophy of testing, budgetary constraints, roles and responsibilities to support testing, the phases of testing and the approach taken in each phase.

Assumptions
The test plan is based on the ATM project developed using .Net technology from Microsoft. In that light, this section will outline budgetary assumptions and considerations for the .Net technology itself. Further, the name of the company doing the development is The ABC Co.
Budgetary Assumptions

Based on the requirements, the total budget is based on 140 function points at a cost of $2500 each. The total budget for the project is $350,000. This figure represents the total staffing costs for the project. It does not include other costs such as hardware which are billed directly to the customer. The specific breakdown of the testing budget appears below in the plan. It is further assumed that the company carrying out the test plan is a mature organization and has already made an investment in automated tools. This is a critical assumption since automated test tools have a high up front investment, which does eventually pay for itself. However, considerable time and resources are spent in training testers on the tool[4].
.Net Technology Considerations

Research was conducted on existing .Net projects to review considerations in those test plans. Due to this being a new technology, those projects identified by Microsoft as examples were used as models in determining the configuration setup.

Budget
Based on the overall project bill of $350,000, the total budget for testing will be $87,500. In his book The Software Project Managers Survival Guide, Steve McConnell recommends that teams spend 20% of budget on system testing[19]. Because of the financial/mission critical nature of the ATM project, ABC Co. plans on spending 25% of budget on testing/QA resources. ( (350K*0.25) = 87,000) Assuming an average QA resource salary of $50,000 with a 25% markup for benefits, the testing phase totals roughly 16.8 man-months. (50K*1.25 / 87,500 * 12 months) This includes 6 man-months of maintenance/debugging (two programmers at three months

Page 5

ABC Co. Testing Plan

each), 3 man-months for the QA project lead, 3 man-months for the configuration manager, and 3 man-months for the testers. Controlling Expenses: ABC Co. uses part-time interns as testers at $6 to $10/hour with no benefits, so those 3 man-months actually include 5 part-time (20 hours/week) testers for three months. The expense of these five testers totals 3 months salary of a single employee at $45K/year, which is less than the $50K allocated. Notice that the total man-month cost of the QA cycle as defined above totals 15 manmonths this allows for the possibility of cost or time over-runs.

Page 6

ABC Co. Testing Plan

Organization
Organizational Structure Chart The chart below reflects the organizational structure of the development team assigned to this project. While it appears hierarchical in nature, it should be noted that the customer does maintain a director working relationship with the developers per the XP paradigm [14].

Figure 1 Organizational Chart

Project Manager

Full Time Customer Liaison

Testing/QA Manager

Configuration Mgmt Manager

Development Manager

Developer Tester

Page 7

ABC Co. Testing Plan

Roles and Responsibilities Testing requires support by all those involved in the development process. This section outlines the roles necessary to ensure effective testing [3]. These roles have specific responsibilities to support testing and are outlined in the table below. Table 1 Roles and Responsibilities Role: Description
Role: Project Manager Description: Has overall responsibility for the management of the project. Role: Testing Manager Description: Provides testing planning and management. Role: Configuration Manager Description: Provides testing planning and management. Role: Development Manager Description: Determines development approach. Role: Tester Description: Provides direct testing support.

Testing Responsibilities
Guides the implementation of the methodology to the project Appoints testing manager Ensures communication infrastructure exists to facilitate development and testing coordination Formulates testing policies Oversees the development of testing plans Oversees the definition of the test environment Ensures application of defined architecture to testing platforms Coordinates migration process to testing and production environment Oversees the implementation of the test environment Assigns resources to support design and coding Ensures that testing policy is followed by developers Coordinates testing cycles with testing manager Coordinates with developers to write test cases Implements test cases Records defects and reports back to developers Approves or disapproves migration of code to next phase in testing. Coordinates with the configuration management area to migrate code. Develops the use cases and stories that will support design and test cases Prepares code for delivery to the function test environment Reviews test results with tester.

Role: Developer Description: Analyst, designer or coders who produce the product

Page 8

ABC Co. Testing Plan

Methodology Approach
Process Overview
General Testing Guidelines

The steps in this testing plan are to ensure a smooth transition of code from a development environment to its production usage. The general approach in most recommended testing plans have been considered [2][7][8][10][12]. Combined, all of them emphasize a standard approach to testing with the following steps: usability testing, unit testing, integration testing, function testing, system testing, acceptance testing and regression testing. Testing starts with the developer and the customer verifying (usability testing) the interface early in development. The developer develops the code to support the requirements and exercises it (unit test). After this, the interfaces between modules is tested (integration testing), and then independent testing against requirements (functional) is conducted. Exercising all components in a production like environment (system testing) with final testing by the customer (acceptance testing) completes most of the process. This also takes into consideration the need to retest software in the event of ongoing fixes (regression testing).
Early Development of Test Cases

Test cases are developed early in the process [2][12]. To complement the Extreme Programming paradigm, which is discussed below, use cases will be developed early in development. These describe the use of a system by an actor and form the basis for both design and test cases.
Extreme Programming

The most common method of software development is the waterfall method, which specifies a phased approach to software development including plan, design, code, test, and release cycles. If a project is canceled during coding or earlier, its probable that no working code was developed, and the entire project will yield no tangible results. Instead of the waterfall model, ABC Company prefers the Extreme Programming (or XP) approach to its projects. Extreme Programming is a software method designed to create working software that solves a business need as soon as possible[1]. Under Extreme Programming, the developers sit with the customer and create a series of lists of functionality on 3x5 cards, called stories. The customer prioritizes these stories and then coders create test cases for the software then they design and code the software. This is one reason why the presence of the customer during testing is critical. XP processes also include pair-up design sessions and code sessions; this ensures all code that is created is viewed by two people. After the software is created, it may be refactored, and then tested on a single continuous integration machine. After copying the

Page 9

ABC Co. Testing Plan

code, developers run a full suite of the test first unit tests. If the code passes all tests, the developers can check in the code and the unit test. XP unit tests and building are covered in more detail under unit tests, below. Essentially, stories allow coders to begin development early, and an on-site customer ensures that the software that is created actually meets the customers need. Getting to the code early transforms expenses into working software; because of the priorities, its possible for the customer to cancel the project at 25% of schedule and actually get working software that fulfils the most important 25% of the business need. Extreme Programming also uses the concept of Project Velocity to measure the cost, time, and relative late/early-ness of the project. In addition to a project-level function point analysis, ABC also asked its developers to measure the effort involved in creating the software in Extreme Programming units (XPU) [1]. By dividing the XPUs by the typical development speed (XPUs developed per week on previous projects) the team produced a man-month estimate for development of the ATM project; this estimate, along with function points and risk-adjusted estimates, was used to create the $350,00 bid for the ATM Project. (Note: The deviation from book XP of working for a fixed price was a concession to the customer made to win the contract. The customer insisted on this concession to minimize uncertainty.) The XP method practiced by ABC Co. feeds directly into the testing phase of the project. Although book XP does not include a testing phase[1], this concept has not exactly been embraced by the business community. As a result, ABC Co. typically includes a testing phase in its software projects similar to the testing phase advocated by Donaldson/Siegel in their book Successful Software Development.

Page 10

ABC Co. Testing Plan

Definitions To provide a common language to the process, this table below provides critical terms as they apply to this process. Table 2 Common Terminology Term Acceptance Testing Base Line Black Box Testing Defect Function Testing Glass Box Testing White Box Testing System Testing Integration Testing System Testing Test Case Unit Testing Use Case XP Definition Formalized testing by the customer to determine completion of the product prior to delivery and payment of services. [2] The version of the source code currently in production. This is sometimes referred to as the gold copy of the software currently in production. [2] Testing software for its basic functionality without specifically exercising known pieces of code. An instance of where the software does not meet the requirement. Formalized testing of modules for their functionality based on the requirements document. Both terms are a form of testing where internal coding knowledge is used to determine testing. Formalized testing where the system is exercised under conditions similar to a regular production environment. Testing modules together to ensure the interfaces are compatible and meet requirements. [2] Formalized testing where the system is exercised under conditions similar to a regular production environment. A specific set of steps to be followed to exercise a specific aspect of software. The steps include how to setup, how to provide the input and the expected output. [2] Testing performed on a specific module by the developer. A story about the use of the system [15] Extreme programming.

Page 11

ABC Co. Testing Plan

Risk Analysis
While every effort has been made to identify priorities and ensure a quality product, we recognize that there are still risks that could impact testing. Risks are outlined in the table below: Table 3 Risk Analysis
Risk Budget is reduced. Description: Mitigation: Description: The budget could be reduced if other portions of development incur cost overruns. Mitigation: There are two forms to address this as follows: 1. Review the prioritization of the system testing areas based on the analysis and adjust accordingly. 2. Review the situation with the customer and request additional funding. 3. The ABC Co. absorbs the cost. Description: Available time for testing could be reduced if other portions of development run over estimates time. Mitigation: 1. Budget and timeline is not set until after functional specification and function point analysis are complete 2. Development team uses XP and modern methods (see http://www.joelonsoftware.com/articles/fog0000000245.html) to ensure the development schedule does not slip 3. The schedule committed to the customer allows additional buffer time for QA beyond that predicted by the function-point estimate. 4. The contract provides four factors that the customer can use to control the project: Time, Quality, Features, and Resources. If the QA phase runs over the committed complete date, then, to some extent, it is the customers choice. 5. The On-Site customer will get daily feedback and have options to steer the project to meet the complete date. Again, if the project is late, then, to some extent, it is by customer choice. 6. If needed, ABC Co. can absorb the cost of QA over-runs. Product is below quality on ship date Description: The defect database has several high-priority defects on ship date; the customer refuses to the sign until all the high-priority (or acceptance-testfound) defects are removed. The customer may insist that ABC Co. fix the defects without reimbursement and, in fact, insist on a discount because the software is late. Mitigation: 1. Testing group has one (1) additional man-month of budget. 2. Beyond that month, ABC Co. will absorb the cost 3. Testing Manager will use FogBuz[20], internet-enabled bug tracking software, to evaluate state of software and request additional development resources as needed. 4. Customer will state explicit acceptance test requirements up front so test

Time is reduced

Page 12

ABC Co. Testing Plan

Turnover causes the department to lose organization knowledge

manager can prioritize defects and ensure that customer-required needs are met. 5. QA Phase as predicted by previous experience and function point analysis is shorter than the date presented to the customer; this adds a buffer zone. Description: Key players on the ATM project are recruited by outside agencies, leave of their own free will, or are recruited by the customer, causing the organization to bring new members on staff and play catch up. Mitigation: 1. ABC Co. practices the techniques of PeopleWare, and has very low turnover. 2. Pair Programming according to XP methodology ensures that two developers have an organizational knowledge of each line of code. 3. An explicit development program for testers minimizes the effect of a lost tester 4. Contracts with the customer, vendors, and contract companies forbids those agencies from recruiting until after the project is complete. Description: After coding begins, the customer realizes that his organization needs additional features, or re-interpret the functional specification to add features not included in the function point analysis. Mitigation: 1. Explicit Contract forces customer to pay a nearly-prohibitive hourly fee for changes made after the specification is complete. This provides the budget for testing to continue after old budget is expended. 2. XP Methodology makes re-interpretation of stories extremely rare. 3. XP Methodology allows additional stories, which add Xtreme Programming Units and Function Points, which increases the cost. Description: Although the code is passed by developers, approved by the test manager, and approved by the customer, the tests are minimal and the software has serious defects. The customer insists of free fixes, and ABC has to either fix them for free or a hefty discount, or risk loss of professional reputation. Mitigation: 1. Development, Testing, and the customer each perform an independent audit to uncover defects: Unit tests, functional tests, system tests, and the acceptance test. This provides a layer of safety: While some defects may get past one tester, this form of testing ensures three, four, or more people will test functionality. 2. The test first and use-case based testing performed during the development cycle is claimed by XP enthusiasts to eliminate the need for a QA group[1]. While this may not be the case, having a QA group does in fact add a layer of redundancy or additional confidence. 3. Because the customer performs their own acceptance test, the software will not pass unless the customer is willing to either skip on the test or make a choice to skimp on quality. Either way shows a lack of commitment to quality and desire to get the product out the door. Under these conditions, it would be very hard for the customer to argue that ABC needs to make uncompensated bug fixes: As described above, quality is a factor under the control of the customer.

Customer Changes requirements during Development, Unit Test or System Test

Code shipped to customer fails to meet quality standards

Page 13

ABC Co. Testing Plan

Unit Test/Integration Test Plan


Approach ABC Company defines unit and Interface testing as part of the development cycle; these tasks will be performed before the Code Complete date and the responsibility for these tests lie with the software developers. Because this process is testing, it is described below.
Unit Testing:

All developers at ABC Co. sit through a 1-hour presentation on unit testing techniques before commencing to code. First, all developers will ensure that fully dressed use cases are created. Although use cases are associated more often with the Unified Modeling Language and the Unified Process, they complement the concept of stories within XP. The use case establishes the first point of traceability for further testing (Larman, 2002). Before Developers create a logical unit (or module) of code, they create code to execute and test the unit/module/object[1]. This code creates the object, performs operations on the object, and then checks the state of the object to ensure that its correct; this is a part of the Xtreme Programming methodology practices at ABC Company [1]. For example, if the programmer were creating an object that simulates a cashiers drawer, the code might look something like this in C++:
//----------------------------------------------------------------// Cdrawer drawer; drawer.addCash(10000); //Cash in PENNIES int iChangeTendered; BOOL bEnoughCash; bEnoughCash = drawer.makesale(100, 1000, iChangeTendered); ASSERT(bEnoughCash==TRUE); ASSERT(iChangeTended==900); LONG lCashLeft = drawer.getCashLeft(); ASSERT(lCashLeft==10100); //----------------------------------------------------------------//

This script would go on to test every public method of the class, with special attention paid to strange possibilities and bounds testing. Of course, some classes may have to be created and instanced to create a logical test such as a money structure (dollars, quarters, etc) or class, for instance. These classes as a whole would compose a unit. When this script is devised, the object hasnt even been coded, and it will not compile. This provides an additional sanity check and forces coders to perform design before coding; otherwise, writing the script would be impossible. In keeping with XP methodology, all test scripts must execute 100% pass before the code can be checked in.

Page 14

ABC Co. Testing Plan

Interface Testing:

After Unit Testing is complete, developers must consider every other class that their object could interface with. After writing up what the various interfaces are, the developer will meet with the main coder on the other objects and the two write up a usecase scenarios for the interfaces. The developer will then code and test interface tests in the same style as unit tests above. Setup and Configuration Requirements In keeping with the XP concept of continuous integration[1], all development machines must have MS Visual Source Safe (for Version Control) , MS Visual Basic, and the .Net Framework and Integrated Development Environment. Continuous Integration testing is performed on a separate build machine, as described by Xtreme Programming Installed[1]. Developers run the entire suite of checked-in unit tests overnight. In addition, the developers will need a machine to simulate the mainframe from which transactions are downloaded and performed; the customer will provide this machine. Participants Unit and Integration tests are performed solely by the development team. Defect Handling In the event that a unit test fails overnight, the coder who wrote that section will examine the test, re-factor the code if needed, check in his code and run a new test the next day. The project manager reports defects on the server side to the customer, as that application is outside the scope of this effort. Estimated Cost/Percentage of Budget As described previously, Unit and Integration testing are performed as part of development, and included in the cost of development (coding). Approvals Since XP depends heavily on pair programming practices, both partners in a module effort must agree that the code is ready for the next phase of testing before requesting a migration to function testing. After the first turnover of any module to testing for function testing, the code will be considered base lined [2, pg 243]. From that point on, the developers must utilize the revision control system to make further changes to the code. See migration requirements for function testing below. Setup and Configuration Requirements Migration Requirement Only source code is to be provided to the configuration management team through properly checking it into the version control software. The configuration management team is required to do a full build of all source code before placing it into the function test environment.

Page 15

ABC Co. Testing Plan

Function Test Plan


General

The purpose of functional verification test is to check that the ATM conforms fully to its functional specification. The functional test will conform to the key dimensions of extreme functional tests[22] Customer-owned To get confidence from the tests, the customers must understand them, and should provide the values tested. Comprehensive: The values to be tested will be derived using boundary value analysis(BVA) [12] Repeatable: The functional test will be documented and chosen so that it is repeatable Automatic: ABC Co will use the existing automated tools available in-house to perform the functional tests. Build Environment The functional test environment requires a dedicated environment to build the application on to ensure that the integrity of the build process. Except for this build, no other development is done on this workstation to ensure the environment is free of extraneous code the developer may have used in unit testing. Further it ensures the build is done in an environment which is monitored to match the current architecture. Since this application is being developed based on .Net technology, the development configuration to prepare production releases is recommended to be the following [13]. Processor: Pentium III-650MHZ Operating System: Windows 2000 Server Memory: 256 MB Hard Drive: At least 2.5 Gig free on the installation drive. Software: Full .Net System Development Kit (SDK). Code is to be compile

Page 16

ABC Co. Testing Plan

Function Test Environment

The function test environment, which is separate from the build environment, is to have a single work station and one data base server. Their configurations are as follows per a Microsoft recommendation[5]: Table 4 Function Test Environment
Server/device types Client application and Database Server End-user machines Machine name MSDNFTEST01 Hardware specification 750 MHz 256 MB of RAM, 19-GB HD Intel EtherPro 10/100 NIC 700 MHZ 64 MB RAM, 10 GB HD Software specification * Windows 2000 Advanced Server * Windows 2000 SP1 Server* SQL Server 2000 * SOAP Toolkit * Windows 2000 Professional * Internet Explorer 5.0 or later/Netscape browsers

Various

Participants Under the extreme programming (XP) paradigm, function testing involves the customer and the developer. The primary difference being that a compile in a clean environment has been done to support the testing. Estimated Cost/Percentage of Budget Since the XP approach includes this under development, this is not part of the testing budget. Further, the build aspect is accomplished by the configuration management (CM) area and all labor associated with that aspect comes under the CM budget.

Page 17

ABC Co. Testing Plan

System Test Plan


Approach General System testing is intended to exercise the system within an environment that resembles a production environment as much as reasonably possible[2]. It is impossible to test for all conditions. Prioritization As implied above, system testing is a high cost activity [2]. Due to that it is necessary to prioritize areas of testing to reduce risk [6][11]. This prioritization was done based on the recommended areas of testing noted by Myers in Humphreys book, Managing The Software Process [2]. The eleven areas of testing and the priority assigned to them was based on the function point analysis of the areas of influence. A full discussion on that prioritization scheme is in Appendix A. That appendix also provides a more detailed discussion in its tables on the purpose to each test and the setup considerations. It should be emphasized that the priority is meant to provide a guide in how to budget for the tests. It does not necessarily imply that any test will be dropped. If the budget does become a severe constraint, this list will be reviewed by the testing manager, development manager and the customer. Briefly, the eleven testing areas are listed below in priority. Table 5 System Test Prioritization Summary Priority 1 2 3 4 5 6 7 8 9 10 11 Category Reliability/Availability Recovery Serviceability Load/Stress Volume Performance Security Human Factors Configuration Compatibility Installability Objective
Verify reliability under typical ongoing work load Determine behavior of system after an ABEND. Determine if application is easily supportable Identify peak load failure point Determine system capacity Test actual performance against service level agreement. Test for security provisions (See security testing section below) Make sure system is understandable by users Ensure defined configuration works under run time conditions Verify any conversion needs

Verify system can be installed.

Page 18

ABC Co. Testing Plan

Actual Steps As indicated, the priority does not necessarily mean that any portion will not be tested, but is provided as an indicator of how much of the budget will be applied against that area of testing. Some areas must be tested regardless of the priority due to the physical nature of the setup. Given that analysis, there will be eleven testing cycles that will be administered under a spiral method [2]. At the end of each cycle, the testing team will stop, meet with the developers and the customers and determine the risks of going forward. A cycle may be repeated if the number of defects warrant it in the opinion of the developers. If this should happen, the customer and the budget will be adjusted during the risk analysis portion of the cycle. The order of testing for the cycles is listed in the table below: Table 6 Actual Order Of Testing Summary and Budget Breakout Cycle 1 Category Configuration % Budget
2%

Comment
The configuration must exist for setup. Most of the expense is hardware related and also comes under the budget for configuration management. To test requires an installation. So even though this has a low priority budget wise, it is necessary to do first. This cycle will test for minimum security and is repeated on every cycle since the majority of all the tests require access. Essentially, security is being tested throughout the other cycles which is and its funding is indirectly supported in those areas. Initial down loads of data require conversion and must be done to move forward on testing This has a higher portion of the budget due to the priority analysis. This is closely related to the reliability and stress testing below. Budgeted according to the priority analysis Budgeted according to the priority analysis Part of volume testing is covered under reliability and availability. Budgeted according to the priority analysis Human factors should have been heavily scrutinized early in development

2 3

Installability Security

5% 5%

4 5 6 7 8 9 10 11

Compatibility Reliability/Availability Recovery Serviceability Load/Stress Volume Performance Human Factors

10% 18% 15% 10% 10% 10% 10% 5%

Page 19

ABC Co. Testing Plan

Note On Security Testing It is strongly advised that additional funding should be considered to procure third party support for security testing. Testing of this nature is highly specialized and should be done by a company with expertise in security [2]. Manual Testing Manual testing will be accomplished based on the test plans provided to function testing as well as any use cases for the external system. Use cases must be developed for external systems as well since they are considered actors [15]. The use-cases will be developed into test-cases by the testers when it is exercised in the system testing environment first. Application Automated Testing Although setting up an automated test takes ten times longer than setting up a manual test [4] automated testing is essential to conduct volume testing due to the time reduced in executing a large number of tests. Due to their nature [2], the first five testing areas will depend heavily on the use of robots or automated scripts to provide the continuous activity or data volume necessary. Specific products recommended for automated testing include Rational Robot [16] and File Aid from Compuware [17]. Setup and Configuration Requirements The recommended environment for system testing is based on that recommended by Microsoft [5] for a project similar in scope and configuration requirements (Since its difficult to quote a full table, its noted that the table below is an exact copy from that WEB page).
Server/device types Web server Machine name Hardware specification 933 MHz 256 MB of RAM, 70-GB HD Intel EtherPro 10/100 NIC 933 MHz 256 MB of RAM, 70-GB HD Intel EtherPro 10/100 NIC 700 MHz 256 of MB RAM, 29-GB HD Intel EtherPro 10/100 NIC 750 MHz 256 MB of RAM, 19-GB HD Intel EtherPro 10/100 NIC 700 MHZ 64 MB RAM, 10 GB HD Software specification * Windows 2000 Advanced Server * Windows 2000 Advanced Server * Windows 2000 SP1 Server* SQL Server 2000 * SOAP Toolkit * Windows 2000 Advanced Server * Windows 2000 Professional * Internet Explorer 5.0 or later/Netscape browsers

MSDNIIS01 Web server Database server Client application server End-user machines MSDNIIS02 MSDNSQL01 MSDNCLIENTAP01 Various

Page 20

ABC Co. Testing Plan

Migration of Code Code moved into system test should be considered production ready from the standpoint of configuration management. That is, it should not be recompiled when moved from the function test environment. This is an established base line for the MicroSoft Intermediate Language (IL) code [2][13]. Participants This section focuses on who will actually perform and setup the testing. External relationships for reporting defects are discussed below in the section on Defect Management. Developers will not be included at any time to conduct specific tests that involve components they developed. The table below outlines specific roles in the system testing phase: Table 7 System Test Participants Position Configuration Management Team Testing Manager Involvement All migrations will be done through the configuration management team. Testing manager will establish policy and be the primary approval authority for exceptions to the migration process. Also, the testing manager will schedule the defect review meeting. Testers will ensure test cases are prepared. Testers will initiate the request to move code to system testing. This is done to ensure that those trained in testing maintain the integrity of the process[12]. Participate in the defect review which will be scheduled by the testing manager. Review defects for severity rating with testers.

Testers

Project Manager and all Development Team Members Customer Liaison

Estimated Cost/Percentage of Budget According to Steve McConnell, Software Project Manager's Survival Guide, 20% of the development budget is typically spent on Systems Testing [19]. Based on this, the total budget for testing against the project budget is estimated at $70,000. Approvals Since system testing is crucial to ensuring a quality product to customer, all managers must review the final status of defects to determine if the package should be released to the customer for acceptance testing. The testing manager, the development manager shall create an approval form in conjunction with the customer liaison.

Page 21

ABC Co. Testing Plan

Acceptance Testing
General Acceptance Test is used to verify that all aspects of the Detail Specification document have been implemented. All test scenarios must correlate to requirements previously agreed to in the Functional Requirements and Detail Specification documents. When all the acceptance tests pass for a given user story, that story is considered complete. Working with the customer, a comprehensive acceptance test is designed to ensure that the final deliverable meets the business needs established in the Specification document and agreed to in writing by both the customer and ABC Co. Approach Acceptance testing addresses the broadest scope of requirements, while lower levels of verification like unit tests address requirements that satisfy functions fully contained within builds that make up releases. The acceptance-testing objective is to verify that the end-to-end[21] ATM operations satisfy ATM requirements in the following categories: operational, functional, performance and interface requirements. This is similar to other areas of testing, however, this is conducted by the customer to ensure a completed product is accepted. Operational Requirements: ensure that the ATM operates in accordance with business requirements set by the customer Functional Requirements: ensure that the required tasks are accomplished and that the needs and objectives of users are met. Performance Requirements: ensure that performance objectives with respect to throughput, delay, number of simultaneous transactions in progress are satisfied. Requirements include speed, accuracy, frequency, reliability, maintainability, and availability. Interface Requirements: ensure that external and internal systems pass information or control to one another in accordance with specifications. Test Execution Acceptance tests are conducted under the direction of an ABC Co Testing/QAManager who has authority regarding the execution of the acceptance test. This authority includes the assignment of priority to defects, and determination of their impact on ongoing testing. This is performed by testers, who are customer appointed. Approvals: The Acceptance Test is the final step in customer testing. Once the customer executes the acceptance test, and the application passes the acceptance test, the customer acknowledges system acceptance in writing. During execution of the acceptance test, the customer may document operational procedures and receive additional operational

Page 22

ABC Co. Testing Plan

training from the ABC Project Engineers in order to train operational staff before going into production.

Error Management
This section applies only to function, system and acceptance testing since those activities are conducted under the testing management team. This section prescribes the approach to provide feedback to the developers and outlines the priorities assigned to defects.
Defect Prioritization

To ensure resources are properly focused, defects will be categorized based on their criticality and urgency [2]. For this project, four categories of defects are defined: Critical ( C ) - High priority. There is no work around, the system is completely non-functional until the defect is corrected. This defect can not be allowed to be passed on to the customer. Major ( M )- High priority. Essential functionality for the system is defective but a work around to allow other testing exists. This defect can not go to the customer. Average ( A ) Important. Obvious deviation from requirements, but does not hamper usage of the system nor does it result in bad data. Release of these defects are to be negotiated with the customer. Minor ( R ) Primarily cosmetic in nature. If time does not allow a fix, the customer will be told of the defect in shipment. The communication model for testing is a continuous one in which testing provides feedback to development. This forms a feed back loop as illustrated below [11]:

Development Process

Testing Process

Each defect will be recorded on the form established in Appendix C (Defect Tracking Document). A copy of this document shall be provided to the developers during daily defect review meetings. Again, this section only defines the communication of defects. While testers are encouraged to provide detailed information and analysis, the defect resolution is the responsibility of the developer [12].

Page 23

ABC Co. Testing Plan

Regression Testing
General Regression testing is done anytime changes have been made to existing code. The goal is to ensure that code that previously worked has not been adversely affected with the change [2]. Approach Due to the expense of testing [2], it is not desired to retest the entire system. Instead, existing test cases that reflect areas of the system most likely to be impacted will be tested. Those test cases will be selected as defects and corrections occur. The testing manager, developers and testers will identify the appropriate test cases during the regular defect review meetings. Due to our configuration management area not being a fully mature area process wise, assumptions about previous builds will not be made in the case of multiple regression tests.

Page 24

ABC Co. Testing Plan

Appendix A System Test Budget Priority Analysis


Analysis to Support Budget Priority of Testing Areas Watts Humphrey outlines G.J. Myers eleven areas of system testing in [2]. Humphrey recommends that each of these areas be considered to determine the applications probability of success in a production environment. While they may seem excessively expensive at first, Humphrey points out as applications become more critical, behavior under stressful or anomalous conditions becomes more important. It is at the point of highest system stress when system behavior is generally most critical and when there will be the least tolerance for error. Since the budget is limited, an analysis was conducted to determine the priority of system testing areas. This analysis took into consideration the eleven areas recommended by Myers and their associated areas of function point influences. The first table outlines those eleven essential categories along with the areas of influence those tests relate to. A third column provides the total number of influences associated with the testing category and the aggregate total score of all those influence areas. This table shall be used for priority analysis to determine which tests will be the most cost effective. This table is essentially, the raw analysis data. The second table lists all eleven areas in order of the number of influences they are associated with. The third table lists all the areas in order of the total score derived from all the influences they are associated with. The sort order in each of these two tables is assigned a weighted score. The fourth table then lists the system test categories based the average influence scores. The final three tables provide different viewpoints to evaluate the budgeted priority of the testing. The table based on total points ensures that those test categories that apparently cover heaviest influence score wise are assigned an appropriate weight to influence their priority. Those with the most individual type of influences indicate test areas that have the far most reaching. Finally, the average indicates those categories that have a high concentration of very significant influences. The weights are added up for each category. The fifth table provides the categories in descending order of the total weight influence. Budget priorities for testing will be based on the fifth table. The analysis is based on a form of grid analysis and addresses some of the areas addressed in a white paper by Hans Schaefer [6]. The points of emphasis by Schaefer were examined in the context of the areas of influence per the project function point analysis. Mcgregor supports this same concept as well [18].

Page 25

ABC Co. Testing Plan

Table A.1 Mapping of 11 Critical System Test Areas To Function Point Influences System Test Category
Myers 1 Load/Stress Objective: Identify peak load failure point Conditions: System is subjected to peak rates for the key operational parameters, including transaction, volume, operator/user load, file activity, error rates and key combinations NOTE: Essentially this test exercises all areas of input and processing.

Influences ID. Description


1. Data Communication 2. Distributed Processing 3. Performancd 4. Heavily used config 5. Transaction Rates 6. Online data entry 8. On line update 9. Complex processing 13. Multiple sites 1. Data Communication 2. Distributed processing 3. Performance 5. Transaction Rate 6. On-Line data entry 8. On-line update 13. Multiple site 1. Data communication 2. Distributed Data Processing 3. Heavily used config 4 4 5 1 4 5 5 2 1 4 4 5 4 5 5 1 4 4 1

Influence Factors

Count of Areas: 9 Total Score: 31 Average: 3.1

Myers 2 Volume Objective: Identify level of continuous heavy load of inputs. Conditions: The system will undergo a continuous period where expected maximum inputs are provided.

Count of Areas: 7 Total: 28 Average: 4 Count of Areas: 2 Total: 9 Average: 4.5

Myers 3 Configuration Objective: Find where legal configurations on system will not operate Conditions: All HW/SW types tested at least once. Notes: It is acknowledged that some hardware may not be available due to conditions that allow it to be only in a production environment (ie cost) Myers 4 Compatibility Objectives: Expose areas of incompatibility Conditions: Test for HW/SW interfaces (internally and externally) and system conversion Notes: Examples may include downloading host data to the local ATM. This is a form of data conversion. Myers 5 Security Objectives: Find ways to break security Conditions: Test should include facilities, procedures, HW, system service, communications and software Notes: Generally, there is limited testing of this nature by testing people. Third parties that specialize in this area. (Humphrey, 1989)

1. Data communication (updates imply potential conversion) 2. Distributed Data Processing (again, exchange of data with main system)

4 4

Count of Areas: 2 Total: 8 Average: 4

1. Data Communications 6. On-line data entry 8. On-line update

4 5 5

Count of Areas: 3 Total: 14 Average: 4.6

Page 26 Table X (continued)

ABC Co. Testing Plan

System Test Category


Myers 6 Performance Objective: Test performance under both peak and normal conditions. Testing will also include the service level agreements (SLA) Conditions: Typical transactions rates and responses NOTES: Dont confuse this with stress testing although it is related. It is a test to determine if the average performance meets established standards. Myers 7 Installability Objective: Determine where the defined procedure for installation can fail. Conditions: Follow setup instructions and conduct testing. Notes: Generally, this will include scripts and programs to check the configuration. Myers 8 Reliability/Availability Objectives: Test for these attributes under typical workload Conditions: Generally involves allowing the system to run for long periods Notes: Examples include long periods of regularly opening/closing files or points of memory allocation to check for memory leaks and long periods of concurrency. Further, consideration is made for the full life cycle of data to include purging action since all systems must deal with waste (Yourdon, 1980) Myers 9 Recovery Objective: Determine behavior after error conditions have occurred. Conditions: Conducted with other tests or forced error Notes: such as divide by zero errors, core dumps, abends, drop of network connection.

Influences ID. Description


1. Data Communication 2. Distributed Processing 3. Performance 4. Heavily used config 5. Transaction Rates 6. Online data entry 8. On line update 9. Complex processing 13. Multiple sites 4. Heavily used config 11. Installation ease 4 4 5 1 4 5 5 2 1 1 5

Influence Factors

Count of Areas: 9 Total: 31 Average: 3.1 Count of Areas: 2 Total: 6 Average: 3

1. Data Communication 2. Distributed Processing 3. Performance 4. Heavily used config 5. Transaction Rates 6. Online data entry 8. On line update 9. Complex processing 12. Operational ease 13. Multiple sites

4 4 5 1 4 5 5 2 5 1

Count of Areas: 10 Total: 36 Average: 3.6

1. Data Communication 2. Distributed Processing 3. Performance 4. Heavily used config 5. Transaction Rates 6. Online data entry 8. On line update 9. Complex processing 12. Operational ease 13. Multiple sites

4 4 5 1 4 5 5 2 5 1

Count of Areas: 10 Total: 36 Average: 3.6

Page 27 Table X (continued)

ABC Co. Testing Plan

System Test Category


Myers 10 Serviceability Objective: This might be called supportability. Test for handling of error messages. Conditions: Induce the most likely errors and determine if resolution information is sufficiently available. Notes: Should verify indexing of error messages in support documents that a support desk would use Infrastructure to capture resolution issues as the system grows Network warning messages (OpenView) Clearly explained dialogue boxes Myers 11 Human Factors Objective: Identify those operations that are inconvenient to users Conditions: Exercise use of manuals and procedures where possible. NOTE: Usability testing should have identified issues early.

Influences ID. Description


7. End user efficiency 11. Installation ease 12. Operational ease 5 5 5

Aggregate Influence Total


Count of Areas: 3 Total: 15 Average: 5

7. End user efficiency 12. Operational ease

5 5

Count of Areas: 2 Total: 10 Average: 5

Page 28

ABC Co. Testing Plan

Table A.2 Test Areas According to Number of Influences Weight 5 5 4 4 3 2 2 1 1 1 1 8. Reliability 9. Recovery 1. Load/Stress 6. Performance 2. Volume 5. Security 10. Serviceability 3. Configuration 4. Compatibility 7. Installability 11. Human factors Table A.3 Test Areas According to Total Influence Score Weight 9 9 8 8 7 6 5 4 3 2 1 8. Reliability 9. Recovery 1. Load/Stress 6. Performance 2. Volume 10. Serviceability 5. Security 11. Human factors 3. Configuration 4. Compatibility 7. Installability Category Number of Influences 36 36 31 31 28 15 14 10 9 8 6 Category Number of Influences 10 10 9 9 7 3 3 2 2 2 2

Page 29

ABC Co. Testing Plan

Table A.4 Test Areas According to Average Influence Weight 7 7 6 5 4 4 3 3 2 2 1 10. Serviceability 11. Human Factors 5. Security 3. Configuration 2. Volume 4. Compatibilty 8. Reliability 9. Recovery 1. Load/Stress 6. Performance 7. Installabilty Category Number of Influences 5.0 5.0 4.6 4.5 4.0 4.0 3.6 3.6 3.1 3.1 3.0

Table A.5 Test Areas In Order of Total Weight Category 8. Reliability 9. Recovery 10. Serviceability 1. Load/Stress 2. Volume 6. Performance 5. Security 11. Human Factors 3. Configuration 4. Compatibility 7. Installability Total Weighted Score 17 17 15 14 14 14 13 12 9 7 3

Page 30

ABC Co. Testing Plan

Appendix B - Test Case Template


Test Case I D Setup Steps Results Pass Fail

Page 31

ABC Co. Testing Plan

Appendix C - Defect Tracking Document


Permanent defect tracking shall be accomplished with a tool by Fogcreek [23]. However, to ensure the defects are captured, the following work sheet should be supplied to the testers. A report in this format shall be provided as well during the defect review meeting.
Defect ID Rating (C,M,A, or R) Test Case ID Description Owner Status

Page 32

ABC Co. Testing Plan

Appendix D Works Cited


[1] Ron Jeffires, Ann Anderson, Chet Hendrickson ;Extreme Programming Installed, , 2001, Addison-Wesley [2]Humphrey, Watts, Managing The Software Process, 1990, Addison-Wesley [3]Humphrey, Watts, The Team Software Process, 2000, Addison-Wesley [4] http://itmanagement.earthweb.com/columns/quaquest/article/0,,2761_762221,00.html [5] http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dncold/html/Ssf1test.asp?frame=true [6] http://home.c2i.net/schaefer/testing/risktest.doc
[7] http://msdn.microsoft.com/vstudio/techinfo/articles/XMLwebservices/webdefault.asp [8] http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dncold/html/ssf2testplan.asp [9] http://www.google.com/search?hl=en&q=%22test+plan%22&btnG=Google+Search [10] http://members.tripod.com/~bazman/frame.html [11] Mcregor, John D.; Sykes, David A, A Practical Guide To Testing Object Oriented Software, 2001, Addison-Wesley [12] Pfleeger, Shari Lawrence, Software Engineering, 2001, Prentice Hall [13] Brill, Gregory, Code Notes for .Net, 2001, Random House [14] www.extremeprogramming.org [15] Larman, Craid, Apply UML and Design Patterns, Prentice Hall, 2001

[16] http://www.rational.com/products/robot/index.jsp [17] http://www.compuware.com/products/fileaid/express/ [18] http://www.korson-mcgregor.com/publications/mcgregor/column2.htm [19] Steve McConnell , Software Project survival Guide, 1995, Microsoft Press [20] http://www.fogcreek.com/ [21] fpd.gsfc.nasa.gov/documents/EOSDIS-systemtestplan.pdf [22] Scott Donaldson/Stanley Siegel ;Successful Software Development, Second Edition, Addison-Wesley, 2001 [23] http://www.fogcreek.com/fogBugz/

También podría gustarte