16. Test Plan Template


XYZ Application

Test Plan
-------------------------------------------------

TABLE OF CONTENT



1.   Introduction 3
1.1     Purpose. 3
1.2     Scope. 3
1.3     References. 3
2.   Version History
3.   System Overview... 5
4.   Testing Synopsis. 6
4.1     System Requirements. 6
4.1.1     Hardware. 6
4.1.2     Software. 6
4.1.3     Miscellaneous tools and information.. 6
4.2     Features to be tested.. 6
4.3     Features not to be tested.. 6
5.   Type of Testing.. 7
5.1     QA Acceptance Test. 7
5.2     Feature Level Testing.. 7
5.2.1     Task-Oriented Functional Tests. 7
5.2.2     Forced-Error Tests. 7
5.2.3     Boundary Tests. 7
5.2.4     System Integration Tests. 8
5.2.5     User Acceptance Test. 8
5.2.6     Load Tests. 8
5.2.7     Stress Tests. 8
5.2.8     Performance Tests. 9
6.   Regression Testing.. 10
7.   Configuration and Compatibility Testing.. 11
8.   Documentation Testing/ONLINE Help Testing.. 12
9.   Install/Uninstall Tests. 13
10.    Test Schedule and Resources. 14
11.    Test Phases and Completion Criteria.. 15
12.    Unresolved Issues and Risks. 16
13.    Test Plan Review... 17







1.          Introduction


1.1            Purpose

This is the test plan to describe the strategy to verify new features implemented for XYZ applications for JKL client.

1.2            Scope

This test plan outline test activities required for XYZ applications. This includes function test, system integration test, hardware test, user acceptance test, and user manual validation.

This document also specifies regression tests to be done as a result of new feature implementation.

1.3            References


Doc No.
Version
Date
Document Type
Title
Author

1.0
June 5, 2007
CR
Change Request_20070522001
ABC














2.          Version History


Version
Date
Details
Author
0.1
June 18, 2007
Initial draft
BCD














3.          System Overview



·         Figure 1. Overview of system under test

4.          Testing Synopsis


4.1  System Requirements

4.1.1     Hardware

·        Sensors
·        Wireless routers
·        AQM box

4.1.2     Software

·        XYZ application version 1.00.80
·        Work station with Window XP and Internet Explorer version 7
o   Back office application supports IE version 5.5 and above
·        Connection to the Configuration UI system.

4.1.3     Miscellaneous tools and information

·        Valid user ID and password

4.2  Features to be tested

Refer to the functional requirements that specify the features and functions to be tested.  The description of the change need not be excessively detailed when there is a complete description to refer to in some other document.  On the other hand, if there is no reasonable specification available, more detail is called for here.

4.3  Features not to be tested

List the features and functions that will not be covered in this test plan. Identify briefly the reasons for leaving them out.

5.          Type of Testing



5.1  QA Acceptance Test



·        Detail a set of acceptance criteria—conditions that must be met before testing can begin.  A smoke test should represent the bare minimum of acceptance testing.

·        As noted above, the ideal is to create a separate document for acceptance criteria that can be reused and referred to here.  If any particular, specialized test cases not listed in that document will be used, refer to them here.
5.2  Feature Level Testing
This is the real meat of the test plan.  The test categories below are filled in itemizing categories of tests, along with references to the test library or catalog.  Individual test cases should not be listed here; test requirements generally should not be either; the details should exist elsewhere and can be cross-referenced.

5.2.1     Task-Oriented Functional Tests


This is a detailed section, listing tests requirements for program features against functional specifications, user guides or other design related documents.  If there are test matrices available listing these features and their interdependence (and there should be), refer to them.

5.2.2     Forced-Error Tests


Provide or refer to a list of all error conditions and messages.  Identify the tests that will be run to force the program into error conditions.

5.2.3     Boundary Tests


Boundary tests—tests carried out at the lines between valid and invalid input, acceptable and unacceptable system requirements (such as memory, disk space, or timing), and other tests at the limits of performance—are the keys to eliminating duplication of effort.  Identify the types of boundary tests that will be carried out.  Note that such tests can also fall into the categories outlined below, so this section may be removed, or made a sub-section of those categories.

5.2.4     System Integration Tests


Identify components or modules that can be combined and tested independently to reduce dependence on system testing.  Identify any test harnesses or drivers that need to be developed.

[System Level Tests] Specify the tests will be carried out to fully exercise the program as a whole to ensure that all elements of the integrated system function properly.  Note that when unit and integration testing have been properly performed, the dependence upon system testing can be reduced.

5.2.5     User Acceptance Test


[User should provide a list of test cases that they are interested in. QA analyst would run those in advance to clear all defects.]

In contrast to types of testing designed to find defects, identify tests that will demonstrate the successful functioning of the program as you expect the customer to use it.  What type of workflow tests will be run?  What type of “real work” will be carried out using the program?

5.2.6     Load Tests


Indicate the types of tests will be carried out to see how the program deals with very large amounts of data, or with a large demand on timely processing.  Note that these tests can rarely be performed without automation; identify the automation tools, test harnesses, or scripts that will be used.  Ensure that the programs developed for the test automation effort are accompanied by their own sets of requirements, specifications, and development processes.

5.2.7     Stress Tests


·        Identify the limits under which the program is expected to perform.  These may include number of transactions per unit time, timeouts, memory constraints, disk space constraints, and so on.  Volume tests and stress tests are closely related; you may consider wrapping both into the same category.

·        How will the product be tested to push the upper functional limits of the program?  Will specific tools or test suites be used to carry out stress tests?  Ensure that these are reusable.

5.2.8     Performance Tests


Refer to the functional requirements that specify acceptable performance.  Identify the functions that need to be measured and the tests needed to show conformance to the requirements.

6.          Regression Testing



·        At each stage of new development or maintenance, a subset of the regression test library should be run, focusing on the feature or function that has changed from the previous version.  Unit, integration, and system tests are all viable places for regression testing.  For small maintenance fixes, identify this subset.  A good version control system can allow the building of older versions of the software for comparative purposes.
·        In the final phase of a complete development cycle, a full regression test cycle is run.  Identify the test case libraries and suites that will be run.
·        Whether a subset or a full regression test run, existing test scripts, matrices and test cases should be used, whether automation is available or not.  Identify the documents that describe the details.  Emphasize regression tests for functions that are new or that have changed, for components that have had a history of vulnerability, for high-risk defects, and for previously-fixed severe defects.

7.          Configuration and Compatibility Testing


·        If applicable, identify the types of software and hardware compatibility tests that will be carried out.
·        List operating systems, software applications, device drivers etc. that the product will be tested with or against.
·        List hardware environments required for in-house testing.

8.          Documentation Testing/ONLINE Help Testing


·        Documentation and online help testing will be carried out to verify technical accuracy of documented material.
·        If a license agreement is included in or displayed by the product, or the portion of it to which this test plan refers, ensure the correct one is being used (see the next item below).

9.          Install/Uninstall Tests


·        How will deployment and installation be tested?

·        How will the uninstallation or rollback process be tested?

·        Since some form of deployment is required for all software products, what generic installation and uninstallation test catalogs will be used or adapted for these tests?

10.     Test Schedule and Resources


Task
Staff
Completion date
Feature Test


Regression Test


System Integration Test


User Acceptance Test


User Manual Validation







·        Project Manager:
·        QA staff:
·        Development staff:
·        Business Analyst:

11.     Test Phases and Completion Criteria


·        Detail the planned test cycles and phases; these should be linked to the development plan for the project.  Specify the type of testing being done in each phase.  Typically unit testing will be done by the developer of the code, and need not be covered in detail in the test plan.  Integration and system testing phases should be detailed here.

·        Outline the criteria for assessing the severity of found defects.  List expectations for setting the priorities on resolving them.  Collaborate with the developer(s), project managers, and the customer representatives on this.

·        Identify in advance the criteria that must be fulfilled before each stage of testing can be considered complete.  Make these specific, measurable, and decidable; otherwise, expectations will differ and time will be wasted on discussion and debate.

·        If there are to be staged releases of system testing – typically alpha for internal releases, beta for limited releases to external test sites, and final releases – sometime called “gold master”, define them.  Define acceptance standards for each phase.  Ideally these should be in a separate document that can be referred to here. Bear in mind that there is a chance that the standards set here are subject to being overruled by some authority or another; for example, a product may ship with a higher than satisfactory number of minor defects, at the behest of a marketing department or CFO that wants the product released with time as the most important consideration.  Be prepared to accept such decisions dispassionately, but also be prepared to record them as failures to fulfill the standards set and agreed upon in advance.  Companies and individuals can forget easily and repeat mistakes when there is no record of breached agreements and their consequences; people learn and improve more easily when records of successes and failures are available.

12.     Unresolved Issues and Risks


·        Identify issues that have yet to be decided as of this draft of the plan.  Note these as risks to the schedule, scope, or quality of the test effort.

·        Identify other risks that may have an impact on the success of the plan.  Use the risks outlined in the course book and the attached speaker notes as a guideline to identifying common risks.  Refer also to the Software Project Survival Guide (Steve McConnell), which includes a good list of risks for every phase of development.  When assessing risk, don’t be optimistic; the quality of the test plan and the risk assessment is weakened by failure to assess risk realistically

13.     Test Plan Review


·        Include plans for review of this test plan.  Identify the parties to review and approve the document, either within the test group or with another set of developers or test engineers.  Look at sample test plan checklists, such as that on p. 3.45 – 3.47 of the course book, or those in Software Project Survival Guide.  Use ideas from these checklists to develop your own checklists, appropriate to the size and scope of the product.  Identify here the checklist(s) that will be used.

·        Meet with developers and customers or customer representatives to ensure that the test plan meets their requirements.

[Example, Reviewers of this test plan:

·        Project Manager:
·        Technical Lead:
·        Architect:     
·        QA Manager: ]

No comments:

Post a Comment