Test Plan Template
Test Plan Template
1. Testing Overview
1.2. To be Tested
What functions (applications and subsystems) will be verified during the execution of this test plan?
2. Product Changes
3. Test Environment
Unit Testing
● SQL Server name: xxx, version: SQL 2005
Server
● Web Server name: ttt, version: Windows 2008
System/Integration Testing
● SQL Server name: YYY, version: SQL 2005
Server
● Web Server name: ZZZ, version: Windows
2008
● .net 3.5 framework
● As problems are found during system and acceptance testing they will be recorded and tracked so
that the list of problems found and the problem status (open, failed, resolved) can be provided to the
project team.
● If a problem is found during testing in another system (e.g., ASTRA), it will be logged and reported to
that system's owners.
● Priorities are assigned to defects:
1. High: These errors cause a "stop work" situation, where there is no functional work-around and
testing progress is halted. Since this type of problem is a critical error, problem resolution must be
immediate.
2. Medium: These errors restrict operation of an important function by impacting business function,
procedures, controls or public image. Testing may be interrupted for the particular function, but
other testing can be continued. The resolution of this level of error is a high priority.
3. Low: These errors represent minor problems that can be worked around with a minimal effort, or
cosmetic errors, and do not affect the overall operation of the application or the usability of the
system.
Test Tracking
● Team members will function as testers, running test cases, evaluating and recording test results,
and generating problem reports.
● A test execution log (tests that have been run, passed, failed) will be kept to produce a test results
summary.
● If needed, supporting documentation (e.g., screen prints and reports) will be retained for highly
critical test runs.
● All high-priority defects have been corrected and any of the associated tests rerun successfully
● All outstanding (unresolved) defects have been documented and include workarounds where
required
● Requirements coverage is 100% (100% of test cases addressing requirements have been executed)
or discrepancies documented and acceptable
● Code coverage (the percentage of code tested) is at least 95%
● The success rate (test cases passed) is at least 95%; the failure rate is documented and acceptable
● Acceptor has signed off on test results and outstanding issues.
Developers
● Perform unit and initial integration tests
● Assist with preparation and execution of system and
acceptance test cases
● Provide new or modified components for testing
6. Work Plan
Prepare a testing work plan with major testing milestones. Document specific functions to be tested
either here or in separate test plan documents. Perform these activities in the sequence identified as
much as possible. For example:
1. Installation/Conversion tests
2. Basic Configuration Acceptance test
3. Regression tests
4. Functional tests (e.g., business logic scenarios, data handling tests)
5. Security tests
6. Report tests
7. User interface and usability tests
8. System interface tests
9. Performance tests
10. Volume and stress tests
11. Error recovery tests
12. Documentation/help verification
13. User acceptance tests (see User Acceptance Testing on the Software Testing site.)
14. Implementation verification tests
7. Issues
Track issues here or provide a link to testing issues with their status.
8. Tests
Describe the sets of tests to be run. List the specific test cases here, or provide an overview of the
tests, plus a link to the document(s) describing the test cases.
Possible test sets are listed in the Work Plan section.
The level of detail included in the test cases will vary with the project; possible test case elements
include:
● TestID
● Short description
● Requirement(s) verified
● Preconditions
● Test data
● Steps
● Expected Results
● Actual results
At a minimum, each test case should include a unique ID and a short description. .