Q1. What Is Verification?
Q1. What Is Verification?
Q1. What Is Verification?
A: Verification ensures the product is designed to deliver all functionality to the customer; it
typically involves reviews and meetings to evaluate documents, plans, code, requirements and
specifications; this can be done with checklists, issues lists, and walkthroughs and inspection
meetings.
1. Requirements are poorly written when requirements are unclear, incomplete, too general,
or not testable; therefore there will be problems.
2. The schedule is unrealistic if too much work is crammed in too little time.
3. Software testing is inadequate if none knows whether or not the software is any good
until customers complain or the system crashes.
4. It's extremely common that new features are added after development is underway.
5. Miscommunication either means the developers don't know what is needed, or customers
have unrealistic expectations and therefore problems are guaranteed.
1. Ensure the requirements are solid, clear, complete, detailed, cohesive, attainable and
testable. All players should agree to requirements. Use prototypes to help nail down
requirements.
2. Have schedules that are realistic. Allow adequate time for planning, design, testing, bug
fixing, re-testing, changes and documentation. Personnel should be able to complete the
project without burning out.
3. Do testing that is adequate. Start testing early on, re-test after fixes or changes, and plan
for sufficient time for both testing and bug fixing.
4. Avoid new features. Stick to initial requirements as much as possible. Be prepared to
defend design against changes and additions, once development has begun and be
prepared to explain consequences. If changes are necessary, ensure they're adequately
reflected in related schedule changes. Use prototypes early on so customers'
expectations are clarified and customers can see what to expect; this will minimize
changes later on.
5. Communicate. Require walkthroughs and inspections when appropriate; make extensive
use of e-mail, networked bug-tracking tools, tools of change management. Ensure
documentation is available and up-to-date. Use documentation that is electronic, not
paper. Promote teamwork and cooperation.
Good test engineers have a "test to break" attitude, they take the point of view of the customer,
have a strong desire for quality and an attention to detail. Tact and diplomacy are useful in
maintaining a cooperative relationship with developers and an ability to communicate with both
technical and non-technical people. Previous software development experience is also helpful as
it provides a deeper understanding of the software development process, gives the test engineer
an appreciation for the developers' point of view and reduces the learning curve in automated test
tool programming.
Please note, the process of developing test cases can help find problems in the requirements or
design of an application, since it requires you to completely think through the operation of the
application. For this reason, it is useful to prepare test cases early in the development cycle, if
possible.
• Ensure the code is well commented and well documented; this makes changes easier for
the developers.
• Use rapid prototyping whenever possible; this will help customers feel sure of their
requirements and minimize changes.
• In the project's initial schedule, allow for some extra time to commensurate with probable
changes.
• Move new requirements to a 'Phase 2' version of an application and use the original
requirements for the 'Phase 1' version.
• Negotiate to allow only easily implemented new requirements into the project; move more
difficult, new requirements into future versions of the application.
• Ensure customers and management understand scheduling impacts, inherent risks and
costs of significant requirements changes. Then let management or the customers decide
if the changes are warranted; after all, that's their job.
• Balance the effort put into setting up automated testing with the expected effort required
to redo them to deal with changes.
• Design some flexibility into automated test scripts;
• Focus initial automated testing on application aspects that are most likely to remain
unchanged;
• Devote appropriate effort to risk analysis of changes, in order to minimize regression-
testing needs;
• Design some flexibility into test cases; this is not easily done; the best bet is to minimize
the detail in the test cases, or set up only higher-level generic-type test plans;
• Focus less on detailed test plans and test cases and more on ad-hoc testing with an
understanding of the added risk this entails.
This methodology can be used and molded to your organization's needs. Rob Davis believes that
using this methodology is important in the development and in ongoing maintenance of his
customers' applications.
• A description of the required hardware and software components, including test tools.
This information comes from the test environment, including test tool data.
• A description of roles and responsibilities of the resources required for the test and
schedule constraints. This information comes from manhours and schedules.
• Testing methodology. This is based on known standards.
• Functional and technical requirements of the application. This information comes from
requirements, change request, technical and functional design documents.
• Requirements that the system can not provide, e.g. system limitations.
• An approved and signed off test strategy document, test plan, including test cases.
• Testing issues requiring resolution. Usually this requires additional negotiation at the
project management level.
• Test cases and scenarios are designed to represent both typical and unusual situations
that may occur in the application.
• Test engineers define unit test requirements and unit test cases. Test engineers also
execute unit test cases.
• It is the test team who, with assistance of developers and clients, develops test cases
and scenarios for integration and system testing.
• Test scenarios are executed through the use of test procedures or scripts.
• Test procedures or scripts define a series of steps necessary to perform one or more test
scenarios.
• Test procedures or scripts include the specific data that will be used for testing the
process or transaction.
• Test procedures or scripts may cover multiple test scenarios.
• Test scripts are mapped back to the requirements and traceability matrices are used to
ensure each test is within scope.
• Test data is captured and baselined, prior to testing. This data serves as the foundation
for unit and system testing and used to exercise system functionality in a controlled
environment.
• Some output data is also baselined for future comparison. Baselined data is used to
support future application maintenance via regression testing.
• A pre-test meeting is held to assess the readiness of the application and the environment
and data to be tested. A test readiness document is created to indicate the status of the
entrance criteria of the release.
• Approved documents of test scenarios, test cases, test conditions and test data.
• Reports of software design issues, given to software developers for correction.
• The output from the execution of test procedures is known as test results. Test results
are evaluated by test engineers to determine whether the expected results have been
obtained. All discrepancies/anomalies are logged and discussed with the software team
lead, hardware test lead, programmers, software engineers and documented for further
investigation and resolution. Every company has a different process for logging and
reporting bugs/defects uncovered during testing.
• A pass/fail criteria is used to determine the severity of a problem, and results are
recorded in a test summary report. The severity of a problem, found during system
testing, is defined in accordance to the customer's risk assessment and recorded in their
selected tracking tool.
• Proposed fixes are delivered to the testing environment, based on the severity of the
problem. Fixes are regression tested and flawless fixes are migrated to a new baseline.
Following completion of the test, members of the test team prepare a summary report.
The summary report is reviewed by the Project Manager, Software QA (SWQA) Manager
and/or Test Team Lead.
• After a particular level of testing has been certified, it is the responsibility of the
Configuration Manager to coordinate the migration of the release software components to
the next test level, as documented in the Configuration Management Plan. The software
is only migrated to the production environment after the Project Manager's formal
acceptance.
• The test team reviews test document problems identified during testing, and update
documents where appropriate.
• Approved test documents, e.g. Test Plan, Test Cases, Test Procedures.
• Test tools, including automated test tools, if applicable.
• Developed scripts.
• Changes to the design, i.e. Change Request Documents.
• Test data.
• Availability of the test team and project team.
• General and Detailed Design Documents, i.e. Requirements Document, Software Design
Document.
• A software that has been migrated to the test environment, i.e. unit tested code, via the
Configuration/Build Manager.
• Test Readiness Document.
• Document Updates.
• Log and summary of the test results. Usually this is part of the Test Report. This needs to
be approved and signed-off with revised testing deliverables.
• Changes to the code, also known as test fixes.
• Test document problems uncovered as a result of testing. Examples are Requirements
document and Design Document problems.
• Reports on software design issues, given to software developers for correction.
Examples are bug reports on code issues.
• Formal record of test incidents, usually part of problem tracking.
• Baselined package, also known as tested source and object code, ready for migration to
the next level.