sv_testing lntroduction
sv_testing lntroduction
TESTING
TECHNIQUES
CSR1105- UNIT-I
Course Detail
No.
Sr. Lab contents of
No. Hour
s
1. Project to Understand the Automation Test Cases 3
generation.
1. Using any Integrated Development Environment 3
(IDE)
(like Selenium), Write a test suite containing
minimum 4 to 5 test cases.
1. Project for the performance analysis and regression 4
testing of any program.
1. A project to illustrate the use of Object oriented 4
testing.
Error, Mistake, Bug, Fault and Failure
■ Ariane 5
– It took the European Space Agency
10 years and $7 billion to produce
Ariane 5, a giant rocket capable of
hurling a pair of three-ton satellites
into orbit with each launch and
intended to give Europe
overwhelming supremacy in the
commercial space business.
– The rocket was destroyed after 39
seconds of its launch, at an altitude
of two and a half miles along with its
payload of four expensive and
uninsured scientific satellites.
Some Software
failures
Y2K problem:
7
Experience of Windows XP
8
Testing definition
■ “Testing is the process of executing a program with the
intent of finding faults.”
– Why should we test ?
– Who should do testing ?
– What should we test ?
– What do we mean by "complete testing"?
– And many more questions…
What is
this?
A failure
An error
Fault
How do we deal with the errors?
Verification?
Patching?
Testing?
Seven Testing Principles
1. Testing shows the presence of defects, not their
absence
2. Exhaustive testing is impossible
3. Early testing saves time and money
4. Defects cluster together
5. Beware of the pesticide paradox
6. Testing is context dependent
7. Absence-of-errors is a fallacy
How do you test this?
The system shall operate at an input voltage range of nominal 100- 250 VAC.
■ Poor Answer
Motivating the Bug Fix
■ Vary the options and settings of the program (change
conditions by changing something about the program
under test).
Input Output
test System test
data data
24
EQUIVALENCE CLASSES AND
BOUNDARY VALUES
■ Robustness testing
■ Worst case testing
■ Robust worst case testing
Functional Testing
The general functional testing process is:
■ The requirements or specifications are analyzed.
■ Valid inputs are chosen based on the
specification to determine that the SUT
processes them correctly. Invalid inputs must
also be chosen to verify that the SUT detects
them and handles them properly.
■ Expected outputs for those inputs are
determined.
■ Tests are constructed with the selected inputs.
■ The tests are run. Actual outputs are compared
with the expected outputs.
27
Functional Testing
Applicability
28
Boundary Value Analysis
a≤x≤b
c≤y≤d
29
Boundary Value Analysis
y
d
c
x
a b
30
Boundary Value Analysis
31
Functional Testing
y
d
c
x
a b
Boundary value analysis of test cases for a function of two
variables.
32
Functional Testing
100 ≤ x ≤ 300
100 ≤ y ≤ 300
33
Boundary Value Analysis
Test case x y Expected
output
1 200 100 20000
2 200 101 20200
3 200 200 40000
4 200 299 59800
5 200 300 60000
6 100 200 20000
7 101 200 20200
8 299 200 59800
9 300 200 60000
34
Functional Testing
35
Robustness Testing
36
Robustness Testing
y
d
x
a b
Robustness test cases for a function of two variables.
37
Robustness Testing
o Minimum
o Just above minimum
o Just below minimum
o Nominal value
o Just below maximum
o Just above maximum
o Maximum
38
Robustness Testing
100 ≤ x ≤ 300
100 ≤ y ≤ 300
The robust test cases are given in Table 2
39
Robustness Testing
Test case x y Expected output
min
min+ 5n test cases
nom
4n+1 test cases
max -
max
41
Worst Case Testing
y
d
x
a b
Worst test cases for a function of two variables
42
Worst Case Testing
Table 3: Worst cases test inputs for two variables example
Test case Inputs Test case Inputs
number number
x y x y
43
Robust Worst Case Testing
y
d
c
x
a b
44
Robust Worst Case Testing
45
Robust Worst Case Testing
100 ≤ x ≤ 300
100 ≤ y ≤ 300
The 72 = 49 robust worst test cases.
46
Limitations - BVA
■ Does not work well for Boolean variables
48
Functional Testing
49
Example : Equivalence Class
Testing
Consider a simple program to classify a triangle. Its inputs is a
triple of positive integers (say x, y, z) and the date type for
input parameters ensures that these will be integers greater than
0 and less than or equal to 100. The program output may be one
of the following words:
[Scalene; Isosceles; Equilateral; Not a triangle]
Design the equivalence class test cases.
50
Example : Equivalence Class
Solution
Testing
Output domain equivalence classes are:
O1={<x,y,z>: Equilateral triangle with sides x,y,z}
O2={<x,y,z>: Isosceles triangle with sides x,y,z}
O3={<x,y,z>: Scalene triangle with sides x,y,z}
O4={<x,y,z>: Not a triangle with sides x,y,z}
The test cases are:
Test case x y z Expected Output
1 50 50 50 Equilateral
2 50 50 99 Isosceles
3 100 99 50 Scalene
4 50 100 50 Not a triangle
51
Example : Equivalence Class
Testing
Input domain based classes are:
I1={x: x < 1}
I2={x: x > 100}
I3={x: 1 ≤ x ≤ 100}
I4={y: y < 1}
I5={y: y > 100}
I6={y: 1 ≤ y ≤ 100}
I7={z: z < 1}
I8={z: z > 100}
I9={z: 1 ≤ z ≤ 100} 52
Example : Equivalence Class
Testing
Some inputs domain test cases can be obtained using the relationship
amongst x,y and z.
I10={< x,y,z >: x = y = z}
I11={< x,y,z >: x = y, x ≠ z}
I12={< x,y,z >: x = z, x ≠ y}
I13={< x,y,z >: y = z, x ≠ y}
I14={< x,y,z >: x ≠ y, x ≠ z, y ≠ z}
I15={< x,y,z >: x = y + z}
I16={< x,y,z >: x > y +z}
I17={< x,y,z >: y = x +z}
I18={< x,y,z >: y > x + z}
I19={< x,y,z >: z = x + y}
I20={< x,y,z >: z > x +y} 53
Example : Equivalence Class
Testing
Test cases derived from input domain are:
Test case x y z Expected Output
1 0 50 50 Invalid input
2 101 50 50 Invalid input
3 50 50 50 Equilateral
4 50 0 50 Invalid input
5 50 101 50 Invalid input
6 50 50 50 Equilateral
7 50 50 0 Invalid input
8 50 50 101 Invalid input
9 50 50 50 Equilateral
10 60 60 60 Equilateral
11 50 50 60 Isosceles
12 50 60 50 Isosceles
13 60 50 50 Isosceles
54
Example : Equivalence Class
Testing
14 100 99 50 Scalene
15 100 50 50 Not a triangle
16 100 50 25 Not a triangle
17 50 100 50 Not a triangle
18 50 100 25 Not a triangle
19 50 50 100 Not a triangle
20 25 50 100 Not a triangle
55
EQUIVALENCE CLASSES
■ Don't forget equivalence classes for invalid inputs.
■ Organize your classifications into a table or an outline.
■ Look for ranges of numbers.
■ Analyze responses to lists and menus.
■ Look for variables that must be equal.
■ Create time-determined equivalence classes.
■ Look for variable groups that must calculate to a
certain value or range.
■ Look for equivalent output events.
■ Look for equivalent operating environments.
BVA and ECP
■ Boundary Value Testing derives test cases with
– Serious gaps
– Massive redundancy
Conditions
No discount T F F F
Extra discount F T T F
Loyalty points F F F T
Decision Table-Based
Testing
■ What benefit do we get from using decision tables in
place of equivalence classes?
■ Node i
– Def (i) = set of variables defined at node i,
which can be used externally to i
– C-use (i) = set of variables used in a
calculus at node i, not defined in i.
■ Edge (i,j)
– P-use (i, j) = set of variables appearing in
the predicate conditioning transfer of
control from i to j
Data Objects
■ (d) Defined, Created, Initialized
■ (k) Killed, Undefined, Released
■ (u) Used: – (c) Used in a calculation – (p) Used in a
predicate
Data Flow Analysis
Data Flow Analysis
■ All definitions
– Selection of a subpath for each variable
definition, for some use (equally in a calculus
or predicate)
■ All C-uses / some P-Uses (resp. all P-uses / some
C-Uses)
– Use in calculation (resp. in predicate) is
favored
■ All Uses
– Selection of a subpath for each use
■ All DU paths
– Selection of all possible subpaths without
iteration between definition and each use
Ordering of criteria
Web Application Testing
■ Content evaluated at both syntactic and semantic levels
■ Function tested to uncover lack of conformance to
requirements
■ Structure is assessed to ensure proper content and
function are delivered
■ Usability is tested to ensure that each category of user can
be supported as new content or functionality is added
■ Navigability is tested to ensure that all navigation syntax
and semantics are exercised
■ Performance is tested under a variety of operating
conditions, configurations, and loading to ensure a
reasonable level of user response
■ Compatibility tested by executing WebApp using a variety
of client and server configurations
■ Security is tested by assessing potential vulnerabilities and
trying to exploit each of them
Performance Testing
■ Tests user response time With web applications
■ Load Testing
– Tests the database – Largest load the database can
handle at one time
■ Stress testing
– Tests the server – Peak volume over a short span of
time
Examples
■ Web Applications (online systems)
■ Data base systems
■ File exchange
■ Disk space
– Multiple users logging into the system at one time (100 users
log in at one time)
– Users logging into the system very rapidly (e.g. 1 user every
second)
– Extended concurrency times (25 users remain logged into
system running heavy transactions for an extended period)
Stress Testing-
Checkpoints
■ Warnings and error messages, do they come at all and
are they helpful and understandable?
■ Are data lost or corrupted?
■ Is the system too slow?
■ Do timeouts happen?
■ Does the system crash?
■ Are really ALL data processed and stored? Even the
end of files, messages or data elements?
■ Are data overwritten without warning?
■ Does the system prioritize in a reasonable way at
overload situations? Are prioritized services still
available?
Stress Testing
■ Tests the breaking point of one system by
overwhelming resources.
■ Ensures that fails have a graceful recovery
■ Features
– Flexible scenario and dynamic data generation
– Real world load simulation
– Distributed stress testing
– Reports
Load Testing
■ Ensures application can handle load
conditions
■ Tests to make sure that speed is not
sacrificed should multiple users access
application
■ End user experience testing
◦ Run full-scale web load test while a subset of users logs
into system to conduct normal work activity
◦ Have a subset of end users log into the system in the
middle of load test to gauge performance
■ Transaction response testing
◦ Order entry must complete within 8 seconds
◦ AP query must return results within 5 seconds
◦ PDF attachment must upload within 5 seconds
Load Testing