SWVV_L17b_Integration_system_validation_testing
SWVV_L17b_Integration_system_validation_testing
Istvan Majzik
majzik@mit.bme.hu
System
specification
Architecture
design
Component
design
Component
implementation
System
integration
• Integration testing
System • System testing
delivery
• Validation testing
Operation,
maintenance
2
Testing and test design in the V-model
Operation,
maintenance
Component
implementation
3
Integration testing
4
Testing and test design in the V-model
Operation,
maintenance
Component
implementation
5
Software integration testing
Software architecture
design
Software construction
design
Software integration Software integration
testing test report
Software integration
test plan
Software quality
assurance plan
6
Goals, methods and approaches
Goal and motivation:
o Testing the interactions of components
o The (system-level) interaction of components may be
incorrect despite the fact that all components are correct
Methods: Cover interaction scenarios by the tests
o Sometimes the scenarios are part of the specification
o Systematic testing: Covering all / representative scenarios
o The concept of equivalence partitions and boundary values
applied for interactions (scenario / input data level)
Approaches
o “Big bang”: integration of all components before testing
o Incremental testing: stepwise integration + testing
7
“Big bang” testing
Integration of all components then testing
using the external interfaces of the integrated system
External test driver
Based of the functional specification of the system
To be applied only in case of small systems
Tester B
C
Error in this component:
D Debugging is difficult
8
Incremental integration and testing
Applied in case of complex systems
Adapted to component hierarchy (calling levels)
A1 A2 A3
9
Basis: Isolation testing of components
Components or system parts are tested in isolation
Test drivers and test doubles (used for substitution w.r.t dependencies)
Dependency: Anything collaborating with the CUT (does not belong to it)
A Test
driver
A1 A2 A3
tested (CUT)
A311 A312 A313
10
General problem: Handling dependencies
Several approaches for substituting dependencies
o Test double: Generic name of the substitute
o Isolation frameworks (e.g., Mockito, JMock, …)
Stub
o Predefined replies to calls
o Checking the state of the CUT
Mock
o Expected and checked behavior
o Checking the interactions of the CUT
(number of calls, with parameters …)
Dummy
o Not used component (just “filler”)
Fake
o Working component, but not the real one
11
Top-down integration testing
Components are tested from their callers (already tested)
Stubs replace the lower-level components that are called
Requirement-oriented testing
Component modification: effects on testing of lower levels
A
Tested component:
test driver
A1 A2 A3
14
Functional integration
Motivation:
o There are several system-level functions
o Functions of different criticality prioritizing testing
Basic idea:
o Integration on the basis of adding system functions
o Each function is integrated and tested in a top-down way
Specific case of top-down integration testing
o Requirement oriented (w.r.t. the given function)
o Test doubles (stubs) are needed
o Top level is tested with more and more functions
o Component modification: effects on testing of lower levels
15
Integration with the runtime environment
Motivation:
o It is hard to construct stubs for the runtime
environment
o See e.g., platform services, RT OS, task scheduler, …
Strategy:
1. Top-down integration of the application components
down to the level of the runtime environment
2. Bottom-up testing of the runtime environment
• Isolation testing of functions (if necessary)
• Testing with the lowest level of the application hierarchy
3. Integration of the application with the runtime
environment, finishing top-down integration
16
Coverage metrics: State based approach
Goal: Coverage of interactions among components
o Basic case: Coverage of interface functions (by calls)
State based coverage metrics:
o Coverage of interface functions for all relevant states
(or transitions) of the caller and the called component
o Extension: With all triggers and conditions for the call
opB2() call can be served by
two transitions of comp. B
18
Coverage metrics: Data flow based approach
Data flow based metrics (covering def-use paths):
o Coverage extended for coupling paths
(among function calls and returns)
Specific def-use labels:
• Last-def-before-call – First-use-in-callee
• Last-def-before-return – First-use-after-call
o Related coverage metrics:
• All-coupling-defs (similar to all-defs)
• All-coupling-uses (similar to all-uses)
• All-coupling-paths (similar to all-paths)
Testing robustness of interfaces
o Extreme and boundary values of call parameters
o Mutating calls in scenarios (omission, duplication,
change of ordering, extreme parameters etc.)
19
System testing
20
Testing and test design in the V-model
Operation,
maintenance
Component
implementation
21
System testing
Testing on the basis of the system specification
Characteristics:
o Performed after hardware-software integration
o Testing functional specification +
testing extra-functional properties
Testing aspects: depend on the specification
o User workload (according to user profile)
o Checking application conditions of the system
(resource usage, saturation)
o Testing fault handling
o Data integrity
o … (depending on the system specification)
22
Types of system tests (examples)
• Real workload
Performance testing
• Response times
24
Testing and test design in the V-model
Operation,
maintenance
Component
implementation
25
Software validation
System requirements
specification
Software validation
test report
Software requirements
specification
Software validation
Software validation
report
Software requirements
test specification
Software validation
plan
26
Validation testing
Goal: Testing in real environment
o User requirements and expectations are taken into account
o Non-specified expectations may come up
o Reaction to unexpected inputs/conditions is checked
o Events of low probability may appear
Timing aspects
o Constraints and conditions of the real environment
o Real-time testing and monitoring is needed
Environment simulation
o If given situations cannot be tested in a real environment
(e.g., protection systems)
o Simulators shall be validated somehow
27
Summary: Testing levels
1. Component (module, unit) testing
o Isolation testing
2. Integration testing: focus on interactions
o (”Big bang” testing)
o Top-down testing
o Bottom-up testing
o Functional integration
o Integration with the runtime environment
3. System testing: focus on system level properties
o Testing the integrated system
4. Validation testing: focus on users + real context
o Testing user expectations in the real environment
o Environment simulation
28
Documentation of testing
29
Standard test documentation (IEEE 829:1998)
Standard for Software Test Documentation
Test planning:
Test Plan: What is tested, by whom, how, in what time frame, to what quality
SPACEDIRT: Scope, People, Approach, Criteria, Environment, Deliverables, Incidentals,
Risks, Tasks
Test specification:
Test Design Specifications: Test conditions, expected outcome,
what is a successful test
Test Case Specifications: The specific test data (test suites)
Test Procedure Specifications: What kind of physical set-up is required, how the tester
runs the test, what steps need to be followed
Test reporting
Test Item Transmittal Report: When specific tested items are passed from one stage of
testing to another
Test Log: What tests cases were run, by whom, in what order, and whether individual
tests were passed or failed
Test Incident Report: Details of test failure (when, why)
Test Summary Report: Assessment about the quality of the system
30
Standard test documentation (IEEE 829:2008)
Standard for Software and System Test Documentation
Test planning: It covers multiple testing levels
Master Test Plan (MTP): Overall test planning for multiple levels
Level Test Plans (LTP): Scope, approach, resources, and schedule of the testing
Test design:
Level Test Design (LTD): Test cases, the expected results, the test pass criteria
Level Test Case (LTC): Specifying the test data for use in running the test cases
Level Test Procedure (LTPr): How to run each test (preconditions and the steps)
Test reporting:
Level Test Log (LTL): Record of relevant details about the execution
Anomaly Report (AR): Events that occur during testing and require investigation
Level Interim Test Status Report (LITSR): Summarize/evaluate interim results
Level Test Report (LTR): Summarize/evaluate the results after test execution has finished
for the specific test level
Master Test Report (MTR): Summarize/evaluate the results of the levels
31
U2TP: UML 2 Testing Profile (OMG, 2004)
Able to capture all needed information for functional black-box
testing (specification of test artifacts)
o With mapping rules to JUnit, TTCN-3 (Testing and Test Control Notation)
Language (notation) and not a method (how to test)
32
U2TP Test Architecture package
Identification of main components:
SUT: System Under Test
o Can be: System, subsystem, component, object
o Characterized by interfaces to control and observation
Test Component: Part of the test system (e.g., a simulator)
o Realizes the behavior of a test case
(Test Stimulus, Test Observation, Validation Action, Log Action)
Test Context: Collaboration of test architecture elements
o Initial test configuration (test components)
o Test control (decision on execution, e.g., if a test fails)
Scheduler: Controls the execution of test components
o Creation and destruction of test components
Arbiter: Calculation of final test results
o E.g., threshold on the basis of test component verdicts
33
U2TP Test Architecture example
34
U2TP Test Data package
Identification of types and values for test,
e.g., sent and received data (with wildcards *, ?)
o Test Parameter
• Stimulus and observation
o Argument
• Concrete physical value
o Data Partition: Equivalence class for a given type
• Class of physical values, e.g., valid names
o Data Selector: Retrieving data out of a data pool
• Operating on contained values or value sets
o Templates
35
U2TP Test Data example
36
U2TP Test Behavior package
Specification of default / expected behavior
Identification of behavioral elements:
o Test Stimulus: Test data sent to SUT
o Test Observation: Reactions from the SUT
o Verdict: Pass, fail, error, or inconclusive
o Actions: Validation Action (inform Arbiter), Log Action
Test Case: Specifies one case to test the SUT
o Test Objective: Named element
o Test Trace: Result of test execution
• Messages exchanged
o Verdict
37
U2TP Test Behavior example
38
Example: BlueTooth roaming
System under test:
Test objective:
Slave Roaming Layer functionality
o Monitoring link quality
o Connecting to a different master
39
Example: Components
41
Example:
Test scenario
Test case
implementa-
tion
(see Blue-
ToothSuite as
test context)
• References
• Timers
• Defaults
42
Test scenarios (details)
43