Dlms Yellow Book PDF
Dlms Yellow Book PDF
Dlms Yellow Book PDF
Companion Specification
for Energy Metering
DLMS/COSEM
Conformance
Testing Process
device ™
language
message
specification
Table of Contents
1. Foreword ........................................................................................................................................ 4
2. Scope ............................................................................................................................................. 5
3. Introduction..................................................................................................................................... 6
3.1 Referenced documents........................................................................................................... 6
3.2 Terms and definitions.............................................................................................................. 6
3.3 Abbreviations ........................................................................................................................ 13
4. Conformance testing - overview................................................................................................... 16
4.1 OSI conformance testing ...................................................................................................... 16
4.2 DLMS/COSEM conformance testing .................................................................................... 16
4.3 Main features of DLMS/COSEM conformance testing ......................................................... 18
5. The conformance test plans......................................................................................................... 19
5.1 Scope of testing .................................................................................................................... 19
5.2 Device testing........................................................................................................................ 20
5.3 Structure of the conformance test plans ............................................................................... 20
5.4 Abstract test cases................................................................................................................ 21
5.5 Test outcomes and verdicts .................................................................................................. 22
5.6 Conformance test plan for the Data link layer using HDLC protocol .................................... 23
5.6.1 Version 2.0..................................................................................................................... 23
5.6.2 Version 4.1..................................................................................................................... 23
5.7 Conformance test plan for the COSEM application layer ..................................................... 23
5.7.1 Version 2.0..................................................................................................................... 23
5.7.2 Version 4.1..................................................................................................................... 23
5.8 Conformance test plan for COSEM interface objects........................................................... 24
5.8.1 Version 2.0..................................................................................................................... 24
5.8.2 Version 4.2..................................................................................................................... 24
6. The DLMS/COSEM conformance test tool .................................................................................. 25
6.1 Introduction ........................................................................................................................... 25
6.2 Licensing the CTT................................................................................................................. 25
7. The conformance assessment process ....................................................................................... 26
7.1 Overview ............................................................................................................................... 26
7.2 Preparation for testing........................................................................................................... 27
7.2.1 Preparation of the IUT ................................................................................................... 27
7.2.2 Production of the CTI..................................................................................................... 27
7.3 Test operations ..................................................................................................................... 28
7.3.1 Review of the CTI .......................................................................................................... 28
7.3.2 Test selection and parameterization.............................................................................. 28
7.3.3 Test campaigns ............................................................................................................. 28
7.4 Documents generated by the CTT........................................................................................ 29
7.4.1 Conformance test report ................................................................................................ 29
7.4.2 Conformance log ........................................................................................................... 29
7.4.3 Line traffic ...................................................................................................................... 29
7.4.4 Viewing the conformance test plans.............................................................................. 29
7.4.5 Viewing the test scripts .................................................................................................. 30
7.4.6 The “Create snapshot” feature ...................................................................................... 30
7.4.7 The “Create certification report” feature ........................................................................ 30
7.5 Repeatability of results.......................................................................................................... 30
7.6 Requirements for test laboratories........................................................................................ 30
8. The certification process .............................................................................................................. 31
8.1 General ................................................................................................................................. 31
1. Foreword
Copyright
This document is confidential. It may not be copied, nor handed over to persons outside the
standardisation environment.
The copyright is enforced by national and international law. The "Berne Convention for the
Protection of Literary and Artistic Works", which is signed by 121 countries world-wide, and
other treaties apply.
Acknowledgement
The actual document has been established by the WG CT of the DLMS UA.
2. Scope
This document specifies the methods and processes for conformance testing and certification
of metering equipment implementing the DLMS/COSEM specification for meter data
exchange.
This document only focuses on testing and certifying the implementation of the
DLMS/COSEM specification. Other functional and performance tests are outside the scope of
this document.
This Edition 3 cancels and replaces Edition 2, published in 2003 and its Amendment 1,
published in 2003.
3. Introduction
[1] DLMS UA 1000-1:2007, COSEM Identification System and Interface Classes “Blue Book”
Eighth Edition
[2] DLMS UA 1000-2: 2007, Sixth Architecture and Protocols “Green Book”
Edition
[3] DLMS UA 1001-1:2002, DLMS/COSEM Conformance Test Process “Yellow book”
Second edition
[4] DLMS UA 1000-1 Ed. 1, Amd. Amendment 1
1:2003
[5] DLMS UA 1001-3:2002, V2.0 Data link layer test plan, Edition 2
[6] DLMS UA 1001-3:2007, V4.1 DLMS/COSEM conformance testing – Conformance test plans - Data link
layer using HDLC protocol
[7] DLMS UA 1001-4:2002, V2.0 COSEM Application layer test plan
[8] DLMS UA 1001-4:2007, V4.1 DLMS/COSEM conformance testing - Conformance Test Plans – COSEM
Application layer
[9] DLMS UA 1001-5:2002, V2.0 Interface objects test plan
[10] DLMS UA 1001-5:2007, V4.2 DLMS/COSEM conformance testing - Conformance Test Plans – Interface
objects
[11] DLMS UA 1001-6:2005, V2.3 COSEM conformance testing – List of standardised OBIS codes
[12] DLMS UA 1001-7:2007 V1.4 COSEM conformance testing – Object definition tables
13] DLMS UA 1002:2003. First Glossary of terms
Edition
[14] X.290 (1995) OSI conformance testing methodology and framework for protocol
recommendations for IUT-T applications – General concepts
[15] X. 291 (1995) OSI conformance testing methodology and framework for protocol
recommendations for IUT-T applications – Abstract test suite specification
[16] X.293 (1995) OSI conformance testing methodology and framework for protocol
recommendations for IUT-T applications – Test realization
NOTE 1 For the actual version of the DLMS UA documents, check the DLMS UA website at http://www.dlms.com/
NOTE 2 X.290 is technically aligned with ISO/IEC 9646-1.
NOTE 3 For other relevant standards see the Bibliography.
3.2.1
abnormal (test case) termination
The term used to describe the result of execution of an abstract test case when it has been
prematurely terminated by the test system. [X.290 3.3.1]
3.2.2
abstract test case
A complete and independent specification of the actions required to achieve a specific test
purpose, defined at the level of abstraction of a particular Abstract Test Method, starting in a
stable testing state and ending in a stable testing state. This specification may involve one or
more consecutive or concurrent connections.
NOTES
1 The specification should be complete in the sense that it is sufficient to enable a test verdict to be assigned
unambiguously to each potentially observable test outcome (i.e. sequence of test events).
2 The specification should be independent in the sense that it should be possible to execute the derived
executable test case in isolation from other such test cases (i.e. the specification should always include the
possibility of starting and finishing in the “idle” state).
[X.290 3.3.3]
3.2.3
abstract test case error
A test case error resulting from an error in the abstract test case. [X.290 3.3.4]
3.2.4
(abstract) test method (ATM)
The description of how an IUT is to be tested, given at an appropriate level of abstraction to
make the description independent of any particular realization of a Means of Testing, but with
enough detail to enable abstract test cases to be specified for this test method. [X.290 3.3.5]
3.2.5
abstract test suite (ATS)
A test suite composed of abstract test cases. [X.290 3.3.6]
3.2.6
abstract test suite (ATS) specification
A specification that contains a standardized ATS together with related information. [X.290
3.3.7]
3.2.7
base specification
A specification of a protocol, abstract syntax, encoding rules, or information object. [X.290
3.3.10]
3.2.8
basic interconnection test (BIT)
A test of an IUT, which has limited scope to determine whether or not there is sufficient
conformance to the relevant protocol(s) for interconnection to be possible, without trying to
perform thorough testing. [X.290 3.3.11]
3.2.9
behaviour test
A test to determine the extent to which one or more dynamic conformance requirements are
met by the IUT. [X.290 3.3.12]
3.2.10
capability (of an implementation)
A set of functions in the relevant protocol(s), which is supported by the implementation.
[X.290 3.3.13]
3.2.11
capability test
A test to verify the existence of one or more claimed capabilities of an IUT.
NOTE Capability testing involves checking all mandatory capabilities and those optional ones that are stated in
the ICS as supported, but not checking those optional ones which are stated in the ICS as not supported by the
IUT.
[X.290 3.3.14]
3.2.12
conformance assessment process
The complete process of accomplishing all conformance testing activities necessary to
assess the conformance of an implementation or a system to one or more OSI specifications.
[X.290 3.3.19]
3.2.13
conformance log
A human-readable record of information produced as a result of a test campaign, which is
sufficient to record the observed test outcomes and verify the assignment of test results
(including test verdicts). [X.290 3.3.20]
3.2.14
conformance test information (CTI)
A statement made by the supplier or implementer of an IUT integrating the PICS, the PIXIT,
the information object ICS and the information object ICT into a single document.
3.2.15
(conformance) test suite
A complete set of test cases possibly combined into nested test groups, that is needed to
perform dynamic conformance testing for one or more OSI protocols.
NOTE It should cover both capability testing and behaviour testing. It may be qualified by the adjectives:
abstract or executable, as appropriate. Unless stated otherwise, an “abstract test suite” is meant.
[X.290 3.3.22]
3.2.16
conformance testing
Testing the extent to which an IUT is a conforming implementation. [X.290 3.3.23]
3.2.17
conforming implementation
An IUT, which satisfies both static and dynamic conformance requirements, consistent with
the capabilities stated in the ICS(s).
NOTE In case of DLMS/COSEM, the capabilities are partly declared in the CTI and they are partly provided by
the IUT.
3.2.18
executable test case
A realization of an abstract test case. [X.290 3.3.31]
3.2.19
executable test case error
A test case error in the realization of an abstract test case. [X.290 3.3.32]
3.2.20
executable test suite (ETS)
A test suite composed of executable test cases. [X.290 3.3.33]
3.2.21
fail (verdict)
A test verdict given when the observed test outcome either demonstrates non-conformance
with respect to (at least one of) the conformance requirement(s) on which the test purpose of
the test case is focused, or contains at least one invalid test event, with respect to the
relevant specification(s). [X.290 3.3.34]
3.2.22
foreseen test outcome
An observed test outcome identified in the abstract test case. [X.290 3.3.35]
NOTE A foreseen test outcome may include an unidentified test event.
3.2.23
idle testing state
A stable testing state in which there is no established connection of the relevant protocol(s)
and in which the state of the IUT is independent of any previously executed test cases.
[X.290 3.3.38]
3.2.24
implementation conformance statement (ICS)
A statement made by the supplier of an implementation or system claimed to conform to a
given specification, stating which capabilities have been implemented. The ICS can take
several forms: protocol ICS, profile ICS, profile specific ICS, and information object ICS.
[X.290 3.3.39]
3.2.25
implementation extra information for testing (IXIT)
A statement made by a supplier or implementer of an IUT which contains or references all of
the information (in addition to that given in the ICS) related to the IUT and its testing
environment, which will enable the test laboratory to run an appropriate test suite against the
IUT. An IXIT can take several forms: protocol IXIT, profile IXIT, profile specific IXIT, and
information object IXIT. [X.290 3.3.41 modified]
3.2.26
implementation under test (IUT)
An implementation of one or more OSI protocols in an adjacent user/provider relationship,
being that part of a real open system which is to be studied by testing. [X.290 3.3.43]
3.2.27
inapplicable test
A test case, which cannot be performed because the necessary conditions are not available.
3.2.28
inconclusive (verdict)
A test verdict given when the observed test outcome is such that neither a pass nor a fail
verdict can be given. [X.290 3.3.44]
3.2.29
information object implementation conformance statement; information object ICS
An ICS for an implementation or system claimed to conform to a given information object
specification. [X.290 3.3.45]
3.2.30
information object implementation extra information for testing; information object IXIT
An IXIT for an implementation or system claimed to conform to a given information object
specification. [X.290 3.3.46]
3.2.31
initial testing state
The testing state in which a test body starts.
NOTE This may be either a stable testing state or a transient state.
[X.290 3.3.47]
3.2.32
inopportune test event
A test event which occurs when not allowed to do so by the relevant specification(s) to which
conformance is being tested. [X.290 3.3.48]
3.2.33
invalid test event
A test event that violates at least one conformance requirement of the relevant
specification(s) to which conformance is being tested. [X.290 3.3.49]
3.2.34
means of testing (MOT) (IUTs)
The combination of equipment and procedures that can perform the derivation, selection,
parameterization and execution of test cases, in conformance with a reference standardized
ATS, and can produce a conformance log. [X.290 3.3.54]
3.2.35
negative test
Test to verify the correct response of the IUT on:
• DLMS/COSEM conformant information and services, which are not implemented;
• non conformant communication traffic.
3.2.36
(observed) test outcome
The sequence of test events, together with associated data and/or parameter values, which
occurred during test execution of a specific parameterized executable test case. [X.290
3.3.58]
3.2.37
parameterized executable test case
An executable test case, in which all appropriate parameters have been supplied with values
in accordance with specific ICS(s) and IXIT(s), as appropriate, and corresponding to a
parameterized abstract test case. [X.290 3.3.61]
3.2.38
pass (verdict)
A test verdict given when the observed test outcome gives evidence of conformance to the
conformance requirement(s) on which the test purpose of the test case is focused, and when
no invalid test event has been detected. [X.290 3.3.63]
3.2.39
positive test
test to ensure the correct implementation of the capabilities of the IUT as defined by the
supplier. A positive test has a described and defined response.
3.2.40
preliminary result
Information to be recorded in the conformance log and to be used in determining the test
verdict. [X.290 3.3.65]
3.2.41
protocol conformance test report (PCTR)
A document produced at the end of a conformance assessment process, giving the details of
the testing carried out using a particular ATS. It lists all of the abstract test cases and
identifies those for which corresponding executable test cases were run, together with the
verdicts assigned. [X.290 3.3.79]
3.2.42
protocol implementation conformance statement (PICS)
An ICS for an implementation or system claimed to conform to a given protocol specification.
[X.290 3.3.80]
3.2.43
protocol implementation extra information for testing (PIXIT)
An IXIT related to testing for conformance to a given protocol specification. [X.290 3.3.81]
3.2.44
reference (standardized) abstract test suite; reference (standardized) ATS
The standardized ATS for which a Means of Testing is realized. [X.290 3.3.84]
3.2.45
repeatability (of results)
Characteristic of a test case, such that repeated executions on the same IUT under the same
conditions lead to the same test verdict, and by extension a characteristic of a test suite.
[X.290 3.3.86]
3.2.46
semantically invalid test event
A test event which is neither inopportune nor syntactically invalid, but which contains a
semantic error with respect to the relevant protocol specification (e.g. a PDU containing a
parameter value outside the negotiated range for that parameter). [X.290 3.3.90]
3.2.47
stable testing state
A testing state which can be maintained, without prescribed Lower Tester behaviour,
sufficiently long to span the gap between one test case and the next in a test campaign.
[X.290 3.3.93]
3.2.48
standardized abstract test suite; standardized ATS
An ATS specified within an ITU-T or ISO/IEC published specification or, in the absence of
such a specification, within a publicly available specification which is in the process of being
standardized within ITU-T or ISO/IEC, and which has the highest standardization status
available, and which has the status of at least a Committee Draft or equivalent. [X.290 3.3.94]
3.2.48
static conformance review
A review of the extent to which the static conformance requirements are claimed to be
supported by the IUT. [X.290 3.3.96 modified]
3.2.49
syntactically invalid test event
A test event which is not allowed syntactically by the relevant specification(s) to which
conformance is claimed. [X.290 3.3.99]
3.2.50
test body
The sequences of test events that achieve the test purpose. [X.290 3.3.105]
3.2.50
test campaign
The process of executing the Parameterized Executable Test Suite for a particular IUT and
producing the conformance log. [X.290 3.3.106]
3.2.50
test case
An abstract or executable test case.
NOTE In general the use of the word “test” will imply its normal English meaning. Sometimes it may be used as
an abbreviation for abstract test case or executable test case. The context should make the meaning clear.
3.2.51
test case error
The term used to describe the result of execution of a test case when an error is detected in
the test case itself. [X.290 3.3.108]
3.2.52
test event
An indivisible unit of test specification at the level of abstraction of the specification (e.g.
sending or receiving a single PDU). [X.290 3.3.110]
3.2.53
test group
A named set of related test cases. [X.290 3.3.111]
3.2.54
test group objective
A prose description of the common objective which the test purposes within a specific test
group are designed to achieve. [X.290 3.3.112]
3.2.55
test laboratory
An organization that carries out conformance testing. This can be a third party, a user
organization, a telecommunications administration or recognized private operating agency, or
an identifiable part of a supplier organization. [X.290 3.3.113]
3.2.56
(test) postamble
The sequences of test events from the end of the test body up to the finishing stable testing
state(s) for the test case. [X.290 3.3.116]
3.2.57
(test) preamble
The sequences of test events from the starting stable testing state of the test case up to the
initial testing state from which the test body will start. [X.290 3.3.117]
3.2.58
test purpose
A prose description of a well defined objective of testing, focusing on a single conformance
requirement or a set of related conformance requirements as specified in the appropriate OSI
specification (e.g. verifying the support of a specific value of a specific parameter). [X.290
3.3.118]
3.2.59
test step (sub-test)
A named subdivision of a test case, constructed from test events and/or other test steps.
[X.290 3.3.122]
3.2.60
(test) verdict
A statement of “pass”, “fail” or “inconclusive”, as specified in an abstract test case,
concerning conformance of an IUT with respect to that test case when it is executed. [X.290
3.3.124]
3.2.61
unforeseen test outcome
An observed test outcome not specified in the abstract test case.
NOTE An unforeseen test outcome can only lead to a test case error or an abnormal test case termination.
[X.290 3.3.127]
3.2.62
valid test event
A test event which is allowed by the protocol specification, being both syntactically and
semantically correct, and occurring when allowed to do so by the protocol specification.
[X.290 3.3.130]
3.3 Abbreviations
Abbreviation Explanation
AA Application Association
AARE Application Association Response
AARQ Application Association ReQuest
ACSE Application Control Service Element
AL Application layer
ANSI American National Standards Institute
APDU Application Protocol Data Unit
ASE Application Service Element
ATS Abstract Test Suite
A-XDR Adapted Extended Data Representation
base_name The short_name corresponding to the first attribute (“logical_name”) of a
COSEM object
CHAP Challenge Handshake Authentication Protocol
Class_id Interface class identification code
COSEM Companion Specification for Energy Metering
COSEMobject An instance of an interface class
CTI Conformance Test Information
Abbreviation Explanation
Abbreviation Explanation
The objective of conformance testing is to establish whether the Implementation Under Test
(IUT) conforms to the relevant specification(s).
The primary purpose of conformance testing is to increase the probability that different
implementations are able to interwork. While conformance is a necessary condition, it is not
on its own a sufficient condition to guarantee interworking capability. Even if two
implementations conform to the same protocol specification, they may fail to interwork fully.
What conformance testing does do is give confidence that an implementation has the
required capabilities and that its behaviour conforms consistently in representative instances
of communication.
• the Blue Book, specifying COSEM interface object model, see [1];
• the Green Book, specifying communication profiles, see [2]; and
• the Yellow book – this document – specifying the conformance testing process.
NOTE The contents of the Blue Book and the Green Book are internationally standardized, see the Bibliography.
DLMS/COSEM
specification
DLMS/COSEM
DLMS UA Test parameters:
test plans -
WG Maintenance from CTI and IUT
Abstract Test Suites
DLMS/COSEM CTT -
DLMS/COSEM CTT -
Parametrized Executable
Executable Test Suites
Test Suites
Conformance assessment
Yes
Defects ? Correct defects
No
Certificate
The conformance test plans - Abstract Test Suites (ATS) - describe, at the level of
abstraction, the test to be performed. See clause 5.
The Conformance Test Tool (CTT) implements the Abstract Test Suites in the form of
Executable Test Suites. See clause 6.
The conformance assessment process consists of the phases of preparation for testing, test
operations and conformance test report production. See clause 7.
The certification process consists of examining conformance test reports and publication of
Certificates. See clause 8.
The quality program includes handling comments and questions and initiating the
maintenance of the specification, the conformance test plans and/ or the CTT as appropriate.
See clause 9.
• it covers servers implementing the COSEM interface object model and one or more DLMS based
communication profiles;
• it is limited to the server’s functionality as presented at the communication interface. Other
functions of the server are out of the Scope of conformance testing;
• the conformance test plans and the CTT are provided by the DLMS UA. They are available to all
members of the DLMS UA;
• the CTT can be used for self-testing and third party testing, at the conditions published at the
homepage of the DLMS UA, at www.dlms.com;
• the certification process can be initiated by any member of the DLMS UA;
• to obtain a Certificate, the manufacturer of the IUT shall possess a registered three-letter
manufacturer ID; see http://dlms.com/flag/index.htm;
• the CTT automatically generates the documents necessary for the Certification;
• the Certification is issued by the DLMS UA;
• the DLMS UA operates a Quality program to maintain the test plans and the CTT.
Messaging
3-layer, connection-oriented, HDLC TCP-UDP/IP based communication
based communication profile profile
ACSE xDLMS
Transporting
Transporting
The COSEM Interface object model, specified in [1] and the COSEM Application layer
specified in Clause 9 of [2] are used in all implementations.
• in the 3-layer, connection-oriented, HDLC based communication profile, the COSEM Application
layer is supported by the data link layer using HDLC protocol, specified in Clause 8 of [2] and the
physical layer specified in clause 5 of [2];
• in the TCP-UDP/IP based communication profile the COSEM Application layer is supported by the
COSEM transport layer specified in clause 7 of [2], and this is supported by a set of lower layers
appropriate for the communication media.
The conformance test plans covers:
For the purposes of testing, the IUT is considered as a black box. The test consists of
sending messages to the IUT and observing the responses.
As access to layer boundaries is not available, the interface object model and the protocol
stack are tested in combination. Therefore, the following assumptions are made:
• for testing the data link layer using HDLC protocol, it is assumed that the physical layer works
correctly;
• for testing the COSEM Application layer, it is assumed that the supporting layers work correctly;
• for testing the COSEM Interface object model, it is assumed that the protocol stack works
correctly.
Test suites have a hierarchical structure (see Figure 3) in which an important level is the test
case.
Each test case has a specified test purpose, such as that of verifying that the IUT has a
certain required capability (e.g. the ability to support certain packet sizes) or exhibit a certain
required behaviour (e.g. behave as required when a particular event occurs in a particular
state).
Within a test suite, nested test groups are used to provide a logical ordering of the test
cases.
Test events are indivisible units of specification within a test step (e.g. the transfer of a single
PDU to or from the IUT).
Test suite
Test suites include test cases falling in the following categories (the list is not exhaustive):
• capability tests;
• behaviour tests of valid behaviour (positive tests);
• behaviour tests of syntactically invalid or inopportune behaviour (negative tests);
• test focusing on PDUs sent to and received from the IUT;
• test related to each protocol phase;
• timing;
• PDU encoding variations;
• variations in values of individual parameters and/or combination of parameters.
• has a Test case name, used as a reference and relating the test case to the test group and the
test suite;
• gives the References pointing to the relevant clauses of the Blue Book [1] and/or the Green Book
[2], constituting the base specification, the test case is related to and derived from;
• specifies the Test purpose;
• specifies the expected behaviour of the IUT; this comprises the Expected result;
• specifies, if the initial testing state required by the test body is not the desired starting stable state
of the test case, the sequence of events to put the IUT to the initial testing state for the test body;
this test sequence comprises the Preamble;
• specifies the sequences of foreseen test events necessary in order to achieve the test purpose.
These sequences comprise the Test body. It may consist of one or more subtests;
• specifies, if the test body can end without the IUT being returned to the desired stable testing
state, the sequence of events to return the IUT to the desired stable testing state; this test
sequence comprises the Postamble;
• specifies the verdict to be assigned to each foreseen test outcome.
The abstract test cases are formatted using the template shown in Table 1.
A foreseen test outcome is one, which has been defined by the abstract test case i.e. the
events which occurred during execution of the test case matched a sequence of test events
defined in the abstract test case. A foreseen test outcome always results in the assignment of
a test verdict to the test case.
• PASSED – Means that the observed test outcome gives evidence of conformance to the
conformance requirement(s) on which the test purpose of the test case is focused, and is valid
with respect to the relevant specification(s);
• FAILED – Means that the observed test outcome either demonstrates non-conformance with
respect to (at least one of) the conformance requirement(s) on which the test purpose of the test
case is focused, or contains at least one invalid test event, with respect to the relevant
specification(s);
• INCONCLUSIVE – Means that the observed test outcome is such that neither a pass nor a fail
verdict can be given.
An unforeseen test outcome is one, which has not been identified by the abstract test case,
i.e. the events, which occurred during execution of the test case did not match any sequence
of test events defined in the abstract test case. An unforeseen test outcome always results in
the recording of a test case error or an abnormal test case termination for the test case.
A test case error is recorded if an error is detected either in the abstract test case itself, (i.e.
an abstract test case error) or in its realization, (i.e. an executable test case error).
An abnormal test case termination is recorded if the execution of the test case is prematurely
terminated by the test system for reasons other than test case error.
The results of executing the relevant individual test cases will be recorded in the
conformance test report.
5.6 Conformance test plan for the Data link layer using
HDLC protocol
Test suite
Data link layer
using HDLC protocol
Test group Test group Test group Test group Test group
HDLC_FRAME HDLC_ADDRESS HDLC_NDM2NRM HDLC_INFO HDLC_NDMOP
Test cases Test cases Test cases Test cases Test cases Test cases Test cases Test cases
Figure 4 – Structure of the conformance test plan for the Data link layer using HDLC protocol
Test suite
COSEM Application layer
Test group Test group Test group Test group Test group
APPL_IDLE APPL_OPEN APPL_DATA_LN APPL_DATA_SN APPL_REL
Test cases Test cases Test cases Test cases Test cases Test cases
Figure 5 – Structure of the conformance test plan for the COSEM Application layer
This conformance test plan is specified in [9]. It is implemented in the Conformance Test Tool
V1.x.
Test suite
COSEM interface objects
COSEM_X_Y
Multiple references
Mandatory objects
Figure 6 – Structure of the conformance test plan for the COSEM interface objects
6.1 Introduction
The DLMS/COSEM conformance test tool (CTT) is an implementation of the abstract test
suites (ATS) in the form of executable test suites (ETS). It can perform the following:
7.1 Overview
The conformance assessment process is the complete process of accomplishing all
conformance testing activities necessary to enable the conformance of the IUT to be
assessed.
Start
Test operations
Test campaigns:
one for each communication
profile and application
context
End
In the following, the elements of the conformance test process are described and the use of
the CTT is explained.
• if the IUT supports more than one logical device, then at least two logical devices should be
configured;
• if it is claimed that the IUT supports more than one authentication context, then at least one AA
should be present for each authentication context. These can be in the same logical device or
spread across the logical devices;
• if the purpose of the conformance assessment process is to obtain a Certificate, then the
mandatory Management Logical Device shall be present and shall contain a Public AA;
• instances of each standard interface class supported shall be present. The set of interface objects
available should be representative for the intended application;
• the AAs shall provide access to the objects and attributes to be tested, with appropriate access
rights and authentication;
• it is the responsibility of the manufacturer to restrict access rights to attributes, which must not be
modified during the test. This can be done by providing extra information in the CTI;
• if load profile with selective access are to be tested, then a sufficient amount of data should be
present. The conditions are specified in [10];
• the set of xDLMS services should be representative for the intended application.
See also Clause 8.5, Scope and validity of the certification.
It also necessary to have information relating to the IUT and its testing environment, like
addresses, timeouts, baud rates, passwords. This information is known as Implementation
Extra Information for testing (IXIT).
The Conformance Test Information (CTI) file, in addition to the identification of the
manufacturer and the IUT, includes the ICS and the IXIT. It can be prepared using the
template provided by the CTT. The template can be loaded from the View menu. A Help,
explaining the contents and the syntax of the CTI is provided. The CTI editor has a built in
syntax checker.
Based on the information taken from the IUT and the CTI, the CTT executes only the tests
applicable and the test cases are automatically parameterized.
If the IUT supports more than one communication interfaces, then a test campaign may be
run on each interface.
If the IUT supports more than one application context, then a test campaign is needed for
each application context for which compliance is claimed.
From the View menu, the line traffic shall be selected, and then the test campaign shall be
started from the Run menu. If necessary, the test can be stopped using Abort.
The progress of the test can be followed on the Conformance test report, Conformance log
and Line traffic windows.
It may happen, that during the test campaign, an exception occurs, terminating the test.
The possible causes and measures are explained in the Help menu.
• date of testing;
• identification of the test tool and license owner;
• identification of the manufacturer as declared in the CTI file;
• identification of the IUT as declared in the CTI file;
• a summary of results for each test suite;
• the result of each test case;
• a copy of the CTI file;
• a digital signature, allowing to check the authenticity of the conformance test report.
Nevertheless, at the test case level, every effort has been made to minimize the possibility
that a test case produces different test outcomes on different occasions.
If the test is done by the vendor – which is the preferred method - the test laboratory shall be an
identifiable part of the vendor’s organisation.
8.1 General
The purpose of the certification process is to obtain a “DLMS/COSEM compliant” Certificate.
This clause describes the necessary steps.
The manufacturer of the device to be certified shall be also a member of the DLMS UA and
shall possess a three-letter manufacturer ID.
The DLMS UA verifies the contents and the authenticity of the conformance test report(s).
The documents submitted are used as supporting documentation for the Certificate.
The DLMS UA maintains the right to discuss the contents of the conformance test report with
the organization having initiated the certification process.
Information on the configuration and functions tested is available in the test report.
The Certificate entitles the manufacturer to place the “DLMS/COSEM compliant mark” on its
products and documentation.
The supporting evidence is the conformance test report with the conformance log and line
traffic.
The DLMS UA does not control if the meters manufactured are identical to the meter tested.
The Certification remains valid as long as the manufacturer guarantees that no design or
manufacturing changes in communication hard- and software with essential influence on the
implementation have been made to the certified device. If changes have been made, a re-test
is necessary.
The DLMS UA takes no position concerning the validity of the guarantee mentioned above.
8.6 Disclaimer
The DLMS UA takes all possible effort to ensure that the conformance test plans and the
conformance test tool are line with the DLMS/COSEM specification and provide a reasonable
depth of testing.
The Certificate does not mean however that an absolute proof of conformance is given.
9.1 General
An important element of the DLMS/COSEM conformance testing process is the quality
program. It includes:
1. the test plans have been written by experts from different manufacturers – members of
the DLMS UA WG Maintenance – based on the Blue Book [1] and the Green Book [2];
2. all test scripts, implementing the abstract test cases have been validated by running them
against several implementations.
9.4 Maintenance
The DLMS UA maintains the conformance testing process to eliminate problems with the tool found
during testing, to enhance tests and to accommodate changes in the standards.
1. a proposal, together with a justification is made to add or modify a test. This can be
initiated by any member of the DLMS UA or by the DLMS UA itself;
2. the request is investigated by the DLMS UA;
3. if the request is accepted, the conformance test plans are amended by the DLMS UA;
4. the new tests cases are implemented by the tool developer;
5. the new tests cases are validated by the DLMS UA;
6. the amended conformance test plans are published;
7. a new version of the CTT is made available to the licensed tool users.
The proposal is submitted to the DLMS UA. The DLMS UA checks if the proposal is in line
with the standards. If approved, the COSEM Object definition tables [12] are amended and a
new .dat file is made available for download.
The vendor submits the proposed test plan to the DLMS UA and the process described above
is followed.
The conformance requirements and the test plans are prepared together with the standard,
but at least upon the acceptance of the new standard.
Annex A
Certificate template