DO 178 B Brief Notes
DO 178 B Brief Notes
DO 178 B Brief Notes
Avionics Software
Engineering : LN 5 :
DO 178 B/C : A Brief Note
Prof M S Prasad
This brief note on DO 178 B/C standards is for Grad/ Post
Grad students of AVIONICS. This LN has been prepared
based on available open literature for instructional purposes
only.
Level
Failure
condition
Description
Objective
associated
Catastrophic
66
Hazardous
65
Major
Minor
No effect
57
28
Fig :
Software
Life Cycle
Process
Planning
Objective
Software
Life Cycle
Process
Objective
SRS
SDS
SCS
SRD
SW Quality
Assurance Plan
SW Requirements
Standards
SW Design
Standards
SW Code
Standards
SW Requirements
Data
Design
Coding
SRC
Integration
EOC
Configuration
Management
SCM Records
Problem Reports
Verification
Detect and report errors/bugs that may have been introduced during
the development processes
Review Records
Quality
SW Conformity
Source Code
Executable Object
Code
SVCP SW Verification
Cases & Proc.
SVR SW Verification
Results
SCA Structural
Coverage
SCI Analysis
SW Configuration
SAS Index
SW
Accomplishment
Summary
Software
Life Cycle
Process
Objective
Assurance
Review
SQA Records
Certification
Liaison
which are to be
Four high level activities are identified in the DO -178B Software Development
Processes section; Software requirements process, Software design process,
Software coding process and Integration process . In addition, there is a
section on requirements traceability (section 5.5) that embodies the evolution
and traceability of requirements from system level requirements to source
code.
As part of Section 5.3, the software development process, DO -178B specifies
that software must meet certain software coding process requirements. These
include adherence to a set of software coding standards and traceability from
low level design requirements to the source code and object code.
Further definition of the software coding standards are provided in Section
11.8 of DO-178B:
Programming language(s) to be used and/or defined subset(s). For a
programming language, reference the data that unambiguously defines the
syntax, the control behaviour, the data behaviour and side -effects of the
language. This may require limiting the use of some features of a l anguage.
Source code presentation standards, for example, line length restriction,
Indentation, and blank line usage and source code documentation standards,
for example, name of author, revision history, inputs and outputs, and
affected global data.
Naming
constants.
conventions
for
components,
subprograms,
variables
and
Description
DO178
DO
17
8 A
DO
17
8 B
DO
17
8 C
Do17
8
6.4.4.2
NR
NR
NR
6.4.4.2.a
and
6.4.4.2 b
NR
NR
Referenc
e
Test Coverage
of
software
structure(MC/D
C is achieved)
Test
coverage
SW structure (
decision
coverage
is
satisfied )
Statement
6.4.4.2 a
coverage
is
& b
satisfied
Data
Coupling
6.4.4.2c
&Control
coupling
is
satisfied
Item no 1 -4 are manual procedures .
NR
NR
Control Coupling
Defined by DO-178B to be The manner or degree by which one software component
influences the execution of another software component.
Data Coupling
Defined by DO-178B to be The dependence of a software component on data not
exclusively under the control of that software component,
Software Configuration Management
Verification of the various outputs discussed in DO-178B are only credible when there is
clear definition of what has been verified. This definition or configuration is the intent of
the DO-178B objectives for configuration management. The six objectives in this area
are unique, in that they must be met for all software levels. This includes identification of
what is to be configured, how baselines and traceability are established, how problem
reports are dealt with, how the software is archived and loaded, and how the
development environment is controlled.
While configuration management
is a fairly well-understood concept within the
software engineering community (as well as the aviation industry as a whole), DO-178B
does introduce some unique terminology that has proven to be problematic. The
concept of control categories is often misunderstood in a way that overall development
costs are increased, sometimes dramatically. DO-178B defines two control categories
(CC1 and CC2) for data items produced throughout the development.
The authors of DO-178B intended the two levels as a way of controlling the overhead
costs of creating and maintaining the various data items. Items controlled as CC2 have
less requirements to meet in the areas of problem reporting, base lining, change control,
and storage. The easiest way to understand this is to provide an example. Problem
reports are treated as a CC2 item. If problem reports were a CC1 item and a problem
was found with one of the entries on the problem report itself, a second problem report
would need to be written to correct the first one.
A second nuance of control categories is that the user of DO-178B may define what
CC1 and CC2 are within their own CM system as long as the DO-178B objectives are
met. One example of how this might be beneficial is in defining different retention
periods for the two levels of data. Given the long life of airborne systems, these costs
can be quite sizeable. Another consideration for archival systems selected for data
retention is technology obsolescence of the archival medium as well as means of
retrieval.
Software Quality Assurance
Software quality assurance (SQA) objectives provide oversight of the entire DO-178B
process and require independence at all levels. It is recognized that it is prudent to have
an independent assessment of quality.
SQA is active from the beginning of the development process. SQA assures that any
deviations during the development process from plans and standards are detected,
recorded, evaluated, tracked, and resolved. For levels A and B, SQA is required to
assure transition criteria are adhered to throughout the development process.
SQA works with the CM process to assure that proper controls are in place and applied
to life cycle data. This last task culminates in the conduct of a software conformity
review. SQA is responsible for assuring that the as-delivered products matches the asbuilt and as-verified product. The common term used for this conformity review in
commercial aviation industry is First Article Inspection.
The Software Testing Process in general is shown below.
any gaps. Alternate sources of development data, service history, additional testing,
reverse engineering, and wrappers* are all ways of ensuring the use of PDS issafe in
the new application.
In all cases, usage of PDS must be considered in the safety assessment process and
may require that the process be repeated if the decision to use a PDS component
occurs after the approval of PSAC. A special instance of PDS usage occurs when
software is used in a system to be installed on an aircraft other than the one for which it
was originally designed. Although the function may be the same, interfaces with other
aircraft systems may behave differently. As before, the system safety assessment
process must be repeated to assure that the new installation operates and behaves as
intended.
If service history is employed in making the argument that a PDS component is safe for
use, the relevance and sufficiency of the service history must be assessed. Two tests
must be satisfied for the service history approach to work. First, the application for
which history exists must be shown to be similar to the intended new use of the PDS.
Second, there should be data, typically problem reports, showing how the software has
performed over the period for which credit is sought. The authors of DO-178B intended
that any use of PDS be shown to meet the same objectives required of newly developed
code.
Prior to identifying PDS as part of a new system, it is prudent to investigate and truly
understand the costs of proving that the PDS satisfies the DO-178B objectives.
Sometimes, it is easier and cheaper to develop the code again!
*Wrappers is a generic term used to refer to hardware or software components that
isolate and filter inputs to and from the PDS for the purposes of protecting the system
from erroneous PDS behavior.
DO 178 Documents
Problem Reports
Tool Selection
The use of traceability and analysis tools for an avionics project that must meet the DO178B certification requirements offers significant productivity and cost benefits. Tools
make compliance checking easier, less error prone and more cost effective. In addition,
they make the creation, management, maintenance and documentation of requirements
traceability straightforward and cost effective. When selecting a tool to assist in
achieving DO-178B acceptance the following criteria should be considered:
Does the tool provide a complete end-to-end Requirements Traceability capability to
enable linkage and documentation from all levels to the source code and associated
test cases?
Does the tool enable analysis for all Structural Coverage Analysis requirements as laid
out in section 6.4.4.2 of the standard?
Can the tool perform MC/DC analysis in assignment statements as well as conditional
statements?
Is there tool availability for all the languages required in your project?
Has the tool been utilised in this manner successfully already?
Appendix A
Guidelines for Software Development
(a) A detailed description of how the software satisfies the specified software high-level
requirements, including algorithms, data-structures and how software requirements are
allocated to processors and tasks.
(b) The description of the software architecture defining the software structure to
implement the requirements.
c) The input/output description, for example, a data dictionary, both internally and
externally throughout the software architecture.
(d) The data flow and control flow of the design.
(e) Resource limitations, the strategy for managing each resource and its limitations, the
margins and the method for measuring those margins, for example timing and memory.
(f) Scheduling procedures and inter processor/inter task communication mechanisms,
including time-rigid sequencing, pre-emptive scheduling, interrupts.
(g) Design methods and details for their implementation, for example, software data
loading, user modifiable software, or multiple-version dissimilar software.
(h) Partitioning methods and means of preventing partitioning breaches.
(i) Descriptions of the software components, whether they are new or previously
developed, with reference to the baseline from which they were taken.
(j) Derived requirements from the software design process.
(k) If the system contains deactivated code, a description of the means to ensure that
the code cannot be enabled in the target computer.
(l) Rationale for those design decisions that are traceable to safety-related system
requirements.
----------------------------------------------------------------------------------------------------------------------