top - download
⟦212430e38⟧ Wang Wps File
Length: 18022 (0x4666)
Types: Wang Wps File
Notes: Software Verification Std
Names: »0360A «
Derivation
└─⟦1304ed705⟧ Bits:30006073 8" Wang WCS floppy, CR 0030A
└─ ⟦this⟧ »0360A «
WangText
…02…SD/STD/012
…02…SVO/810616…02……02…#
SOFTWARE
VERIFICATION
STANDARD
…02……02…GENERAL
T̲A̲B̲L̲E̲ ̲O̲F̲ ̲C̲O̲N̲T̲E̲N̲T̲S̲
…02…1 PURPOSE .................................... 4
…02…2 VERIFICATION PRINCIPLES .................... 4
…02……02…2.1 DEFINITION ............................. 4
…02……02…2.2 VERIFICATION PHASES .................... 4
…02……02…2.3 VERIFICATION METHODS ................... 5
…02…3 VERIFICATION DOCUMENTATION ................. 7
…02…4 TEST BED ................................... 8
…02…5 DETAILED VERIFICATION REQUIREMENTS ......... 9
5.1 UNIT TEST ............................... 9
5.1.1 Capability Testing .................. 10
5.1.2 Structural Testing .................. 10
5.1.2.1 Control Path Testing ............ 10
5.1.2.2 Data Access Testing ............. 11
5.1.2.3 Error Processing ................ 11
5.2 INTEGRATION TESTING ..................... 12
5.2.1 SW/SW Integration ................... 12
5.2.2 SW/HW Integration ................... 13
5.2.3 Environmental Simulation ............ 14
5.3 QUALIFICATION TEST ...................... 15
6 PROJECT DEPENDENCIES ...................... 16
1 P̲U̲R̲P̲O̲S̲E̲
The purpose of this standard is to define the principles
to be applied during verification of software products,
and to define the minimum extent to which such verification
shall be carried out.
2̲ ̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲P̲R̲I̲N̲C̲I̲P̲L̲E̲S̲
2.1 D̲E̲F̲I̲N̲I̲T̲I̲O̲N̲
Software verification is defined as the process of
ensuring and demonstrating that software design, programming,
tests and documentation meet the approved specifications
and standards.
2.2 V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲P̲H̲A̲S̲E̲S̲
The verification activities will normally be distributed
throughout the development phases in the following
way:
a) Inspections (Reviews) which will take place after
system design, detailed design coding and test.
Inspections shall check that design, programming
and verification in each stage conform to agreed
specifications and standards and contain as few
errors as possible. Inspections are defined in
a separate standard:
Software Inspection Standard SD/STD/TBD
b) Unit Testing takes place as part of the development
of each software unit (see UDF Standard SD/STD/006
for definition of unit).
Unit tests shall establish that a piece of code
conforms to the functional, performance, and interface
specifications placed on it in the agreed design.
This usually means testing against detail design.
c) Integration Testing takes place during the integration
phase.
The tests shall establish that a unit tested piece
of software interfaces properly with other units,
and that the units function properly together within
a functional unit of a higher level of integration.
This is mainly a verification of the consistency
of detailed design with architectural design.
d) Qualification and acceptance testing takes place
after integration and/or after installation of
end product at purchasers site.
These tests shall demonstrate systematically that
the fully integrated system complies with all functional,
performance, and interface specifications.
This amounts to testing the end product against
the requirement specification.
2.3 V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲M̲E̲T̲H̲O̲D̲S̲
The verification requirements will be met by one of
the following methods:
a) T̲e̲s̲t̲
This method is a functional verification, such
as actual operation wherein the element of verification
is instrumented, measured, or displayed directly.
All software products shall be completely tested
to demonstrate its proper functioning, its interfaces
and its performance.
Test is the preferred method of verification.
b) A̲n̲a̲l̲y̲s̲i̲s̲. This method is a non-functional verification,
such as deduction or translation of data, review
of analytical data, or performance of a detailed
analysis.
The aim of the analysis is to discover:
1) Misinterpretations of the approved specification
or the design.
2) Errors, omission or inconsistencies in the
above or in the programs, tests or documentation.
3) Nonconformance to the agreed standards.
Analysis is performed in all cases where test is
not possible and examination is not sufficient.
c) E̲x̲a̲m̲i̲n̲a̲t̲i̲o̲n̲/̲I̲n̲s̲p̲e̲c̲t̲i̲o̲n̲
This method is a non-functional verification, such
as visual inspection of the physical characteristics
of the item or of the documentation associated
with the item.
Examination is performed in all cases where Test
or Analysis is considered an unneccesary effort.
3̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲D̲O̲C̲U̲M̲E̲N̲T̲A̲T̲I̲O̲N̲
The verification activities shall be executed in accordance
with a set of documentation which shall be prepared
in accordance with the appropriate standards and shall
be available before the activities are started.
The test documentation shall be approved by at least
one person other than the originator of the software.
This person would preferably belong to a verification
group which works independently from the production
groups.
The following documents shall be prepared:
a) V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
A management document which addresses all aspects
related to the verification. It should include
the test schedule and define the necessary support
tools.
b) T̲e̲s̲t̲ ̲S̲p̲e̲c̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
Describes the test criteria and the methods to
be used in a specific test to assure that the performance
and design specifications have been satisfied.
The test specification identifies the capabilities
or programme functions to be tested and identifies
the test environment.
c) T̲e̲s̲t̲ ̲P̲r̲o̲c̲e̲d̲u̲r̲e̲
A test procedure is a document that delineates
each step necessary to conduct a test. The steps
shall be in sequence with all inputs and outputs
defined.
d) V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲ ̲R̲e̲p̲o̲r̲t̲
Collects, either directly or by reference, all
verification results and contains a summary and
a conclusion of the verification.
e) V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲ ̲C̲o̲n̲t̲r̲o̲l̲ ̲D̲o̲c̲u̲m̲e̲n̲t̲
Contains a cross reference between the requirement
specification and the verification documents.
For each separate requirement an entry shall exist
showing how it is verified.
4̲ ̲ ̲T̲E̲S̲T̲ ̲B̲E̲D̲
In connection with the development of a software unit
a complete test bed shall be developed and utilized
in unit testing.
The Test bed shall be released with the unit and be
available in the future as a basis for tests after
modifications and to be integrated in the integration
test bed as appropriate.
The Test Bed shall consist of:
a) Test Driver
- If the unit under test is not a selfstanding
program the test driver shall contain the necessary
control structure to invoke the functions of
the test item.
- If the test item does not access its input
data directly the test driver shall contain
means for accessing proper input data and convey
them to the unit.
- If the test item does not produce direct output
the test driver shall contain means for accepting
output from the unit and store them in a test
output file.
b) Test Data
A Test Data file shall contain all the necessary
data to exersize the test.
c) Test Output File
An output file shall be available for retaining
the results of a test run. The latest test results
shall always be kept on a copy in order to compare
it with the reult of a new run.
d) Test Documentation
A set of documents as described in chapter 3.
e) Stubs
One or more stubs as necessary to simulate the
functions of other units with which the unit under
test normally interfaces.
The simulation may be primitive but shall be sufficient
to support the ongoing test.
5̲ ̲ ̲D̲E̲T̲A̲I̲L̲E̲D̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲R̲E̲Q̲U̲I̲R̲E̲M̲E̲N̲T̲S̲
5.1 U̲N̲I̲T̲ ̲T̲E̲S̲T̲
5.1.1 C̲a̲p̲a̲b̲i̲l̲i̲t̲y̲ ̲T̲e̲s̲t̲i̲n̲g̲
The total set of tests developed for Unit Testing shall
demonstrate:
1) that the software unit meets its functional, performance,
and interface specifications.
2) that all code was sufficiently exercised to establish
confidence in its reliability in normal, marginal,
and abnormal operational environments.
Consequently, one subset of the test space must
be designed against the requirements whereas the
other subset must be designed against the implemented
software independent of the specifications on which
its design was based.
5.1.2 S̲t̲r̲u̲c̲t̲u̲r̲a̲l̲ ̲T̲e̲s̲t̲i̲n̲g̲
In testing against the implemented code three aspects
shall be distinguished:
1) Correctness of the control paths executed under
given input conditions as well as executability
of all code.
2) Correctness of data access and determination of
access addresses.
3) Correctness of the calculations performed on that
path.
5.1.2.1 C̲o̲n̲t̲r̲o̲l̲ ̲P̲a̲t̲h̲ ̲T̲e̲s̲t̲i̲n̲g̲
A path shall be defined by an entry point, an exit
or abortion point and a set of loop counts and selection
exist from which the exact sequence of actually executed
instructions can be reconstructed.
Generally vertical and horizontal structuring of the
design and the program should have resulted in sufficient
partitioning of the complete set of paths of the complete
program into subpaths confined to subunits that testing
of all paths within each subunit is practicable (with
respect to the amount of test effort required). In
this case this shall be done.
If this is not practicable, independent substructures
shall be derived and test cases developed to test the
paths within each of these substructures independently
of the others. Independence should, if possible, be
construed to mean operation on different subsets of
data space.
5.1.2.2 D̲a̲t̲a̲ ̲A̲c̲c̲e̲s̲s̲ ̲T̲e̲s̲t̲i̲n̲g̲
The test cases shall at least exercise every alternative
of a selection, demonstrate proper termination of variable
count loops for several loop counts, demonstrate proper
data access for several destinations, and exercise
every compound statement in all of its modes. (A compound
statement shall be defined as a marco or procedure
call. The mode of such a statement shall be defined
as the basic type of function performed (e.g. TRIG
may deliver a SINE or COSINE, etc. depending upon a
function indicator provided in the call) or as the
basic format of the data passed (e.g. different record
format interpretation in the receiver depending upon
a function indicator). This definition separates most
of the paths and branches on one level of software
unit from the multitude of paths and branches induceable
in lower level units through the calls).
5.1.2.3 E̲r̲r̲o̲r̲ ̲P̲r̲o̲c̲e̲s̲s̲i̲n̲g̲
This effectivity of the test cases shall be proven
by appropriate monitoring and tracing tools.
Errors in the calculations performed on a given path
are of two kinds.
1) Errors which appear on every execution of the path
(unconditional errors e.g. wrong or missing statement,
wrong address etc.).
2) Errors depending upon the value of the variables
processed (conditional errors e.g. overflow, insufficient
precision of results etc.).
The first class is completely discovered through the
all branches/all modes requirement.
The probability of detecting errors of the second class
shall be made sufficiently high by supplying a test
space which exercises critical sequences with at least
three different sets of off nominal inputs (e.g. marginal,
extreme, singular, boundary value, wrong type or format,
out of bounds, null, missing or other data and control
parameters which are unusual with respect to singularities
of the function or limits of the computer).
5.2 I̲N̲T̲E̲G̲R̲A̲T̲I̲O̲N̲ ̲T̲E̲S̲T̲I̲N̲G̲
5.2.1 S̲W̲/̲S̲W̲ ̲I̲n̲t̲e̲g̲r̲a̲t̲i̲o̲n̲
a) Software shall be integrated by successively joining
verified units to larger units and again testing
these larger (integrated units).
Integration shall procede one level at a time until
the final product is completed.
b) The integrated unit shall be subjected to integration
tests and unit tests.
The integration tests shall demonstrate proper
functioning of the internal interfaces. They shall
exercise the interfaces for each different type
of control and data traffic which they are required
to handle and show that the tested unit uses its
component units correctly.
In contrast, unit testing (as described earlier)
shall show that the correct usage of the component
units by the tested unit produces proper results
in the tested unit. When performing unit tests
on integrated units, the requirements on the thoroughness
of unit tests shall be confinced to the highest
level unit since lower level units shall have undergone
complete unit testing before integration.
5.2.2 S̲W̲/̲H̲W̲ ̲I̲n̲t̲e̲g̲r̲a̲t̲i̲o̲n̲
a) If software development and software/software integration
were performed on a simulator a hardware/software
integration step must be performed.
Hardware/software integration shall be performed
to successively (by layers) replace software supporting
device simulators by target hardware and test the
interaction between software and target hardware.
b) Hardware shall first be subjected to a complete
set of unit tests (diagnostics). The correct usage
of hardware by software and all types of data traffic
between hardware and software shall be tested.
This shall include the test of all device control
functions and device status and error indicators,
transfer of typical and marginal data and timing
configurations.
5.2.3 E̲n̲v̲i̲r̲o̲n̲m̲e̲n̲t̲a̲l̲ ̲S̲i̲m̲u̲l̲a̲t̲i̲o̲n̲
a) Environmental simulation shall be performed at
least for all software for which this is required
in the specifications.
Whereas testing as discussed above is essentially
concerned with systematically investigating isolated
aspects of the software's behaviour under controlled
and possibly artificial conditions much like in
a laboratory experiment, environmental simulation
shall imitate in a random but reproduceable manner,
all aspects of "real world" demands (or a sub-
or superset thereof) made on the software system
and log selected results.
b) Environmental simulation shall be performed after
unit and integration testing. It shall be directed
towards detecting the following types of errors:
1) Mismatches between requirements and operational
needs.
2) Mismatches between the product and the requirements
not discovered during test due to omissions
in test space.
3) Conditional errors missed during analysis and
testing.
4) Real time errors.
5) Performance deficiencies.
In order to detect the first three types of errors,
the simulator shall be designed to vary input data
statistically within defineable constraints and
to perform plausibility checks on the results.
c) In order to detect the latter two types of errors,
the simulator shall be able to simulate load variations
and different task configurations by selecting
events and their associated input data from a predefined
set of precalculated cases, submit this sequence
of demands on the system, and monitors the results.
The simulator shall be able to select test cases
randomly from the predefined set varying the case
and the time of selection according to predefineable
density distributions. The results shall be compared
with the precalculated ones and errors shall be
logged together with the corresponding system states.
It shall be possible to reproduce a situation by
restarting the system in a certain logged state.
It shall be possible to adjust the level of monitoring.
5.3 Q̲U̲A̲L̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲T̲E̲S̲T̲
a) Qualification tests shall be developed and executed
to systematically demonstrate that
1) the completely integrated product and its components
meet the functional, performance and interface
requirements
2) all applicable events are properly handled
3) rejection of improper inputs is adequate
b) The performance test shall be performed as environmental
simulation of the system under test if the exact
determination of performance under various loads
is critical to the success of the operation. (To
be determined in the specification of the product.)
c) Environmental simulation shall be performed for
all software in an operational environment. Uninterrupted
error free service time shall be equivalent to
at least ten days of normal operation. The test
case distribution shall mirror actual operation
with respect to input values, timing and event
patterns as well as operator error and hardware
failures.
c) All tests shall be designed by an independent tester.
e) All tests shall be executed at least once on target
hardware before acceptance for operation.
f) A verification quality analysis shall be performed
for each object verified and documented in the
Software Verification Report.
6̲ ̲ ̲P̲R̲O̲J̲E̲C̲T̲ ̲D̲E̲P̲E̲N̲D̲E̲N̲C̲I̲E̲S̲
Dependent on the size and complexity of a software
project, the actual necessary verification effort may
vary considerably.
However, all the points described in the above verification
principles shall be evaluated in all cases and if any
such point is clearly superfluous, a justification
of this shall be contained in the verification documentation.
Especially the size of the documentation necessary
may vary and therefore, it is not a demand that the
five types of documentation is produced as separate
documents, but each may be reduced to a chapter in
other documents.