top - download
⟦cd32d3e07⟧ Wang Wps File
Length: 45215 (0xb09f)
Types: Wang Wps File
Notes: CPS/PLN/012
Names: »0279A «
Derivation
└─⟦b676ca540⟧ Bits:30006070 8" Wang WCS floppy, CR 0026A
└─ ⟦this⟧ »0279A «
WangText
)…00……00……00……00……15……0a……00……00……15……0b……15……02……15……06……14……09……14……0f……13… …13……06……12……09……12……0a……12……01……12……02……11……09……11……0f……11……05……10……0c……10…
…10……05……10……06……10……07……0f……86…1 …02… …02… …02…
…02…CPS/PLN/012
…02…820510…02……02…#
CAMPS ACCEPTANCE PLAN
…02…ISSUE 1.2…02…CAMPS
T̲A̲B̲L̲E̲ ̲O̲F̲ ̲C̲O̲N̲T̲E̲N̲T̲S̲
…02…1 GENERAL ................................. 4 to
5
1.1 PURPOSE ............................... 4
1.2 PROJECT REFERENCES .................... 4
1.3 TERMS AND ABBREVIATIONS ............... 5
2 DEVELOPMENT TEST ACTIVITY ............... 6 to
8
2.1 TEST AND VERIFICATION METHODS ......... 6
2.2 VERIFICATION ACTIVITIES ............... 6
2.3 VERIFICATION RESULTS .................. 8
3 TEST PLAN ............................... 9 to
21
3.1 SYSTEM DESCRIPTION .................... 9
3.2 ACCEPTANCE TEST TASK SUMMARY .......... 9
3.3 TEST SCHEDULE ......................... 17
3.4 DELIVERABLE ITEMS ..................... 20
3.5 ORGANIZATIONAL RESPONSIBILITIES ....... 20
4 ACCEPTANCE TEST TASK SPECIFICATION ...... 22 to
54
4.1 ACCEPTANCE TEST SPEC. AND PROC ........ 22
4.2 DSMT (PROTOTYPE) SYSTEM TEST .......... 27
4.3 FACTORY ACCEPTANCE TEST ............... 31
4.4 COMSEC IN-PLANT VERIFICATION .......... 34
4.5 AVAILABILITY VERIFICATION ............. 35
4.6 IN-PLANT SOFTWARE VERIFICATION ........ 38
4.7 SOFTWARE FUNCTIONAL VERIFICATION
(SITE 1) .............................. 42
4.8 SOFTWARE OPERATIONAL VERIFICATION
(SITE 1) .............................. 45
4.9 SOFTWARE VERIFICATION, SITE 2-16 ...... 48
4.10 COMSEC ON-SITE VERIFICATION ......... 51
4.11 SITE PROVISIONAL ACCEPTANCE (SPA) .. 52
1̲ ̲ ̲G̲E̲N̲E̲R̲A̲L̲
1.1 P̲U̲R̲P̲O̲S̲E̲
The purpose of the Acceptance Test Plan is:
a) to provide a coherent outline of the necessary
activities required for system acceptance.
b) to define the acceptance criterias for each individual
activity.
c) to outline the test methodology.
d) to outline the test schedule.
e) to define, where applicable, the personnel-, equipment-,
and software requirements for the individual test
activities.
f) to define the responsibilities for conducting and
coordinating the test activity.
1.2 P̲R̲O̲J̲E̲C̲T̲ ̲R̲E̲F̲E̲R̲E̲N̲C̲E̲S̲
a) Contract No. CE 80-9009-INF.
b) CAMPS SYSTEM REQUIREMENTS SPEC. CPS/210/SYS/0001
c) CAMPS Documentation Plan CPS/PLN/008
d) CAMPS R&M Plan CPS/PLN/004
l.3 T̲E̲R̲M̲S̲ ̲A̲N̲D̲ ̲A̲B̲B̲R̲E̲V̲I̲A̲T̲I̲O̲N̲S̲
CDRL C̲ontract D̲ata R̲equirement L̲ist
CR C̲hristian R̲ovsing A/S.
DSMT D̲evelopment, S̲oftware, M̲aintenance and T̲est
System. This is the first CAMPS prototype.
LI L̲ine I̲tem
M&D M̲aintenance and D̲iagnostics
MDSD M̲eantime between D̲iscovery of S̲oftware D̲efects
R&M R̲eliability and M̲aintenance.
SPA S̲ite P̲rovisional A̲cceptance.
SRS CAMPS S̲ystem R̲equirements S̲pecification
WBS W̲ork B̲reakdown S̲tructure
WP W̲ork P̲ackage
2̲ ̲ ̲D̲E̲V̲E̲L̲O̲P̲M̲E̲N̲T̲ ̲T̲E̲S̲T̲ ̲A̲C̲T̲I̲V̲I̲T̲Y̲
The acceptance test activity is the conclusion of the
tasks, contained in the equipment quality and assurance
provision. Prior to the acceptance test activity,
a hierarchy of tests and verifications have been exercised
to prove the equipment compliance with the contractual
requirements.
2.1 T̲E̲S̲T̲ ̲A̲N̲D̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲M̲E̲T̲H̲O̲D̲S̲
The requirements will be met by one of the following
methods:
a) E̲x̲a̲m̲i̲n̲a̲t̲i̲o̲n̲: This method is a non-functional verification,
such as visual inspection of the physical characteristics
of the item or of the documentation associated
with the item.
b) A̲n̲a̲l̲y̲s̲i̲s̲: This method is a non-functional verification,
such as deduction or translation of data, review
of analytical data, or performance of a detailed
analysis.
c) T̲e̲s̲t̲ ̲D̲e̲m̲o̲n̲s̲t̲r̲a̲t̲i̲o̲n̲: This method is a functional
verification, such as actual operation wherein
the element of verification is instrumented, measured,
or displayed directly (test) or where the element
of verification is logically obvious, as the result
of some other verification, but not itself displayed
(demonstration).
2.2 V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲A̲C̲T̲I̲V̲I̲T̲I̲E̲S̲
a) The total set of verification activities (including
acceptance test) are divided into hardware, software
and system verification.
b) The verification is divided into two classes:
I - internal verification
II - external verification (i.e. Acceptance Tests)
c) Only class II verifications shall be approved by
SHAPE.
d) Class I verifications are approved by CR's internal
QA only, but may be inspected by SHAPE's QAR.
e) The verification efforts are divided into the verification
types defined below.
f) H̲A̲R̲D̲W̲A̲R̲E̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
C̲l̲a̲s̲s̲ …01…V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲ ̲T̲y̲p̲e̲
I Prototype Verification (Modules)
I Preproduction Verification (Modules)
I Production Verification (Modules)
II DSMT (Prototype) System Verification
(HW System)
II Factory Acceptance Verification (HW System
and System Software)
II COMSEC In-Plant Verification (HW System)
II Availability Verification
g) S̲O̲F̲T̲W̲A̲R̲E̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
C̲l̲a̲s̲s̲ …01…V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲ ̲T̲y̲p̲e̲
I Software Development Verification (Sub-systems
and packages)
II DSMT (Prototype) System Software Verification
(System software)
II In Plant Software Verification (SW System)
II Software Functional Verification (SW System)
II Software Operational Verification
(SW System)
h) S̲Y̲S̲T̲E̲M̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
C̲l̲a̲s̲s̲ …01…V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲ ̲T̲y̲p̲e̲
II Comsec On-site Verification
II Site Provisional Acceptance (SPA)
2.3 V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲R̲E̲S̲U̲L̲T̲S̲
a) Class I verification results will be documented
in a series of internal CR verification reports.
b) Class II verification results will be documented
in the Acceptance Test Reports.
3̲ ̲ ̲T̲E̲S̲T̲ ̲P̲L̲A̲N̲
This section provides an overview of the system to
be tested, the acceptance test task activities and
schedule.
3.1 S̲Y̲S̲T̲E̲M̲ ̲D̲E̲S̲C̲R̲I̲P̲T̲I̲O̲N̲
In the context of the acceptance test task CAMPS can
be described in terms of functional capabilities and
attributes related to:
a) Hardware
b) Software (System-, M&D- and Application-)
c) System
The detailed CAMPS description is given in the SRS.
3.2 A̲C̲C̲E̲P̲T̲A̲N̲C̲E̲ ̲T̲E̲S̲T̲ ̲T̲A̲S̲K̲ ̲S̲U̲M̲M̲A̲R̲Y̲
The CAMPS Acceptance Test tasks are divided into a
number of Work Packages (WP's). The associated Work
Breakdown Structure (WBS) is shown in figure 3-1 and
table 3-1.
FIGURE 3-1
…02… CDRL
WP No. WP
Name LI
No.
̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲
̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲ ̲
2.2.1 Acceptance
Test
Plan 2.1.2
2.5.2 Acceptance
Test
Spec.
and
Proc. 2.5.2
5.3 DSMT
(Prototype)
System
Verification
5.4.4.1 Factory
Acceptance
Verification,
site
1 5.4
5.5.4 Factory
Acceptance
Verification,
site
2 5.5
5.6.4 Factory
Acceptance
Verification,
site
3 5.6
5.7.4 Factory
Acceptance
Verification,
site
4 5.7
5.8.4 Factory
Acceptance
Verification,
site
5 5.8
5.9.4 Factory
Acceptance
Verification,
site
6 5.9
5.10.4 Factory
Acceptance
Verification,
site
7 5.10
5.11.4 Factory
Acceptance
Verification,
site
8 5.11
5.12.4 Factory
Acceptance
Verification,
site
9 5.12
5.13.4 Factory
Acceptance
Verification,
site
10 5.13
5.14.4 Factory
Acceptance
Verification,
site
11 5.14
5.15.4 Factory
Acceptance
Verification,
site
12 5.15
5.16.4 Factory
Acceptance
Verification,
site
13
5.17.4 Factory
Acceptance
Verification,
site
14
5.18.4 Factory
Acceptance
Verification,
site
15
5.19.4 Factory
Acceptance
Verification,
site
16
5.4.4.2 Comsec
In-plant
verification,
site
1 3.16
2.7 & 2.4.4 Availability
Verification
(R
and
M) 2.4.4
5.4.6 In-plant
SW
Verification 5.4.5
7.1.1 SW
Functional
Verification,
site
1 7.1.1
7.1.2 SW
Operational
Verification,
site
1 7.1.2
7.x.1 SW
Verification,
site
x
=
2
to
16
7.x.2 Comsec
on-site
verification
site
x=1
to
16 7.x
7.x.3 SPA,
site
x=1,
16 7.x
TABLE 3-1…01……01…Cross Reference between Acceptance Test WP No.'s…01…and LI No.'s of The Contract
Data Requirement List.
The following sub-sections briefly describes the contents
of each of the WP's.
In section 4 is given a detailed WP description.
3.2.1 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲T̲e̲s̲t̲ ̲S̲p̲e̲c̲i̲f̲i̲c̲a̲t̲i̲o̲n̲ ̲a̲n̲d̲ ̲P̲r̲o̲c̲e̲d̲u̲r̲e̲
This WP consists of the preparation of the Acceptance
Test Specification and Procedure document.
The Test Specification part of the document will contain
a complete distribution and mapping of all testable
contractual requirements onto the indicated individual
test activities. The document will furthermore contain
a description of the Test Methods and Constraints,
and Test Progression. The Test Specification establishes
a baseline for a detailed description of the Test Procedures.
The Test Procedure part of the document provides a
detailed procedure description of each test to be performed.
The procedure description will be written for test
engineers and will for each test case include test
control, test data, expected test results and test
execution procedure.
3.2.2 H̲a̲r̲d̲w̲a̲r̲e̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
Hardware verification comprises 4 WP's described in
the following:
3.2.2.1 D̲S̲M̲T̲ ̲(̲P̲r̲o̲t̲o̲t̲y̲p̲e̲)̲ ̲S̲y̲s̲t̲e̲m̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
a) When the hardware of the DSMT System has been integrated
and the system software completed, a DSMT (Prototype)
Acceptance Verification of the integrated system
will be performed.
b) This is an external verification (class II). Upon
SHAPE approval of the results of the tests, the
manufactured system is baselined.
c) The test shall verify that all hardware fulfils
the functional specification.
d) An environmental test shall be included to verify
the environmental requirements. This test will
be in compliance with the SRS para. 3.5.2.1.
3.2.2.2 F̲a̲c̲t̲o̲r̲y̲ ̲A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲,̲ ̲S̲i̲t̲e̲ ̲1̲
When the CAMPS hardware, to be installed at the first
site, has been integrated at the factory, a functional
test equivalent to the DSMT system verification test
will be performed at the factory to verify that all
hardware fulfils the functional specification. This
test will not include the environmental test cases.
3.2.2.3 F̲a̲c̲t̲o̲r̲y̲ ̲A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲,̲ ̲S̲i̲t̲e̲ ̲2̲ ̲-̲ ̲1̲6̲
When subsequent CAMPS have been integrated at the factory,
tests will be carried out to verify that all hardware
fulfils the functional specifications. The test is
a subset of the DSMT (Prototype) System Verification.
The subset of test procedures will be identical to
the performance test procedures used for the first
site (3.2.2.2).
3.2.2.4 C̲O̲M̲S̲E̲C̲ ̲I̲n̲-̲p̲l̲a̲n̲t̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
a) The CAMPS equipment will be tested by Purchaser
at the factory. The purchaser will liaise directly
with the national COMSEC authorities where appropriate.
b) The test measurements will be performed on a representative
CAMPS configuration.
The measurements shall verify compliance with the
requirements specified in the SRS para. 3.5.2.10.
c) All equipment which process classified information
in a clear electrical form shall be in compliance
with the SRS para. 3.5.2.10.
3.2.2.5 A̲v̲a̲i̲l̲a̲b̲i̲l̲i̲t̲y̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
This verification is accomplished by a combination
of different means such as calculations, observations
and analysis, which together shall provide an evidence
of the systems conformity to the availability requirements.
The availability verification is performed in the
Reliability and Maintainability Programme. For a detailed
description see CAMPS R&M Plan.
3.2.3 S̲o̲f̲t̲w̲a̲r̲e̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
Software verification consists of 4 WP's described
in the following.
3.2.3.1 D̲S̲M̲T̲ ̲(̲P̲r̲o̲t̲o̲t̲y̲p̲e̲)̲ ̲S̲y̲s̲t̲e̲m̲ ̲S̲o̲f̲t̲w̲a̲r̲e̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
This test is the SW part of the DSMT (Prototype) System
verification.
3.2.3.2 I̲n̲ ̲P̲l̲a̲n̲t̲ ̲S̲o̲f̲t̲w̲a̲r̲e̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
The In Plant Software Verification will be performed
in the factory with a simulated environment specified
in SRS para 3.5.11.5.2.b-c. The test criteria requires
an accumulated test time of 240 hours.
Upon fulfilment of the acceptance criteria specified
in SRS para 4.2.2.2, the CAMPS software is baselined
for release to the operational sites.
3.2.3.3 S̲o̲f̲t̲w̲a̲r̲e̲ ̲F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲,̲ ̲S̲i̲t̲e̲ ̲1̲
a) At the first site (SHAPE) CR will conduct a formal
functional test to verify basic system functions
and interfaces. The test will be supervised by
CR test engineers and SHAPE operational staff will
assist in operating the system.
b) NICS-TARE, ACE CCIS, and SCARS II are planned to
be available. However, in the event one or more
of the interfaces are not available, the contractor
will proceed with all of those components of the
system that are independent of the missing interfaces.
c) In this event, SHAPE will give Site Provisional
Acceptance, not withstanding the inability to test
all interfaces, providing all other aspects of
the test are satisfactorily completed.
3.2.3.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲O̲p̲e̲r̲a̲t̲i̲o̲n̲a̲l̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲,̲ ̲S̲i̲t̲e̲ ̲1̲
a) Immediately after successful completion of the
Software Functional Verification at the SHAPE site,
an Operational Test shall take place in order to
verify the reliability of the software system.
b) The operational test will take place under CR control,
but SHAPE personnel will operate the system in
a live operational mode in accordance with agreed
test specifications and procedures.
c) The acceptance test criteria shall be that the
CAMPS system shall be able to execute the basic
software functions according to the approved System
Requirements Specification for 3 periods of 20
calendar days in a live operational mode without
any software failure.
d) Any software failure, which concept will be defined
in para 4.8, shall require a complete retest cycle.
The duration of the test or retest may be shortened
at SHAPE's discretion.
3.2.3.5 S̲o̲f̲t̲w̲a̲r̲e̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲,̲ ̲S̲i̲t̲e̲ ̲2̲-̲1̲6̲
Software verification for the CAMPS installations other
than the first site will consist of a subset of the
tests outlined in para. 3.2.3.2.
3.2.4 S̲y̲s̲t̲e̲m̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
System verification consists of 2 WP's as described
in the following.
3.2.4.1 C̲O̲M̲S̲E̲C̲ ̲O̲n̲-̲s̲i̲t̲e̲ ̲V̲e̲r̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
a) When the CAMPS computer equipment and peripherals
have been installed on-site, the verification will
be performed by ACE COMSEC or national COMSEC authorities
nominated by ACE COMSEC.
b) The purchaser will supervise the verification and
CR test engineers will assist in operating the
system. The COMSEC on-site verification is a formal
verification. The verification shall take place
in conjunction with the SPA.
3.2.4.2 S̲i̲t̲e̲ ̲P̲r̲o̲v̲i̲s̲i̲o̲n̲a̲l̲ ̲A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲(̲S̲P̲A̲)̲
a) Site provisional acceptance is the act whereby
SHAPE will acknowledge by protocol that CR has
fully demonstrated that a site is complete and
ready for initial operation and will take place
when the following requirements have been met:
1) Completion of the agreed on-site acceptance
test specified in para 4.7, 4.8, and 4.10 for
site 1 and para 4.9 and 4.10 for the remaining
sites.
Conductance of the mentioned acceptance tests
assumes the availability of the connecting
points for the external interfaces (NICS TARE,
ACE CCIS, SCARS II, TRC and Point-to-Point
connections). However, in the event one or
more of the interfaces are not available, the
contractor shall proceed with all of those
componets of the system that are independent
of the missing interfaces.
In this event SHAPE will give site provisional
acceptance, not withstanding the inability
to test all interfaces, providing all other
aspects of the test are satisfactorily completed".
2) Verification of the site inventory.
3) If applicable, availability of a mutually agreed
discrepancy list showing the agreed date for
clearance of each listed discrepancy.
3.3 A̲C̲C̲E̲P̲T̲A̲N̲C̲E̲ ̲T̲E̲S̲T̲ ̲S̲C̲H̲E̲D̲U̲L̲E̲
The time schedule for the acceptance tests is shown
on the enclosed schedule charts.
ACCEPTANCE TEST SCHEDULE …01…(2 - 12 Site Equipment)
CAMPS ACCEPTANCE TEST SCHEDULE…01…(DSMT & FIRST SITE EQUIPMENT)
3.4 D̲E̲L̲I̲V̲E̲R̲A̲B̲L̲E̲ ̲I̲T̲E̲M̲S̲
Each of the above described WP's results in a set of
deliverable items. Together these items constitute
the documentation for the acceptance task.
The deliverable items to be submitted by CR for each
WP are as follows:
1) a Description or Specification
For those WP's that comprise test activity for which
CR is responsible, the following shall apply:
2) a set of Test Procedures derived from the Test
Specification.
3) a Test Report containing the results from the performed
test procedures.
The deliverable items to be submitted by SHAPE are
identified as follows:
4) a COMSEC Test Specification
5) a set of Test Procedures derived from (4).
6) a Site Provisional Acceptance Protocol
3.5 O̲R̲G̲A̲N̲I̲Z̲A̲T̲I̲O̲N̲A̲L̲ ̲R̲E̲S̲P̲O̲N̲S̲I̲B̲I̲L̲I̲T̲I̲E̲S̲
The responsibility for conductance of the CAMPS acceptance
task lies with CR. Within CR's CAMPS program organization,
the responsibility lies with the Integration and Test
(I&T) group which has the lines of reference indicated
in fig. 3.3-1.
SYSTEM
ENG
INTEGR. & o Integration and Test
TEST o Acceptance Tests
FIGURE 3.3-1
4̲ ̲ ̲A̲C̲C̲E̲P̲T̲A̲N̲C̲E̲ ̲T̲E̲S̲T̲ ̲W̲P̲ ̲S̲P̲E̲C̲I̲F̲I̲C̲A̲T̲I̲O̲N̲
This section describes the Acceptance Test Work Packages
with emphasis on:
1) Purpose
2) Personnel Requirements
3) HW Requirements
4) SW Requirements
5) Functional Description
6) Acceptance Criteria
7) Test Schedule
4.1 A̲C̲C̲E̲P̲T̲A̲N̲C̲E̲ ̲T̲E̲S̲T̲ ̲S̲P̲E̲C̲I̲F̲I̲C̲A̲T̲I̲O̲N̲ ̲A̲N̲D̲ ̲P̲R̲O̲C̲E̲D̲U̲R̲E̲
4.1.1 P̲u̲r̲p̲o̲s̲e̲
The purpose of this document is to identify all requirements
to be demonstrated by test, and to map these requirements
onto the described individual test activities.
Furthermore, the document shall provide the test procedures
on how to execute the test.
4.1.2 T̲e̲s̲t̲ ̲S̲p̲e̲c̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
The Acceptance Test Specification shall include the
following information.
a) T̲e̲s̲t̲ ̲S̲p̲e̲c̲i̲f̲i̲c̲a̲t̲i̲o̲n̲
1) R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲.̲ List the individual requirements
to be demonstrated by the test as derived from
the SRS.
2) S̲y̲s̲t̲e̲m̲ ̲F̲u̲n̲c̲t̲i̲o̲n̲s̲.̲ Provide a detailed list
of the systems functions which will be exercised
during overall system testing.
a The list must be derived from the SRS.
b It must be ordered in such a way that the
functions are related to the requirements
given in para. 4.1.2.a.1
3) T̲e̲s̲t̲/̲F̲u̲n̲c̲t̲i̲o̲n̲ ̲R̲e̲l̲a̲t̲i̲o̲n̲s̲h̲i̲p̲s̲
a List the tests which, taken as a whole,
constitute the overall test activity.
b Provide a Test/Function Matrix Chart summarising
the overall allocation of system functions
to the tests.
b) T̲e̲s̲t̲ ̲M̲e̲t̲h̲o̲d̲s̲ ̲a̲n̲d̲ ̲C̲o̲n̲s̲t̲r̲a̲i̲n̲t̲s̲
1) S̲y̲s̲t̲e̲m̲ ̲T̲e̲s̲t̲ ̲C̲o̲n̲d̲i̲t̲i̲o̲n̲s̲.̲ Indicate whether the
system test is to be made using normal system
inputs (type, magnitude or frequency) and data
base or whether a special set of exercise inputs
and exercise data base is to be used.
2) E̲x̲t̲e̲n̲t̲ ̲o̲f̲ ̲S̲y̲s̲t̲e̲m̲ ̲T̲e̲s̲t̲.̲ Indicate:
a The extent of testing to be employed.
b When total testing is not to be employed,
the test requirements either as a percentage
of some well defined total quantity or
a number of samples of discrete operating
conditions or values and the rationale
for adopting limited testing.
3) D̲a̲t̲a̲ ̲R̲e̲c̲o̲r̲d̲i̲n̲g̲.̲ Indicate data recording requirements
including those data types not normally recovered
from the system.
4) S̲y̲s̲t̲e̲m̲ ̲T̲e̲s̲t̲ ̲C̲o̲n̲s̲t̲r̲a̲i̲n̲t̲s̲
a Indicate the anticipated limitations imposed
on the test due to system or test conditions.
b Address limitations on the following:
l̲. Timing
2̲. Interface
3̲. Equipment
4̲. Personnel
5̲. Data base
c) T̲e̲s̲t̲ ̲P̲r̲o̲g̲r̲e̲s̲s̲i̲o̲n̲. In cases of progression or cumulative
tests, explain how progression is made from one
test to another so that the cycle of activity for
each test is completely accomplished.
1) T̲e̲s̲t̲ ̲D̲a̲t̲a̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲. Describe the rules by
which the test results will be evaluated.
For example:
a Tolerance (range over which a data value
output by a system performance parameter
can vary and still be considered acceptable).
b Samples (the minimum number of combinations
or alternatives of input conditions and
output conditions that can be exercised
to constitute an acceptable test of the
parameters involved).
c Counts (the maximum number of interrupts,
halts, or other system breaks which may
occur due to non-test conditions).
2) T̲e̲s̲t̲ ̲D̲a̲t̲a̲ ̲R̲e̲d̲u̲c̲t̲i̲o̲n̲. Describe the technique
to be used for manipulation of the raw test
data into a form suitable for evaluation.
The available techniques would include:
a Manual (manual collection and collation
of system test outputs into test sequence
order followed by visual inspection of
the results).
b Semi-automatic (automatic inspection of
the test results as obtained by data recording
means using a test data reduction program
followed by manual (visual) inspection
of selected test results which do not lend
themselves to complete reduction by automatic
means).
c Automatic (automatic inspection of test
results specially recorded for manipulation
by the test data reduction program. Test
results, as recorded, include all items
of test significance. The test data reduction
program contains an image of correct data
output for an item-by-item comparison of
data and provides a summary of an evaluated
test as output).
4.1.3 T̲e̲s̲t̲ ̲P̲r̲o̲c̲e̲d̲u̲r̲e̲s̲
The Test Procedures shall be written for experienced
test engineers and they shall include the following
information when applicable.
a) T̲e̲s̲t̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲.̲ Provide a description of the
test to be performed.
b) T̲e̲s̲t̲ ̲C̲o̲n̲t̲r̲o̲l̲.̲
1) System Test means of control. Indicate how
the test is to be controlled. For example:
a Manual means; i.e. manual insertion of
necessary inputs and manual control of
test sequence.
b Semi-automatic means; i.e. manual insertion
of necessary inputs and automatic (test
program) control of test sequence.
c Automatic means; i.e. preparation and
use of a special test program to provide
necessary inputs, conduct tests, monitor,
and record test results.
2) T̲e̲s̲t̲ ̲D̲a̲t̲a̲.̲
a I̲n̲p̲u̲t̲ ̲D̲a̲t̲a̲.̲ Describe the manner in which
input data are controlled to:
1̲ Test the system with a minimum of data
types and values.
2̲ Exercise the system with a range of
bona fide data types and values which
test for overload, saturation, and other
"worse case" effects.
3̲ Exercise the system with bogus data
types and values which test for rejection
of irregular inputs.
b I̲n̲p̲u̲t̲ ̲C̲o̲m̲m̲a̲n̲d̲s̲.̲ Describe how input commands
control:
1̲ Test initialization
2̲ Test halt or interrupt
3̲ The repeat of an unsuccessful or incomplete
test.
4̲ Alternate modes of operation as required.
5̲ Test termination.
c O̲u̲t̲p̲u̲t̲ ̲D̲a̲t̲a̲.̲ Describe the controls placed
on output data to:
1̲ Detect occurrence (or ultimate non-occurence)
of output data (as event) for indication
of test completion.
2̲ Record or identify permanent location
of output data (in entirety) for indication
of test performance.
3̲ Evaluate output based on test sequence.
4̲ Compare test output against required
output to assess the performance of
the test.
d O̲u̲t̲p̲u̲t̲ ̲N̲o̲t̲i̲f̲i̲c̲a̲t̲i̲o̲n̲.̲ Describe the manner
in which output notifications are controlled
in order to:
1̲ Indicate readiness for test (normal
operation condition).
2̲ Provide indications of irregularities
in input test data or test data base
due to normal or erroneous test procedures.
3̲ Provide indications of irregularities
in internal operations on test data
due to normal or erroneous test procedures.
4̲ Provide indications on the control,
status, and results of the test as available
from an auxiliary test supervisor program
(if used).
c. T̲e̲s̲t̲ ̲P̲r̲o̲c̲e̲d̲u̲r̲e̲s̲
1) Test Set-up.
2) Test Initialization.
3) Test Steps to be performed by the CR test engineer.
4) Test Termination.
4.1.4 S̲c̲h̲e̲d̲u̲l̲e̲
The Acceptance Test Specification and Procedure
document will be submitted for Shape approval within
15 working days after the System Design Review.
4.2 D̲S̲M̲T̲ ̲(̲P̲R̲O̲T̲O̲T̲Y̲P̲E̲)̲ ̲S̲Y̲S̲T̲E̲M̲ ̲T̲E̲S̲T̲
4.2.1 P̲u̲r̲p̲o̲s̲e̲
The purpose of this test is to verify basic hardware-
and system software functions.
4.2.2 P̲e̲r̲s̲o̲n̲n̲e̲l̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
This test will be conducted in the factory by CR test
engineers.
4.2.3 H̲a̲r̲d̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The DSMT system shall comprise the hardware configuration
which is functionally similar to a typical CAMPS configuration.
4.2.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The software required for conducting this test is the
DAMOS software necessary to provide the following facilities:
1) Multiprogramming and scheduling at run time.
2) Store management.
3) Device control and interrupt response.
4) Interprocess interfaces.
5) Fault handling.
6) Maintenance and diagnostics (M&D).
7) General purpose test drive.
Additional special purpose test drive software may
be required
4.2.5 F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲
By means of the M&D- and the test drive software, all
hardware- and interface functions shall be verified.
After a complete verification of hardware functions,
the system software functions mentioned in 1) to 6)
above shall be verified.
4.2.6 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲
A verification of all basic hardware- and system software
functions in compliance with agreed test procedures,
shall establish the acceptance criteria.
4.2.7 S̲c̲h̲e̲d̲u̲l̲e̲
The schedule outline is shown overleaf.
DSMT (PROTOTYPE) TEST
4.3 F̲A̲C̲T̲O̲R̲Y̲ ̲A̲C̲C̲E̲P̲T̲A̲N̲C̲E̲ ̲T̲E̲S̲T̲
4.3.1 P̲u̲r̲p̲o̲s̲e̲
The purpose of this test is to verify that all hardware
fulfils its functional specifications. The test constitutes
customer's acceptance of the hardware prior to shipment
to site.
4.3.2 P̲e̲r̲s̲o̲n̲n̲e̲l̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
This test will be conducted in the factory by CR test
engineers.
4.3.3 H̲a̲r̲d̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
For each CAMPS site configuration, the test shall be
performed on all CAMPS equipment to be delivered to
the site.
4.3.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
System M&D-software and the relevant special purpose
test drive software developed for the DSMT test.
4.3.5 F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲
When a CAMPS has been integrated at the factory, the
functional specifications for the hardware system shall
be verified.
4.3.6 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲
A verification of all specified hardware requirements
in compliance with agreed test procedures shall establish
the acceptance criteria.
4.3.7 S̲c̲h̲e̲d̲u̲l̲e̲
A period of 3 months is scheduled for the factory acceptance
test of each individual CAMPS Site Configuration.
From this 3 month period the first 2 months are planned
for test conductance and system debugging. The last
month is planned for test analysis and report preparation.
The schedule outline is shown overleaf.
FACTORY ACCEPTANCE TEST SCHEDULE
4.4 C̲O̲M̲S̲E̲C̲ ̲I̲N̲-̲P̲L̲A̲N̲T̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲
4.4.1 P̲u̲r̲p̲o̲s̲e̲
This test shall verify that the electromagnetic radiation
from the equipment is in compliance with the requirements
as described in the SRS para. 3.5.2.10.
4.4.2 P̲e̲r̲s̲o̲n̲n̲e̲l̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The test shall be conducted by and under the responsibility
of the ACE COMSEC authorities.
CR test engineers will if necessary assist during the
test conductance.
4.4.3 H̲a̲r̲d̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The test will be a qualification test performed on
a representative CAMPS configuration.
4.4.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲
System M&D software and the relevant special purpose
test drive software developed for the DSMT-test.
4.4.5 F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲
To be submitted by the purchaser.
4.4.6 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲
Verification of the equipment compliance with the requirements
described in the SRS para. 3.5.2.10.
4.4.7 S̲c̲h̲e̲d̲u̲l̲e̲
The test shall be performed during the factory acceptance
test period. Due to possible interference from electromagnetic
background radiation, the test must be conducted between
0100 and 0600 hours. The test period is scheduled
to be 820401 - 820531.
4.5 A̲V̲A̲I̲L̲A̲B̲I̲L̲I̲T̲Y̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲
4.5.1 P̲u̲r̲p̲o̲s̲e̲
This task shall be performed as part of the R & M programme
in order to demonstrate that the specified availability
requirements are met.
4.5.2 P̲e̲r̲s̲o̲n̲n̲e̲l̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
This task will be undertaken by CR system and hardware
engineers.
4.5.3 H̲a̲r̲d̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The DSMT hardware configuration and all subsequent
integrated and burned-in CAMPS systems will be used
for the tests and observations needed for conducting
this task.
4.5.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
Minimum software requirements are the M&D software.
4.5.5 F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲
a) The availability and reliability performance characteristics
specified in SRS para. 3.4.4 shall be demonstrated
by performing availability measurements/calculations,
in which the values of the module/unit MTBFs and
MTTRs are verified by:
1) Testing/analysis in accordance with SRS para.
4.2.1.7.1, 4.2.1.7.2 and 4.2.1.7.3 or
2) Provision of acceptable evidence of conformity
to requirements, i.e. evidence supported by
prior observation of system or module performance
over a length of time sufficient to have the
same degree of confidence as the test procedure
that the required availability can be archived.
3) Availability/reliability/maintainability observations
which the contractor wishes to submit in lieu
of testing shall be provided as part of the
R&M Plan for agreement by the purchaser.
b) The set of units and modules, taken together, shall
make up the CAMPS equipment of the R&M model(s)
used for verification of system availability. A
definition of the terms module and unit is given
in the SRS para 3.4.4.7.l) and m).
4.5.6 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲
Will be specified in the R&M Program Plan (CPS/PLN/004).
4.5.7 S̲c̲h̲e̲d̲u̲l̲e̲
The task schedule is shown overleaf.
AVAILABILITY VERIFICATION
4.6 I̲N̲-̲P̲L̲A̲N̲T̲ ̲S̲O̲F̲T̲W̲A̲R̲E̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲
4.6.1 P̲u̲r̲p̲o̲s̲e̲
The purpose of this test is to give an initial verification
of the functional and operational performance of an
integrated CAMPS. The test will be performed in a
simulated environment.
4.6.2 P̲e̲r̲s̲o̲n̲n̲e̲l̲ ̲R̲e̲q̲u̲i̲r̲e̲d̲
This test will be conducted in the factory by CR test
engineers.
4.6.3 H̲a̲r̲d̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲d̲
The hardware required for conducting this test is:
1) T̲h̲e̲ ̲t̲e̲s̲t̲ ̲o̲b̲j̲e̲c̲t̲, which is the hardware comprised
in the first site (SHAPE) CAMPS installation.
2) T̲h̲e̲ ̲s̲u̲p̲p̲o̲r̲t̲ ̲e̲q̲u̲i̲p̲m̲e̲n̲t̲, which is the necessary hardware
required to simulate the CAMPS environment as specified
in SRS para. 3.5.11.5.2.b-c. In addition to simulating
CAMPS environment, the support equipment will be
able to log and store all CAMPS output.
4.6.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲d̲
1) T̲h̲e̲ ̲t̲e̲s̲t̲ ̲o̲b̲j̲e̲c̲t̲, which is the complete CAMPS software
system.
2) T̲h̲e̲ ̲s̲u̲p̲p̲o̲r̲t̲ ̲s̲o̲f̲t̲w̲a̲r̲e̲, which is the necessary software
required to give the support equipment the functions
specified in SRS para. 3.5.11.5.2.b-c. In addition
to simulating the CAMPS environment, the support
software shall provide facilities for monitoring
the functional and operational performance of the
CAMPS.
4.6.5 F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲
The verification of the CAMPS functional and operational
performance is accomplished by loading the CAMPS input
channels with pseudo random operational simulated traffic
with characteristics, distribution and throughput as
specified in the SRS para. 3.4.1.1 and 3.4.1.2 and
with a load as specified in the SRS para. 3.5.11.5.2.b-c.
The Test Drive System will log and store all output
from CAMPS.
Any failure of the CAMPS's specified functional and
operational behaviour will according to its nature
be categorized as follows:
a) C̲a̲t̲e̲g̲o̲r̲y̲ ̲1̲
- Loss of a message or other transaction.
- Corruption of a message or other transaction.
- A message or transaction being missent.
- Misapplication of security rules.
- Failure of accounting procedures.
b) C̲a̲t̲e̲g̲o̲r̲y̲ ̲2̲
1) loss of service to more than 25% of all channels
and user connecting points.
2) loss of service to more than 50% of all operating
positions or system reporting facilities.
3) loss of ability to recover the system.
c) C̲a̲t̲e̲g̲o̲r̲y̲ ̲3̲
1) errors not included in 1 and 2 above.
This test together with the preceding test activity
during System Integration and Test will form the basis
for the calculations of the MDSD probability.
4.6.6 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲
The acceptance criteria for this test requires an accumulated
test time of 240 hours with an allowable total of:
1 failure of category 1 and
2 failures of category 2 and
3 failures of category 3.
4.6.7 S̲c̲h̲e̲d̲u̲l̲e̲
The 240 hour test period will be divided into three
parts.
During the first 120 hours, the individual functions
of the system will be verified by means of manual inputs
and the Test Drive System.
During the subsequent 80 hours, the system will be
load tested in accordance with the specifications of
SRS section 3.4.1.1 and 3.4.1.2.
During the last 40 hours, the system will be load tested
with additional simulation of transmission media failure
and noise. The Test Drive system will be used to exercise
the system.
The schedule for the Test is shown overleaf.
IN-PLANT SOFTWARE TEST
4.7 S̲O̲F̲T̲W̲A̲R̲E̲ ̲F̲U̲N̲C̲T̲I̲O̲N̲A̲L̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲,̲ ̲S̲I̲T̲E̲ ̲1̲
4.7.1 P̲u̲r̲p̲o̲s̲e̲
The purpose of this test is to give a verification
of the basic functional performance of the first CAMPS
site installation.
4.7.2 P̲e̲r̲s̲o̲n̲n̲e̲l̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The CAMPS shall be operated by members of SHAPE operational
staff under supervision of CR test engineers.
4.7.3 H̲a̲r̲d̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The hardware required for conducting the test is:
1) The hardware comprised in the specified first site
(SHAPE) CAMPS installation.
2) The connecting points for the external interfaces.
(NICS-TARE, ACE CCIS, SCARS II, TRC, and point
to point connections.)
4.7.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The software required for conducting this test is the
basic CAMPS software system.
4.7.5 F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲
The verification of the CAMPS basic functional performance
is accomplished by conducting a complete exercise of
all specified functions.
The exercise of the system will be based on agreed
test procedures.
4.7.6 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲
The acceptance criteria for this test is, that the
CAMPS shall be able to execute all basic functions
in compliance with the SRS and the agreed test procedures.
In case one or more of the interfaces (NICS-TARE, ACE
CCIS or SCARS II) are not available, the test shall
be conducted with all of the components of the system
that are independant of the missing interfaces.
4.7.7 S̲c̲h̲e̲d̲u̲l̲e̲
The test will start immediately after installation
and will progress until all the agreed test procedures
have been verified. The duration of the test may be
up to three months. The schedule is shown overleaf.
SOFTWARE FUNCTIONAL TEST
4.8 S̲O̲F̲T̲W̲A̲R̲E̲ ̲O̲P̲E̲R̲A̲T̲I̲O̲N̲A̲L̲ ̲T̲E̲S̲T̲,̲ ̲S̲I̲T̲E̲ ̲1̲
4.8.1 P̲u̲r̲p̲o̲s̲e̲
The purpose of this test is to give a final verification
of the CAMPS operational performance and the reliability
of the software system.
4.8.2 P̲e̲r̲s̲o̲n̲n̲e̲l̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
Same as under para. 4.7.2.
4.8.3 H̲a̲r̲d̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
Same as under para. 4.7.3.
4.8.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
Same as under para 4.7.4.
4.8.5 F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲
The verification of the CAMPS operational performance
and software reliability is accomplished by operating
the system in a live operational mode.
The test will take place under CR supervision, but
SHAPE personnel will operate the system in accordance
with agreed test procedures and within the specified
traffic loads and characteristics
4.8.6 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲
In defining the acceptance criteria for this test reference
shall be made to the definitions of software error
categories made in para. 4.6.5.
Software errors of category 3 shall here be further
subdivided into 3 subcategories:
Category 3.1) Software errors which adversely affect
the accomplishment of the CAMPS functional
and operational behaviour, so as to
degrade performance, but for which there
exist reasonable alternative work-around
procedures.
Category 3.2) Software errors which cause user inconvenience
or annoyance, but do not affect the
systems functional performance.
Category 3.3) Errors not included in 3.1 and 3.2.
A software failure is defined as an occurrence of any
of the deviations from CAMPS specified functional and
operational performance which are described under category
1, 2 and 3.3.
The acceptance criteria shall be that the CAMPS system
shall be able to execute the basic software functions
according to the approved SRS for 1 period of 15 days
and 2 periods of 20 calender days in a live operational
mode without any software failures.
The initial 5 days of the testperiod shall be used
for training of SHAPE's operational staff.
Any software failure shall require a complete re-test
cycle.
Software errors of category 3.1 and 3.2 shall not require
a restart of the test. However, the errors shall be
corrected and verified before final acceptance.
4.8.7 S̲c̲h̲e̲d̲u̲l̲e̲
The test schedule is shown overleaf.
SOFTWARE OPERATIONAL TEST
4.9 S̲O̲F̲T̲W̲A̲R̲E̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲,̲ ̲S̲I̲T̲E̲ ̲2̲-̲1̲6̲
4.9.1 P̲u̲r̲p̲o̲s̲e̲
The purpose of this test is to give a site related
verification of the CAMPS functional and operational
performance.
4.9.2 P̲e̲r̲s̲o̲n̲n̲e̲l̲ ̲r̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
Same as under para. 4.7.2
4.9.3 H̲a̲r̲d̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
Same as under para 4.7.3.
4.9.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
Same as under para. 4.7.4.
4.9.5 F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲
The verification of the CAMPS functional performance
is accomplished by performing a sub-set of the tests
outlined in para. 4.7.
4.9.6 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲
The verification of the functional performance in compliance
with agreed test procedures shall establish the acceptance
criteria.
4.9.7 S̲c̲h̲e̲d̲u̲l̲e̲
The software verification is scheduled to be conducted
following conclusion of the installation task and immediately
prior to Site Provisional Acceptance.
The planned time period for this test task is 2 weeks:
1 week for conductance of test procedures.
1 week for test analysis and preparation of the test
report.
The overall task schedule is shown overleaf.
SOFTWARE VERIFICATION, SITE 2-16
4.10 C̲O̲M̲S̲E̲C̲ ̲O̲N̲-̲S̲I̲T̲E̲ ̲V̲E̲R̲I̲F̲I̲C̲A̲T̲I̲O̲N̲
4.10.1 P̲u̲r̲p̲o̲s̲e̲
This test shall verify that the electromagnetic radiation
from all on-site installed CAMPS equipment is in compliance
with the requirements described in the SRS para. 3.5.2.10.
4.10.2 P̲e̲r̲s̲o̲n̲n̲e̲l̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The test is to be conducted by and under the responsibility
of ACE COMSEC.
Purchaser will supervise the verification and CR test
engineers will assist in operating the system during
the test.
4.10.3 H̲a̲r̲d̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
All on-site installed CAMPS equipment.
4.10.4 S̲o̲f̲t̲w̲a̲r̲e̲ ̲R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The complete CAMPS software system.
4.10.5 F̲u̲n̲c̲t̲i̲o̲n̲a̲l̲ ̲D̲e̲s̲c̲r̲i̲p̲t̲i̲o̲n̲
Test specification and procedures shall be submitted
by the purchaser.
4.10.6 A̲c̲c̲e̲p̲t̲a̲n̲c̲e̲ ̲C̲r̲i̲t̲e̲r̲i̲a̲
The verification of the equipment compliance with requirements
stated in SRS para. 3.5.2.10 will establish the acceptance
criteria.
4.10.7 S̲c̲h̲e̲d̲u̲l̲e̲
The verification shall take place in conjunction with
the SPA. The schedule is shown under para. 4.11.7.
4.11 S̲I̲T̲E̲ ̲P̲R̲O̲V̲I̲S̲I̲O̲N̲A̲L̲ ̲A̲C̲C̲E̲P̲T̲A̲N̲C̲E̲ ̲(̲S̲P̲A̲)̲
4.11.1 P̲u̲r̲p̲o̲s̲e̲
Site provisional acceptance is the act whereby SHAPE
will acknowledge by protocol that CR has fully demonstrated
that a site is complete and ready for initial operation.
4.11.2 R̲e̲q̲u̲i̲r̲e̲m̲e̲n̲t̲s̲
The SPA will take place when the following requirements
have been met:
1) Completion of the agreed on-site acceptance test
specified in para 4.7, 4.8, and 4.10 for site 1
and para 4.9 and 4.10 for the remaining sites.
Conductance of the mentioned acceptance tests assumes
the availability of the connecting points for the
external interfaces (NICS TARE, ACE CCIS, SCARS
II, TRC and Point-to-Point connections). However,
in the event one or more of the interfaces are
not available, the contractor shall proceed with
all of those componets of the system that are independent
of the missing interfaces.
In this event SHAPE will give site provisional
acceptance, not withstanding the inability to test
all interfaces, providing all other aspects of
the test are satisfactorily completed".
2) Verification of the site inventory
3) If applicable, availability of a mutually agreed
discrepancy list showing the agreed date for clearance
of each listed discrepancy.
4.11.3 F̲o̲r̲m̲a̲l̲ ̲P̲r̲o̲c̲e̲d̲u̲r̲e̲
SHAPE will either sign a SPA Acceptance Protocol or
deliver to CR in writing a "Notice of Defects" listing
the defects which prevented provisional acceptance
of the site.
The date of signature of the SPA Acceptance Protocol
marks the date of the SPA.
If SHAPE delivers a "Notice of Defects" then the date
on which SHAPE signs a "Correction of Defects Notice"
shall be the date of the SPA.
"Defects", as referred to above, means any non-compliance
with the contract by CR which prevents the normal operation
of the site. "Discrepancies" shall mean a non-compliance
which does not prevent completion of site provisional
acceptance. However, any discrepancies existing at
the time of signing the Site Provisional Acceptance
Protocol will be listed in the protocol and dates for
correction of each will also be included.
The date of the SPA marks the time at which SHAPE takes
over the responsibility of a site and the operation
of the site can commence. Prior to SPA, all operation
by SHAPE of the site shall be approved by CR, except
for such operation referred to in para. 4.8 above.
4.11.4 S̲c̲h̲e̲d̲u̲l̲e̲
The SPA's are scheduled to take place as shown overleaf.
SITE PROVISIONAL ACCEPTANCE.