The title page shall contain the information identified below in the indicated format: This specification shall contain a table of contents listing the title and page number of each titled paragraph and subparagraph. The table of contents shall then list the title and page number of each figure, table, and appendix, in that order. This section shall be numbered 1 and shall be divided into the following paragraphs.
|Published (Last):||6 December 2015|
|PDF File Size:||18.42 Mb|
|ePub File Size:||9.74 Mb|
|Price:||Free* [*Free Regsitration Required]|
This appendix contains requirements for a category and priority classification scheme to be applied to all problems detected in the deliverable software or its documentation that has been placed under contractor configuration control. The requirements specified in this appendix are a mandatory part of this standard. Problems detected during software operation shall be classified by category as follows: Software problem. The software does not operate according to supporting documentation and the documentation is correct.
Documentation problem. The software does not operate according to supporting documentation but the software operation is correct. Design problem. The software operated according to supporting documentation but a design deficiency exists.
The design deficiency may not always result in a directly observable operational symptom but possesses the potential for causing further problems. Problems detected in the software or its documentation shall be classified by priority as follows: Priority 1. Priority 2. Priority 3. Priority 4. A software problem that is an operator inconvenience or annoyance and which does not effect a required operational or mission essential capability.
Priority 5. All other errors. This appendix contains a default set of definitions for the evaluation criteria appearing in Figures 4 through These definitions shall be implemented by the contractor if an alternative set has not been proposed in the Software Development Plan and accepted by the contracting agency.
The definitions specified in this appendix are a mandatory part of this standard. The following definitions are listed in the order that the criteria appear in Figures 4 through For convenience, the definitions use the word "document" for the item being evaluated, even though in some instances the item being evaluated may be other than a document.
Internal consistency as used in this standard means that: no two statements in a document contradict one another, a given term, acronym, or abbreviation means the same thing throughout the document, and a given item or concept is referred to by the same name or description throughout the document. Understandability, as used in this standard means that: the document uses rules of capitalization, punctuation, symbols, and notation consistent with those specified in the U. Traceability as used in this standard means that the document in question is in agreement with a predecessor document to which it has a hierarchical relationship.
Consistency between documents, as used in this standard, means that two or more documents that are not hierarchically related are free from contradictions with one another. Elements of consistency are: no two statements contradict one another, a given term, acronym, or abbreviation means the same thing in the documents, and a given item or concept is referred to by the same name or description in the documents.
The contract may include provisions regarding the requirements, analysis, design, and coding techniques to be used. This criterion consists of compliance with the techniques specified in the contract and SDP. This criterion, as used in this standard, means that: the amount of memory or time allocated to a given element does not exceed documented constraints applicable to that element, and the sum of the allocated amounts for all subordinate elements is within the overall allocation for an item.
This criterion, as used in this standard, means that: every specified requirement is addressed by al least one test, test cases have been selected for both "average" situation and "boundary" situations, such as minimum and maximum values, "stress" cases have been selected, such as out-of-bound values, and test cases that exercise combinations of different functions are included.
This model is based on the premise that a life cycle is a living model with multiple disciplines aa multiple, reconfigurable steps which are inherently iterative. This paper examines both standard and iterative software development life cycles, and addresses the compatibility of these life cycles and techniques with DoD-StdA. One criticism of the standard was that it was biased toward the Waterfall Model. Modifications to the design at this point become extremely expensive and are often deferred until a future release, if at all.
DOD 2167A PDF
Data Item Description