Section 6.1 Risk Assessment

Risk management is a continuous, iterative process used to identify, quantify, and monitor risks during each stage of the ICT acquisition process.  Risk assessment objectives are summarized in Figure 6.1 below.

A number of risk assessment tasks that focus on quality-driven outcomes are highly recommended to be performed, such as:

  • Adherence to Required Format and
  • Conformance to Standards and Contractual Requirements.

In performing Risk Assessment, The IV&V Provider will:

  • identify and assess the overall risk of the ICT project by focusing from two (2) perspectives i.e. Project Risk and Product Risk,
  • verify strategies for reducing all identified risks of high criticality and likelihood, focusing on technical, schedule, and cost risks in order to reduce negative impacts on the project,
  • perform continuous technical and programmatic analysis to track risks over time and ensure they are addressed,
  • perform a formal risk analysis at each stage of the ICT acquisition process, with brainstorming sessions to rank potential risks and note those requiring attention,
  • assist in the preparing risk mitigation plans, track progress towards risk reduction, and support technical, process, and program risk resolution as tasked by The Agency.

 Figure 6.2 below lists the typical Risk Assessment Evaluation Criteria to be adopted. Evaluation with respect to these criterion will consider whether: 

  • Required components are included,
  • Components work and are in the required order,
  • Components contain the required content,
  • Product adheres to requirements regarding formatting and presentation, and
  • The Agency is in adherence to Statements of Work, the terms of work, applicable higher level specifications, and standards and specifications, etc.

 

Adherence to required Standards and Requirements
As  a standard practice, criticality analysis of risks will be based on the following:

  • The potential consequences associated with a defect in, or failure of the function,
  • Likelihood of occurrence will be based on the probability of a defects or failures occurrence.

The IV&V Provider will document the assessment rationale and rank or weight both criticality and likelihood risk for each issue.  The results of this analysis will be used to:

  • identify catastrophic, critical, and high-risk functions,
  • focus IV&V Provider resources on the most critical aspects of the ICT system design development and
  • keep all risks visible.

 

Internal Consistency
Additional tasks include Internal Consistency where items being evaluated do not contradict themselves in either content or style.  Elements of consistency include:

  • consistent statements,
  • consistently defined terms throughout,
  • concepts are defined, and
  • consistent level of detail and presentation style throughout.

Understandability is a subjective, critical element of quality focused on: 

  • writing using generally accepted rules of grammar, capitalization, punctuation, symbols, and notation,
  • non-standard terms, phrases, acronyms, and abbreviations are defined,
  • material being presented can be interpreted in only one way, and
  • illustrations are adequately explained.

 

Understandability and Technical adequacy
Technical adequacy covers reviews of whether:

  • the overall technical approach is sound,
  • the technical information provided does not violate known facts,
  • if the technical approach adheres to best practices,
  • the technical approach well researched or based on proven methods,
  • the technical approach appears well thought out, and
  • makes sense both technically and practically.

 

Appropriate Degree of Completeness, Traceability, Consistency and Feasibility

  • Appropriate Degree of Completeness means that:
    • all constituent parts are present and
    • each part is addressed in adequate detail within the context of the overall ICT
  • Traceability means that the document or other item in question is in agreement with a predecessor to which it has a hierarchical relationship.
  • Consistency between documents means that two or more documents that are not hierarchically related are free from contradictions with one another.
  • Feasibility is the degree to which the design stated in a document or other item can be implemented given the state of the art, schedule and resource constraints, available tools and techniques, and other factors affecting the target ICT system’s procurement.

 

Appropriate Requirement Analysis, Design, Coding Techniques Used
Appropriate requirement analysis, design, coding techniques used is about the assessment of the prepared items based on:

  • industry accepted best practices,
  • the statement of work, and
  • the Development Team’s ICT project plan regarding system development activities.

 

Appropriate Level of Detail
Appropriate Level of Detail is a subjective criterion whose evaluation is based on the intended use of the document.  A document can err in either direction: 

  • a document that is supposed to provide requirements might be so detailed as to contain design data;
  • a document that is supposed to provide detailed design might be too high-level.

 

Adequacy of Test Coverage of Requirements
Adequacy of Test Coverage of Requirements applies to test planning documents of the public sector agency staff or software Development Team.  The following aspects must be considered: 

  • Is every requirement addressed by at least one test?
  • Have test suites been selected for an “average” situation as well as for “boundary” situations such as minimum and maximum values?
  • Have negative cases been selected, such as out-of-bounds values?
  • Have meaningful combinations of inputs been selected to ensure adequate coverage?

 

Adequacy of Planned Tools, Facilities, Procedures, Methods and Resources
Adequacy of planned tools, facilities, procedures, methods and resources applies to manuals and planning documents and is judged as to whether the planned items will be adequate to model their intended purpose. 

 

Appropriate Content for Intended Audience

Appropriate content for intended audience is assessed when each document has an intended audience and must be evaluated according to how well it addresses the needs of that audience. 

 

Consistency between Data Definition and Data Use
Consistency between data definition and data use refers to the way in which a data element is defined should match the way that it is used in the ICT software. 

 

Completeness of Testing and Adequacy of Retesting 
Completeness of testing is assessed if all test suites and all test procedures have been carried out, and all results have been fully recorded, analysed, and reported. 

Adequacy of Retesting is assessed when testing consists of repeating a subset of the test suites.

Adequacy of Test Descriptions/Procedures (Test Inputs, Expected Results, and Evaluation Criteria) examines whether test suites and test procedures are sufficiently clear and specific that someone could execute the test and judge unambiguously whether the evaluation criteria have been satisfied.