Overview of RAI Activities Throughout the Product Life Cycle

5. SHIELD TEVV

5.1 EVALUATE: Test System for Robustness, Resilience, and Reliability

  1. Are all parts of the stack subject to testing (sensor, datasets, model, environment, etc.)?
    1. For swarm and distributed effector technologies, have the devices been tested in cooperation with one another?
  2. Have there been unit tests of each component in isolation? Have there been integration tests to understand how the components interact with one another within the overall system?
  3. Has the system and its components been tested for robustness against:
    1. Perturbations (natural and adversarial)
    2. Adversarial attack
    3. Data/concept/model drift or data poisoning
    4. Human error and unintended or malicious use
  4. How has the system been tested for:
    1. performance
    2. safety
    3. maintainability
    4. suitability
    5. security
    6. understandability of outputs
  5. If applicable, leverage red-teaming techniques or bounties to help identify or anticipate system robustness/vulnerabilities. Leverage software tools through approaches such as fuzzing, for finding vulnerabilities.
  6. If applicable, are end users able to appropriately understand how outputs are produced and what they mean?

5.2 Revisit Documentation and Security

  1. Confirm that the way the concepts and constructs have been operationalized make sense given the use case, context and potential impacts, and DoD AI Ethical Principles.
  2. Confirm all relevant elements of the ontology are included for measurement and assessment.
  3. Describe the security review process, and the authorization received after its completion.

5.3 Update Documentation

  1. Update SOCs, impact and risk assessments, CONOPS, data/model cards, and DAGR as needed.