Responsible AI Toolkit IconHome
Logo of the CDAO| Responsible AI Division

RAI Toolkit

Overview

The Responsible Artificial Intelligence (RAI) Toolkit provides a centralized process that identifies, tracks, and improves alignment of AI projects to RAI best practices and the DoD AI Ethical Principles, while capitalizing on opportunities for innovation. The RAI Toolkit provides an intuitive flow guiding the user through tailorable and modular assessments, tools, and artifacts throughout the AI product lifecycle. The process enables traceability and assurance of responsible AI practice, development, and use.

Responsible AI Toolkit Logo

Getting Started

The RAI toolkit is a self-assessment built around the AI product life cycle. We recommend completing the self-assessment questions iteratively at each phase of the life cycle as the AI model or application is being developed.

While answers cannot currently be saved on the site, users will have the option to export a JSON that can be uploaded later to continue work on the assessment. Once the assessment is complete users can export a PDF with all answers.

RAI Toolkit

AI Product Life Cycle

Overview of RAI Activities throughout the Product Life Cycle

AI Lifecycle diagram.

Additional Parts of the Toolkit

DAGR Risk Guide

The Responsible Artificial Intelligence (RAI) Defense AI Guide on Risk (DAGR) is intended to provide DoD AI stakeholders with guiding principles, best practices, and other governing Federal and DoD guidance.

RAI Tools List

A collection of tools and resources to help you design, develop, and deploy responsible AI applications.