Assurance Tests
What it is
Assurance Tests define specific compliance checks that validate your AI systems meet regulatory and business requirements. Each test specifies what evidence to collect and how to validate it against an evidence model.
How to use
- Navigate to Assurance → Tests in the sidebar
- View all assurance tests across your organization
- Click New Test to create an assurance test
- Configure the test with evidence model and validation criteria
Test Configuration
- Name: A descriptive identifier for the test
- Evidence Model: The model this test validates against
- Validation rules: Criteria for determining pass/fail
- Collection method: How evidence is gathered (automatic or manual)
Key Features
- Evidence model linking: Tests validate against defined evidence models
- Automated collection: Configure automatic evidence gathering
- Manual review: Support for human-in-the-loop validation
- Result history: Track validation outcomes over time
Validation Approach
Assurance tests can validate:
- Data format and completeness
- Timestamp and freshness requirements
- Content integrity and accuracy
- External system state verification
Tips
- Write clear test names that describe the compliance requirement
- Link tests to specific evidence models for traceability
- Consider both automated and manual validation needs
- Group related tests into assurance suites
Last updated on