Skip to Content
DashboardEvidence Models

Evidence Models

Evidence Models define the structure and validation rules for evidence that AI systems produce during testing.

Overview

Evidence Models help you:

  • Define Evidence Structure: Specify what fields evidence must contain
  • Set Validation Rules: Ensure evidence meets quality standards
  • Track Compliance: Map evidence to compliance requirements

Creating Evidence Models

  1. Navigate to Assurance → Evidence Models
  2. Click New Evidence Model
  3. Define the model:
    • Name: Descriptive name for the model
    • Description: What this evidence proves
    • Fields: Define required and optional fields
    • Validation Rules: Set constraints on field values

Field Types

Evidence models support various field types:

TypeDescriptionExample
stringText contentUser consent message
numberNumeric valuesConfidence score
booleanTrue/falseExplicit consent obtained
dateTimestampsWhen evidence was collected
arrayListsList of verified items
objectNested structuresDetailed metadata

Validation Rules

Apply rules to ensure evidence quality:

  • Required: Field must be present
  • Min/Max Length: Constrain text length
  • Pattern: Regex validation for formats
  • Enum: Restrict to specific values
  • Custom: Define custom validation logic

Using Evidence Models

When running assurance tests, specify which evidence model to use. The test runner will validate collected evidence against the model before storing it in the Evidence Vault.

Last updated on