Premium

Conformity Assessment

High-risk AI systems under the EU AI Act must undergo a conformity assessment before being placed on the market. AI SENTINEL provides a structured template that walks your team through every requirement — from technical documentation to post-market monitoring — so nothing falls through the cracks.

What is a Conformity Assessment?

A conformity assessment is a formal evaluation process required by Art. 43 of the EU AI Act for high-risk AI systems. It verifies that the AI system meets all applicable requirements before deployment — covering data governance, transparency, human oversight, accuracy, robustness, and cybersecurity.

Depending on the use case, conformity assessments may be conducted internally by the provider or through a third-party notified body. In either case, the assessment must follow a structured methodology and produce documented evidence of compliance.

Who needs it?

AI Providers

Organizations that develop or place high-risk AI systems on the EU market must complete a conformity assessment before deployment (Art. 16).

AI Deployers

Organizations deploying high-risk AI systems must verify that the provider has completed the conformity assessment and that the CE marking is in place (Art. 26).

Annex III systems

AI systems used in biometrics, critical infrastructure, education, employment, essential services, law enforcement, migration, or administration of justice (Art. 6(2)).

Internal compliance teams

Even when a notified body is involved, internal teams need a structured process to prepare documentation, gather evidence, and track remediation of gaps.

What the template covers

Risk Management System

Verification that a risk management system is established and maintained throughout the AI system lifecycle (Art. 9).

Data Governance

Assessment of training, validation, and testing data — including relevance, representativeness, and bias examination (Art. 10).

Technical Documentation

Review of technical documentation completeness: system description, design specifications, development process, and performance metrics (Art. 11, Annex IV).

Record-keeping & Logging

Evaluation of automatic logging capabilities for traceability, including input/output records and system events (Art. 12).

Transparency

Assessment of instructions for use, disclosure of AI nature, and interpretability measures for deployers and affected persons (Art. 13).

Human Oversight

Verification that appropriate human oversight measures are designed into the system and documented (Art. 14).

Accuracy, Robustness & Cybersecurity

Testing and validation of accuracy levels, resilience to errors and adversarial attacks, and cybersecurity measures (Art. 15).

Quality Management System

Review of the provider's quality management system covering development processes, testing, and post-market obligations (Art. 17).

Post-market Monitoring

Verification that a post-market monitoring plan is in place to detect and address issues after deployment (Art. 72).

How it works

1

Create a new assessment

Select the Conformity Assessment template and link it to a registered AI system. The template pre-populates all required evaluation areas.

2

Work through each section

Each section presents structured questions aligned to the EU AI Act requirements. Record findings, attach evidence, and note any gaps.

3

Submit for review

Once complete, submit the assessment for review by the AI Officer or governance lead. The approval workflow tracks sign-off and any requested changes.

4

Document and maintain

The completed assessment becomes part of your compliance record. Link it to the AI system's compliance mapping and revisit it when material changes occur.