Many organizations building or deploying AI systems in Europe are treating August 2026 like a distant deadline. It is not. The conformity assessment — the formal procedure that determines whether your high-risk AI system can legally enter the EU market — takes months to complete properly. Getting it wrong means you cannot affix the CE mark, cannot deploy, and risk removal from the market by national supervisors.
Here is what the process actually involves, and how to work through it without losing time.
The conformity assessment is the procedure laid out in Articles 16 and 43 of the EU AI Act. It is the mechanism by which a provider — the company that develops or substantially modifies an AI system — demonstrates that the system meets all requirements in Chapter III, Section 2 before placing it on the EU market or putting it into service.
The output is documented evidence, stored in a technical file, that forms the basis for the EU Declaration of Conformity. The CE marking follows from that declaration.
This is not an audit in the informal sense. It is a structured, legally defined process with specific outputs.
Only providers of high-risk AI systems as defined in Article 6 and Annex III must go through conformity assessment. Annex III currently lists eight categories, including:
If your system falls into one of these categories and meets the additional materiality conditions in Article 6(2), you are looking at a high-risk classification and must comply with the full Chapter III requirements.
The EU AI Act distinguishes between two conformity routes.
Internal control (Annex VI) applies to the majority of high-risk AI systems. The provider conducts the assessment themselves, without involving a third party. This does not mean it is lightweight. Internal control requires completing the full technical documentation (Annex IV), establishing and maintaining a quality management system (Article 17), conducting a risk management process (Article 9), validating performance on appropriate datasets (Article 10), implementing post-market monitoring (Article 72), and registering in the EU AI Act database before deployment.
Join thousands of professionals mastering AI skills with interactive courses.
Third-party conformity assessment is required for a smaller set of systems: those listed in Annex III, points 1(a) and 6, which covers biometric systems used for real-time remote identification in public spaces, and AI systems used for making individual decisions in access to essential services in some cases. These require involvement of a notified body — an accredited organization officially designated by an EU member state.
Most enterprise AI systems in HR, compliance, finance, and operations will fall under internal control.
Step 1 — Classify your system. Run the Annex III checklist against your system's actual use case. The classification is based on the intended purpose as documented, not what the system technically could do. A general-purpose LLM integrated into a recruitment workflow becomes a high-risk system under point 4 of Annex III.
Step 2 — Establish a quality management system. Article 17 requires a documented QMS covering design, development, testing, and post-market monitoring. In practice, this means written policies, version-controlled documentation, designated responsibilities, and internal review cycles.
Step 3 — Build the technical file. Annex IV lists the required contents. This includes a general description, design specifications, training methodology and dataset description, risk management records, testing results, logs and monitoring mechanisms, cybersecurity measures, and the EU Declaration of Conformity. This file must be kept updated throughout the system's lifecycle.
Step 4 — Conduct the risk management process. Article 9 requires an ongoing risk management system, not a one-time checklist. You need to identify, estimate, evaluate, and mitigate risks throughout the development and deployment lifecycle. This process must be documented.
Step 5 — Validate the data and model performance. Article 10 requires that training, validation, and testing datasets meet specific quality criteria: relevant, representative, free from errors, and appropriately complete given the system's purpose. Document your data governance practices.
Step 6 — Conduct the conformity assessment. For internal control, this means reviewing all documentation, confirming the system meets each applicable requirement, and formally signing off. The output is the EU Declaration of Conformity (Article 47).
Step 7 — Register in the EU AI Act database. Before market placement, high-risk AI systems must be registered in the EU database managed by the AI Office. The registration requires specific information about the system, provider, intended purpose, and geographic reach.
Step 8 — Affix the CE marking. Once the Declaration of Conformity is signed and registration is complete, the CE mark can be affixed (Article 48). This marking signals compliance and is required for legal market access.
The most common mistake is treating conformity assessment as a documentation sprint near the deployment date. The risk management process, data validation, and QMS cannot be bolted on retroactively. They need to be embedded in the development process from the start.
A second common mistake is underestimating the technical file requirement. Annex IV is specific. Organizations that start compiling documentation late often discover that critical information about data sources, model architecture choices, and testing methodology was never recorded in a usable form.
Third, and often overlooked: the obligation does not end at deployment. Post-market monitoring under Article 72 requires active tracking of system performance in real-world use, incident reporting, and regular updates to the technical file.
The EU AI Act database (Article 71) is managed by the European AI Office and will be publicly searchable for certain categories of systems. Providers must register before placing a high-risk system on the market. The database is operational as of August 2026 at the latest, but early registration is possible and advisable.
National market surveillance authorities — in the Netherlands, that is primarily the Rijksinspectie Digitale Infrastructuur — will use this database as a starting point for compliance checks.
Conformity assessment under the EU AI Act is modeled on the EU's existing CE marking framework for products like medical devices and machinery. Organizations that have been through a Medical Device Regulation conformity process will find it structurally familiar, though AI-specific requirements around explainability, human oversight, and data governance add new dimensions.
The organizations that get this right are building their technical documentation and quality management systems now, while their AI systems are still in development. They are treating Article 4 literacy — ensuring their teams understand these requirements — as a prerequisite for compliance, not an afterthought.
The August 2026 deadline for high-risk AI systems under the EU AI Act is not a far horizon. It is the next production release.