< All Topics
Print

Conformity Assessment in Plain Language

Conformity assessment is the structured process by which a manufacturer or other responsible economic operator declares that a product, service, system, or process meets the applicable legal requirements. In the European regulatory landscape, it is the bridge between technical specifications and legal presumption of conformity. It is not merely a technical checkbox; it is a governance function that allocates responsibilities, defines evidence, and sets the boundaries of market surveillance and enforcement. For professionals working with AI, robotics, biotech, and data systems, understanding who does what, what evidence matters, and how to prepare is essential to avoid costly delays, legal exposure, and market access barriers.

European Union legislation typically follows a New Legislative Framework (NLF) approach, which harmonizes the rules for placing products on the market. The NLF includes Regulation (EU) 2019/1020 (Market Surveillance and Compliance) and Decision No 768/2008/EC, which sets out common principles. Conformity assessment sits within this framework. It is important to distinguish between self-declaration (the manufacturer issues an EU Declaration of Conformity based on their own assessment) and third-party certification (a notified body evaluates conformity). The choice depends on the risk class and the specific legal act. For example, low-risk devices may be self-declared under the Medical Devices Regulation (MDR) or the In Vitro Diagnostic Medical Devices Regulation (IVDR), whereas high-risk devices require a notified body. Under the AI Act, most systems are self-declared, but high-risk AI systems require a notified body only if they are safety components of products that themselves require third-party conformity assessment. In practice, this means many AI providers will self-declare high-risk status, while some will face notified body involvement due to the underlying product regulation.

Legal Foundations and the Role of the NLF

The EU’s NLF establishes a common architecture for conformity assessment across sectors. It defines economic operators (manufacturers, authorized representatives, importers, distributors), their obligations, and the documentation required. It also sets out the rules for notified bodies, market surveillance authorities, and the safeguard mechanism. The NLF is complemented by sector-specific regulations that specify the conformity assessment modules (e.g., Module A for internal production control, Module B for EU-type examination, Module D for quality assurance). Understanding the module is crucial because it determines who does what. For instance, Module A allows the manufacturer to self-assess and issue an EU Declaration of Conformity, while Module B requires a notified body to examine the design and issue a certificate.

For AI and data-intensive systems, the NLF interacts with the AI Act (Regulation (EU) 2024/1689), the GDPR (Regulation (EU) 2016/679), and the Data Act (Regulation (EU) 2023/2854). Conformity assessment under the AI Act is primarily a self-declaration process for most systems, but high-risk AI systems that are safety components of products subject to other legislation (e.g., machinery, medical devices, elevators) must follow the conformity assessment procedures of those product regulations, which may involve a notified body. The AI Act also introduces the concept of a conformity assessment body (notified body) for certain cases, such as high-risk AI systems in sensitive sectors or when the provider opts for third-party assessment voluntarily. In practice, organizations should map their AI systems to the relevant product legislation and determine whether the underlying product requires third-party assessment.

Who Does What: Allocation of Responsibilities

Responsibility is distributed across the supply chain. The manufacturer is the natural person or legal entity that designs, manufactures, and places a product on the market under their name. They are responsible for conformity assessment, drawing up technical documentation, issuing the EU Declaration of Conformity, and ensuring ongoing compliance (including post-market surveillance and corrective actions). If the manufacturer is established outside the EU, they must appoint an authorized representative in the EU to perform specific tasks, such as keeping technical documentation available and cooperating with market surveillance authorities.

The importer places products from third countries on the EU market. They must verify that the manufacturer has undertaken conformity assessment, that the technical documentation is available, and that the product bears the required markings and labeling. Importers must also identify themselves on the product or its packaging. The distributor makes the product available on the market and must verify traceability and compliance obligations, ensuring that they do not supply products that they know or should have known are non-compliant.

For software and AI systems, the manufacturer is typically the entity that places the system on the market under its name, which may be the developer, the integrator, or the provider of a platform. The AI Act clarifies that the provider is the entity that develops an AI system with a view to placing it on the market under its own name or putting it into service under its own name, regardless of whether it is distributed as a physical product, software, or service. If the system is embedded in a product, the manufacturer of the product may be responsible for the conformity assessment of the AI system as a safety component. In practice, this means that software vendors supplying high-risk AI components to OEMs must ensure their component meets the AI Act’s requirements and provide the necessary evidence to the product manufacturer.

Authorized Representatives and Importers in AI and Data Systems

For non-EU providers of high-risk AI systems, appointing an authorized representative is mandatory under the AI Act. The authorized representative must be established in the EU and accept written mandate from the provider to perform specific tasks, including keeping the EU Declaration of Conformity and technical documentation available for market surveillance authorities, cooperating with those authorities, and responding to corrective actions. In practice, the authorized representative should be engaged early, and the mandate should clearly define responsibilities, especially regarding updates and incident reporting.

Importers of AI-enabled products must ensure that the provider has completed the appropriate conformity assessment. They should verify that the technical documentation includes the information required by the AI Act, such as the system’s intended purpose, risk management system, data governance measures, and instructions for use. Importers should also ensure that the product bears the CE marking and the identification of the manufacturer or authorized representative. In sectors like medical devices, importers must also register themselves in the EU database (EUDAMED) when it is fully operational.

Integrators and Deployers

Integrators who combine multiple components into a system may become a manufacturer if they place the integrated system on the market under their name. They must ensure that the components used are compliant and that the integrated system meets the relevant requirements. Deployers (users) are generally not responsible for conformity assessment, but they have obligations under the AI Act to use high-risk AI systems in accordance with instructions, monitor for risks, and ensure human oversight. Deployers may also have obligations under sectoral rules, such as clinical investigation or use of medical devices, which may trigger additional compliance steps.

Conformity Assessment Procedures and Modules

EU legislation uses a set of harmonized modules to define how conformity is demonstrated. The most common include:

  • Module A (Internal production control): The manufacturer assesses conformity based on their own technical documentation and production quality assurance. This is typical for low-risk products and many AI systems under the AI Act.
  • Module B (EU-type examination): A notified body examines the design and issues an EU-type examination certificate. This is required for certain higher-risk products.
  • Module C (Conformity to type based on internal production control): The manufacturer ensures production conforms to the approved type.
  • Module D (Quality assurance): A notified body approves the manufacturer’s quality system and audits it periodically.
  • Module E (Product quality assurance): A notified body approves the quality system for final product inspection and testing.
  • Module F (Product verification): A notified body verifies each product against the approved design.
  • Module G (Unit verification): A notified body verifies each unit individually.

For AI systems, the AI Act does not prescribe modules but references the NLF procedures where applicable. If a high-risk AI system is a safety component of a product that requires third-party assessment, the product’s conformity assessment module will determine the involvement of a notified body. For standalone high-risk AI systems that are not safety components of a product, the AI Act generally allows self-declaration (Module A) but requires the provider to follow a robust conformity assessment process, including risk management, data governance, technical documentation, record-keeping, transparency, and quality management systems. The provider may also opt for a third-party assessment voluntarily, which can be useful for market confidence or contractual requirements.

Self-Declaration vs Third-Party Certification

Self-declaration is not a lesser standard; it is a legal declaration backed by evidence. The provider must follow the conformity assessment process, compile technical documentation, and issue an EU Declaration of Conformity. The declaration must identify the legal act, the provider’s details, and the notified body (if involved). Self-declaration shifts the burden of proof to the provider. Market surveillance authorities can request evidence at any time, and failure to provide adequate documentation can lead to withdrawal or prohibition of the AI system.

Third-party certification involves a notified body that is independent and accredited. The notified body assesses the provider’s technical documentation and quality management system and issues certificates. This is common for high-risk medical devices, machinery with safety components, and certain industrial products. For AI, third-party involvement is limited to cases where the underlying product regulation requires it or when the provider voluntarily chooses it. The AI Act also introduces the possibility for conformity assessment bodies to assess certain aspects of high-risk AI systems, such as the quality management system, if the provider opts for this route.

Technical Documentation: What Evidence Matters

Technical documentation is the core evidence of conformity. It must be comprehensive, up-to-date, and accessible to market surveillance authorities. The documentation should be structured to demonstrate compliance with each relevant requirement. For AI systems, the AI Act specifies the content of technical documentation, which includes:

  • General description of the AI system, including its intended purpose, provider details, and deployment contexts.
  • Elements of the AI system and its development process, including methodologies and techniques.
  • State of the art and relevant harmonized standards or common specifications.
  • Risk management system and measures implemented to address risks.
  • Data governance and data management measures, including data sources, preprocessing, and bias mitigation.
  • Information to be provided to users, including instructions for use and transparency information.
  • Details on the quality management system, if applicable.
  • Records of post-market monitoring and serious incident reporting.

For products subject to the NLF, technical documentation typically includes design and manufacturing information, risk assessment, test reports, declarations of conformity, and instructions for use. For software and AI, documentation should also include model cards, data sheets, version control, change logs, and evidence of validation and verification activities. The documentation should be organized to allow an auditor or market surveillance authority to trace requirements to evidence.

Standards and Common Specifications

Harmonized standards published by European standardization organizations (CEN, CENLEC, ETSI) provide a presumption of conformity with the relevant legal requirements. For AI, the European Commission will request standardization bodies to develop harmonized standards for the AI Act, including for risk management, data governance, and quality management. Until harmonized standards are available, providers can use common specifications adopted by the Commission or international standards, but they must demonstrate that their approach meets the legal requirements. It is important to monitor the Official Journal of the EU for the list of harmonized standards and to maintain a standards mapping document.

Quality Management Systems

For high-risk AI systems, the AI Act requires a quality management system that covers design, development, testing, and post-market activities. This can be integrated with existing ISO 9001 or sector-specific QMS (e.g., ISO 13485 for medical devices). The QMS should include procedures for risk management, data management, design and development controls, change management, supplier management, and post-market surveillance. It should also address documentation control and record retention. For organizations that already have a QMS for medical devices or machinery, the AI QMS can be an extension or module within that system.

Post-Market Surveillance and Vigilance

Post-market surveillance (PMS) is the systematic process for collecting and analyzing experiences gained from products on the market. For AI systems, PMS should include monitoring performance, user feedback, drift detection, incident reporting, and periodic reviews. The AI Act requires providers to implement a PMS system and to report serious incidents to market surveillance authorities. For medical devices, vigilance reporting includes serious incidents and field safety corrective actions. For machinery, the Machinery Regulation requires reporting of serious incidents. The PMS plan should define data sources, metrics, reporting cadence, and responsibilities.

AI Act Specifics: Conformity Assessment in Practice

The AI Act introduces a risk-based approach. Most AI systems are outside the scope of conformity assessment obligations, but high-risk AI systems are subject to strict requirements. A high-risk AI system is one listed in Annex III (e.g., biometrics, critical infrastructure, employment, education, essential services, law enforcement, migration, administration of justice) or is a safety component of a product covered by other EU legislation (e.g., medical devices, machinery, vehicles). For high-risk AI systems that are not safety components of a product, the provider must follow a conformity assessment procedure that is essentially self-declaration, but they must meet all requirements, including risk management, data governance, technical documentation, transparency, human oversight, accuracy, robustness, and quality management. They may choose to involve a conformity assessment body voluntarily.

For high-risk AI systems that are safety components of products subject to other legislation, the conformity assessment of the product (and thus the AI component) follows the procedures of that legislation, which may require a notified body. For example, a high-risk AI system embedded in a medical device will be assessed under the MDR, which may require a notified body for the device and the AI component. The provider of the AI system must provide the necessary evidence to the device manufacturer. In practice, this means AI providers must understand the product ecosystem and coordinate with OEMs to ensure the AI component’s conformity is integrated into the product’s conformity assessment.

Conformity Assessment Bodies and Notified Bodies

Conformity assessment bodies (CABs) are entities that assess conformity for specific sectors. For products under the NLF, they are called notified bodies. The AI Act introduces the concept of conformity assessment bodies for AI, which may be notified bodies or other designated bodies depending on the sector. Member States designate these bodies based on accreditation and competence. For AI, the Commission will adopt implementing acts specifying the tasks and procedures for these bodies. In practice, organizations should verify the scope of a notified body’s designation to ensure it covers the relevant legal act and technical area.

CE Marking and EU Declaration of Conformity

The CE marking indicates that the product complies with all applicable EU legislation. For AI systems that are not physical products, the CE marking may be applied to the product in which the AI is embedded. The EU Declaration of Conformity is a formal document that must be signed by the manufacturer or authorized representative and include:

  • Identification of the product and the legal acts it complies with.
  • Name and address of the manufacturer and, if applicable, authorized representative.
  • Statement of conformity.
  • Information on the notified body (if involved).
  • Date and place of issue, and signature.

The declaration must be made available to authorities upon request and, for certain products, accompany the product. For AI systems, the declaration should reference the AI Act and any other applicable legislation (e.g., GDPR for data processing, Data Act for data sharing, Machinery Regulation for safety components).

National Implementation and Cross-Border Considerations

While EU regulations are directly applicable, national implementation can affect market surveillance, notified body designation, and enforcement practices. Member States designate market surveillance authorities and notified bodies, and their practices may vary. For example, Germany’s Federal Institute for Drugs and Medical Devices (BfArM) and the Central Office for the Supervision of Medical Devices (ZLG) play key roles in medical device oversight, while France’s ANSM handles vigilance and market surveillance. The Netherlands’ Healthcare and Youth Inspectorate (IGJ) focuses on healthcare contexts. For AI, national authorities will enforce the AI Act, with coordination through the European AI Office and the AI Board.

Organizations operating across multiple jurisdictions should establish a single point of contact for EU compliance, maintain a regulatory matrix mapping products to legal acts and national authorities, and ensure that their technical documentation is accessible in the languages required by Member States. For medical devices, EUDAMED will centralize registration, but until full deployment, national databases are used. For machinery and other products, national databases may exist for incident reporting or registration.

Comparative Approaches Across Europe

Germany tends to have rigorous enforcement and a strong culture of documentation and quality management, aligned with its industrial base. France emphasizes clinical evidence and post-market surveillance for medical devices and has a proactive approach to AI oversight through national strategies. The Netherlands is pragmatic and risk-based, with a focus on healthcare safety and effective supervision. The UK (post-Brexit) has its own UKCA marking and regulatory regime for products and AI, but for EU market access, compliance with EU regulations remains necessary. For AI, the UK’s approach is similar in risk-based principles but diverges in institutional arrangements and timelines.

Practical Preparation: A Step-by-Step Approach

Organizations should approach conformity assessment as a structured program rather than a one-off project. The following steps provide a practical roadmap:

  1. Map products and systems to legal acts: Identify whether the product or AI system falls under the AI Act, MDR, IVDR, Machinery Regulation, Radio Equipment Directive, Data Act, GDPR, or other legislation. Determine risk classifications and whether third-party assessment is required.
  2. Ident
Table of Contents
Go to Top