How can we help you?

The rapid integration of artificial intelligence (AI) into public and private sectors has prompted a pressing need for robust frameworks to govern its procurement and deployment.

In the European Union, this need has been met, in part, by the development of the EU AI Model Contractual Clauses (the Clauses), which are a set of standardised terms designed to support public buyers in procuring AI systems that are trustworthy, fair, and secure. 

Understanding the Clauses is increasingly relevant as a UK legal practitioner, especially in the post-Brexit landscape where cross-border procurement and regulatory alignment remain pertinent. This article explores the history of the Clauses, the changes introduced in their latest iteration, and their applicability and value beyond the EU and in the UK.

History and Context

The EU AI Model Contractual Clauses emerged from a collaborative effort rooted in the EU’s broader strategy to regulate AI responsibly, aligning with the European Commission’s proposed Artificial Intelligence Act (AI Act), first tabled in April 2021. The AI Act, designed to establish harmonised rules for AI systems based on a risk-based approach, underscored the need for procurement tools to operationalise its principles.

In April 2023, the initial draft of the Clauses was released by the Public Buyers Community, a platform supported by the European Commission’s Directorate-General for Internal Market, Industry, Entrepreneurship, and SMEs (DG GROW). Over 40 experts from public sector bodies, academia, and industry participated in the review of the Clauses, refining them to address transparency, explainability, auditing rights, and accountability. The first public version was published in September 2023, accompanied by translations into all EU languages by October 2023.

The adoption of the AI Act on 13 June 2024 marked a significant milestone, prompting an update to the Clauses. The revised version of the Clauses, released on 5 March 2025 and accessible via the Public Buyers Community platform, reflects the finalised AI Act’s requirements, ensuring alignment with this landmark legislation.

Changes in the updated Clauses

Following feedback from the 2023 pilot phase, the latest rendition of the Clauses incorporates several key refinements, including:

  • Alignment with the AI Act: the Clauses now fully reflect the AI Act’s risk-based framework, particularly its classification of AI systems as "high-risk" or "non-high-risk," as defined in Article 6 and Annexes II and III. This ensures compliance with mandatory requirements for high-risk systems, such as risk management and human oversight;
  • Enhanced Flexibility: a "light version" for non-high-risk AI systems has been introduced alongside the comprehensive high-risk version, offering customisable options to suit varying procurement needs;
  • Detailed Commentary: a new explanatory note accompanies the Clauses and enhances usability for public buyers by providing practical guidance on the application of the Clauses and instructions for reporting use cases; and
  • Pending Translations: while the 2023 version was translated into all EU languages, the updated Clauses are still being translated with multilingual versions expected soon.

The Clauses were issued by the Public Buyers Community, an initiative of DG GROW, in collaboration with the Directorate-General for Communications Networks, Content and Technology (DG CNECT), Living-in.EU, and Pels Rijcken. This multi-stakeholder effort reflects a commitment to pooling expertise from legal, technical, and procurement domains to create a practical tool for public authorities across the EU.

Scope of Coverage

The Clauses focus on AI-specific provisions aligned with the AI Act, deliberately excluding broader contractual elements such as intellectual property, payment terms, delivery schedules, applicable law, or liability. They are designed as a modular schedule to be appended to existing agreements, covering in particular:

  • High-Risk AI Systems: mandatory requirements exist  for systems listed in the AI Act’s Annexes, including risk management, data governance, transparency, and cybersecurity;
  • Non-High-Risk AI Systems: recommended provisions enhance trustworthiness, though are not legally mandated under the AI Act; and
  • Key Principles: trustworthiness, fairness, security, transparency, and accountability, tailored to the procurement of AI from external suppliers.

Notably, the Clauses do not address obligations under other EU laws, such as the General Data Protection Regulation (GDPR), requiring buyers to integrate these separately.

The Clauses are intended for voluntary use by public organisations procuring AI systems from external suppliers. They are particularly relevant in the following circumstances:

  • Post-AI Act Enforcement: with the AI Act in force as of 13 June 2024, the Clauses are immediately applicable for EU public buyers seeking compliance;
  • Pilot and Operational Phases: the Clauses can be used in pilot projects or fully operational deployments, with feedback encouraged via the Public Buyers Community; and
  • Custom Contexts: their modular nature allows for use in diverse procurement scenarios provided they are tailored to specific needs.

Legal status

The Clauses are not legally binding in and of themselves. They are a voluntary tool, not an official EU legislative document, and their use is discretionary. Public organisations must assess their sufficiency and proportionality on a case-by-case basis, adapting them to comply with the AI Act and other applicable laws (e.g., GDPR).

As the AI Act’s technical standards and guidance evolve, the Clauses may require further updates, but they currently serve as a practical bridge between regulation and procurement practice.

To use the Clauses effectively, public organisations must adapt the Clauses to their specific contractual context and adapt them into broader agreements that address standard terms (e.g., payment, liability, termination).

Organisations should conduct a risk assessment to determine whether the AI system is classified as "high-risk" or "non-high-risk" under the AI Act, as this will determine the applicable version of the Clauses to incorporate. Organisations must also ensure compliance with overlapping regulations such as GDPR and supplement the Clauses as necessary.

Value outside the EU and in the UK

For UK legal practitioners and public authorities, the Clauses still hold significant value despite the UK’s departure from the EU.

Post-Brexit, the UK is developing its own "pro-innovation" AI regulatory framework, but alignment with EU standards remains critical for cross-border trade and procurement. The Clauses offer a benchmark for trustworthy AI procurement that UK entities can adapt. From a procurement perspective, the Clauses’ focus on transparency and accountability aligns with UK public procurement principles and is timed well with the launch of the Procurement Act 2023, offering a ready-made tool for public authorities procuring AI.

The significance of the Clauses is not only limited to public authorities but also has relevance in the private sector. UK businesses supplying AI to EU public bodies must comply with the AI Act, making familiarity with the Clauses essential. Private organisations procuring AI may also use them as a best-practice guide.

In practice, UK entities might customise the Clauses to reflect domestic laws (e.g., the Data Protection Act 2018) while retaining their AI-specific provisions. The voluntary nature of the Clauses makes them a flexible resource, particularly for UK suppliers navigating EU markets.

Value Add?

The EU AI Model Contractual Clauses represent a pioneering effort to translate the AI Act’s ambitions into actionable procurement tools. They reflect a collaborative, iterative process aimed at encouraging transparency and fostering responsible AI adoption.

For UK legal practitioners, they offer both a window into EU regulatory expectations and a practical framework adaptable to domestic needs. As AI continues to reshape public services, the Clauses provide a critical starting point for ensuring trust and accountability in procurement, bridging the gap between innovation and regulation in an increasingly interconnected world.