How can we help you?

Just in time for "go live" day, the Government issued an updated Procurement Policy Note (PPN 017) on "Improving the Transparency of AI Use in Procurement". PPN 017 updates the previous guidance issued in March 2024 reiterates best practice for identifying and managing the risks and opportunities associated with the use of AI in procurement. The recent updates reflect new terminology in line with the Procurement Act 2023 but do not introduce any policy changes. Nevertheless, this has us thinking about the interaction between AI and public procurement.

AI-generated documents

Tender documents and submissions are lengthy, technical, and resource-heavy documents to prepare. AI platforms can significantly streamline the preparation of these documents for contracting authorities and bidders alike.

The risks of AI in generating tender documents and responses are widely known: outputs can be overly generic and not project-specific, bidder responses may overpromise where the tool does not have enough background information, and bidders using the same platform could end up with substantially similar responses.

However, the PPN messaging is clear and realistic. The use of AI by bidders in the procurement process is not prohibited, but it should only be applied where risks can be effectively understood and managed by the contracting authority.

Procurement practitioners will not be able to stop bidders from using AI. Indeed the question of whether they should be trying to do so is a legitimate one. AI may level the playing field for SMEs that perhaps do not have the funds to engage specialist bid teams to professionalise their responses. Sophisticated AI platforms may go some way to filling this gap.

To mitigate the risk of AI in generating tender submissions, we recommend that contracting authorities require bidders to declare where they have used AI in responses (for example, listing which questions were AI-supported and how – was the AI used as a mere spell checker or did it play a larger role in planning/drafting responses?). This will at least allow contracting authorities to identify where AI is relied upon for the purposes of conducting enhanced due diligence on the responses to AI-generated questions.

The PPN does not give as firm a view on the use of AI by contracting authorities. For already resource-stretched contracting authorities, any time-saving efficiencies are likely a welcome addition. It is, however, essential to maintain human supervision to ensure precision, relevance, and legal compliance.

Decision making and evaluation

The potential efficiency savings of AI do not strictly relate to the generation of tender documents. There is a use case for AI in the evaluation stage. PPN 017 is mostly concerned about AI for bid generation and provides little guidance on its use in the evaluation stage.

At this stage in the evolution of AI products, we recommend exercising caution. Typically, AI platforms are generative Large Language Models (LLMs). LLMs are trained to predict a "statistically plausible" response based on the information pool they draw from. To effectively use this type of platform to evaluate qualitative responses, the AI platform would need to be properly trained using relevant data on how to apply the scoring matrix and evaluation methodology. Even with proper training, contracting authorities would need to have in place appropriate checks and balances to ensure the tool is taking into account appropriate considerations and disregarding irrelevant considerations.

In the world of section 50 assessment summaries, contracting authorities are under clear instructions to provide reasons for scores. Those reasons may be difficult to articulate to bidders where practitioners using the AI platform do not have the technical know-how to understand how the decision was made to provide meaningful reasoning behind it.

Confidentiality and data security

A broader issue with the use of AI in procurement procedures is data security and confidentiality. Most tender documents will include some form of personal data and commercially sensitive information. If sensitive data is entered into an AI system without proper safeguards, there is a danger it could become publicly accessible. This is not only a risk for contracting authorities. Bidders should also be wary of uploading their own tender responses to AI, which often includes personal data in supplier personnel CVs and commercially sensitive pricing information.

PPN 017 does highlight this risk and states that it may be necessary to put in place "proportionate controls to ensure bidders do not use confidential contracting authority information, or information not already in the public domain, as training data for AI systems".

The extent of the risk that AI systems are training using sensitive data will depend on the type of platform that is used (for example, whether it is a closed offline platform or an open online platform). However, it is unlikely that the technical capabilities of contracting authority and bidder procurement teams extend to being able to easily distinguish whether data has been used as "training data" or not. It may even require close inspection of the terms and conditions of the AI tool's provider to understand how uploaded data will be used.

Contracting authorities can mitigate this risk by clearly setting out to bidders what information is prohibited from being uploaded into AI. If the information is not uploaded, it cannot be used as "training data". This avoids straying too far into the technical detail.

Conclusion

There are risks associated with the use of AI in procurement procedures, but they are not necessarily insurmountable. Practitioners should exercise caution and be mindful of any key issues in their approach.