Fighting financial crime, protecting consumers’ needs, and bolstering wholesale markets are the regulator’s key priorities for the year ahead.

By Rob Moulton, Nicola Higgs, Becky Critchley, and Charlotte Collins

On 19 March 2024, the FCA published its Business Plan for 2024/25, setting out its priorities for the year ahead. While the Business Plan now takes on less significance than it did historically given other publications in circulation such as the FCA’s 3-year Strategy and the Regulatory

This annual publication explores some of the core focus areas for UK-regulated financial services firms in the year ahead. 2023 saw significant progress on the regulatory reform agenda, and many measures consulted on or reviewed as part of the Edinburgh Reforms will be finalised and/or implemented in the course of 2024.

We also saw the passing of the Financial Services and Markets Act 2023, many provisions of which have already come into effect and have made important changes to the

As regulatory thinking evolves, firms must ensure that any current or planned use of AI complies with regulatory expectations.

By Fiona M. Maclean, Becky Critchley, Gabriel Lakeman, Gary Whitehead, and Charlotte Collins

As financial services firms digest FS2/23, the joint Feedback Statement on Artificial Intelligence and Machine Learning issued by the FCA, Bank of England, and PRA (the regulators), and the UK government hosts the AI Safety Summit, we take stock of the government and the regulators’ thinking on AI to date, discuss what compliance considerations firms should be taking into account now, and look at what is coming next.

The FCA recently highlighted that we are reaching a tipping point whereby the UK government and sectoral regulators need to decide how to regulate and oversee the use of AI. Financial services firms will need to track developments closely to understand the impact they may have. However, the regulators have already set out how numerous areas of existing regulation are relevant to firms’ use of AI, so firms also need to ensure that any current use of AI is compliant with the existing regulatory framework.

A new publication from the UK’s financial regulator signals to firms that they should take steps to manage risks in the use of AI.

By Stuart Davis, Fiona M. Maclean, Gabriel Lakeman, and Imaan Nazir

The UK’s Financial Conduct Authority (FCA) has published its latest board minutes highlighting its increasing focus on artificial intelligence (AI), in which it “raised the question of how one could ‘foresee harm’ (under the new Consumer Duty), and also give customers appropriate disclosure, in the context of the operation of AI”. This publication indicates that AI continues to be a key area of attention within the FCA. It also demonstrates that the FCA believes its existing powers and rules already impose substantive requirements on regulated firms considering deploying AI in their services.

FCA Chair hints that new regulation addressing data ethics in the FinTech space may be on the horizon.

By Nicola Higgs, Fiona Maclean and Terese Saplys

Will societies of the future be ruled by algocracy, in which algorithms decide how humans are governed? Charles Randell, Chair of the Financial Conduct Authority (FCA) and Payment Systems Regulator, addressed how to avoid this hypothetical scenario in a broad-ranging speech on that he delivered on 11 July 2018 in London.

Randell’s Remarks

Contributing Factors to an Algocracy

According to Randell, the following three conditions could collectively give rise to a future algocracy:

  • If a small number of major corporations were to hold the largest datasets for a significant number of individuals (as is currently the case)
  • Continuing vast and rapid improvements in artificial intelligence and machine learning that allows firms to mine Big Data sets with greater ease and speed
  • Further developments in behavioural science allowing firms to target their sales efforts by exploiting consumers’ decision-making biases