CBDC: an opportunity to re-think AML regulations?

Share this!

On 6th December 2023, the Digital Pound Foundation hosted an in-person roundtable consisting of a selection of our Privacy and Identity Working Group members and external participants. The purpose of the roundtable was to share our thinking around the implementation of AML regulations in the digital age and gather feedback from the external participants. This paper was produced based upon those discussions.

A recent report from LexisNexis estimated that UK financial institutions spend over £30 billion per year complying with money laundering regulations – that’s 1% of UK GDP. And yet, money laundering appears to be as prevalent as ever with the National Crime Agency stating that it is possible that ‘hundreds of billions of pounds’ might be laundered in the UK each year.

This workshop discussed whether there is a structural problem with the way we attempt to detect and prevent money laundering and criminal finance today, and whether a digital Pound could be introduced with a different approach that would be more effective by design. Central to the discussion is how the identity of people wishing to open and use a digital Pound account would be established whilst safeguarding the privacy of their personal data.  

The workshop was focused on the thought experiment: what if Payment Interface Providers (PIPs) were not responsible and liable for implementing compliance. Instead, in order to have a Central Bank Digital Currency (CBDC) account, a customer would need to provide personal data to a new type of organisation – a Compliance Service – whose sole objective would be to detect and prevent illicit finance. The workshop discussed: 

  • The data sharing that would be required 
  • The regulations within which the data processing would happen
  • The customer integrity and confidentiality considerations
  • The technologies that could be deployed. 

A brief synopsis of the proposal and the workshop discussions is provided below along with the proposed next steps.

A structurally better way to detect and prevent criminality?

Today’s approach to the prevention of money laundering and criminal finance dates back to an agreement in 1989 through which the Financial Action Task Force was established. Much has changed since then. In the UK there have been multiple revisions to the anti-money laundering (AML) regulations, each one leading to costs for the circa 450 financial institutions and other organisations that must apply them.

But are financial institutions well placed to detect criminality? They cannot see ‘the whole picture’; they can only assess a customer on the data that he or she presents to them and the transactional data to which they have access. They are able to buy in services from specialist third party organisations but the laws on data sharing will mean they always have a blinkered view. As a result, they must restrain their commercial motivations in order to meet their obligations to wider society. For the financial institution, discussing whether a potentially lucrative new customer might not be presenting the full facts can be an awkward opening to a relationship, particularly for new and innovative companies wishing to achieve market share. 

Financial institutions have long been judged a special case because of the fundamental role they play in our economy: the money we use in the vast majority of transactions is ‘private’ money – essentially liabilities on commercial organisations – not ‘public’ money like cash where liability sits with the Bank of England. As a consequence, banking licences are hard to obtain and bank directors have great responsibilities. But with a CBDC, this logic changes and the approach to regulation should change too. 

The bad actors are often way ahead. It takes time for information about new attack vectors to be disseminated. Those who use the payments system to launder money will often exploit ‘the weakest link’ long before legally appropriate data sharing agreements can be put in place to prevent them: there are over 21,500 firms supervised for compliance with money laundering regulations by the Financial Conduct Authority alone. 

Today, financial institutions are responsible for prevention of money laundering.

Under a digital Pound, would it not be better to move responsibility for the prevention of money laundering from the providers of relatively dumb (but secure), digital wallets – known as Payment Interface Providers (PIPs)- to objective, independent entities specialised and focused on conducting the necessary checks? Such entities – let’s call them Compliance Services – would have one purpose only: the detection and prevention of money laundering and criminal finance in a manner that meets regulatory requirements. This purpose can be split into two separate but closely linked functions: ongoing Customer Due Diligence to know who is operating the account, and Transaction Monitoring to monitor how the account is being used.

Prevention and detection of money laundering would be done more effectively.

What might the benefits of this approach be? 

Customer convenience: 

  • Customers would need to provide and verify identity information to the Compliance Service only
  • Only the necessary sub set of data items would be shared by the Compliance Service with a PIP
  • The data received by a PIP would have been pre-verified and so could be trusted at face value without asking the customer for more data.

Privacy: 

  • There would be no secondary purpose for the Compliance Service so personal data would not be commercialised in ways that might concern the customer
  • The PIP might not need to know very much at all about the customer.

Effectiveness: 

  • The Compliance Service would have the authority to share data with other organisations in order to fulfil its purpose 
  • Compliance Services would be under the supervision of a distinct regulatory body
  • There would be far fewer organisations for the regulatory body to oversee with regard to money laundering and criminal finance than we have today
  • PIPs would focus on their commercial goals: serving their customers’ needs  better than their competitors do.

What data would be shared, by whom and with whom?

The customer would have Data Sharing Agreements with both the Compliance Service and the PIP:

  • The Compliance Service would share data with the PIP at the opening of the account and perhaps relevant updates when necessary. Potentially this data sharing might be quite minimal.
  • The PIP would share transaction level data with the Compliance Service as needed to perform Transaction Monitoring effectively.

It is important to distinguish the different types of data and where they are being processed:

  • Customer data: the identity information, address and contact details that are provided by the customer which the Compliance Service would need to verify against trustworthy sources. In some circumstances ‘source of funds’ information would be required. For businesses, it would be necessary to verify further information to meet regulatory requirements. 
  • Financial transaction data: this information is controlled by the authorising customer and would be created by the PIP when a transaction is conducted. It describes the facts about a payment; payer, payee, date, time, amount, reference etc.  
  • Data about transactions: this type of information is also recorded by the PIP; location of device authorising the payments, method of authentication, etc.
  • Derived data: this is created from analysis of the data about the transactions. The analysis conducted depends upon the objective. A PIP will have interest in using the analysis to improve its service. A Compliance Service will be looking for patterns of payment that might detect illegal activity.
  • Aggregated data: summary data about payments will be used by PIPs to manage liquidity.

The Bank of England has stated that its Central Bank Core Ledger will not hold any personal details about accounts as its responsibilities are solely for monetary policy and financial stability. Its current plans are to maintain transaction level data for this purpose.  

Analysis is required of which elements of data should be shared between PIPs and Compliance Services – and between the community of Compliance Services themselves. However, differentiation of the objectives of each would help regulators and Data Protection Officers in determining what is appropriate. This in turn would give greater clarity to customers and help safeguard privacy.

How would this model fit within existing privacy law?

There are four main sources of (relevant) UK data privacy law:

  • At the overarching level, there is the Human Rights Act 1998 (HRA), implementing the European Convention on Human Rights. In particular, article 8: respect for private and family life constrains the UK’s freedom to legislate in a way that disproportionately infringes privacy.

    (EU law establishes an overarching privacy principle more directly, through the EU Charter of Fundamental Rights, article 8 – right to the protection of personal data. This does not apply in UK law but, in practice, the HRA is likely to have a similar effect.)
  • The article 8 principle is implemented into UK law through the UK General Data Protection Regulation (UK GDPR), the post-Brexit equivalent to the EU GDPR. The UK GDPR is supplemented by the Data Protection Act 2018 (DPA). The UK GDPR and DPA regulate the processing of personal data both within and outside the financial services sector.
  • There is also specific regulation of the processing of personal data collected for AML / CTF / similar purposes – the  UK Money Laundering, Terrorist Financing and Transfer of Funds (Information on the Payer) Regulations 2017 (AML Regs), which, as well as requiring data to be collected and retained for these purposes, also:
    • (regulation 41) prohibit processing of these data except for AML / CTF / similar purposes, as specifically permitted by law or with data subject consent; and
    • (regulation 40) require AML / CTF / similar records to be deleted after five years, with limited exceptions. 

These regulations, which pre-date Brexit, implement requirements of the EU anti-money-laundering regime.

Key UK data privacy principles, articulated in the UK GDPR, include:

  • All processing must have a lawful basis (AML processing is typically carried out based on the “required by law” or, in limited cases, the “legitimate interests” lawful basis – the latter requiring a careful balance to be struck between the interest in fighting financial crime and the privacy of the data subjects)
  • There must be transparency about the processing of personal data
  • Data processing, including sharing between organisations, must abide by a purpose limitation principle (here the UK GDPR principle is reinforced by AML Reg 41)
  • Data minimisation: personal data shared or otherwise processed must be minimised, and personal data must be deleted when they are no longer needed
  • Particularly tight restrictions on the processing of sensitive personal data, including data relating to actual or alleged criminal offences
  • Data subject rights of access to their personal data, to require them to be corrected, object to their processing, etc.
  • Personal data security requirements

Ultimately, any AML model can be implemented under UK data protection law if it is consistent with the HRA requirement to respect private (and family) life. However:

  • Careful thought would need to be given to the proportionality of the proposed model, and particularly to the data sharing within the model, from the privacy perspective – in the final analysis, this is likely to be as much a political or public / user opinion as a legal issue;
  • Since the proposed model involves:
    • new collection, analysis and sharing of very significant personal data, going beyond current AML arrangements,
    • which would not be optional – i.e. a user would not be able opt out of the collection and sharing of their data and still take full advantage of the digital pound,
    • it would need to be underpinned by new legislation (or new regulatory rules), allowing it to go ahead based on the “required by law” lawful basis; and
    • detailed consideration would then need to be given to each of the specific (but less fundamental) requirements of the UK GDPR and how they can be addressed within the model.

Customer integrity and confidentiality considerations

What is the customer’s perspective?

Customers would need to understand the role of their separate relationships with their PIP and the Compliance Service and understand how each one would share personal data with the other. There are four key elements for considerations:

Outsourcing 

Third-party risk would be an inherent part of the future-state model. In today’s parlance, the PIP is outsourcing functionality for Customer Due Diligence and Transaction Monitoring to the Compliance Service. The customer would need to understand the risks associated with such a data sharing relationship and how to seek redress in the event that a risk were realised.

Compliance Services would need to be certified against a set of clearly defined regulatory standards. As commercial entities there would need to be a market of different providers so as to provide proper choice to the customer and ensure price competition. In the event that a particular Compliance Service were no longer financially viable, there would need to be procedures that could be enacted to ensure continuity of service. Customers would also need the assurance that appropriate contractual arrangements are in place between the PIPs and Compliance Service agencies. 

Data Sharing & Access

Fraud vectors change rapidly and the methods of detecting fraud need to be sufficiently agile in response. The Data Sharing Agreements between the organisations need to be structured based on an appropriate lawful basis so as not to constrain the effectiveness of each to be able to perform their role effectively. Furthermore, these organisations would be bound by applicable privacy laws, mandating that the principles of data protection and relevant obligations are adhered to. 

Data Security

Organisations would be obliged to implement appropriate Technical and Organisational Measures commensurate with the risk. As discussed below, a number of technologies can be deployed to protect privacy and prevent data breaches. The privacy principle of data minimisation assists here: the less personally identifiable information (PII) shared, the lower the risks from data breaches. 

User Control

Digital identity is the direction of travel in technology standards towards user-centric sharing of data, whereby the individual uses standardised protocols to permit the sharing of data through infrastructures with sufficient integrity that the receiving party can rely on it at face value without further processing. This model would enable the permissioned sharing of the data needed for initial and perhaps on-going Customer Due Diligence by the Compliance Service. 

Open standards for user controlled sharing of data are emerging from the World Wide Web Consortium and other sources.

Which privacy enhancing technologies could be employed? 

Technologies support organisations in meeting their obligations to protect personal data under privacy laws. The workshop reviewed the mature and emerging technologies and standards that could be leveraged to enable the proposed model.

Technologies are required that can both prevent and detect illicit activity – and do so in a way that strikes the right balance between privacy and transparency – minimising data shared and maximising insight.

Prevention Technologies

The first and most obvious prevention technology is advanced electronic signatures, performed with a private key that only the signatory has access to. If the private key is held in a secure element on the user’s device then physical access to the device may be needed to perform a transaction. This technology is extremely mature and well proven being deployed across the payments sector – especially Europay, MasterCard® and Visa® (EMV) contact and contactless payments. It ensures that transactions are strongly authenticated. The effectiveness of this technology is seen in the low fraud rates for EMV. In addition, when combined with strong identity verification, the identity of the person performing a transaction can be known with a high level of assurance.

Emerging technologies and techniques such as Zero Knowledge Proofs (for example when performed with Verifiable Credentials) allow transactions that benefit from the same strong authentication and integrity as advanced electronic signatures but allow the signatory to minimise the amount of personal data that is shared. For example a payee could provide strong cryptographic evidence that they meet the age criteria for the transaction without giving any additional information away. This technology is relatively new but should be scalable and is being widely developed by the decentralised identity community.

Homomorphic Encryption is an example of a less mature technology that could have some interesting applications in CBDC in the area of AML. The technology uses advanced cryptographic techniques to perform operations on encrypted data without needing to decrypt the data first. For example, an encrypted value (e.g. a passport number) could be compared with encrypted lists of values to determine if the value is present in the list. Techniques like this could allow data to be checked at other financial institutions without breaching privacy controls. Homomorphic encryption is however computationally intensive – so its practical use for AML is still to be proven.

Detection Technologies

Tokenisation is simply the technique of replacing personal data with pseudonymous values. The technique is already used widely in the payments industry to prevent fraud – when a merchant stores a card on file they will store a token rather than the actual card number itself, limiting the impact of any data breach. The technique is also used by some the largest ecommerce fraud detection platforms. Device and customer data is tokenised (or one-way hashed) before being submitted into a data pool. Within the pool tokenised “identities” or “devices” may be flagged as high risk, where a merchant has previously reported a problem. This allows the platform to detect high risk actors using pseudonymous information only. The platform will know an “identity” is high risk but will have no information on who that “identity” actually is.

Differential privacy is a technique where noise is injected into data, so that data becomes pseudonymous and potentially anonymous. The technique is an approach rather than a single technology per se. The idea is that once noise has been added to the data, data analytics techniques can be used to gain insights from the data. This technology is nascent and whilst direct application for CBDC would need further analysis it provides another potential framework for overcoming the regulatory barriers that prevent collaboration on data.

Last but not least, confidential computing seems to have potential. It leverages the trusted execution environment capabilities built into the chipsets of servers deployed in many data centres. Similar to the HSM technology that is the mainstay of industrial strength cryptographic key management (but without the physical tamper resistance) it provides secure isolated environments for processing sensitive data. Two key advantages over HSMs will be cost and processing capacity. Whilst there will be some performance overhead in using confidential computing, the potential scale is much greater than HSMs providing a rich environment where complex processing – such as AML-related data analytics – can be performed with high degrees of privacy and security. Confidential computing is already supported by mainstream cloud providers with a growing set of support tools.

Discussion and next steps

Identity and privacy are emotive subjects. Popular concerns around the introduction of a CBDC often focus on erosion of privacy rights even though Central Banks provide assurance that the privacy model for a CBDC will be the same as exists today.

The ‘thought experiment’ discussed in this workshop starts from the supposition that we should not simply replicate today’s model for prevention of money laundering and criminal finance into the design of a digital Pound on the basis that it is not achieving what anyone wants: costs are high, effectiveness is low and the privacy model is far from clear to most customers.

But the subject is complex and, as reflected above, much greater analysis is required. Inevitably, the workshop did not have time to do more than skim the surface. However, the feedback was that the idea warranted further investigation in the light of current initiatives such as the new Economic Crime Plan 2023 to 2026.

For example, HM Treasury is currently analysing the feedback on its 2023 public consultation on reform of the AML / CTF supervisory system. The consultation did not consider the fundamental logic of the money laundering regulations themselves but at the complexities of supervising the compliance of 21,500 financial services firms, 265 gambling firms, 33,911 entities in the accountancy sector, 8,462 in the legal sector as well as the 36,960 firms supervised directly by HMRC. 

In 2024 the Digital Pound Foundation’s Identity & Privacy Working Group proposes to do more investigation into this model and how it could be introduced in the context of other initiatives underway. The unanswered questions raised at the workshop will inform the work:

  • Who has responsibility for making a payment authorisation? If the Compliance Service has an indication that the payment might be money laundering or criminal finance should this prevent the PIP from fulfilling the payment?
  • What is the business case for the new model? Who pays for the Compliance Service and how much does it cost? How many Compliance Services are needed to make a competitive market and which types of organisations would be able to provide such a service?
  • Would people be able to understand how their data is being used? Would they be comfortable or concerned? 

We would welcome your feedback on this proposal. Please email us: press@digitalpoundfoundation.com.

Submit your feedback

Claim a FREE Subscription

Receive our monthly newsletter, PLUS invitations to our webinars and events.