Privacy Roundtable – Part 1: Balancing individual and societal needs

Share this!

The Digital Pound Foundation held the first in a planned series of Roundtable events on Wednesday 21st September to discuss the topic of protecting and balancing privacy within the context of CBDCs and other new forms of digital money. 

The Roundtable was in the form of a 90-minute closed, in-person moderated discussion. It was a private session, subject to Chatham House rules, with a small select group of regulators and policymakers to observe the discussion.

The following is an anonymised summary of the key findings and outputs of the session.

Balancing individual and societal needs 

The issue of privacy within the context of CBDCs (and other new forms of digital money) is particularly pertinent. As well as the need to balance the expectations of individuals and organisations, themselves often contradictory, there is the added complexity of ensuring that associated technologies and platforms adhere to rigorous data protection and systematic controls obligations.  

Privacy is technology agnostic. Regardless of the platform or protocol, all data processing and controlling entities must comply with GDPR and data privacy principles. While it is accepted (up to a point) that governments may hold a lot of personal data about individuals (for example, DWP, NHS and HMRC), there are fears that the level of information held by the government through CBDC activity will be excessive and intrusive. In a climate where giant public enterprises like Google, Facebook et al gather and hold so much personal information there is certainly increasing concern with data privacy (and data protection). Hacks and scandals like Cambridge Analytica and other recent data attacks only add to these concerns as worries extend beyond which data is available to governments and corporations but also how is that data kept secure.  

Younger members of society seem to have fewer qualms about giving away personal identity data – they want to be tracked on TikTok and Instagram; they want to be seen – in return for which they give data away freely. Since privacy seems less of an issue for younger generations, their expectations may be lower than for those who would prefer their transactional information to remain private. That said, for those on the opposite end of the spectrum, who are not so laid back about sharing personal information, the benefits of information sharing for accountability and recourse for [digital] money transactions should not be forgotten.

“There’s always a risk with putting people in charge of their identity; they’ll give it up to Facebook for a Mars Bar, which is insane. It’s like giving people the option of having seat belts in their cars, rather than mandating it because it’s better for society as a whole.”  

Consultations carried out by the ECB (and other bodies) confirm the public’s concerns with data privacy and protection. Ultimately, trust is what will determine and drive their participation in CBDCs. This is why privacy is such an important topic when working through the characteristics of a well-designed digital pound. It is interesting to note that proponents of a digital currency don’t necessarily understand how one might be achieved. We can be certain that data privacy and protection will be fundamental concerns that will have to be addressed to secure user buy-in.

GDPR regulation sets the standard for data protection across Europe and the UK; CBDC may present the opportunity to assess if GDPR is the appropriate standard to strive for or whether, once stakeholder needs and expectations are better understood, a different framework is required. The most obvious challenge with making GDPR the starting point is that it is more about data protection (protecting personal data from unauthorised access/revelation) than privacy (not collecting it at all). 

There’s also the question of the extent to which GDPR puts control of personal data into the hands of individuals (i.e., they can decide what their data is used for) and whether that is sufficiently robust in the context of publicly and privately issued digital money and any commercial intentions for data collection. Does there need to be deeper and broader government policy beyond GDPR?  Should digital currency stakeholders determine where new data protection dials should be set?

Work by Helen Nisenbaum at Cornell Tech highlighted this, and in particular, the presumption in many circles that it should be sufficient to regulate use of data, rather than data collection itself. The difficulty with this presumption is that it is pretty much impossible to verify how data has been used (once it has been collected).  Data minimisation – preventing people having to disclose personal data in the first instance – may seem a simpler strategy than working out how to protect it once collected, but realistically there is always going to be a requirement to collect some data. As such, the need for rigorous data protection must be at the forefront of digital currency development.

“There are lots of different dimensions, different user situations, different issuers. We need to really think about CBDC public policy considerations compared with privately issued digital money. Looking at digitised DLT cash, for example, there will be different privacy considerations, depending on the instrument type, the issuer, users and usage. It’s all about choice.

What are an individual’s reasonable expectations with respect to data privacy?

This question polarises opinion: At one end of the spectrum are the proponents of complete anonymity who assert that ANY reduction in anonymity for using digital currency (compared to physical cash) would constitute a public harm (“privacy is a public good”).  At the opposite end are the ‘nothing to hide, nothing to fear’ camp that holds that individuals wishing to use a transactional service should have no qualms about giving up certain personal information to ‘prove’ that they are who they say they are.  

“We have to understand what the privacy requirements are before we can design the product, and what we mean by privacy in the context of our broader societal requirement to stop bad actors. That said, there should be no ‘presumption of guilt’ at the point of entering a transaction, nor implied agreement to surveillance simply because technology permits it.”  

A lot has changed in the last 20 years. More and more high street businesses no longer accept hard cash, and many others operate in the online space where cash payment is not an option. Consequently, it is increasingly difficult for transactions to remain fully anonymous or private. Further, “surveillance capitalism” is a new, lucrative and growing business model, with myriad networks exchanging data, new mechanisms for collecting data and packaging and commercialisation of data insights built around individual user profiles. 

The question is to what extent should an individual’s personal identity information be linked to transactions for the purposes of profiling them, for commercial and/or government purposes? A system could be designed to facilitate transactions that are completely private for the payer. If certain kinds of payments do not need to be private, rules can be imposed on merchants accepting such payments and around redemption of monies received. There are lots of ways of imposing rules to provide required information to regulators without requiring consumers to give up privacy rights a priori.

To what extent should privacy by right be enshrined in digital currencies and how do we define appropriate protections?

There’s a very important distinction between onboarding data and transactional data. It’s hard to avoid the comparison between physical cash and electronic and (now) digital money with respect to transaction anonymity, achieved by default with physical cash, as any organisation issuing non-physical means of monetary value transfer must meet requisite regulatory obligations with respect to KYC, AML, political sanctions and so on.  

“If anonymity is not baked into the design, it will be difficult to add it later. From first principles, we should start from a position where it can be used as a truly anonymous token or transaction mechanism. If AML, and counter terrorism, and all those sorts of very reasonable things are then required, services can be added on top. But you can’t do it the other way round. It’s too late to discover that your government (or other parties) are doing things you don’t trust them to do with your data.”

The relationship between anonymity and accountability and recourse is an interesting one. Where anonymity is achieved, for example, when transactions are made in cash and there is then no link between the individual and subsequent transactions, it is not too dissimilar to an anonymous payment card, which if lost, does not permit any kind of recourse. This presents an interesting challenge to those seeking anonymous digital currencies and whether the resultant potential absence of accountability is actually a good idea.

 “Anonymity is a way of measuring privacy. It’s not the transaction data itself that matters so much, it’s whether and how other aspects of a user’s identity are linked to the transaction, for example, bank accounts or hardware devices. Ideally, links to identity in digital currency transactions would be severed so that people retained the benefits of cash and public payments mechanisms.”  

“The reason we have cryptocurrency, if we go back to Data Challenge 1982 and blind signatures for untraceable payments, is that people want digital cash. They want to hold digital assets and be able to transact them in a way that does not link their history to transactions, and without being blocked.”

One approach might be to uncouple personal information gathered at onboarding from transactions, while still collecting other transaction data that might be useful, for example, demographic and customer choice data (what sort of people are buying what types of things). Some types of metadata collected in the transaction process might be of wider value to society. It should not be impossible in the CBDC space to use structures like Data Trust to come up with a reasonable compromise to ensure that data is collected and available for the public good. A solution that extracts toxic personal information from transactions, leaving only the impersonal record of transactions themselves, unlinked to the personal identity of spenders, might be more palatable from a regulatory perspective (per transaction data insight) without compromising individual privacy.

Beyond AML and financial crime, there is also the question of other potential uses for individuals’ data to profile behaviours, as major retail brands do with customers’ shopping data. Tesco, for example, collects and uses loyalty (Club) card and other shopper data.  Because of the huge potential reputational risk associated with misuse of this data, they have an obligation to process and manage it securely and responsibly. Nonetheless, in the event of an apparent data breach, it is always very difficult to prove that data has NOT been leaked. 

“We need to be practical, to elevate above the landscape and find use cases, whether it’s transport or shopping, and say that it’s cheaper, easier, faster for the general public. They won’t care; they’ll just see their cost of living being reduced by this new technology which is a good thing. They won’t even mind a bit of data being leaked here and there – the horses have already bolted.”

The challenge and complexity of data privacy is particularly relevant with respect to the development of CBDCs. From a technological point of view, a digital currency can be designed to be completely anonymous, or completely transparent, or anywhere in between these two extremes. However, this is not a technology decision; stakeholders must determine between themselves what levels of data privacy and protection are appropriate, and this requires a lot more education around different use cases. 


Quotes provided in this article were provided by the participants at the Privacy Roundtable.

Those that attended as participants included:

  • Jamie Andrew, Clifford Chance
  • Sean Devaney, CGI
  • David Putts, Billon
  • Geoff Goodell, UCL
  • David Rennie, IDemia
  • Dave Birch, International Keynote Speaker
  • David Karney, Worldline
  • Representatives from the Bank of England, FCA and HMT attended as observers.

Claire Conby, DPF Operations and Governance Lead and CRO at Billon Financial Ltd, moderated the session.

Join our mailing list

Keep up-to-date with the Foundation’s initiatives, including leadership papers, articles, and events.

Make an enquiry​

We'd love to hear from you!