Tokenisation is a hot topic in financial services today. The Markets in Crypto Assets (MiCA) regulation was signed in May this year and will apply from December 2024. In the UK, the Financial Conduct Authority (FCA) recently shared views on how tokenised funds can be developed, and outside of Europe tokenisation is also a hot topic. Tokenisation is frequently touted as an exciting innovation; however, there is a healthy scepticism over how transformative it will prove to be. There is also debate over what the term fundamentally means. This article aims to provide a level set on what tokenisation is, the potential use cases and considerations for the financial services market.
Untangling tokenisation
Tokenisation is the process of transforming ownership rights for an asset into a digital token. Tokens can be stored on either centrally governed or distributed ledgers such as blockchain.
Most commonly we think of cryptocurrencies when talking about tokens. Cryptocurrencies in the form of fungible tokens such as bitcoin or ether, or non-fungible tokens like digital artwork are new assets that are created on chain. However, tokenisation also takes assets from the real world, including fiat currencies in the form of token based CBDCs, tokenised deposits, or fiat backed stablecoins. Alongside tokenised securities that are created natively on chain, tangible assets like gold or real estate and intangible assets like copyright or patents can all be tokenised.
The potential for tokenisation
The Bernstein Report (2023) estimated that 2% of global money supply could be tokenised in the next five years via stablecoins and CBDCs, equating to three trillion US dollars. Furthermore, potential for a five trillion US dollar opportunity for financial markets driven by real estate, private market funds and securities was highlighted.
At a high level, the benefits from tokenisation could include ease of accessing assets, greater liquidity and of course efficiency. However, we are not there yet.
When discussing tokenisation with a panel of experts at SIBOS 2023, there was agreement that to unpick the value of tokenisation would mean identifying:
- The use cases where tokenisation can truly add value to end clients
- The business model and associated business case for offering tokenised assets
There are numerous possibilities for tokenisation, however, below are a few use cases that were a focus of the discussion:
Repurchase Agreements (Repos) – speed is critical for repo and other securities financing trades and will be even more so with shortening settlement timelines. The benefit of tokenisation could be risk free execution much faster than the existing processes through digitally managed repos with a smart contract tokenising the underpinning collateral. Global banks are already exploring this for internal repo trades and intraday lines: J.P. Morgan’s Onyx platform enables the exchange of assets using the JPM Coin System to represent ownership of underlying securities. In the first two years of Onyx launching, $300bn USD had been processed. Similarly, Broadridge Repo Module is a distributed ledger solution processing 1.5tr repos a month. Whilst these volumes are low, they do indicate the beginnings of a new market.
International trade – in international shipping, a bill of lading (BoL) is the legal document representing receipt of cargo. This remains a heavily manual process which has proven difficult to digitise as validating the certificate of ownership is complex. This is where tokenisation could prove a valuable option – tokens created on chain are unique and cannot be copyrighted so would help unblock the challenge of digitising paper processes.
Atomic settlement – settling instantaneously is a target for many players in the industry. For example, Euroclear’s distributed ledger technology (DLT) platform for securities bond issuance is using real market money to test out a primary issuance use case in a controlled way: cash is on chain and allowing atomic settlement without a CBDC or stablecoin. This evidences real DLT innovation that is bridged to the traditional system, providing options to clients.
In these cases, delivery versus payment (DVP) remains the core transaction and has settlement risk in the process. Whilst quick and smooth, this is not happening simultaneously today, although could in the future. Regardless, debate remains on the meaning of atomic settlement – taking DVP is possible today at a trade-off between liquidity and risk. Clients would need to pay to get immediate finality which would be much more expensive. Allowing settlement on a net basis creates liquidity and the cost would need to be weighed up, something that does not matter most of the time but does at points of crises.
So how do we move forward?
The market is not ready for mass adoption at this point, but is moving forward. Financial institutions (FIs) should consider where they can get involved through safe experimentation in sandboxes, such as the UK Digital Securities Sandbox. Some key questions remain open regarding how interoperability will work, the potential role of bridges between chains. It would be risky to dive in to building multiple bridges as duplication between financial market infrastructures (FMIs) would still exist and would not eliminate the challenges with reconciliation. The data model and infrastructure needs to facilitate seamless communication between applications in a secure and privacy enabled way, avoiding separate duplicative systems.
Connectivity is at the heart of making progress with tokenisation. The target is for interoperability to deliver safer, faster and cheaper services but this needs to be in a way that the participants are comfortable to follow, which is supported by a clear business case. Industry dialogue is essential between FMIs, FIs, regulators and others to avoid too many siloed initiatives and ensure learning early and together. Defined standards will also help to support progress.
A further enabler to realise the benefits of tokenisation is the need to get cash on ledger as currently depositor liability is not there. This needs to be weighed up with understanding the liquidity consequences, for example if fragmented through side by side tokenised and non-tokenised.
We need an end-to-end vision for tokenisation but it is a challenge to decide where to start and can be seen as an additional layer of cost. Sandboxes will help to prove the value and niche use cases could be a better place to start – evidencing benefits before expanding into the larger, mainstream cases.
Our Experts
Related Insights
Meet the payments leader of the future
Traditional banks are starting to realise the huge revenue-generating potential of their payments functions and are transforming their operating models, embracing new technologies, and ushering in fresh ways of working. But there's a piece of the puzzle remaining: people.
Read moreWhy banks need to evolve their legacy payments operating models
Evolving your payments operating model could hold the secret to greater customer-centricity and value creation. Discover our three steps to building a strategy centered on innovation and growth.
Read moreMaximising the value of your payments function
We explore how enterprise value creation can help banks achieve sustainable efficiencies, free up capacity and funding, and transform payments from a utility into a truly commercial revenue stream.
Read moreReimagining the payments function
Right now, there’s a huge opportunity for banks to transform payments from a cost line item into a valuable new revenue stream. We explore what it will take for banks to tap into this revenue-generating opportunity and deliver sustained growth and long-term value creation.
Read more