Tokenisation - Moving to "autopilot”

July 20, 2023

In previous articles in this series, my colleagues have explored some of the key steps needed when a financial institution (FI) decides to move forward with tokenisation – land on the right strategy, align legal, regulatory and risk operations. In this next article of the series, I examine the technological and operations considerations around tokenisation.

What are the key technology issues around tokenisation?

As banks look to build their technology, core to this should be interoperability, programmability and driving innovation. This allows automation of transactions involving money as well as financial and real assets. This reduces the need for manual settlements as contingent transfer of claims and transactions can take place through smart contracts.

For technology leaders tasked with helping their organisation meet its tokenisation strategic goals, understanding the detail and the different choices available is essential. They need to build a secure, resilient and scalable platform with several key technical dependencies:

· Security

As tokenisation platforms become more prominent, they will likely face increased fraud and cyber incidents. It is therefore important that organisations develop an information security programme that adequately addresses IT and information security risk with key controls safeguarding sensitive information.

· Interoperability - private vs public blockchain

I have spoken to several FIs who feel they need to use private blockchains for tokenisation and primary issuance. This is because there is no third-party risk, they can install their own governance and ultimately have more control. Yet others I have spoken to believe private chains are irrelevant and that everything should be built on the public chain. Public chains are more scalable and are today enhanced with a high level of security framework that can meet expectations of risk teams and regulators alike.

Whether its private or public, the key is that any chain needs to be designed so it can be interoperable with others i.e., public and private chains can interact. We are unlikely to reach a point (at least anytime soon), where there is a single, approved platform everyone agrees to use, therefore financial institutions need to help ensure their chosen solution, whether private or public, bridges across both chains.

This interoperability will allow market participants to connect seamlessly and avoid banks having to run multiple platforms and operate in numerous regulatory environments.

· Complexity - choosing the right token and programmability

One of the key technology decisions that will impact programmability, is over which token standard to use. Currently, there are several different types of tokens being used across different Distributed Ledger Technology (DLT) systems. The table below outlines the differing characteristics of some of the most popular ones.

No alt text provided for this image

· Infrastructure dependencies- integration with existing systems

Whilst much of the technology focus around tokenisation is on building on DLT, equally important is integration with existing systems. Technology teams need to think about where and how to best integrate with the current technology stack. Some key considerations are around resilience and security. This includes helping ensure a high level of protection around securing data privacy and planning against hacking.

This integration also allows banks to connect tokenisation systems to their internal systems. For example, the finance division should have a direct connection, allowing a live holistic view of the organisation’s positions.

· Scalability and speed – the dangers around execution

As with any complicated technology implementation there are high risks around execution. A high-profile example of the difficulties was the Australian Stock Exchange (ASX) who had ambitious plans to introduce a new DLT platform that would cut costs by eliminating the need for reconciliation. It was hoped the new platform would create shared data to enable participants to design new business models, such as transparent reporting.

Yet after several years of expensive delays the project was called off. When looking at the reasons for failure, there are some important lessons to learn:

  • Distributed systems can mean slower transaction processing, especially as data has to flow twice, from the client to the ledger and back.
  • Concurrency (letting processes operate simultaneously) can help scale processing. But it becomes a problem when multiple trades involve the same data set, such as a security identifier, or the same broker.
  • A review found workflows weren’t being tailored for a DLT environment, designs were haphazard, and adding new functionality would require major tech work and migrations of core systems and APIs.
  • Client/vendor coordination was poor with different views on success metrics.
  • Holding data (the common data model used across workflows such as settlements), was at risk of outages.

Not all of these failings were around the technology. It’s also important to know how various components need to be able to work together. It’s why institutions should be ready for significant due diligence before selecting what technology companies to use, and then work hard to help ensure an effective working relationship.

So, what does technology end state look like?

Now we have covered some of the key considerations and constraints around technology, let’s turn our attention to what the high-level architecture around tokenisation could look like.

The outline below highlights the number of essential components that need to be included and how they could interact with each other. All the elements included are the key building blocks needed for tokenisation. This architecture addresses the technical components needed to make tokenisation work efficiently, whilst also ensuring alignment with internal systems and clean interactions on the front end.

No alt text provided for this image

Each financial institution’s technical teams and technology leaders will need to carefully design a bespoke architecture. This will allow a financial institution to maximise the benefits from tokenisation, for its customers and for itself.

This series of articles aims to explain that the journey is not straightforward, from deciding the right strategy, controls, regulatory position and ultimately designing the right system. Given the benefits, and the mass of competitors looking to explore this further, it is something organisations cannot afford to ignore, with many in financial services excited by the potential upsides around tokenisation. Turning that goodwill and intention into a well-designed, secure system that works well is not straightforward. Technology teams will need to carefully design, test and implement this new architecture to ensure the potential around tokenisation is realised.

The views reflected in this article are the views of the authors and do not necessarily reflect the views of the global EY organisation or its member firms.

Key contacts: Strategy & Change Pierre Pourquery, Eric W. and Emanuel Vila, Legal & Regulatory Monica Gogna and Christopher Woolard CBE, Technology Muneeb Shah, Risk & Control Mark Selvarajan Richards, CFA, Prateek Saha, FRM, and Rupal Thakrar, Assurance Amarjit Singh 💙 and Laeeq Shabbir and Digital Assets Insights and Assessment (EY DAIA) Mely SOMKHIT and Bronwen Bedford