Ralf Kubli says standardised algorithmic definitions must be used to ensure the entirety of a tokenised asset’s collateral, liabilities, and cash flows are represented.
There’s no denying that interest in the tokenisation of financial assets — a method of digitally representing an asset and storing that data on a blockchain — is on the rise. The field was already a USD 310 billion industry by 2022, estimated to reach as high as USD 16 trillion by 2030.
This is an exciting prospect, but the truth is the industry isn’t quite ready for that level of adoption just yet. In its current form, tokenisation is effectively incomplete. There’s still work to be done to ensure that every single tokenised asset is designed to algorithmically represent the entirety of its collateral, liabilities, and cash flows. Only once we see this done along standardised lines will this space truly begin to take off.
Fortunately, we already have the template.
The Promise of Tokenisation
In recent years, financial institutions have been increasingly exploring the practice of “tokenisation,” a method of digitally representing an asset and storing that data on a blockchain. The rationale behind this lies in the numerous advantages associated with tokenisation.
Among these are shorter settlement times and improved capital efficiency. Tokenisation can allow for automated securitisation of balance sheets relative to risk appetite, meaning firms can refinance larger parts of their balance sheets easily. It can also make it much more feasible for these companies to offer retail investors exposure to private markets, improving overall liquidity. Combined with lower operating costs and increased revenue, tokenisation stands to make financial markets overall more efficient.
Another critical area that tokenisation stands to address is regulatory compliance. Tokenised assets are essentially code represented on an immutable blockchain. The rules built in cannot be changed arbitrarily or in secret. All assets on-chain have to follow the rules built into them. By using tokenisation to build markets where illegal activity is inherently impossible, regulators can move past the need for additional oversight or enforcement. For example, if a specific product is deemed a security but only in certain jurisdictions, it can be designed so that it cannot be transferred to exchanges in those regions. There will be no punishment for doing so because it won’t even be possible.
The Final Piece of The Puzzle
It’s exciting to see all of the benefits that tokenisation could offer, but the industry isn’t quite there yet. Right now, the majority of tokenisation is being done in a way that doesn’t accurately reflect the entire set of data that defines an asset. Often, these tokenised goods fail to encode critical information about what they represent, such as financial obligations, liabilities, or cash flows over time. It’s not uncommon for firms to simply encrypt the hash of a PDF containing a human-readable financial contract. This means a human must still unencrypt the data and interpret or calculate what needs to be done. This is not more efficient, is not scalable, and allows for the same lack of transparency that triggered the 2008 financial crash.
Instead, what is truly needed is a fusion of blockchain benefits and open banking standards, for the creation of true “smart financial contracts.” Smart financial contracts encode not only the information about the tokenised financial asset but also clearly define all payment obligations of the parties to the financial contract, i.e., cash flows, and, as a result, define the asset and liability side. Higher information quality, transparency, and efficiency in price discovery and analysis based on machine-readable and machine-executable term sheets will enable innovation in trading and securitisation. Company-wide risk management becomes easier, and systemic risk management is again possible. The entire system would be transparent and machine-auditable and could easily be stress-tested for various market conditions.
Fortunately, there are already standardised algorithmic definitions that could be coded into tokens to create such contracts. Referred to as the Algorithmic Contract Types Unified Standard, or ACTUS, these royalty free standards were adopted in 2011 by the Office of Financial Research, a branch of the US Treasury created to respond to the economic turmoil of over a decade ago.
ACTUS combines a concise data dictionary that defines the contractual terms present in financial contracts with a complete taxonomy of fundamental algorithmic contract type patterns. This allows all cash flow obligations established by the contract to be accurately projected and analysed by all parties over the entire life of the contract. Such a framework is exactly what needs to be implemented in tokenised assets in order to provide complete transparency and interoperability.
Any firm could easily gather accurate information about their balances, obligations, and all cash flows. They could project out how they would fare under various market conditions and prove to both investors and regulators that all of their services are 100% backed up. Auditing could be fully automated and real-time, meaning there would be no ability to hide the current state of the company, and it would be impossible for anyone to commit fraud.
The State of Things Now
Currently, there’s a lot of growing interest across many major firms surrounding tokenisation. For example, in December, BlackRock CEO Larry Fink famously stated, “The next generation for markets, the next generation for securities, will be tokenisation of securities.” It isn’t just general sentiment from one player, though; multiple companies are actively getting involved.
Major payment services like American Express have tokenisation services that allow merchants and cardholders to use “payment tokens” to settle purchases much more quickly than traditional rails. Meanwhile, Mastercard has announced the launch of its Multi-Token Network (MTN), which will begin by testing tokenised bank deposits before moving into experiments using stablecoins and CBDCs.
Speaking of these developments, just recently, the Bank of International Settlement (BIS) has unveiled a blueprint for the architecture of a “Unified Ledger” to bring all digital assets under a single network. The focus would be CBDCs, tokenized deposits, and other tokenised claims on financial tangible and intangible assets. The benefit here would be the settlement finality that comes from central bank assets residing in the same blockchain as other digital goods.
All of these examples are encouraging from the standpoint of the industry beginning to get involved. However, what’s important to remember is that it is unlikely that any single network, even one overseen by the BIS, would ever be the only network interacting with global markets. Fortunately, with standards like ACTUS, this would be just fine.
As long as all relevant blockchains create true smart financial contracts that define all assets and liabilities under a single framework, the entire ecosystem should work together seamlessly. Any networks that don’t have standard assets could exist independently but would have no access to the major financial markets, severely limiting liquidity and utility.
Tokenisation is not going away anytime soon; that is becoming clear. With so many players exploring the possible benefits, it’s only a matter of time before a more comprehensive and interoperable system emerges. This can be achieved through ACTUS, and doing so will unlock the true potential that this technology can offer and safeguard global markets from ever again experiencing profound systemic failures due to lack of transparency.
This change will occur; the only question is how fast the industry will embrace it.
Ralf Kubli is a board member at the Casper Association and an experienced executive with a strong background in blockchain, cryptocurrencies and decentralised technology. Ralf holds an MBA from Cornell and an M.A. in History from the University of Zurich.