- Accueil
- / Publications
- / Articles
- / How Interoperability and Technical Sophistication Will Shape Tokenization Progress
How Interoperability and Technical Sophistication Will Shape Tokenization Progress
-
mai 01, 2025
-
By the end of 2024, the market for tokenization of real world assets had surpassed $14 billion, an increase of more than 65% since the start of the year.1 A range of traditional financial institutions, defi companies and other businesses have continued to build momentum around their pilot tokenization projects, with some having moved, or in the process of moving, into full implementation.2 Still, even amid a surge of activity, there are numerous potential pitfalls that may stall projects or dilute their ability to deliver returns on investments.
Tokenization is the process of establishing tokens to represent real world items and the value of units of those items on a blockchain, so they may be managed, transacted with, tracked and/or shared. Tokenized assets are not the same as cryptocurrencies. Organizations moving in this space must build awareness of the roadblocks and risks across the tokenization landscape, no matter what stage their project is in. This will be critical to remediating potential issues before they escalate and strengthening the foundation of tokenization programs before moving them forward. Several early stage considerations were discussed in a recent Bloomberg article, which addressed the importance of prioritizing strategic planning in the face of a race to innovate.3 The article discussed common pitfalls in tokenization pilots, including stakeholder misalignment, project funding defining objectives.
As projects progress, banks and financial institutions are also finding that the reach and impact of their tokenization solutions are limited by the fact that they’re siloed between each institution. The industry has yet to solve the issue of how the environment within one bank will interact with that of another, and in a way that ultimately isn’t disruptive to consumer transactions and services.
Addressing the interoperability and integration problem will require tokenization projects to be designed from a clear understanding of market trends and available technologies. Any given institution’s platform will not work in a vacuum. Therefore, project leaders will need to consider what technology is currently available, as well as the new products and emerging solutions on the horizon that may impact future interoperability needs and capabilities. Institutions across the industry will be engaging in complementary work and technology development in parallel — thus, all participants will benefit from establishing some form of connectivity and cross-industry dialogue, while also maintaining their own platform’s independence. One potential format for such collaboration is to identify and coordinate consortiums that establish a sustainable framework for operating with others across the market. A recent strategy project FTI Technology’s Blockchain and Digital Assets practice led for a telecommunications company provided a basis of how a consortium could be established to provide a framework of industry standards and technical interoperability. It included a business case for how a decentralized autonomous organization (“DAO”), a collectively-owned, blockchain-governed organization working towards a shared mission, could help scale inter-industry collaboration.
Using a similar model to help establish interoperability for tokenization platforms would help to facilitate collective, objective decision making about shared focus areas, as well as secure collaboration between peers and third parties. The DAO’s interaction rules could be codified into automated smart contracts to support governance, providing mechanisms for funding projects and resolving legal issues between participants. Over time, as the base network extends to use cases beyond those for which it was initially created, this framework could enable new ideas and applications to be built, onboarded and run in a way that is beneficial and compatible to the whole network.
In the absence of a unified DAO or other body for inter-industry collaboration, there are also numerous features organizations can look for when evaluating tokenization interoperability solutions and bridging technologies, which aim to provide connectivity between different blockchain platforms. Because these solutions are highly complex and nuanced, evaluation typically requires the support of experts deeply familiar with blockchain technology and capable of effectively testing transactions in a closed environment to confirm the system is working without having to share proprietary information with other institutions.
With the support of technical experts, tokenization project teams can also monitor the market dynamics and technology providers impacting their industry. Testing platforms at scale is also essential to understand how a system will perform under pressure and in various scenarios. Integration with existing systems, data flows and processes also requires careful thought, prioritizing a balance between aspirational plans and what is currently practical. For example, whilst decentralized identities and web3 wallets can be used for identity verification, these are currently not commonplace, so architectures need to be made to accommodate these in the future while also using existing identity mechanisms in the present.
In addition to interoperability, many tokenization projects are also faltering in the face of governance and monetization issues.
From a governance perspective, the inherent challenge is how to establish controls and protections, in this case, around tokenization offerings, without stifling agility. Ultimately, organizations will need to define responsibilities for how their platforms and the broader network will consistently do what they’re meant to and with the appropriate parties. Additionally, within enterprise platforms, monitoring, reporting and controls must be established to ensure that if anything goes wrong technically, legally, financially, etc., there are defined channels and processes for recourse.
Monetization and return on investment are still somewhat ambiguous within many tokenization projects. Again, in the race to innovate, the question of how operational efficiency will be achieved to maximize return on investments and optimize capital has been overlooked in many initial projects. Teams must work with experts to evaluate the true cost of their technology platforms and operational supports needed to maintain them. Then, these costs must be balanced against customer experience needs and the expectations for what revenue or new business opportunities will be driven by token-based offerings and programs. The cost, value and ownership of intellectual property created in these initiatives must also be assessed and factored into overall monetization calculations.
One issue is in how the platform is developed. Creating an evenly funded and monetized consortia spreads any investment risk, but at the expense of speed and agility. Also, privacy and confidentiality are concerns; financial institutions only want competitors to see and know certain things. Consortiums can be effective but also requires more transparency than some institutions may want, which may then require working with a neutral third-party expert for confidentiality protection.
Return on investment is another important consideration. Development teams may benefit from combining blockchain and tokenization investments with artificial intelligence initiatives. This can enhance the returns and help projects stay in motion, rather than stalling because they are competing with AI programs for the same technology development budgets. Moreover, the two technologies can be leveraged together to enable more automation, transparency and verifiability across numerous systems. As financial institutions pursue tokenization as a means to create competitive advantage and drive new business channels, building knowledge of the market and technology landscape will be critical. Companies on the front end of innovation will require a strong team of experts who can anticipate and quickly overcome roadblocks. Laggards may end up needing to invest more, so that they can catch up once the space has matured, or be forced to adopt others’ technologies and standards. Those in the middle tier (e.g., companies that aren’t comfortable with leading the industry but do not want to lag) will need to maintain a very close watch on the market dynamics, so that they can effectively keep pace with first mover activities. Across the spectrum of first movers, mid-tier and followers, technical sophistication, and the ability to effectively achieve interoperability balanced with independence, will be central success factors.
Footnotes:
1: “Tokenized RWA Market Surpasses $14 Billion,” The Defiant, December 18, 2024.
2: “What Treasury Teams Can Learn From Central Banks’ Tokenization Projects,” PYMNTS, January 28, 2025.
3: Steve McNew, Sam Davies, Serkan Ersanli, “Tokenization’s Success Hinges on Proper Controls and Governance,” Bloomberg, September 23, 2024.
Related Insights
Related Information
Date
mai 01, 2025
Contacts
Senior Managing Director, Global Leader of Blockchain & Digital Assets
Managing Director