A new joint report from Boston Consulting Group (BCG) and Ripple forecasts that the market for tokenized assets, or real-world assets (RWAs), could soar to $18.9 trillion by 2033, signaling the imminent mainstream adoption of tokenization technology. This projection is based on an expected compound annual growth rate (CAGR) of 53%, landing between the report’s conservative $12 trillion estimate and a more optimistic $23.4 trillion scenario.
Tokenization, the use of blockchain technology to record ownership and transfer assets like securities, commodities, and real estate, is gaining traction across the financial sector. Many traditional financial institutions are turning to tokenization for greater efficiency, reduced settlement costs, and the ability to enable 24/7 transactions. Noteworthy examples include JPMorgan’s Kinexys platform, which has already processed over $1.5 trillion in tokenized transactions, and BlackRock’s tokenized U.S. dollar money market fund (BUIDL), which is nearing $2 billion in assets under management and increasingly integrated into decentralized finance (DeFi).
“The technology is ready, the regulatory environment is evolving, and we already see key use cases in action,” said Martijn Siebrand, Digital Assets Program Manager at ABN AMRO, as outlined in the report.
Early Wins and Expanding Use Cases
The report highlights successful early use cases, such as tokenized government bonds, particularly U.S. Treasuries. These allow corporate treasurers to shift idle cash into tokenized short-term government bonds with no intermediaries, providing real-time liquidity management.
Private credit is another rapidly developing area, enabling access to markets that have traditionally been opaque and illiquid while offering clearer pricing and fractional ownership. Additionally, tokenization in carbon markets is gaining attention, with blockchain-based registries improving transparency and tracking of emissions credits.
Barriers to Widespread Adoption
Despite the promising growth of tokenization, the report points to five primary obstacles hindering broader adoption. These include fragmented infrastructure, limited interoperability between platforms, inconsistent regulatory frameworks, unclear custody guidelines, and the lack of standardized smart contracts. Currently, most tokenized assets are settled in isolation, and the absence of a universal delivery-versus-payment (DvP) standard is preventing liquidity from flourishing in secondary markets.
The regulatory landscape remains a major challenge, with different countries taking varying approaches. While regions like Switzerland, the EU, Singapore, and the UAE have established clear frameworks for tokenized securities, major markets such as India and China still have restrictive or ambiguous regulations. This inconsistency makes cross-border operations difficult, forcing firms to adapt their infrastructure to individual markets.
Tokenization’s Path Forward
The report outlines three distinct phases for the tokenization market: initial adoption of straightforward instruments like bonds and funds, followed by the introduction of more complex products like private credit and real estate, and finally, a full market transformation including traditionally illiquid assets like infrastructure and private equity. Most firms are in the first or second phase, and the scalability of tokenized assets will depend heavily on regulatory clarity and the continued evolution of infrastructure.
Tokenization promises significant cost savings, particularly in processes like bond issuances, collateral management, and real estate fund tokenization. The report notes that tokenization projects can now be launched for under $2 million, while large-scale, end-to-end integrations could cost up to $100 million for major financial institutions.
Despite these gains, the report cautions that without coordinated action from the industry, tokenization risks re-creating the fragmentation it aims to address. Jorgen Ouaknine, global head of innovation and digital assets at Euroclear, warned that without a unified effort, siloed solutions could emerge within the tokenization space, undermining its potential.