Tokenization and ETFs: One Repaves the Road for the Other
The asset management market has been grappling with the same question for months, dressed in different disguises: Will blockchain-based tokenization displace ETFs? The short answer is that this binary framing is the issue, not the solution. While the industry debates who will win, it is overlooking the structural risk that both actually share.
According to data compiled by Asset Tokenization, the tokenization market could reach $2 trillion by 2035. McKinsey, on the other hand, projects $1.9 trillion for 2030 and $4 trillion for 2035. These figures justify serious conversations in any boardroom. But before getting excited about the potential market size, it is crucial to understand exactly what problem each instrument is solving and the real cost of not solving it.
ETFs Built the Road, but the Asphalt is Cracking
The central argument circulating among industry specialists is that ETFs resolved a legitimate structural problem: they made markets more accessible, more liquid, and dramatically cheaper than traditional mutual funds. This is empirically true. But there is a second part of the diagnosis that is mentioned less often: ETFs are still operating on infrastructure designed in the 1990s.
Settlement in T+1. Access conditioned on having a brokerage account. Trades limited to market hours. And a structural barrier that is rarely mentioned in fund manager reports: approximately 7 billion people globally do not have access to a brokerage account. This is not a product problem. It’s a distribution infrastructure issue.
Here is where the road analogy becomes technically precise. ETFs laid the asphalt on what was once a dirt road. But traffic volumes have changed, speed demands have evolved, and the asphalt that served well in 1993 is not designed for what lies ahead. Tokenization doesn’t demolish the highway; it changes the road surface. These are distinct engineering problems, with different costs and implementation timelines.
The risk for established players is not that tokenization will displace them tomorrow. The risk is that they begin to lose margin in the most profitable segments—institutional investors who value 24/7 liquidity and instant settlement—while their fixed cost structures remain calibrated for the previous model.
The Trap of the Substitution Narrative
When an emerging technology appears with market projections in the billions, the industry tends to organize the debate into two trenches: advocates and opponents. Both positions are convenient for selling conferences and funding rounds. Neither is useful for making operational decisions.
The CEO of OpenAssets frames it in a way that, stripped of usual commercial enthusiasm, withstands analytical scrutiny: tokenization does not fix what is wrong with ETFs; it fixes what is wrong with the roads that ETFs run on. It’s a distinction that may seem semantic but has concrete implications for any asset manager evaluating where to allocate technological transformation budgets.
There’s a difference between updating the product wrapper and replacing the settlement system, custody protocols, and distribution mechanisms. The latter comes with implementation costs orders of magnitude higher, requires regulatory coordination across multiple jurisdictions, and faces resistance from intermediaries currently capturing the rents of the legacy system.
This creates a predictable pattern: institutions with greater exposure to these intermediaries will move slowly, not out of a lack of vision, but because every month of delay in adoption is a month of protected revenues in the current model. The incentives are aligned with inertia. This is not a value judgment; it is basic organizational mechanics.
Coexistence as a Period of Risk, Not of Equilibrium
The most likely scenario in the five to ten-year horizon is not the replacement of ETFs nor their clean evolution into tokenized versions. The more probable scenario is a coexistence phase where both infrastructures operate in parallel, capturing distinct market segments with different cost structures.
That sounds reasonable on paper. In practice, coexistence comes with operational costs that optimistic analyses often overlook: maintaining two settlement systems, two regulatory compliance frameworks, and two technology stacks simultaneously is not free. For medium-sized managers, that cost can be prohibitive without a phased strategy defining exactly when and how to migrate capabilities without compromising the core business.
The players likely to successfully survive this transition are not necessarily the first to tokenize everything. They are the ones who can isolate their stable income core while experimenting with tokenized structures in lower regulatory risk segments. A tokenized money market fund is a very different bet, in terms of exposure, than tokenizing an equity ETF with exposure to emerging markets. Granularity matters.
McKinsey and Asset Tokenization projections are useful for sizing market potential. They are not helpful for determining at what point in the transition period a specific structure shifts from being a controlled bet to becoming a capital burden that jeopardizes the main operation. This calculation depends on variables that no market report can generalize: current operational margin, revenue concentration by segment, cost of capital, and internal regulatory capacity.
The Filter Separating Serious Bets from Positioning Statements
There’s a quick way to distinguish which organizations are taking tokenization seriously as an infrastructure transformation and which are using the topic to fuel innovation narratives without committing real capital: look where the risk budget is.
A firm announcing a tokenization initiative but has not modified its custody architecture or initiated formal regulatory discussions is not making an infrastructure bet. They are engaging in corporate communication. The difference between the two has a name in accounting: intangible assets with no book value versus capital investment with an amortization schedule.
The forthcoming coexistence period between ETFs and tokenized structures is likely to last longer than enthusiasts project and cost more than skeptics anticipate. Organizations that reach the convergence phase with solid balance sheets, variable cost structures, and gradually built regulatory capacity will be the ones able to capture the market when the infrastructure stabilizes. Those that have over-invested in a premature transition, or those that have ignored the signal altogether, will have less room to maneuver when the operational window narrows.









