Home   News   Features   Interviews   Magazine Archive   Founding Partners  
Subscribe
Securites Lending Times logo
Where Digital Finance

Meets Traditional Markets
≔ Menu
  1. Home
  2. Interviews
  3. Gracy Chen, Bitget
Interview

Bitget


Gracy Chen


Feb 2026

Gracy Chen, CEO at Bitget, sits down with Karl Loomes to discuss why institutional focus has shifted from experimentation to execution, what is holding back large-scale adoption, and how liquidity, custody, and regulatory clarity will shape digital market infrastructure

Image: Bitget
Tokenisation is increasingly discussed as inevitable rather than experimental. From your perspective, what is driving the shift from ‘if’ to ‘how’ at an institutional level?

For a long time, tokenisation sat in an uncomfortable middle ground, where it was technically impressive, but commercially questionable and unproven in its viability. What has changed over the course of 2025 is that institutional participation has accelerated. They are no longer evaluating tokenisation as an experiment or a passing stage — what concerns them now is how to implement it at scale. How the execution should work and what the risks are.

This shift, from where we’re looking at it, is being driven by a mix of factors, both macro and micro in nature. On the macro side, institutions are under pressure to modernise their market infrastructure.

Traditional capital markets were built in a different era, and they still carry left-over inefficiencies from those times, fragmented across jurisdictions as they are, and often slow to settle. Capital gets tied up in buffers and intermediaries, cross-border transactions add layers of friction, and payments ultimately can take days.

Tokenisation offers a way to address some of these operating issues by enabling faster settlement cycles and boosting capital efficiency. Automated post-trade processes lower counterparty risks and reduce the amount of idle capital just sitting in the system. As a result, institutions can move their operations more dynamically across markets.

On the micro side, we are also seeing real market data that proves demand exists. Just at Bitget alone, in the closing months of 2025, we recorded a 452 per cent month-over-month increase in spot trading volume and a 4,468 per cent surge in futures volume for tokenised US stocks. Over 80 per cent of that activity came from institutional participants, which tells us this is not just retail speculation but professional capital engaging with a new market structure.

What are the main structural or operational barriers that currently prevent tokenised equities, funds, or other RWAs from attracting large-scale institutional capital?

The biggest barrier, I would say, is with integration. It is a common thing for institutions to rely on counterparties, custody networks, compliance systems, and reporting frameworks that have been built over decades. All those systems and workflows are deeply interconnected, but tokenised assets for the most part still sit outside of them, which creates friction.

If these assets cannot be cleanly reconciled with an institution’s internal systems or reported under regulatory and accounting frameworks that are already known and familiar to it, then adoption naturally starts running into obstacles on the practical level.

Another prominent limiting factor is fragmentation. Tokenised RWAs exist across multiple varying blockchains, legal wrappers, and settlement models.

From an institutional perspective, this increases both operational complexity and legal uncertainty. It’s not always clear which standards apply to a given asset or how ownership is enforced.

What institution players really want is consistency and clarity. They need interoperable standards between chains and understandable legal frameworks that they can rely on, as well as predictable settlement and custody rules.

Without that level of standardisation, scaling institutional participation becomes difficult as they don’t feel confident enough to push forward in this space.

Liquidity is often cited as a key constraint for tokenised assets. What role can large exchanges play in supporting deep, resilient liquidity for institutional participants?


It’s important to remember that liquidity doesn’t appear out of nowhere — it’s designed.

Large exchanges actually play a critical role in this process because they can simultaneously bring together three key components: institutional market makers, compliant and resilient trading infrastructure, and aggregation of consistent trading demand in one venue. When all of these factors are put together, liquidity can be properly structured and sustained.

In practice, this means professional market-making programmes, deep order books, transparent pricing, and reliable execution quality. All things that are essential to encourage institutional participation. When liquidity is supported by professional market makers rather than short-term or opportunistic trading activity, markets tend to behave more predictably. This is what creates resilience during periods of volatility and gives institutions confidence that they can enter and exit positions reliably and at scale.

How do you see custody standards needing to evolve for tokenised assets to meet institutional expectations around segregation, control, and risk management?

When it comes to custody, institutions are looking for clarity more than anything else. They want a segregation of assets, well-defined control mechanisms, and transparent liability frameworks. They need to know exactly who controls an asset at any given point, how access is governed, and, ultimately, who is responsible for fixing things if and when something goes wrong.

For tokenised assets to reach institutional-grade custody, the adoption of multiple systems is required. Multi-layer authorisation for starters, as well as detailed audit trails and integration with internal risk, compliance, and reporting controls. Without that level of control and visibility — without a safety net — it is very difficult for institutional participants to commit to tokenised assets as a core part of their portfolios and strategies.

We are also seeing growing demand for hybrid custody models. Institutions increasingly want the benefits of onchain transparency, such as real-time asset status visibility, combined with offchain legal certainty. This includes clearly defined ownership rights, enforceable legal claims, and protection in insolvency scenarios.

I fully expect that custody providers that can bridge blockchain and traditional safeguards will be critical for large-scale adoption.

Regulatory alignment remains uneven across jurisdictions. What minimum level of regulatory clarity do you think institutions need before tokenised assets can be adopted at scale?

Well, like I already mentioned, institutions appreciate predictability. It is perhaps unrealistic to expect perfect harmonisation of rules in the global setting, but some steps do need to be taken.

At a minimum, the players need clarity on three points: asset classification, custody obligations, and counterparty responsibilities. If an institution understands how a tokenised asset is treated from a regulatory and accounting perspective, it can better assess the acceptable level of exposure and build the asset into its risk frameworks accordingly. That’s a necessary factor if institutions are to participate in this market at scale.

What slows adoption is ambiguity in regulation: when rules and boundaries are unclear, institutions inherently default to cautious behaviour. But on the other hand, once the rules are made understandable and predictable, participation comes down to risk tolerance and not regulatory uncertainty. That’s when adoption can grow.

From your vantage point as CEO of Bitget, what lessons should tokenised markets borrow from established securities finance and post-trade infrastructure?

The biggest lesson is that post-trade matters as much as execution. Traditional markets work because settlement, reconciliation, custody, and risk management are predictable and reliable. Tokenised markets need to earn that same level of trust. Risk controls are another key area. Established securities finance has decades of experience managing counterparty risk, margining, and stress scenarios. Tokenised markets shouldn’t ignore that history; instead, they should encode those lessons directly into smart contracts and operational processes.

Finally, transparency and auditability are essential. One of blockchain’s strengths is real-time visibility, but that only works if data is standardised and verifiable. The goal isn’t to reinvent finance from scratch, but to upgrade proven market infrastructure with better technology, while preserving the safeguards that institutions rely on.

Conversely, are there aspects of crypto market infrastructure — such as programmability or real-time settlement — that traditional markets should be more open to learning from?

Absolutely, and real-time settlement is actually a great example to bring up here. Crypto markets have shown that it is possible to move assets 24/7 with near-instant finality. That has deep-reaching implications for capital efficiency, especially for institutions. Among other things, it means that capital is freed up faster and balance sheets can be used more efficiently without being tied up in pending transactions and multi-day settlement cycles.

And yes, programmability is also an area where traditional markets can stand to learn. Smart contracts make it possible to automate a lot of market parts that still remain highly manual today, from compliance checks to settlement logic. Instead of relying on layers and layers of manual intervention, rules can be embedded and executed directly as part of the transaction flow. This can significantly reduce operational overhead and lower the risk of human errors, making post-trade processes much more efficient and straightforward.

Looking ahead three to five years, what would a mature institutional tokenisation market look like in practice — and what would need to have gone right for it to reach that point?

In a mature institutional tokenisation market, tokenised assets shouldn’t feel different at all. They would simply be another format through which capital moves. Institutions would be able to issue, trade, settle, and custody tokenised equities, bonds, funds, or commodities with the same confidence they have today in traditional markets, but with far greater efficiency.

For that to happen, a few things need to go right. First, standards around disclosure, asset backing, custody, and settlement need to be consistent across jurisdictions.

Second, interoperability must improve, so tokenised assets can move seamlessly between platforms, custodians, and settlement layers. Third, regulation needs to provide clarity without over-engineering, allowing institutions to participate while still protecting market integrity.

Most importantly, tokenisation must prove it delivers real benefits. Faster settlement, better capital efficiency, and broader access, not just technological novelty. When those advantages are clear and repeatable, institutional adoption follows naturally.
Next interview →

TRAction Fintech
Quinn Perrott
NO FEE, NO RISK
100% ON RETURNS If you invest in only one digital assets news source this year, make sure it is your free subscription to The Digital Assets Edge
Advertisement
Subscribe today