A
seminar organized by the ULI Singapore NEXT Committee introduced attendees to real estate "tokenization," or fractional investing/trading, as a potential bridge between private investors and direct ownership. Although not new, tokenization in real estate is a niche market, particularly in Asia Pacific, with Singapore hosting a small number of specialized digital platforms.
The seminar brought together experts in real estate, tokenization, and capital markets to discuss the potential of this unique approach to real estate ownership. A straw poll revealed that few attendees considered themselves knowledgeable about the topic. The experts explained that tokenization can be an attractive option for private real estate investors by addressing four key challenges: lowering entry barriers, providing access to vetted investment opportunities, removing active asset management burdens, and offering enhanced liquidity.
Fraxtor CEO Lee said his company uses blockchain to "tokenize" real estate assets, allowing investors direct access to assets. Tokens avoid the volatility of stock markets and offer a minimum investment size higher than REITs but lower than private equity funds. The panel agreed that more work by regulators and education for investors is needed for tokenization to reach a wider retail audience.
Knight Frank's Chay emphasized the need for careful evaluation, encouraging potential investors to do their homework. Moderator Seet asked about the types of real estate assets that can be tokenized and where investors have been most keen to allocate their capital. Lee replied that while all types of assets in many locations can be tokenized, Fraxtor's investors tend to prefer developed markets such as Singapore, Australia, the UK, and Japan.
Tokenization can also be used to invest in a private equity real estate fund or a real estate debt investment. Chay suggested that investment managers could use tokenization to raise co-investment capital for individual projects or even for the funds themselves. The panel agreed that more work by regulators is needed to make tokenization more efficient, transparent, and secure.
