How asset managers can avoid being squashed by big data

By Eric Dickinson 

Fund providers’ tender teams are more than ever before being overburdened with requests for information and requests for proposal.

This highlights the ever-increasing need for available due diligence information, and for a central repository to hold the due diligence data, to enable instant assessments of funds and their firms.

Fund buyers are overwhelmed by the sheer volume of data that is presented to them by way of fact sheets, presentations, prospectus’ research and due diligence reports – all in different forms, therefore presenting information in an inconsistent manner.

Additionally, the information often highlights areas we already know about and answers questions in an ambiguous manner.  Comparisons become impossible to make.

In my last article for Wealth Manager, I discussed the benefits of an industry-standard ‘golden-source’ data repository of fund due diligence information, which included:

  • Information on all funds would be accessible it would answer questions raised in due diligence processes and collated by a broad spectrum of wealth managers and advisers
  • It would enable ease of comparison of different funds by presenting information in a consistent and unambiguous manner information would be maintained by fund managers and providers
  • Technology would do the ‘heavy lifting’, avoiding the need to manually collate the necessary information
  • A documented time-stamped audit would give proof of fund buyers having undertaken full due diligence on a fund prior to transaction

The above benefits lead to a more efficient industry, reducing costs for both fund buyers and fund providers, and, ultimately, reducing costs for the investor.

 The regulator’s desire to see better levels of understanding and transparency on opaque investment products is a further reminder of the need for easily accessible information that can be used as part of a full and thorough due diligence process.

An industry standard repository helps the market meet this need in a cost-effective and efficient way.

A technology-based repository can further improve on a process – where, historically, the issuing of bespoke RFP documents has been the modus operandi – by offering a hub of due diligence information that provides consistent data on funds across providers.

Volatile data alerts

Historically, due diligence has meant trawling through the small print of a lengthy prospectus, reviewing multi-page standard documentation from the fund providers, often leading to more questions than answers.

A lot of time is needed on one fund to fully understand how it operates and the risks it can take to achieve its performance. Having a better level of understanding also helps prepare for fund manager meetings.

 The due diligence data set includes both static data, that changes very rarely, and volatile data, which is often key data that affects risk and changes on a regular basis.

Therefore, the need exists for volatile data alerts, whereby the fund provider keeps the fund buyer updated on risks pertaining to volatile data as it changes – such as changes to fund liquidity and leverage, the fund management team or to whether the fund can borrow or go short on cash.

Automated alerts may be configured by fund buyers to notify them of key fund information changes, thereby avoiding missed updates from the provider that may prove critical. They could also keep the buyer informed of important changes affecting fund risk that are not provided as matter of course by the provider.

It is clear the industry needs to use a standard repository of fund due diligence data to ensure that due diligence can be cost-effective and efficiently undertaken.

Eric Dickinson is an independent consultant to the investment management industry.