Good data pays off with smoother functioning IT

By Paul Bebber
23 July 2018

The benefits of effective data management are not limited to extracting valuable client insights and honing your investment edge. It can also help make your IT infrastructure work faster and smoother.

 

Data matters

Data quality and availability have become the most important determinants of investment managers’ success.

That’s no exaggeration. As we’ve reiterated throughout this data management blog series, fingertip access to accurate, timely and far reaching data flows, on everything from clients’ profiles and objectives to the latest market news and asset movements, are fundamental to enhancing investment performance, client servicing, regulatory compliance and operating efficiencies.

But a collection of legacy systems, tactical patches, manual interventions and inconsistently formatted information won’t get your data and wider operating environment to where it needs to be, especially given the growing premium on accuracy and speed in all phases of the trade lifecycle and client relationship.

 

The catch-22 of outmoded infrastructures and data problems

The lack of an integrated technology infrastructure that allows for easy data collection, normalisation, enrichment, storage and distribution is a common problem across the industry.

Often we see firms wrestling with some combination of:

  • Excel spreadsheets and local data stores—where data is siphoned off into individual end user desktops, databases and departments, where it will be manipulated and changed. The result is a lack of proper data control, the loss of useful information and multiple versions of the truth.
  • Siloed data—in which data is managed at a point in time for the benefit of that silo, whether it is a system or business unit. Most of the data views are inward facing and application specific, based on an IT infrastructure acquired for a particular purpose.
  • Web like infrastructure—where investment managers have bought and developed a multitude of systems over time. But as data has to be moved around this sprawling, point to point infrastructure, there is a high risk of it being changed or corrupted, putting a huge pressure on audit and change management.

All of which leads to poor quality data that, in turn, flows through to downstream systems and impacts on all the processes along the way. In response, firms typically have to build a multitude of extensions and rely on manual workarounds to satisfy their data needs, leading to high cost and non-scalable operating environments.

 

Creating a seamless, efficient infrastructure

There is no easy solution to these challenges. Many well intentioned, high profile data management projects have fallen by the wayside. But there are some key steps on the road to true data quality.

One is to define what data is owned by the different departments, systems and areas within the firm, so you can see where your client, market, counterparty and enterprise data is stored and managed.

The next is to leverage an internal data store through which external data feeds flow, that can serve as a consolidated source for your downstream processes. An automated data hub, set up to interoperate with your other technology solutions and backed by robust policies to maximise the quality and integrity of your data, will give you that centralised store of readily accessible and non-corrupted information that has become so integral to success.

 

Taking advantage of the cloud

The benefits of this kind of locally deployed, centralised data service solution can be further magnified through a hosted cloud environment.

Ease of maintenance is a case in point.

Whenever a market data provider, custodian or counterparty upgrades or changes one of their interfaces, each of the data users’ locally installed solutions has to be updated. In a hosted environment, the service provider need only make the update to the hosted infrastructure once. So capability enhancements, such as upgraded interfaces, the addition of new data sources or improved reporting tools, can be rolled out to users and supported much faster, more efficiently, with less operational risk and minimal disruption.

But whether firms favor an internally installed or cloud based approach, the focus needs to be on smoother data flows. And that will lead to fewer infrastructure stress points, better system integration, faster processing, less risk and improved operating efficiencies.