Skip to content
Busque para obtener información sobre los productos y soluciones de InterSystems, las oportunidades de carrera y más.

Best Practices in Decision-Making in Volatile Markets

side profile of confident and professional young Asian businesswoman looking up while standing against contemporary corporate skyscrapers

The finance industry has learned many lessons over the last couple of years about business decision-making in a volatile market environment. The ability to rapidly respond to geopolitical events or market crashes can have a significant impact on both reducing risk and boosting profitability, writes Virginie O’Shea, CEO and Founder of Firebrand Research. However, it requires a good handle on both internal and external data sets, which is challenging in an operational architecture where data sits in silos across the business, Ms. O’Shea explains in this article.

There are many pressures on firms to get a better handle on their data assets. Regulators have been gradually reducing the time in which financial institutions have to answer queries and increasing reporting requirements across all areas of the business. Client expectations around reporting and service response times have increased as transparency and governance have come to the fore. As digitalization increases across sectors, firms are also dealing with a much higher volume of data in a variety of formats, from structured to unstructured.

The typical process of building a report manually and aggregating data via spreadsheets and relying on the IT team to normalize the data from upstream systems is impractical in today’s markets. The output is also clunky and static, which means it is almost impossible to interrogate or actively analyze the data. Given the number of reports that must be produced by a firm to satisfy business, client and regulatory considerations, it is also a significant internal cost and operational burden in terms of man hours. IT and operational staff bottlenecks can frequently impact the timeframes for report production, which could result in business opportunities being lost, clients becoming dissatisfied and regulatory compliance obligations being missed.

One large financial institution with numerous legacy systems that was previously struggling with its process for producing risk, financial and regulatory reports opted to deploy a smart data fabric to automate and industrialize the process. Much like many other large financial institutions, the firm’s many silos reflect its history of acquisitions and the complexity of its operational architecture made a manual report process extremely slow and complex. The firm was producing reports on data that was around 10 to 14 days old and relying on IT to provide answers to queries about the underlying data, which took additional days or even weeks to deliver. Stale data combined with slow response times was negatively impacting a range of internal and external business functions.

The business case for deploying a more real-time and intelligent approach to producing reports was therefore all about reducing costs and risks, improving the productivity of its staff and enabling business users to self-serve. Large financial institutions such as the global bank in question are also under heavy pressure from regulators to enhance their risk and regulatory reporting architectures; hence the reputational risks posed by financial penalties was also a factor for consideration.

The bank’s requirements included a technology platform that could be easily integrated with a variety of different systems and could support a wide range of data types, including high-volume transactional information. Given that many similar firms are undergoing digital transformation programs, the variety of systems with which tools such as these must be integrated is only going to increase over the next few years as the gradual move from legacy platforms to more modern technologies continue.

The self-service aspect of the bank’s requirements reflects its desire to move away from reliance on over-burdened IT and operations staff. By enabling business users to directly query and report on risk and finance data, not only are these activities much quicker, they involve much fewer additional staff. However, for these users to be comfortable using such a tool, the interface needed to be simple to use and fit for purpose from a business user perspective.

Another significant benefit of moving from manual processes to an automated system is that the firm has greater transparency when it comes to the provenance of the data that is used within its reports. The data lineage is accessible from a centralized system, which greatly improves the audit trail and reduces data risk. Data quality and integrity issues can be resolved more rapidly as the source systems can be easily identified and any underlying problems resolved, if required. When regulators or clients come knocking to ask questions, the firm is better able to address those queries in the required timeframe.

From a purely business perspective, decision-makers are able to access the data they need on demand and make informed decisions based on current data, including live transactional information. Latency is minimized and risk assessments are much more accurate from a market and client standpoint. Meeting service level agreements (SLAs) is much less challenging and reports are available as and when they are required by clients.

Many firms have already invested time and money into their data architectures over the last decade with a business improvement goal in mind. Some firms have built data lakes to pool their data assets from across the organization, but these centralized architectures haven’t quite lived up to the business’s expectations in most instances. Though the data is available for basic historical analysis, it isn’t easily manipulated and interrogated from a more in-depth analytical perspective. Data lakes also often struggle to support real-time data analysis at scale, which necessitates an additional layer on top of the lake to normalize and render the data fit for purpose.

Business users want to be able to make the best use of existing investments rather than starting from scratch, therefore building on top of the lake is the best option. Think of a data fabric layer much like a water processing plant, filtering out the mud from the data lake and ensuring it is consumable by downstream functions. If a business user receives a query from a client, they can access the data residing in the data lake via the data fabric and respond to complex queries in real time, regardless of the type of data involved. The output could be for portfolio analysis, risk management, compliance or any other business purposes.

These challenges are also not limited to the banking sector, asset managers are also keen to better support their portfolio optimization, operational reporting, and client reporting functions with clean, rapidly delivered data. The M&A activity within the asset management community has been significant over the last few years, which means much like their banking counterparts, many of these firms are also struggling with unwieldy, siloed operational architectures. Aggregating data from multiple source systems and providing the various business, compliance and risk functions with quick and easy data access, whether directly or integrating into downstream systems, also makes a difference in the buy-side realm.

No technology can solve all of your business problems; however, they can certainly help when it comes to dealing with the rapid pace of the current and future market environment.


Virginie O’Shea is CEO and Founder of  Firebrand Research. As an analyst and consultant, Ms. O’Shea specializes in capital markets technology, covering asset management, international banking systems, securities services and global financial IT.
This blog post was originally posted on Tabb Forum.

RELATED TOPICS

Explore Additional Blog Posts Below

Oct 10, 2023
Global Head of Product and Industry Marketing
May 09, 2023
Transforming Supply Chain Management for UST with Improved Orchestration and Real-Time Decision Intelligence
Sep 26, 2022
Combining both internal and external data improves performance in trading, while gaining greater insights into reducing risk and regulatory compliance.
Aug 24, 2022
Silos, complexity, and governance requirements are making it difficult for financial firms to democratize data. Here’s how new data technologies could help.