Skip to content
Search to learn about InterSystems products and solutions, career opportunities, and more.

Surviving in the New, High-Speed World of Financial Services

High Speed World of Financial Services

Increasing trade volumes and periods of high market volatility can create significant challenges for financial services firms' data management infrastructure.

This is especially true in front- and middle-office applications in capital market firms. Sell-side firms in particular can experience extremely high transaction volumes, since they partition already high volumes of incoming orders into even more orders for execution. At the same time, they must support a high volume of concurrent analytic queries to provide information on order status, risk management, compliance, surveillance, and other key metrics, for internal and external clients. This requirement for multi-workload processing at very high scale — coupled with the need for the highest levels of performance and reliability, and a low total cost of ownership — has been difficult to achieve.

A leading global bank improved data throughput 500%, reduced latency by 1000%, and lowered operating costs by 75% compared with its previous in-memory DBMS, all without a single incident since its initial implementation.

Compounding the challenge is the fact that transaction volumes not only grow incrementally and within expectations, but also often spike dramatically in response to unexpected world events. Recent examples include the 2008 financial crisis, 2010 Flash Crash, devaluation of China’s currency in 2015, Brexit, trade wars, and many other political events.

The data platform underlying a firm’s real-time and near real-time front- and middle-office applications is a critical component of its technology infrastructure. The applications must be extremely reliable and highly available — able to withstand both normal transaction volume growth and the extreme spikes that can occur during periods of market volatility, without incident.

A failure, or even just a slowdown, of the underlying data management infrastructure can have severe consequences for a firm. For example, with in-memory database technologies, it can take minutes or hours to rebuild the database and resume normal operations after a failure. In the meantime, the firm’s ability to process additional trades and provide order status and other critical information is compromised, and financial losses mount.

Even a slight delay or outage can cause significant financial losses and impact a firm’s reputation. One major bank recently reported a loss of $100,000 for each minute that its order management system was down.

Example: Order Management System

An order management system is a critical component of a bank’s technology platform. It must record all orders originating from both clients and internal sources, ensure proper routing and execution of the orders, maintain the state integrity of each order (for example, if an order is only partially filled), record and properly allocate all trade executions, and preserve all data, while concurrently processing analytic workloads on the trade data. It is absolutely mission-critical; it cannot slow down, drop trades, or go dark, regardless of market volume or volatility.

To successfully handle growth and volatility without performance or availability issues, a data platform must balance transactional workloads with the concurrent analytic demands of downstream applications. Financial services organizations must be able to process millions of incoming messages per second while simultaneously supporting thousands of analytic queries per second from hundreds of systems that must report on the state of orders while performing other queries.

Traditional operational databases are too slow to accommodate the high throughput and data-access rates required. And in-memory databases alone are not sufficient for many applications for a number of reasons:

  • Scale limitations. Because the data in an in-memory database is stored in main memory, the working data set is limited by the available amount of memory. As a result, as data volumes and/or analytic query workloads increase, at some point both the transaction processing and the analytic queries will slow or stall.
  • System downtime. Because the data is stored in memory, if the database server fails, the data that is resident in memory on that server is lost. Some in-memory database systems offer persistence through mirror databases, replication, and other approaches. These techniques can affect ingest performance and cost, and increase maintenance complexity. For databases where the data is stored in files and transaction logs, the recovery effort involves rebuilding the database using the logs, checkpoint files, and other backup data. This is a time-consuming process, during which time the bank’s ability to process orders is compromised, resulting in revenue losses and other penalties to the business.
  • High costs. Scaling in-memory systems is expensive. And because servers have hard memory limits, scaling in-memory databases beyond these limits requires firms to purchase additional nodes to sustain normal operations and allow headroom for unexpected volatility, which increases costs.

A New Approach

Fortunately, there is a new approach that delivers performance equal to or better than that of an in-memory database, but with none of the compromises. InterSystems IRIS® data platform provides the durability and reliability of a traditional operational database, but with better resource efficiency and a lower total cost of ownership. Unlike both in-memory and traditional operational databases, it is optimized for extremely high performance for both transactions and concurrent analytical processing, without incident or performance degradation, even during periods of extreme market volatility.

This data platform delivers fast transactional and analytic performance without sacrificing scalability, reliability, or security. It handles relational, object, document, key-value, and multi-dimensional data in a common, persistent storage tier, without any replication of the data.

Unlike traditional in-memory databases, since the data is always stored on disk in a format optimized for random access, there is never a need to rebuild the database.

The data platform offers a unique set of features that makes it highly attractive for mission-critical transactional-analytic applications, including:

  • High performance for transactional workloads, with built-in persistence,
  • High performance for analytic workloads,
  • Consistent high performance for concurrent transactional and analytic workloads at scale, and
  • Lower total cost of ownership compared with in-memory technologies.

Conclusion

The high-speed world of financial services presents some of the most demanding requirements for technology infrastructures.

Fortunately, there is a technology that can meet these seemingly conflicting requirements: processing both transactions and analytic queries concurrently, at very high scale, with the highest levels of reliability even when markets spike, and with a low total cost of ownership.

For more information about InterSystems IRIS data platform, visit InterSystems.com/Financial.

InterSystems is the information engine that powers some of the world’s most important applications. In healthcare, business, government, and other sectors where lives and livelihoods are at stake, InterSystems has been a strategic technology provider since 1978. InterSystems is a privately held company headquartered in Boston, Massachusetts (USA), with offices worldwide, and its software products are used daily by millions of people in more than 80 countries.

 

RELATED TOPICS

Other Resources You Might Like

Apr 02, 2025
A Digital Front Door Solution
A Digital Front Door Solution Successful healthcare organizations (HCOs) understand that patient engagement is fundamental. Patients who are well-informed about their conditions and treatment options make better decisions. By empowering patients to take an active role in their healthcare, HCOs can improve patient outcomes, satisfaction, and loyalty.
Apr 02, 2025
IDC Spotlight Report
Increased capabilities in supply chain management and decision intelligence tools, along with complex tech stacks, have put a premium on the ability to integrate, synthesise, and use disparate data for faster transformations and long-living business benefits.
Apr 02, 2025
A cloud-based FHIR to OMOP solution for real-world data
Cloud-Based, On-Demand Access to Secure Patient Data Nationwide
Mar 19, 2025
Supply Chain
See how InterSystems Supply Chain Orchestrator™ improves supply chain operations with real-time, actionable data. Supply Chain Orchestrator includes an extensible data model, integration engine, and API framework that allow you to create real-time, full-stack applications for order processing, issue processing, demand forecasting, and more.
Mar 18, 2025
InterSystems Data Fabric Studio - Supply Chain
Data Fabric Studio makes it easy to access accurate, consistent, and reliable data faster, and enables better-informed decision-making across your entire supply chain.
Mar 13, 2025
IDC Spotlight Report
Organizations shifting to increasingly dynamic supply chains require digital transformation for real-time visibility across extended supply chain partners, enabling advanced analytics-driven insights Introduction
Mar 13, 2025
Supply Chain
InterSystems is helping supply chain customers in the retail, consumer packaged goods (CPG), manufacturing, and healthcare sectors. Our solution provides unparalleled real-time orchestration.
Mar 12, 2025
InterSystems Supply Chain Orchestrator
In supply chain, predicting disruptions before they occur and handling them in an optimized manner when they do occur, is a game changer. An AI-enabled decision intelligence platform can optimally manage disruptions when and before they occur so you can be ready for the unexpected. Learn about some of the use cases that InterSystems Supply Chain Orchestrator can address to help you manage the unexpected.
Mar 06, 2025
ARC Advisory Group Report
A new category of supply chain data fabrics is emerging to meet the unique needs of large businesses with complex supply chain processes. These new data fabrics must go beyond traditional enterprise data fabrics, which are not optimized for supply chain environments. These new platforms need to e able to embrace intricate supply chain data, real-time alerting, and complex decision-support tradeoffs. Such a platform is needed to allow companies to truly support agile business execution.
Mar 04, 2025
Next-Generation AI-Powered Electronic Health Record System
InterSystems IntelliCare™ is an advanced, unified electronic health record (EHR) and healthcare information system with integral AI. It enables healthcare delivery organisations of any size to break down interoperability barriers, improve information flows, and boost clinical and business outcomes.

Take The Next Step

We’d love to talk. Fill in some details and we’ll be in touch.
*Required Fields
Highlighted fields are required
*Required Fields
Highlighted fields are required
** By selecting yes, you give consent to be contacted for news, updates and other marketing purposes related to existing and future InterSystems products and events. In addition, you consent to your business contact information being entered into our CRM solution that is hosted in the United States, but maintained consistent with applicable data protection laws.