Risk management is modernising fast as financial institutions use more varied types of data and seek to implement AI-based innovation.
A discussion among financial sector risk management and data experts agreed risk management is becoming more technical but remains the province of people with advanced mathematical skills. Their desire to harness AI and their demand for near-real-time data from a huge variety of sources is exposing the need for a new approach, such as a data fabric.
Large language model AI, for example, is set to extend the information available in risk management. It will provide rapid summaries of large amounts of publicly available information on the internet contained in financial reports, social media, sentiment analysis, and alternative data such as satellite images.
Financial organisations need to modernise their data architectures so they have clean and dependable data from this ever-broadening variety of sources. They need the ability to ingest and harmonise data much more quickly and successfully, to employ natural language processing and Python, and analyse major historical datasets. Capabilities need to straddle data in the cloud and on-premises and both modern and legacy systems. Organisations need to enable a permissioned level of self-service for risk management teams and the quants working with them. These are tough requirements to meet without a new architectural approach.
New regulation is also having its effect in Europe. The EU AI Act, for example, shifts focus from the mathematical aspects of the technology to ethical constraints, providing a legal framework. Some of the most significant implications for financial risk management are the requirements for logging and documenting of AI models for organisations to be transparent about what they are doing to data.
Organisations must ensure they do not corrupt the original “golden source” as they slice and dice data for different purposes. These are increasingly important considerations in regulatory compliance functions in many jurisdictions.
Organisations need to address ownership and governance, working on the basis that employees will care more about data quality and standards if they are the people who use it.
The balance needs to be struck between centralised control and enabling decision-makers to achieve more with the data at speed. This is one of the reasons why temporality of data is gaining greater importance.
A smart data fabric approach will fulfil requirements without disruption and provide clean and trustworthy data from whichever data sources an organisation uses. The smart data fabric avoids introducing latency into to these processes, which is crucial. It works across any set of systems and facilitates the use of analytics.
It also assists with the shortage of talent. Organisations often lack individuals with the ability to set up and implement advances in technology. The smart data fabric automates many of the tasks in data preparation. But partnering remains very important. Partner organisations can build on the domain expertise of those within the financial firm.
AI has opened up many opportunities but requires a strategy and business case to move into production for risk management. The smart data fabric is emerging as the simplest and most efficient way to resolve the challenges of implementation so organisations can use their talents and drive forward innovation with AI. This is fast becoming a competitive and compliance necessity as data and regulation constantly change.
Find out more about InterSystems Data Fabric Studio