Covering Disruptive Technology Powering Business in The Digital Age

Home > DTA news > News > The big business of big data in liquidity risk management
The big business of big data in liquidity risk management


How important is it to have the right technology in place to manage liquidity risk in real time? Are we getting to the point now where the majority of banks and corporates do this?

In short, there are a lot of solutions in the marketplace that claim to help manage liquidity, but the crux of this as a real-time issue really lies with big data. The ability to delve into deeper data granularity is a big concern for a lot of companies. Technology is the key to being able to report on all of this, but the challenge is getting all that data available to report and perform analytics across all asset classes.

For corporates, the challenge is too many disconnected and fragmented systems which is preventing them from having a real-time global view of liquidity risk, and ultimately this hinders their ability to stress test conditions. At the outset it sounds simple, and you might assume that that’s what everybody is doing, but the challenges involved in reaching that point are very real.

For banks the situation is a little different as they have a lot more money to invest in technology. Even here, though, depending on where you operate in the bank – an interest rate swaps or commodities desk, for example – they’ll each have their own systems, and while the data is there, it’s there in different silos. The challenge for banks is getting all of that data together to be able to run analytics and then being able to run further predictive analytics and stress testing on those results.

From a technology point of view, I think what corporates and banks are ultimately looking for is a real-time enterprise view of liquidity, and to be able to perform stress testing and behavioural scenario analyses on that data.

These are issues that have been going on for years, and with changing regulations like Basel III, the answers to the questions are following the same evolutionary path that the banks and corporates are on.

Will the advancement of technology help to automate more elements of liquidity management?

The short answer is yes, and again data is at the heart of this. It is critical to be able to source accurate data that goes into your cash flows and collateral positions, and transactional data from different businesses and geographies, that will help build a real-time data structure to aid the management of liquidity risk.

Once you have that data, you need to be able to run advanced analytics at a granular level, to ‘slice and dice’ that data right down to the what, when and where to discover how a certain transaction is affecting liquidity risk.

From a reporting perspective and taking compliance into account, automation can help. Returning to the previous example of having data in different silos, if you’re reporting on a particular asset class, like currency, independent of everything else, you start to get into complicated reporting territory, potentially across multiple jurisdictions. The need to aggregate data, to have one single source of truth is crucial, and that’s where technology can help automate things.

The market is always evolving, and so is the regulatory landscape – we started out with Basel I, now we’re up to Basel III. The fact it’s always changing is impacting existing risk operations and technology platforms. From OpenLink’s perspective, and from a technology standpoint, it’s very important to be able to adapt to changes almost as rapidly as they happen – it gives our clients a real competitive edge. Companies shouldn’t be buying technology that becomes obsolete every time there’s a new change, but it is difficult for companies to have to keep finding workarounds outside of existing technology.

Rather than taking such a piecemeal approach, companies should try and be more holistic in evaluating those processes and their platforms. Being able to consolidate data and centralise liquidity in a solution that gives you real-time data means you no longer need older disparate data sources where you have to worry about normalising data to a point where you can report on it. In turn, this enables you to focus on quality and accuracy. By improving data quality, you have more accurate and consistent data, which lends itself to more accurate reporting; it’s very much ‘junk in, junk out’.

Will liquidity risk management become simpler as technology improves, or will ever-changing regulatory, market and political landscapes mean that banks and corporates will have to remain on their toes?

Yes and no. From my perspective, we already have the technology to address these issues in terms of reporting improvements and data centralisation, but this technology will continue to improve.

The market will continue changing, the number of regulations will always go up and down, and especially at this point in time, political issues can make an impact.

For example, there’s talk of deregulation around the new Trump administration in the US and whether Dodd-Frank will get rolled back. At the moment, it’s all conjecture, but you’re forced to report on it anyway. Talking to our bank customers, not everybody wants to dial back Dodd-Frank entirely because so much money has been invested in trying to report to its standards and comply with it, and it has had its benefits. Contrast this situation with Europe where more regulation is incoming. A lot of the geopolitical risk has played into where regulation and compliance is growing. You want flexible, adaptable systems for this, it shouldn’t take months or years.

Real-time monitoring and analytics are considered by most businesses to be an advantage because as well as facilitating clients and enabling the use of cash across the enterprise, it helps them manage positions by automating that process.

People are talking a lot about robotics and AI, and while those terms can mean different things to different people, the key word is automation.

You’ve got a lot of people pulling levers, pushing buttons and spending an inordinate amount of time reconciling databases or spreadsheets when there are already technologies that can take over that work. Automation will allow corporates to better focus on the task at hand, and start to use their heads a bit more when it comes to thinking of the future and what they should be looking at.

OpenLink has made some great strides in predictive AI, which again starts with data, to the point we can ask ‘how will it impact me if markets move, or currency moves against me after certain macro events across the world?’ We should now be able to predict a likely outcome, proactively assess that from a risk management perspective and hedge in advance of that event. Brexit is an example of this where clearly a lot of companies didn’t anticipate the Leave vote and weren’t doing any hedging of currency.

Are current technologies and regulations, like Basel III, enough to ensure that we avoid a repeat of the financial crisis?

I think if you look at the objective of the Basel framework in promoting safer and more resilient financial systems, fining, and solvency requirements, I think it has done enough, particularly for banks.

Basel III has driven innovation in technology and risk reporting. On the one hand, you’re trying to avoid a ‘too big to fail’ scenario and make sure that companies have enough capital on hand. On the other hand, though, from a risk management perspective, they’ve bounced analytics around that scenario, and when you analyse across asset classes, interesting things start to happen.

If you have all your asset classes and you’re running Monte Carlo risk reporting, it has the ability to look at netting effects and correlations. By combining all those asset classes into one analysis, you can effectively start to reduce your liquidity risk because it looks for the netting. If you ran a Monte Carlo risk analysis on FX alone, not accounting for other asset classes, you’re not getting the full reduction of risk, and by reducing the risk, it pays you back in working capital.

If you’re reporting in the right way, the risk department will benefit.

This article was originally publshed on and can be viewed in full