Data is plentiful. We see evidence of this all the time in one form or another, whether it’s the 13 billion internet-connected devices generating data by the second, or the wealth of data being created and captured within our own organisations (in 2021 alone, ANZ delivered over one billion data items to our customers just to support receivables reconciliation). Data has never been more readily available, nor has the promise of helping us better understand the world around us and make data-driven decisions been more achievable.
The need for such capability has never been more pressing either. The pandemic, natural disasters and geopolitical unrest have created unprecedented levels of uncertainty, rendering traditional ‘gut feel’ decisions unreliable.
With the means and the end being so clearly aligned, why then are so many treasurers still ‘envisioning’ using data rather than actually using it? Why haven’t we all progressed further?
At ANZ, our experience tells us that data analytics is easy to do in pockets, but difficult to industrialise. But there may also be an underappreciation (and consequent underutilisation) of what we have today.
What we have today, and the promise of tomorrow
Like many others, our data journey started by agreeing on what we were aiming to achieve through the use of data. Rather than selling raw transactional data — a low-value, low-margin endeavour — we focused on two things:
- using data internally to improve our operational efficiency and decision-making; and
- enriching and differentiating our products, services and customer experiences through insight reports and bespoke client engagements.
On the former, one of the more obvious and valuable uses of data has been in the area of cyberfraud detection. Combining both transactional and behavioural data, we’ve been able to support customers in identifying and managing fraud through proactive alerting of suspicious activity.
On the latter, we’ve had a great response from customers across a number of use cases.
- We’ve been able to augment a retailer’s sales data with our own anonymised consumer spending data to help them better understand how sales were trending relative to their industry, industries with related characteristics, and the economy as a whole.
- We’ve also used de-identified demographic data to assist retailers with the placement of new stores and optimise the product mix within existing stores based on their target demographic.
- And more recently, and perhaps more topical, we’ve deployed this same anonymised data set to help government agencies understand the economic impact of COVID restrictions and assess the effectiveness of initiatives in responding to community needs.
These examples illustrate the type of value that can be delivered by data today. And with the increasing adoption of advanced tools like machine learning,disclaimer the ability to shift from retrospectively describing ‘what happened’, to forecasting ‘what might happen’ as well as recommending ‘what should be done’ is quickly becoming a reality.
What about cashflow forecasting?
Cashflow forecasting — the ability to predictively model inbound and outbound cashflow based on various business parameters and financial data sets — comes up in almost every conversation ANZ has with treasurers. Many ask, ‘What’s preventing data analytics from addressing this problem?’
Our view is outbound cashflow is typically less of a concern for treasurers, as most have good oversight of their payment obligations.
The complexity lies in inbound cashflows. Specifically, the ability to aggregate data residing across multiple external parties and systems in a timely manner (e.g. multiple banks, debtors and consumers) is both a technical integration challenge (legacy batch systems) and a contractual one (bilateral agreements).
APIs (Application Programming Interfaces) are a modern way of integrating systems, and one that enables more standardised delivery of enhanced data in near real-time. APIs could help unlock the data held across your various partners, but universal adoption is some time away — hindered by legacy systems that can’t handle APIs without significant investment. ANZ encourages its customers to ask their partners about their API strategy, not just their data strategy.
Scaling and industrialising — lessons learnt
While ANZ is proud of what we have today (and how we’re positioned to deliver on the promise of tomorrow), it’s clear to us that the true value has come from scaling pockets of activity into a repeatable and industrialised capability — a challenging journey to say the least, but a valuable one with a number of lessons for those looking to follow a similar path.
Lesson 1: Building a data lake, not a data swamp
Like many organisations, ANZ creates, receives, processes and manages large amounts of data every day. Data lakes provide a central repository to house this data for use across an organisation.
This is particularly useful when addressing a common hurdle faced by many treasury functions — how to draw insights from fractionalised data dispersed across several systems and departmental silos. Even the most advanced treasury management systems deliver limited benefits without sufficient integration with other departmental systems. Data lakes can help with the aggregation and integration effort, and materially lift the insight available to treasury.
But despite its value, a data lake can quickly devolve into a ‘data swamp’ if left unchecked. Unclear data origins, impacts of change, duplication and poor data quality all present challenges in meeting regulatory compliance obligations around data assurance, quality and standards. For treasurers, this inevitably leads to questioning the lineage of the data, the drivers behind specific forecasts and deviations, and ultimately the validity of the insights.
Enter metadata management, the practice of properly defining, classifying and relating data. Think of metadata as ‘information about data’ — it answers questions like ‘Where did this data come from? Who created it? How did it change? Is it accurate?’.
Investing in an enterprise metadata management capability will encourage the use and reuse of data assets as a single source of truth, and create a more efficient data environment by eliminating duplicate data sources and rework. It ensures that data can be deployed with certainty, at speed and in a cost-effective manner while also managing risks.
Lesson 2: Winning hearts and minds — building a ‘data-first’ culture
Data enables a new way of problem solving, but embedding this throughout your organisation is as much a cultural shift as it is a technical one.
For whatever reason, many teams are unaccustomed to taking a ‘data-first’ approach to problem solving and asking in the first instance, ‘What data would we need to solve this problem, and how do we go about getting it?’ As a result, we’ve seen time and again valuable data insights being built only to be underutilised.
Curiosity, awareness and applying a ‘wide-angled lens’ all play a role. As creatures of habit, many of us tend to default to the same internal data sources for answers. However, given the pace data is becoming more available and accessible, it’s an increasingly worthwhile exercise to regularly ask, ‘What data is available across our organisation, or from our banking and other partners? What value could be derived from co-mingling this information together? Once combined, who else could benefit from this information?’ As the traditional managers of banking relationships, treasury and finance teams are the usual recipients of this analysis. But how could marketing, sales, product development and logistics teams also benefit from the data available from your banking partner?
Technical challenges aside, we believe the ability to leverage data to its full extent is a matter of culture.
However, we’ve also seen great examples of teams that have made this shift. Some of our Australia-based clients have told us that they review our Economic Pulse data every week as part of management meetings to better understand trends and drive decisions — something only made possible by both the technology (timely delivery of data) and culture (a ‘data-first’ mindset) working in harmony.
Lesson 3: Just because you can, doesn’t mean you should
Data scientists (at least our colleagues) love chaos. They love sifting through anarchy and exploring chaotic pieces of data to uncover hidden gems. Armed with the right datasets, tools, and understanding of your business, a good data scientist will amaze you with what they can discover.
In contrast, data governance is about production lines. It’s about getting data out on a predictable, timely and reliable basis. It’s about ensuring obligations around data privacy and security are fulfilled. And it’s about ensuring your organisational values are embedded in an ethics framework that puts the ‘human impact’ at the centre of decisions regarding the data we collect, the insights we create, and the parties those insights are shared with. At ANZ, our data ethics principles include fairness, transparency and anonymity — just to name a few.
The tension between data-hungry colleagues and governance frameworks is a healthy one, and managing the balance between the two should be regarded as a key competency and point of differentiation. Our view here — which may seem counterintuitive — is taking the time upfront to get your data governance right will actually allow you to go faster. It provides guardrails to accelerate work, and allows greater peace of mind and assurance that the right things are being considered and risk is being managed along the way.
Lesson 4: Data is a contact sport
While the end result of data analysis may take the form of a report, widget or interactive dashboard, the process of getting there is very much a contact sport.
Uncovering (sometimes hidden) pain points and identifying data-driven solutions is exponentially more productive when customers, data scientists and engineers are engaging on a frequent and regular basis. At ANZ, our most successful engagements have two things in common:
- collaborative sessions between our customers and ANZ’s teams to iteratively co-design data insights; and
- the creation of sandpits and rapid prototypes to facilitate discovery and exploration through a human-centred design process.
Aside from delivering more relevant insights more efficiently, we’ve found that engagements run this way often end up expanding beyond the original requestor (treasury, in many instances) to other departments such as procurement, sales and marketing — all of which generate data for treasury as well as rely on the data generated by treasury.
Multi-department engagements like this illustrate how pain points and solutions can often cut across intra-company boundaries, and can serve as a catalyst for addressing historical data silos.
Where are you on your data journey? What’s your data culture?
The ingredients for creating relevant and actionable insights have never been more readily available. But the ‘secret sauce’ is having a data culture that takes a ‘data-first’ approach to problem solving, and a data governance framework that delivers predictable, timely and reliable data in a manner that fulfils obligations, aligns to values and protects against misuse.
Where are you on your data journey? And what’s your data culture? We’d love to hear from you as your banking partner and explore what we could do as your data partner.
This publication is published by Australia and New Zealand Banking Group Limited ABN 11 005 357 522 (“ANZBGL”) in Australia. This publication is intended as thought-leadership material. It is not published with the intention of providing any direct or indirect recommendations relating to any financial product, asset class or trading strategy. The information in this publication is not intended to influence any person to make a decision in relation to a financial product or class of financial products. It is general in nature and does not take account of the circumstances of any individual or class of individuals. Nothing in this publication constitutes a recommendation, solicitation or offer by ANZBGL or its branches or subsidiaries (collectively “ANZ”) to you to acquire a product or service, or an offer by ANZ to provide you with other products or services. All information contained in this publication is based on information available at the time of publication. While this publication has been prepared in good faith, no representation, warranty, assurance or undertaking is or will be made, and no responsibility or liability is or will be accepted by ANZ in relation to the accuracy or completeness of this publication or the use of information contained in this publication. ANZ does not provide any financial, investment, legal or taxation advice in connection with this publication.
Machine learning refers to a set of statistical models used to extract generalisations from vast quantities of data.Return