4 min read

Can Data Help Stabilise Our Economy?

Given this evident astronomical advancements with data, can business leaders afford not to prioritise the data scalability and data pace initiatives required - not only to secure the financial system but also - to ensure continued market competitiveness and customer satisfaction?
Can Data Help Stabilise Our Economy?
Photo Credits: Toronto Star

Soon after the Lehman Brothers triggered global financial crash, I was fortunate to have the opportunity to be part of the worldwide response by international regulators - to use data as an essential tool for mitigating future failures of our global financial system. But, 10 years on from the disaster The Economist refers to as "Minsky's moment", NBCNews has asked the question, "could it happen again?"

What is the World turning into?

We already live in a world where our phone Apps (like Uber‏) talk to Taxis (telling them exactly where on the planet we are and when we want them), a world where our sMart TVs tell our providers what programmes we want to watch and when we would like to watch them (like Channel4), a world where our fridges and kettles (by passing messages through our sMart meters) can tell our energy providers (like OVOEnergy) just how much power they are consuming (and how much cash they are taking out of our pockets), and a world where direct conversations between outer worldly satellites and farm tractors could be combined with DefraGovUK OpenData to help decide when and where to plant our seeds!

Yes - we are indeed increasingly living in a "data" driven world.

None of these, however, answer the pertinent questions: "can data help stabilise our economy" and could a crash - of the Lehman Brothers ilk - "happen again"?

But what does "data" have to do with global financial stability?

In April 2009, the Group of Twenty (G-20) Finance Ministers and Central Bank Governors asked both the International Monetary Fund (IMF)‏ and the Financial Stability Board (FSB) to jointly "explore information gaps and provide appropriate proposals for strengthening data collection" - given that such data gaps especially "highlighted when a lack of timely, accurate information" not only hinders "effective surveillance" of global financial systems but also hampers the development of "effective responses" global financial crises. One of my academic papers quotes a top Global Banking Executive appearing to concur with this viewpoint - by asserting that:

while "not the silver bullet", a radically improved data regime is not only "a key step" but "an important step" - for mitigating the risk of global financial systemic failure.

The evolving role of data in the mitigation of financial systemic failure ("effectively speaking for itself") goes on to trace the role of data - from antecedent financial systemic landscapes to the currently predominant worldview - where the role of data now encompasses pre-emptive resolvability assessment and orderly resolution facilitation.

This does indeed seem to unambiguously affirm that data is - as a minimum - a critical ingredient, for maintaining a stable global economy.

However, a September 2018 paper by the Financial Stability Board‏ - which assessed the progress of the G20 Data Gaps Initiative - considered that these data gaps are "an inevitable consequence of the ongoing development of markets and institutions.". The paper goes on to suggest that such data gaps become most critical during global "financial crises" which can become exacerbated if a lack of data speed or information integrity hinders the potential for "policy makers and market participants" to effectively mitigate, contain or resolve pertinently critical financial systemic issues.

It is, as such, undoubtedly clear that;

the stability of the global financial system is going to increasingly be determined by how quickly data can be made available and how much the data can intrinsically be trusted

In what could be considered a response to a global shift in perception (and the ensuing dilemma which is "too big to rescue" yet "too interconnected to fail"), the 21st of July 2010 saw the enactment of the "Dodd-Frank" reform Act - by the President of the United States. Its goals were "to promote the financial stability" by "improving accountability and transparency". It is no surprise that some consider this seismic shift towards "transparency" the foundation of today's OpenBanking epidemic - which is set not only to drive the rapid expansion of FinTech sector but to lead to radical improvements in our customer experiences.

Additionally, there can be little doubt of the significantly increasing influx of financial sector investment in AI. Some standout examples would include the over $200 million expenditure within the sector in Canada in 2017 and the extremely strategic Layer6AI acquisition by TDBank - for an undisclosed fee. But, if bold comments in Inga Beale’s influential article on "how technology could change the rules of the game" are to be believed, then today's investment levels would almost certainly only be the tip of the iceberg.

Given this evident astronomical rise in importance of data - to business and to the broader economy - and given the significant technological advancements across Big Data, Data Science, Machine Learning and AI, a most pertinent question is likely to be:

can business leaders afford not to prioritise the data scalability and data pace initiatives required - not only to secure the financial system but also - to ensure continued market competitiveness and customer satisfaction?

Would another broader question likely be asked about what the future holds for you and me - with regards to what skills, competencies and capabilities we desperately need to develop - and whether we can afford the risk of remaining calcified by "the way things have always been done" only to subsequently find ourselves rigorously dishevelled by future advancements in paradigms such as Artificial Intelligence and Robotics?

A number of matters to seriously consider - with lots of potential for contemplating what additional questions we necessarily need to be posing and what critical questions have conspicuously been left unasked.

So, can "it" happen again?

In spite of all of the radical advancements - within the cloud, data and technology - it is yet impossible to conclude that another "Lehman Like Disaster" could never happen again. With the aftermaths of Covid, Brexit and US versus China trade wars, and with fears about looming crises given the evolving Russian invasions showing no credible signs of abatement, this might not be as far-reaching a concern as may once have been considered.

However, as we now evidently have access to the kinds of speed, availability and integrity never previously achievable - given current technologies and trends - it can most certainly be affirmed that data can indeed help stabilise our economy. This should mean that - given appropriate focus and investments - identification, mitigation or resolution are now possible at capability levels never previously available to mankind.

Some pertinent questions, however, remain unanswered - such as, "has sufficient focus been globally achieved?" and "are the current investment levels adequate for ensuring that an appropriate resilience benchmark is met?"

Thank you

Edosa would love to support your transformation from Organizational Visions into Quantifiable Data Value....

Let's Talk