In the June 2020 edition of The Network Forum Journal, Simon Shepherd examines why ordered, securely stored and accessible data is so important.
During and after the Great Financial Crisis (GFC), Contractors and Consultancies of all shapes and sizes were excised from large Financial Institutions as just one way of quickly and effectively reducing cost. In the void that was left, and in response to the need for better oversight, a whole new Industry was born: GRC. Governance, Risk and Compliance has evolved as a major earner for Advisers and Software Companies alike, as they set themselves up to address a yawning gap in the armoury of Banks in particular.
What relevance is this to the Network Management function? Network Managers have at their fingertips – or should have, depending on the efficacy of their systems – a large volume of data that informs all aspects of GRC. Heavyweight and well-informed Network Management teams sit at Level 1 in the Risk hierarchy, which means that these Teams run relationships and execute the risk-taking function at the outermost extremities of the Banking Industry. Network Managers are taking primary risk on the counterparties they choose to warehouse cash and securities. The data derived from this function feeds directly into Level 2 and Level 3 Risk Management and underpins the holistic views that senior, dedicated Risk Managers so desperately crave.
Indeed, this explains an oversight by Banks in addressing the Middle Office function when presenting their first draft Recovery and Resolution Plans in the fallout from the GFC, which were all rejected because of insufficient attention to this detail. Lehman Brothers in Administration is still unwinding positions and situations which would categorically have benefitted from better Middle Office infrastructure, regardless of the poor decisions being taken in the Front Office.
Fundamentally, it is the data that matters and how it is stored, how securely it is stored, how it is organised, how it is accessed and how it is processed. A great process, no matter how well engineered, will always be compromised by bad data. A better process will always make more of good data, but the same improved process will still, always, be compromised by the bad data.
Getting static data into an ordered, coherent framework which is secure, structured, access-controlled and has full audit capability starts to address many if not all the most basic data challenges. If that data has not been scrubbed, reviewed, and possibly cleaned again, then any and all future processes that are run across that data will be compromised.
Last year there was a cartoon published in which Greta Thunberg can be seen striking a line through the word “economy” and replacing it with “planet” in the quote “it’s the economy, stupid”. A Network Manager could be well-advised to replace “economy” with “data” instead.
Simon Shepherd, CEO