NEXT-GENERATION NETWORK AND VENDOR MANAGEMENT: MITIGATING OPERATIONAL RISK IN RELATIONSHIP NETWORKS.

Written by

Simon Shepherd

Published on

Insights

A white paper in association with A-Team.

 

MYRIAD Group Technologies Ltd - Revolutionise your Third‑Party ManagementIntroduction

Banks are spending enormous amounts of money to combat the risk associated with the complex networks of relationships they have built up with other financial institutions and service providers. A patchwork of technology solutions has been deployed to address the issues, but this is a wasteful approach financially, does not fully address the issues and may even create additional risk.

Moreover, it means that banks are constrained in their ability to respond to developing market circumstances, such as new regulations or market infrastructure changes, limiting their ability efficiently to exploit new business opportunities.

Just as retail banks have discovered, full digitalisation is the way enterprises will increasingly operate in the modern world. This paper will look at how that approach can be used to build a next-generation network management system that minimises the compliance burden and maximises competitive efficiency by removing internal barriers that hinder them responding effectively to external changes.

The greater willingness in recent years of financial institutions to adopt cloud computing architectures, particularly the on-demand software as a service (SaaS) deployment model, means that creating such a system is both further simplified – and at considerably less cost.

MYRIAD Group Technologies has been addressing the existing and future opportunities presented that will be discussed in this paper with an integrated, single-platform application that allows institutions to migrate their existing disparate tools to one coherent solution suite, based on a single database that is globally accessible.

Current Situation

Over time, financial institutions have built complex relationships with multiple partners in support of their own core activities, including custodians, correspondent banks, clearing houses, as well as non-financial service providers such as IT services, legal and HR services. These provide services in support of primary functions such as post-trade custody, clearing and cash management.

Often these relationships are managed on a bilateral basis and can date back years or even decades – and in many cases there is a two-way transfer of services. The result is a huge set of interdependencies that are continually changing, introducing significant potential for operational risk. Such is the scale and complexity that, over the past decade or more, network management has grown to become a separate business function, and the cost of overseeing and of maintaining these relationship networks is extremely high – for some it is their largest cost other than staff and facilities, reaching into multiple millions of dollars in some cases. There is also a huge operational burden, resulting in significant operational risk as well as financial and compliance risk.

Among the many complications and constraints this imposes on banks, there is what could be characterised as a self-imposed hamstringing of their ability to manage their regulatory obligations. Banks – with some notable exceptions – have tended to regard regulation as an imposition, taking a short-term tactical response to what is really a long-term strategic issue that dates back far longer than the financial crisis of 2007/8.

Financial institutions’ response to new regulatory measures has been characterised by narrowly focused, rapidly deployed solutions, often based on their existing fragmented organisations. Rather than a top-down, holistic approach to regulatory compliance, many banks find themselves struggling to maintain a plethora of systems and solutions that do not communicate with each other, requiring manual intervention and introducing operational risk.

These inefficiencies can also have implications for a firm’s compliance performance, risking the possibility of financial penalties, regulatory censure, and reputational damage. The volume of measures introduced since the 2008/9 Credit Crisis – from Big Regulation like the EU’s MiFID II and GDPR, the US Dodd Frank Act, and Basel measures including BCBS 239 and FRTB, through to more specific and localised rules including SFTR, and individual market reporting rules from CFTC and MAS – has challenged banks’ ability to respond. Moreover, the intensity of regulatory scrutiny has shone a light on banks’ operational shortcomings, resulting in a constant stream of penalties and fines.

The operational burden on banks is certainly challenging; with several regulations specifically requiring certain levels of operational resilience, among them the guidelines of the US Office of the Comptroller of the Currency (OCC) and the proposed European Digital Operational Risk regulation (DORA). These are in addition to the many other regulations that affect both the specific and general aspects of their business, not forgetting the regional and domestic rules on data sovereignty, particularly in relation to cloud-based solutions.

There are technological solutions to these issues, but these have typically been implemented piecemeal – often being seen as the responsibility of the compliance or legal functions, or simply part of an IT project. A holistic view of the institution and its partnership networks is required, which means front- to-back-office issues must be considered: Are there hidden factors affecting the cost per trade that would have best-execution implications, for instance?

Taking that holistic view, though a difficult task, can turn regulatory compliance into a routine undertaking (as it is in most other industries, including the IT sector) and allow banks to focus on their primary goals.

What is needed?

Clearly, financial institutions faced with this set of challenges, and under pressure from both regulators and their own business stakeholders to streamline the operational aspects of their activities, need to act.

But what are the key measures needed for addressing these issues? A good place to start is the guidelines set out by the US Office of the Comptroller of the currency. Published in 2013 and expanded on since, these provide a solid starting point for framing the questions on which a successful network management strategy can be built. The OCC guidelines set out questions that need to be answered by all parts of the enterprise. Those answers can help to show what needs to be done and who needs to do it.

In essence, the guidelines state – as do all other regulations in this area – that: “A bank’s use of third parties does not diminish the responsibility of its board of directors and senior management to ensure that the activity is performed in a safe and sound manner and in compliance with applicable laws.”

In other words, you cannot outsource your compliance responsibilities: this has been the mantra of the legal and compliance departments for long enough now, such that no-one is going to question it.

But the extensive use of third-party services creates its own management challenge, compounded by the inability of internal systems to communicate effectively. Compliance and operations executives are struggling to keep on top of how these service providers are performing and more importantly whether they remain in compliance with their regulatory obligations.

For banks seeking to put in place controls around their increasingly complex businesses and third-party relationships, the question really comes down to how firms can develop, implement and maintain effective compliance frameworks for this modern business environment: What are the obstacles to the creation of this next-generation network management framework?

As it turns out, while the wider problem is far from simple, many of the obstacles to solving it have already been removed by developments in IT and data science, and market leaders have, for some time, migrated their disparate tools to a single coherent solution suite, based on one globally accessible database. As is always the case with so many technology developments, those pioneers paid a premium to get a competitive edge, and those that did are now realising they need to make the same transformation journey to benefit from the lessons learnt and the lower capital expenditure required.

Barriers to progress

Fortunately, many of the key issues lie within firms’ own walls and can be dealt with internally, if not easily.

The first group of issues in this category have much to do with the history of technology in financial institutions, and the way that it has been deployed over decades. This has led to the growth of internal fiefdoms and the creation of siloes where data is replicated in several areas of the institution, often inconsistently, and where it is frequently not readily available.

Similar fragmentation exists between the compliance, legal and operations functions, leading to issues such as blurred lines of responsibility, and outdated, sometimes contradictory policies which build in redundancy and allow inefficiency and opacity to persist.

This historical approach to development has also led to fragmented technology implementations, with some vendor solutions here, internal developments there, and hybrid approaches built to straddle the two. This sort of approach, if not destined to fail, is inevitably compromised in terms of quality, time-to-market and cost.

A further complication is caused by the simultaneous acceleration of technological change through the combination of consumer-led technology development in device terms – smartphones and tablets – and the arrival of ubiquitous connectivity, creating an always-on-real-time environment in which modern financial services will have to operate in the future

Best practice approaches

Addressing the issues of legacy system replacement has been a priority for every CIO and CTO for at least the past three decades. As modern hardware and software has been deployed, many approaches have been taken to improving data quality and consistency – siloed data has been accessed using a data abstraction layer in the architecture, data has been extracted, transformed and loaded into data warehouses, but underlying data structures have remained because the format is specific to an application written in an older programming language.

In some ways, cloud computing offers a major opportunity for firms to square this circle. But it brings its own complications – not least of which is that in most cases it will need the creation of a new set of supplier relationships that will require management.

Security is generally the first concern that financial institutions raise in connection with the use of cloud platforms. Notwithstanding the above-mentioned mantra about not being able to outsource compliance, experience is showing that the large cloud platform providers can provide highly secure environments, so perhaps this question should be addressed from a risk perspective, rather than compliance one.

More problematic is managing the ongoing investment in the evolution and update of the systems. A common approach has been simply to virtualise existing production systems and implement them in a cloud environment, but this is a replication of the issues mentioned above in relation to hybrid implementations and adds the danger of replicating the existing security flaws.

Across the system, the architecture will have to integrate performance management of internal systems and processes with monitoring of the effectiveness of counterparties, which MYRIAD does through online issue tracking, a formalised RFI and SLA function. This way, service levels can be measured and targeted through automated comparison of issues against SLA requirements.

In the end, as the retail banking industry has discovered and the payments industry is discovering, the challenges of creating a coherent, effective compliance framework in an always-on-real-time environment have much in common with the challenges of creating and operating a profitable and efficient business.

The key lies with having a data management policy that can deliver the correct data to the correct users, which means having systems and controls in place to monitor and guarantee data integrity, consistency and completeness.

This is just the approach that MYRIAD as a platform enables through the provision to back- and middle-office staff of access to a single repository of third-party agent, vendor or other supplier details. Users are linked to the front-office, operations and finance functions and are provided with tools for performance measurement, issue tracking and control over related fees and charges, this includes network and relationship management tools covering account-specific reporting, document tracking, account details and support for cash, securities or both.

What the future looks like

Adopting a coherent and comprehensive data management policy is also a central plank of the full digitalisation of the bank that will be essential to ensure future competitiveness across a range of markets and sectors. Digitisation almost always precedes digitalisation and Banks that fail to digitise and organise their core, foundational data coherently are bound to compromise their efforts to digitalise successfully.

Numerous examples of banks viewing regulatory change as an opportunity rather than a challenge are available – the challenger banks in the retail/consumer sector are the tip of an iceberg that is largely comprised of the financially much larger payments and corporate banking sector, with investment banks now recognising the new opportunities that are available to them.

As examples from those sectors show, there are opportunities to be seized on once the underlying challenges of data management are properly addressed, and while early adopters of such strategies and their technology providers faced considerable costs and cultural challenges, the industry has reached a point where the competitive and regulatory imperative is no longer hamstrung by financial costs or organisational complexity. In fact, as others have found across all industry sectors, removing complexity and reducing costs go hand-in-hand, and both are best achieved by implementing technology that is dedicated to the task.