Legacy technology: not a true ‘legacy’ at all.

Written by

Simon Shepherd

Published on


For The Network Forum Journal #12, Simon Shepherd discusses the need for integration of proprietary technology with off-the-shelf specialist platforms.

Traditionally, you leave your legacy as a mark of all the good things you have done in your life. In financial terms, it might be the wealth you pass on to your descendants or successors. Reputationally, your legacy may well be a reflection of all the good causes you have supported or your wider philanthropic efforts during your lifetime.

But in banking terms, legacy technology is not a true legacy at all. It is frequently stated that this month’s in-house development is next month’s legacy technology. This is painfully accurate. The timescales may extend beyond mere months, but legacy is problematic; it is instantly out-of-date, no longer supported, no longer fit-for-purpose; management has moved on or lost interest, the Development team has been pulled apart and budget has quite possibly been pulled, too.

At the moment, one of the buzzwords in the Financial Services Industry is ‘interoperability’. The lack of interoperability, both internally and externally, is a consequence of ill-directed and poorly thought-out proprietary development. These projects often layer in more in-house development, because past in-house projects can rarely integrate with anything new.

How best to leverage new technology in the context of creaking IT Estates is a massive challenge for Banks in particular – yet many Banks persist in building proprietary technology which stands little chance of inter-operating with anything outside that Bank’s domain. Being ‘proprietary,’ they are by definition never going to work with the outside world. ‘Outside’ is where the innovative development is taking place.

Proprietary technology is deemed to be a differentiator, but it just locks in that same inefficiency, causing massive, costly inertia. Banks worry about the costs of inefficiency, but in-house development programmes build in that same inefficiency – the lack of interoperability – that they want to avoid.

There are three ways this can be tackled: adoption of standards; the use of non-proprietary software and systems that promote interoperability; and a move away from in-house development towards investment in integration projects, leveraging industry-standard software which will be well supported and will evolve with Industry needs. Simply, faster to market, more robust in terms of security and of course, at vastly lower cost.

The recent upgrade of CHAPS, the start of migration to ISO20022, and the expectation that serious providers are ISO27001 certified or SOC 2 accredited are all symptomatic of a steady shift towards a better, standards-driven operating environment.

The need to bring disparate systems together (interoperability), better to leverage historic data and legacy systems, is a key driver in many new technology projects. These focus on underlying functionality, naturally; but they also focus on ‘fit’ (for purpose) and ease of integration, meaning “this is how we make it work and this is how we leverage what we have got”.

The whole notion of industry-standard software addresses wider and wider uptake, non-differentiating activity, streamlining and all-round ‘better’ outcomes as defined by lower cost. Basically, focused solutions which deliver value quickly and sustainably, locking in gains year-after-year.