About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

What do banks need to do to survive in the new risk era?

Subscribe to our newsletter

By Neil Vernon, Chief Technology Officer at Gresham UK

With high profile internal security breaches, stringent regulation and regulatory fines on the up, banks are under greater pressure than ever before to both guarantee and evidence the integrity of their data. But with existing banking IT infrastructure – patchworked systems, siloed processes and fragmented organisations – only serving to increase risk, how can banks get control and implement a framework that reflects the realities of the dangers they’re facing?

Many banks know they have holes in their systems that are only getting bigger as the number of datasets they deal with evolves and grows. Aside from the risks inherent to an inability to red-flag exceptions and aggregate data to identify potential external threats, they are leaving themselves open to internal abuse from unscrupulous individuals who have identified how to exploit these holes to their own ends.

The only way to keep pace with the multi-faceted data integrity challenge is with a flexible, adaptable control framework. One that can onboard new regulation and implement it across multiple data feeds simultaneously, and one that isn’t constrained by complex data and can adapt to new controls without heavy reliance on IT.

Luckily for banks, implementing a framework like this is no longer a case of wholesale system replacement. Agile plug-in data integrity platforms are specifically designed to adapt to change. Controls are stringent, exceptions are highlighted in real-time, evolving requirements of existing regulations are supported and there is flexibility to accommodate new legislation quickly and efficiently as it is introduced.

Designed to consume complex streams of near real-time data, these platforms can handle data in multiple formats and of any width and structure. Instead of the data having to work with the technology, the technology accommodates the data, meaning it can be fully implemented in a matter of weeks, without the additional cost and time required to relabel and change processes.

Just a few years ago, implementing a new data control framework would have been a mammoth task, sapping resource from other critical operational and IT functions. As we head into 2016, it can be a simple augmentation project, live and operational within a few weeks. It is a smarter route to data integrity. A route that empowers banks to take control without fearing what lies under the hood.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise and making it available and actionable to everyone who needs it is extremely difficult....

BLOG

Data Infrastructure Faces Stress Test as Private Credit Consolidation Beckons

By Charles Sayac, Managing Director EMEA West, NeoXam. A bout of consolidation unseen in the sector’s history may be on the cards for the private credit space – one that threatens to unearth a host of complex data challenges for the unprepared. A recent Carne Group report revealed almost all (96 per cent) of private debt managers...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...