About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Process Re-engineering, Implementation are Key To Success of EDM, Says Data Warrior Fleming

Subscribe to our newsletter

The appointment of chief data officers (CDOs) has been heralded as a sign that financial institutions’ senior managers have at last recognised the importance of enterprise data management. While this is undoubtedly true, taking the EDM remit and running with it through to a successful conclusion is no mean feat. Enterprise data projects risk being derailed at a number of turns, and those charged with spearheading them must balance the often conflicting needs of different stakeholders, while demonstrating a solid ROI and equipping the firm with the appropriate resources – both human and technology – for the job.

“I do see more firms starting to understand the case for action,” says EDM veteran John Fleming, formerly responsible for enterprise data at Morgan Stanley. “They are getting burnt by trading losses and lawsuits, or they are seeing their operational costs escalating. On the client side, all businesses – trading businesses, sales businesses – are trying to achieve client centricity, and you can’t do that unless your customer data is well managed.” But once the decision has been made to press ahead with a data project, there are certainly numerous pitfalls to watch out for, he agrees. “The biggest one is not getting proper business buy-in. The project really has to be bought into at the highest level of the firm if you ever hope to succeed.”

Success in a data initiative has to be combination of completing the project and carrying out the necessary business process re-engineering around it, Fleming reckons. Where firms get “the wrong end of the stick”, he says, “is when data projects are perceived as technology projects”. “The key people – the visionaries – are in the IT department, and what ends up getting missed is the business process change that must go along with it. This is akin to having someone build you a fancy new car, and you can’t drive it.” The business process change required is “very traditional process re-engineering”, he says.

Without it, there is a risk that, because people typically only care about their own part of a process, they will be reluctant to give up control of an activity, or fail to appreciate the importance of something that they believe “happens over there”.

“Then you’re only creating downstream problems. The business process re-design component of a data initiative will involve getting people to stop doing something they have always done and start doing something else instead. It may also include implementing a workflow-style solution – putting in place more controls.” If you are going to automate, “don’t look to just automate the manufacturing”, Fleming continues. “Automate the manufacturing, and improve the quality and controls. This must be thought through upfront. You don’t build a car and then think about how quickly it should go.”

The people side of the EDM equation must also be addressed, he says. “Unfortunately, projects often reside with the security master people. This is wrong. It is not about security master data. It’s about process management workflow. You certainly need data expertise, but you also need people with expertise in process re-engineering, user interfaces, field placement, ergonomic screen design… Bring in people with different perspectives – people from different industries. If a piece of security master data is wrong, no-one dies. In the pharmaceuticals industry, if a piece of data is wrong in the manufacturing process, people CAN die. If you really want to understand data quality and controls, manufacturing people can teach you so much.”

On the subject of the technology side of the equation, Fleming is a self-confessed “heretic”. “There’s a solution out there for everybody,” he says. One impediment to successful implementations is that “everybody is running around searching for the golden RFP”. If the emphasis on finding the perfect system is too strong, there is a danger of “spending all your time and money looking for it”. “Just pick a system and get started,” he suggests. “Many times, firms have solutions in their shops already, left over from previous failed attempts. It would be perfectly possible to pick up one of those and keep going. But usually, firms just throw them away and start again. This lack of completion is one of the greatest problems of all, because it just creates one more disconnected component in the infrastructure.”

A common cry from the data management industry is that building a business case and demonstrating an ROI on data projects is difficult and that this hampers firms’ ability to get buy-in for initiatives. But, believes Fleming, “firms can and should come up with business cases”. It is perfectly possible to do this, based on both hard and soft benefits, he says, and indeed it is a vital step to take prior to a project. “If you don’t understand the processes well enough upfront, then you won’t get the process change right. You will just end up putting in place another technology container. And through the whole experience of building the business case, you will create change advocates that you wouldn’t otherwise have had.”

This is an edited version of an article that first appeared in the inaugural edition of Reference Data Review’s sister publication A-Team IQ, available free to subscribers. Go to www.a-teamiq.com or email ateamiq@a-teamgroup.com for details.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Real world data governance – Practical strategies for data ownership

The theories of data governance and ownership are well rehearsed. Essentially, data governance includes rules and processes that make data accurate, compliant and accessible, ensuring the right users can access trusted data as and when they need it. Data ownership assigns responsibility and accountability for a specific dataset to an individual or team that can...

BLOG

Pensions Insurance Corporation Selects FINBOURNE Cloud-Native Data Management Platform

Following a market review, Pension Insurance Corporation (PIC) has chosen FINBOURNE Technology to provide investment data management capabilities that will help drive its data strategy. PIC, a specialist UK pension insurer, manages a portfolio of about £45 billion and provides guaranteed pensions to over 300,000 people in the UK. It will use FINBOURNE’s cloud-native financial...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...