A-Team Insight Blogs

Counting the Costs and Constraints of Building In-House Solutions

Research commissioned by Asset Control highlights the pain points of updating or replacing in-house solutions, including cost, skills and resourcing, and the difficulty of deploying solutions to the cloud.

Headline statistics from the research show 94% of respondents expecting to encounter challenges of some sort when building a solution in-house, and 52% of senior decision makers in financial services organisations across the US and Europe looking to update or replace in-house solutions because they have become technologically outdated. Some 49% are pushed to schedule changes by increasing digitalisation within the business and 48% by the need to keep pace with the competition.

The survey underpinning the research was carried out by One Poll among 100 decision makers in financial firms with over 50 employees. Fifty of the firms are in the UK and 50 in the US.

According to the research, the challenges of building in house often lead, directly or indirectly, to greater costs, with skills and resourcing the biggest challenge of building a solution in-house, followed by staying within budget and having to adapt the solution to meet changing regulations or business requirements.

Martijn Groot, vice president of marketing and strategy at Asset Control, says: “The gradual accumulation of additional costs is one of the biggest problems with the in-house approach to technology development in financial services. Internal solutions are often approached as a project, a one-off cost, and are not regarded, and consequently budgeted, as an ongoing concern. This is unrealistic in a fast-changing financial services landscape.”

The ongoing costs of internal solutions tend to come from additional costs after implementation and hiring new developers after previous developers leave the business.

Groot comments: “The one-off approach, if executed well, may look attractive given that the firm is best placed to cater to its own specific requirements. However, the subsequent maintenance costs to keep the lights on and evolve the feature set to cope with emerging requirements are large.

“Also, with costing often done as a project, some operational costs tend to be hidden until an organisation wants to change something. This can be a challenge, particularly if the original developers have moved on, the platform is technologically outdated or does not lend itself well to cloud deployment. Unfortunately, the true costs and constraints of an internally developed solution often only become clear when firms need to change things.”

Leave a comment

Your email address will not be published. Required fields are marked *

*

Share article

Related content

WEBINAR

Upcoming Webinar: Buy-side: How to get data management right in investment management, insurance and pension funds

Date: 11 April 2019 Time: 10:00am ET / 3:00pm London / 4:00pm CET. Regulation and business needs are putting pressure on buy-side firms to significantly improve their data management processes. Unlike sell-side firms, most buy-side firms are reluctant to implement large data management solutions and many prefer a zero footprint approach. So, how are buy-side...

BLOG

Finding a Balance Between Standards and Flexibility in Data Architecture

The integration of standards and flexibility into data architecture is an ongoing challenge for financial firms that must not only improve operational efficiency, but also sustain adaptability to support business and regulatory change. Both are important, but potentially opposed to each other, begging the question of how best to find an optimal balance between standards...

EVENT

Data Management Summit London

Now in its 8th year, the Data Management Summit (DMS) in London explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...