About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Simplifying the Data Pipeline Through Automation

Subscribe to our newsletter

The increasing complexity of data processes and the surging use cases for that data has made pipeline automation a must for financial institutions, says Stonebranch vice president for solution management Nils Buer.

Not only is there a business case for pipeline automation, but there is also a growing legal case for it as regulators pile ever-greater obligations on firms.

“In Europe, it’s a legal requirement that you monitor your flow end-to-end,” Buer tells Data Management Insight. “It’s important that you have full auditability of your flow but if you must go into each individual subsystem and connect through scripts, it’s very tough to audit. So automation is crucial.”

Regulatory oversight of data has increased as pipelines have lengthened. From ingestion to delivery, the processes through which data passes are multiplying as vendors unveil new products and services to clean, refine and order the information. Further complicating the picture, pipelines are constructed across a variety of frameworks, with modern institutions linking on-premises stacks with multi-cloud setups. Also, pipelines shift temporally, with resource-heavy processes moved around platforms to take advantage of the best rates given on any server at any given time of the day.

New Applications

Stonebranch is among the leading providers of pipeline and workflow automation. Its software-as-as-service and on-prem products are powering insurance and banks alike, but also manufacturers and retailers. It knits organisations’ data workflows together and provides users with a central hub from which it can oversee and manage them in real-time. If a problem arises with a data feed or a mastering system, for example, the company’s data manager can address it from the hub without having to go to the individual application.

It’s customisable, too, and lets users integrate on-prem and cloud-based schedulers. As a self-service product it is configurable via JSON, enabling organisations to shape their pipeline view within their preferred coding language.

In this way, Stonebranch’s solution is one that “orchestrates as well as automates”, says Buer, who is responsible for ensuring the multiplicity of new applications that come onto the market can be integrated into the automation tool.

“You have so many products involved in one single data pipeline that it is a must in many use cases,” he explains. He illustrates the point citing the example of banks, which are more likely to operate a hybrid on-prem and cloud setup. “You need somehow to connect these two worlds, and doing this by scripts is impossible – it’s too complicated. We really simplify the data pipeline.”

New York Summit

The benefits and challenges of taming data pipelines will be a central plank in Buer’s keynote delivery at A-Team Group’s Data Management Summit New York City later this month. His address, entitled “Future-proof your data pipelines with advanced automation”, will also touch on the expected return on investment from automating a data pipeline.

This latter point is a central one to any investment in new data management processes, and Buer is in no doubt that the benefits of pipeline automation can be realised quickly and in multiple ways. As well as optimising cloud use to get the best price benefits from each platform, and reducing the risk of legal action through poor auditing, Buer says the Stonebranch solution reduces drag in data pipelines. Providing data with a frictionless route through its many processes reduces the need for costly interventions. And should eddies emerge, engineers have a simple and clear route to take remedial action through Stonebranch’s hub structure.

“When a workflows go wrong, you really want to know where the issue is, why this issue happened, he says. “So this is why you can see the whole workflow in an observability dashboard using open telemetry.”

Cultural Change

Stonebranch’s technology was recently recognised in Gartner’s 2024 market guide for DataOps tools, which said they “enable organisations to continually improve data pipeline orchestration, automation, testing and operations to streamline data delivery”.

Of the challenges to implementation, Buer says the most difficult to overcome is not technological but cultural, with competing business units being expected to cooperate and collaborate in ways in which they may not be accustomed. Bringing an organisation’s various stakeholders together, however, can be made easier by the Stonebranch hub system.

“The key challenge is to connect the different departments,” he says, citing a leading global insurance brand. “They have a mainframe team, they have a cloud team, and they have also a Kubernetes DevOps team.

“Insurance policy data is in the mainframe, but needs to be in the web portal too, so these mainframe people now need to talk with the people from the cloud team. It’s easier when you can orchestrate these teams centrally.”

·      A-Team Group’s 14th annual Data Management Summit New York City will be held on 26th September. Click here or sign up for attendance below.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

4 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and...

BLOG

Rethinking ‘Cloud First’: Why IT Leaders are Repatriating from Public Cloud

By Stewart Laing, chief executive, Asanti. Since 2013 “Cloud First” has been the guiding mantra for countless IT leaders, promising agility, scalability, and a low barrier to entry. Public cloud providers like AWS, Microsoft Azure and Google Cloud Platform have touted their services as one-size-fits-all solutions, offering organisations access to cutting-edge technology without substantial upfront...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...