About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Simplifying the Data Pipeline Through Automation

Subscribe to our newsletter

The increasing complexity of data processes and the surging use cases for that data has made pipeline automation a must for financial institutions, says Stonebranch vice president for solution management Nils Buer.

Not only is there a business case for pipeline automation, but there is also a growing legal case for it as regulators pile ever-greater obligations on firms.

“In Europe, it’s a legal requirement that you monitor your flow end-to-end,” Buer tells Data Management Insight. “It’s important that you have full auditability of your flow but if you must go into each individual subsystem and connect through scripts, it’s very tough to audit. So automation is crucial.”

Regulatory oversight of data has increased as pipelines have lengthened. From ingestion to delivery, the processes through which data passes are multiplying as vendors unveil new products and services to clean, refine and order the information. Further complicating the picture, pipelines are constructed across a variety of frameworks, with modern institutions linking on-premises stacks with multi-cloud setups. Also, pipelines shift temporally, with resource-heavy processes moved around platforms to take advantage of the best rates given on any server at any given time of the day.

New Applications

Stonebranch is among the leading providers of pipeline and workflow automation. Its software-as-as-service and on-prem products are powering insurance and banks alike, but also manufacturers and retailers. It knits organisations’ data workflows together and provides users with a central hub from which it can oversee and manage them in real-time. If a problem arises with a data feed or a mastering system, for example, the company’s data manager can address it from the hub without having to go to the individual application.

It’s customisable, too, and lets users integrate on-prem and cloud-based schedulers. As a self-service product it is configurable via JSON, enabling organisations to shape their pipeline view within their preferred coding language.

In this way, Stonebranch’s solution is one that “orchestrates as well as automates”, says Buer, who is responsible for ensuring the multiplicity of new applications that come onto the market can be integrated into the automation tool.

“You have so many products involved in one single data pipeline that it is a must in many use cases,” he explains. He illustrates the point citing the example of banks, which are more likely to operate a hybrid on-prem and cloud setup. “You need somehow to connect these two worlds, and doing this by scripts is impossible – it’s too complicated. We really simplify the data pipeline.”

New York Summit

The benefits and challenges of taming data pipelines will be a central plank in Buer’s keynote delivery at A-Team Group’s Data Management Summit New York City later this month. His address, entitled “Future-proof your data pipelines with advanced automation”, will also touch on the expected return on investment from automating a data pipeline.

This latter point is a central one to any investment in new data management processes, and Buer is in no doubt that the benefits of pipeline automation can be realised quickly and in multiple ways. As well as optimising cloud use to get the best price benefits from each platform, and reducing the risk of legal action through poor auditing, Buer says the Stonebranch solution reduces drag in data pipelines. Providing data with a frictionless route through its many processes reduces the need for costly interventions. And should eddies emerge, engineers have a simple and clear route to take remedial action through Stonebranch’s hub structure.

“When a workflows go wrong, you really want to know where the issue is, why this issue happened, he says. “So this is why you can see the whole workflow in an observability dashboard using open telemetry.”

Cultural Change

Stonebranch’s technology was recently recognised in Gartner’s 2024 market guide for DataOps tools, which said they “enable organisations to continually improve data pipeline orchestration, automation, testing and operations to streamline data delivery”.

Of the challenges to implementation, Buer says the most difficult to overcome is not technological but cultural, with competing business units being expected to cooperate and collaborate in ways in which they may not be accustomed. Bringing an organisation’s various stakeholders together, however, can be made easier by the Stonebranch hub system.

“The key challenge is to connect the different departments,” he says, citing a leading global insurance brand. “They have a mainframe team, they have a cloud team, and they have also a Kubernetes DevOps team.

“Insurance policy data is in the mainframe, but needs to be in the web portal too, so these mainframe people now need to talk with the people from the cloud team. It’s easier when you can orchestrate these teams centrally.”

·      A-Team Group’s 14th annual Data Management Summit New York City will be held on 26th September. Click here or sign up for attendance below.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practice approaches to data management for regulatory reporting

13 May 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management...

BLOG

Webinar Preview: Buy-Side Best Practices to Navigate a Challenging Data Landscape

The buy side is facing fresh challenges as rapid digitalisation, shifting geopolitics and economic uncertainty force investors to evolve their investment strategies. While firms are turning to data to make sense of, and react to, these new volatilities, that surge of information is posing its own set of challenges too, particularly on how to manage...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Connecting to Today’s Fast Markets

At the same time, the growth of high frequency and event-driven trading techniques is spurring demand for direct feed services sourced from exchanges and other trading venues, including alternative trading systems and multilateral trading facilities. Handling these high-speed data feeds its presenting market data managers and their infrastructure teams with a challenge: how to manage...