A-Team Insight Blogs

Stream Financial Releases High-Performance Data Management Platform

Stream Financial has released a data management platform designed to solve the problems of data siloes and ageing enterprise data management (EDM) solutions, and deliver efficient, high-performance data management that can reduce total cost of ownership and help firms improve margins.

The platform challenges traditional data management systems, but keeps existing infrastructure in place, by providing a distributed query engine on top of existing systems that connects data from multiple sources together and readies it for processing. By way of example, a query run on the platform with plugs to an Oracle database, Microsoft SQL Server, bespoke platform or spreadsheets, could be answered in about 50 milliseconds.

Stephen Taylor, CEO at Stream Financial says: “Most financial institutions have a massive regulatory burden and data requirement. They have tried to reduce costs by off-shoring and outsourcing, but they need to do more to compete in an environment of low margins.”

Considering existing data management systems, Taylor notes that problems with data silos have led firms to copy the data from the silos and take a Big Data approach, which is not ideal as it is better to access data at source. EDM systems, he says, translate data efficiently into required formats, but struggle to deliver at scale and can become complex.

The technology underpinning the Stream Financial platform is a compressed columnar database, a technology that was initially used in astronomy in the 1970s. It supports commonly used programming languages such as Java, Python and .NET, and uses SQL to query data. Multiple queries can be run concurrently using a query app provided by Stream Financial and with scripts written by the company or the user. The app calls processes to gather the desired dataset, which is then sent to an existing system app such as a reporting app.

The company was founded in 2013 and has built the platform from the ground up, in part based on previous experience Taylor gained working on similar problems at Barclays Capital. He describes it as a simple and safe means of answering business needs for fast and efficient data queries.

Its components include: DF Vault, which allows real-time querying of compressed historical data and through compression provides reduced storage requirements and increases performance; DF Performance, which supports high performance caching, leverages existing technology, and makes data in any system accessible to users in SQL; and DF Virtualisation, which provides targeted customer centric views of live operational data, clean data to drive apps, reporting, artificial intelligence and analytics, and supports data lineage.

“It is the speed of processing and data sourcing that differentiates the Stream Financial platform,” says Taylor. “Operating as a virtual database with some aspects of hardware caching, it can handle billions of rows of data and make concurrent queries across multiple data sources fast and efficiently.”

The company is initially promoting its solutions, which can be implemented in-house, in a private cloud or as managed services, to buy-side firms that are significantly smaller than sell-side banks (that are often reluctant to buy solutions from small vendors), and frequently take the option of managed services rather than enterprise solutions. The company has secured one customer, and has others in the pipeline. Customers pay to use the platform on a monthly basis, with additions such as more data sources paid for at a consultancy rate and support included, which Taylor remarks is ‘a simple offer to buy’.

Leave a comment

Your email address will not be published. Required fields are marked *

*

Share article

Related content

WEBINAR

Recorded Webinar: Data Governance

This webinar has passed, but you can view the recording by registering here. A-Team Group editor, Sarah Underwood, is joined by Dennis Slattery of EDMworks for a Q&A session to address the following: How to implement better data governance and workflow amidst data inconsistencies and duplication, challenging regulatory demands, organisational overlaps, varied documentation standards, and...

BLOG

JWG’s Di Giammarino Elaborates on Data Shortcomings of Industry’s Attempt to “Know Your Exposure”

Industry think tank JWG has just completed a research project into the industry’s level of preparation for the risk management challenges involved in the incoming tsunami of regulatory requirements that indicates a lot still needs to be done to get the right level of risk data consistency and accuracy. Conducted over the summer months and...

EVENT

Breakfast Briefing: Meeting the Data Requirements of FRTB London

The Fundamental Review of the Trading Book (FRTB) Breakfast Briefing, will examine how the capital markets industry is approaching FRTB data management and will look at the implications for the ways that firms source, manage and store data for FRTB compliance.

GUIDE

Solvency II Data Management Handbook

Want to get a handle on Solvency II and what it means for data management? Need to make sure you have all the bases covered for the looming January 2016 deadline? Our Solvency II Data Management Handbook is now available for free download to help you. This Handbook is the ultimate guide to all things...