About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Virtuality Breaks Data Management Mould with Logical Data Warehouse

Subscribe to our newsletter

Data Virtuality has broken the traditional data management mould with Logical Data Warehouse (LDW), a solution that combines the flexibility of a data virtualisation engine with extract, transfer and load (ETL) tools. It includes 200 connectors to data sources and consumption tools, allows users to access data in real time, and uses SQL language to define data models and access data directly from different systems for use cases such as regulatory reporting.

The company was founded in March 2012 with an initial focus on digital use cases of LDW such as e-commerce and digital marketing. More recently, it has gained interest from the finance sector and is reconsidering its positioning to work specifically within the sector. Clients already onboard include Vontobel and Crédit Agricole Consumer Finance, which uses the platform to aggregate credit risk data, produce regulatory reports, and monitor credit loan applications in real time.

Nick Golovin, founder and CEO of Data Virtuality, explains: “The biggest problems banks face are the challenges of regulation, regulators looking at how they produce reports, and cost. The features of our system fit well here and fulfil use cases such as risk data aggregation, regulatory reporting, digital banking, and real-time processing.”

The Data Virtuality solution provides a single platform to connect, transform, query and join data from multiple data sources immediately, without depending on IT. It can be used across a business, implemented in a day, and its flexibility and scalability allow new queries to be set up in minutes rather than months. The technology is also transparent, making data lineage and an audit trail relatively easy to achieve. Data governance is built into the access layer.

From a user perspective, the solution’s virtualisation engine takes data from any connected source, makes the data look like an SQL database and allows the user to use SQL to define data models and get data directly from different systems to meet particular use cases. Golovin comments: “The virtual layer makes modelling very flexible, the same data is used for different models and output requirements.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Hidden Dangers in the Race to ‘AI-Readiness’

The data ecosystem has been awash with references to “artificial intelligence readiness” in the past few months, a reflection of the importance being placed on the technology within capital and private markets. The term is generally used in calls for institutions to upgrade their data management systems to ensure their data is of good enough...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

The Data Management Implications of Solvency II

This special report accompanies a webinar we held on the popular topic of The Data Management Implications of Solvency II, discussing the data implications for asset managers and their custodians and asset servicers. You can register here to get immediate access to the Special Report.