About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Jordan & Jordan Unveils Reference Data Testing Kit

Subscribe to our newsletter

Jordan & Jordan (J&J), a New York-headquartered management and infor-mation technology consulting firm, has introduced a so-called Reference Data Testing Framework (RDTF) and will offer it to clients as part of its reference data consulting services.  

Developed by J&J over the past two years, RDTF is a customizable framework and set of tools that address the need for automated data testing and verification within reference data projects. It allows for one-time creation of test data records along with an analysis specification that is used to analyze test results. Among the data products supported are Bloomberg Data License and Reuters DataScope.

Although RDTF is primarily used during the development and integration test cycles of a reference data project, it can also help with vendor selection and offshore testing.

Essentially, it takes data in different parts of the data-processing lifecycle, compares it to expected results, and reports on exceptions. The exceptions can then be analyzed to determine whether problems lie within the systems, process or the data itself.

“When you want to validate or arbitrate your reference data, you need a test,” says Richard Shriver, managing director, technology services, at J&J. “We recognized that testing involved repeatable generic processing, and so we evolved some of the custom development software into a reusable toolkit. By using the toolkit within our consulting service, we provide clients a more efficient service – better, faster and cheaper – and give clients the confidence that they are dealing with people who understand the business processes enough to ask the right questions.”

Features of RDTF include:
Record count checks for raw records loaded into the system and reports on missed records.
User-friendly test data creation with accompanying test analysis specification on a per-record basis.
Customizable implementation-specific analysis for test results.
Excel-like reports on test results, details of exceptions and client specific report additions.

Components include a data reader/generator, data loader, analyzer and reporter. Test reports include summary pass/fail, exceptions and detail reports, as well as an array of custom reporting options.

The J&J Reference Data Consulting Service comprises domain experts for a broad range of asset and data classes, technology experts who have implemented large-scale multi-asset reference data systems for major investment banks and hedge funds, as well as a testing strategy that includes the use of RDTF. The firm also offers consulting for institutions looking to manage reference data test and support teams offshore.

According to Shriver, with any data reference implementation project, regardless of whether software is provided by third-party suppliers or developed in-house, there are a lot of business rules and data definitions specific to each client. These all need to be independently verified and tested.

He says: “If you want to change the reference data process, you need to verify data that comes into and out of the process.”

Although “there are lots of people selling software and solutions,” Shriver says each client still has a lot of work to do to ensure the system works as expected. That’s where J&J can help.

Shriver says J&J has a growing reference data consulting business. In the past two years, the company has consulted on three major enterprise reference data projects within North America, as well as smaller projects – such as content or software changes – including vendor evaluations and RFI development.
Although pricing for J&J services varies depending on the project, Shriver says a good rule of thumb is to budget 20% of the development and implementation costs for testing and verification.

 jj.jpg

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: In data we trust – How to ensure high quality data to power AI

Artificial intelligence is increasingly powering financial institutions’ processes and workflows, encompassing all parts of the enterprise from front-office to the back-office. As organisations seek to gain a competitive edge, they are trialling the technology in variety of ways to streamline and empower multiple use cases. Some are further than others along the path to achieving...

BLOG

What a Decade of DataOps has Taught Us

By Hervé Chapron, SVP & General EMEA, Semarchy. It has been 10 years since the term DataOps was first coined. It refers to the practice of combining technology and process excellence to improve the quality, usability and, ultimately, value of data to a business. DataOps has evolved from a nascent concept to an integral component...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...