The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Wachovia Expands Use of AC Plus To Boost Risk Data Quality

Wachovia Corp. has once again expanded its use of Asset Control’s AC Plus data management platform to boost the capabilities of its internal risk management function. Wachovia originally implemented AC Plus in its risk management operation in 2004 (Reference Data Review, January 2005) and subsequently extended its use to other areas of the enterprise as part of a three-year project (Reference Data Review, September 2005).

The latest expansion involves the addition of sources of market data used to support Wachovia’s risk management systems. The bank has added several undisclosed “complex data feeds” that it says will help “improve the quality of market data utilized within risk management” as well as offering that data throughout the bank.

Wachovia’s risk solution makes use of snapshot, end-of-day and time-series pricing information for interest rates, credit spreads, equities, FX and commodities, gathered and managed by the AC Plus platform. Additionally, AC Plus consolidates and validates data from external sources, including Reuters, Bloomberg, FT Interactive Data and Markit Group, to provide consistency and reliability.

Wachovia is making use of Asset Control’s range of four-dimensional graphing capabilities. This will allow the bank to analyze and survey data anomalies and trends over time.

Speaking at the ISIPS conference in London this month, Martijn Groot, head of product management at Asset Control, outlined how Asset Control’s audit and backtracking functions allow clients to standardize and consolidate disparate and often-conflicting price data from multiple sources into a single consolidated price that can be published to internal application.
The process involves applying client-defined business rules to incoming and internal data sources. These rules reflect the client’s approach to data management, and may range in complexity from a sophisticated algorithm to a simple average in order to arrive at a figure that the institution is comfortable with.

Related content

WEBINAR

Upcoming Webinar: Evolution of data management for the buy-side 2021

Date: 27 May 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes The buy-side faced a barrage of regulation in 2020 and is now under pressure to make post-Brexit adjustments and complete LIBOR transition by the end of 2021. To ensure compliance and ease the burden of in-house data management, many...

BLOG

GLEIF Details Technologies Underlying Digital Version of LEI, the Verifiable LEI

The Global Legal Entity Identifier Foundation (GLEIF) has published issuance and technical infrastructure models for the verifiable LEI (vLEI) system it introduced back in December 2020. The vLEI is a secure digital attestation of a conventional LEI and is designed to extend the use of the identifier and, ultimately, enable instant and automated identity verification...

EVENT

Data Management Summit New York City

Now in its 10th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Corporate Actions

Corporate actions has been a popular topic of discussion over the last few months, with the DTCC’s plans for XBRL and ISO interoperability, as well as the launch of Swift’s new self-testing service for corporate actions messaging, STaQS, among others. However, it has not been a good start to the year for many of the...