The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: UBS’s Brown on Creating a Data Map of the Financial Enterprise

Rupert Brown, lead architect in the office of the CTO at UBS Investment Bank, opened the A-Team Group Data Management Summit with a keynote presentation covering the creation of a data map for a financial enterprise. Taking a traditional map as his starting point and later touching on the topographical nature of the London Underground map, he asked questions along the lines of: have we got a map; where are we on the map; where do we want to go; how do we get there; how far do we have to go; and, inevitably, are we nearly there yet?

Outlining the drivers for an enterprise data map, Brown explained: “The need is to look at data across the enterprise, not only reference data, but also pricing and risk data. It is also necessary to look at the segregation of duties around data, systems development and operations. We don’t have permanent employees making up IT teams any more, instead we have consultants, outsourcing arrangements and so on, which means meta data is important in sustaining a single source of the truth.”

Brown went on to discuss the need for positional reference points and categorisation on any data map and noted the London Underground map, first drawn in the 1930s and looking very similar today, as an example of a map with a notion of latitude and longitude, orientation with north at the top and standard fonts and annotation. But he warned: “It is not enough to draw a map, people must believe in it and refer to it.”

Turning to an IT intensive financial enterprise, Brown described a project he is leading at UBS, which segregates and maps interconnections as longitudinal bars on a map and applications as latitudinal bars. The data map was generated by feeding the bank’s database into OpenStreetMap and includes nodes and ways, as well as relationships between these that can help put meta data into the map.

The initial mapping concept has been tested using several hundred applications and several thousand connections. The outcome is a good view of data transports and applications, and a highly questionable solution offering many data views without the need to change data and its relationships on the map. Brown commented: “The map is sufficiently simple and consistent to stand the test of zooming in and out.”

Summarising the project to date, he added: “We have learnt that 2D maps are best and can be used as layers or with links, that the OpenStreetMap approach is viable and that the database has to be NoSQL. We have also learnt that we need large format printers.”

As OpenStreetMap is about crowd sourcing, Brown plans to transfer the date map to a web page, allowing people to click on a node and add or edit data. He explained: “This is a key part of the project. We will ask people to push data into the map as they write software and we will need to map protocols with animation. We also need bitemporality to see what the map looked like and what it will look like. This requires versions of the map and animation over time.”

The map concept will be moved to a web server using the open source Mapnik toolkit for rendering maps, while a Java animation tool is being built to provide the ability to add or edit data. UBS is working with University College London to evolve this part of the project and consider aspects such as who should contribute data and how data mapping experts can be identified on the basis of their contributions.

Brown explained the genesis of the data map as a project he undertook in his former role at Merrill Lynch to find out the degrees of separation between reference data and its targets. Looking forward, he said: “Essentially, the map is a data structure that can be used to query dependencies, but it is also part of a bigger idea about how you get people to put data into systems and how you gather knowledge within the enterprise and bring it back into the software development lifecycle. We are building engines that will bring together and map systems and data models, and we are building processes to check the map. The end game is to derisk the enterprise.”

Related content

WEBINAR

Recorded Webinar: Approaches to migrating market data to the cloud to drive agility in trading operations

Market data in the cloud is an attractive proposition in terms of reducing the cost of on premise servers and storage, and moving into a more agile and flexible data delivery environment. It is also well suited to working from home, the fall-back of many financial institutions during lockdowns caused by the coronavirus pandemic. But...

BLOG

Martin Currie Investment Management Implements CuriumEDM for ESG Data Processes

Martin Currie Investment Management has completed the first phase of implementing Curium Data Systems’ CuriumEDM, which includes a focus on gathering, mastering and quality control of Environmental, Social and Governance (ESG) data sets. The investment manager has offered sustainable long-term investing for some years, with ESG analysis embedded in its investment process. CuriumEDM provides data...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Trading Regulations Handbook 2021

In these unprecedented times, a carefully crafted trading infrastructure is crucial for capital markets participants. Yet, the impact of trading regulations on infrastructure can be difficult to manage. The Trading Regulations Handbook 2021 can help. It provides all the essentials you need to know about regulations impacting trading operations, data and technology. A-Team Group’s Trading...