About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Knowledge Graph Technology – A Game Changer for Data Management in Capital Markets

Subscribe to our newsletter

Early identification of use cases that provide a focal point for the implementation of knowledge graphs is essential to successful projects, according to a panel of experts speaking at a recent A-Team Group webinar.

The webinar discussed the benefits and pitfalls of implementing knowledge graphs, including how to build, sustain and maintain them, use cases, the challenges of deployment in existing data management landscapes, and approaches firms can take to resolve challenges and achieve return on investment.

The webinar speakers were David Newman, senior vice president and head of enterprise knowledge graph solutions, data management and insights at Wells Fargo; Michael Pool, executive director, head of ontology and semantic modeling at Morgan Stanley; and Alex Brown, chief technology officer at Datactics.

Early consensus suggested that use cases of knowledge graph technology must be identified at the outset to provide a focal point for the implementation process.

“There is a tendency to try to create an enterprise-wide ontology, which can be overwhelming,” said Pool. “The key to getting a knowledge graph built is to identify the use cases you need to address and show value – it is easy to modify the graph and the ontology as you go. You don’t need to worry about getting everything right as long as you design and solve those use cases in a way that is scalable.”

Newman added: “Organisations have a great opportunity if they start with the right investment – namely a centre of excellence where groups come together and plan deployment by defining naming conventions, file structure standards, governance, and processes for approval by owners of key data areas and how the concepts presented in those key data models will be aligned with each other.”

Once use cases are agreed, it is important to make it as easy as possible for people to put data sets into the graph and extract the data they need. That said, Pool commented: “If you need a small group of engineers to get data into the graph, the project is not going to work. It is also important to think about what kind of metadata is going to be important at the outset and get it into the graph early so it becomes part of the process. That metadata is what is going to allow people to find the data and help them make links.”

Use cases

Use cases of knowledge graphs was the subject of an audience poll conducted during the webinar. Some 75% of respondents highlighted data analytics, while business insight and data integration were both referenced by two-thirds of respondents.

A number of use cases were explored by the webinar speakers, the first of which was linking structured data with alternative data sets – an application that is becoming increasingly important in capital markets.

Newman noted the opportunity for payment monitoring and how the ability to view a network of payments is helpful in identifying financial crime. “When that layer of knowledge is injected into the network we gain insights that we would normally have many more challenges to identify,” he said.

Brown described a project Datactics has been working on to build a knowledge graph and populate a graph database with UK Companies House data and other data sets from the Office of National Statistics. The objective of the project was to link company entities to persons of significant control.

“This is a common challenge for tasks such as KYC due diligence,” Brown said. “We sought to demonstrate the importance of data quality matching by taking data verbatim from these sources, throwing it into graph databases, generating a knowledge graph, and then comparing this with a more refined approach where we ingested, normalised and cleansed the data before discovering links using fuzzy matching and natural language processing.”

Brown says the project proved the importance of carefully preparing data and imposing quality controls whenever a knowledge graph is generated or populated. “If the data is wrong, you can miss the relationship between nodes or fail to identify that two nodes are actually the same thing, so you need to be careful that your knowledge graph is giving you the complete picture if it is feeding into something like a KYC process.”

Data cataloguing can also benefit from the use of knowledge graphs. “For example, an organisation might be asked for its national identification numbers,” explained Newman. “There is typically no column in a table database that represents national identification numbers. With a knowledge graph you can organise your data so you can query the broad class of data of national identification numbers.”

Challenges of implementation

Newman suggested the challenges of deploying knowledge graphs in existing data management landscapes can be divided into organisational and technological issues that are ‘natural growing pains’ of introducing new technologies and capabilities. The challenges are about  giving different groups in an organisation an opportunity to review the diversity of knowledge graph tools so the right tools can be found and adopted in a technology standards catalogue,” he said. He also noted the need to develop a semantic model to underpin selected tools.

Brown emphasised the importance of automated, reproducible, and auditable pipelines to populate knowledge graphs with high quality data.


A final audience poll considered the the extent of business and operational benefits organisations are gaining, or would expect to gain, from the implementation of knowledge graphs. Some 38% of respondents are gaining, or expect to gain, significant business benefits. The same percentage are gaining, or expect to gain, some business benefits. From an operational standpoint, 13% said they are gaining, or expect to gain, some operational benefits. A similar percentage said they expect to gain neither business nor operational gains from the implementation of knowledge graphs.

“The biggest factor in return on investment is data utilisation and the way to achieve that is making the knowledge graph the framework from which data utilisation occurs,” said Pool. “The key is to be able to say, ‘I want my front line analyst to be able to figure out the impact on this particular portfolio of something happening in this particular country’.”

He concluded: “Solve the small problems and think about how they fit into the larger framework. “Don’t let perfection be the enemy of the good, because if you try to solve the whole thing before you even get started, you will never find a solution.”

Subscribe to our newsletter

Related content


Upcoming Webinar: How AI can help trading venues ease the burden of post-Brexit MiFID II data continuity checks

Date: 6 October 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Brexit has brought a number of additional challenges to trading venues, not least a new layer of complexity when fulfilling Markets in Financial Instruments Directive II (MiFID II) transparency requirements. This has been caused by ESMA increasing the data continuity...


GLEIF Publishes vLEI Governance Framework to Support Digital Identities for All Legal Entities

The Global Legal Entity Identifier Foundation (GLEIF) has published its planned verifiable LEI (vLEI) Ecosystem Governance Framework and is working with capital markets regulators on how the vLEI could be a means of securing filings and reports. It has also presented the framework for review by the Regulatory Oversight Committee (ROC) of the LEI, which...


FRTB Briefing Virtual (Redirected)

The FRTB Briefing Virtual will examine key focus areas and priorities for banks in 2020 and offer guidance, advice and expertise for tackling the challenges and pain points around implementing the FRTB regulation.


ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...