The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Asset Control’s McGranaghan on the Data Vendor’s Partnership with Oracle

Asset Control is one of a number of vendors working with Oracle to benchmark its technology capabilities with a view to the vendors working closely together to improve the scalability and response times of EDM solution vendor’s platform. Reference Data Review speaks to Donal McGranaghan, senior vice president of product and development at Asset Control, about how the project is progressing and the risk and regulatory drivers behind the decision to partner with Oracle.

In terms of the ongoing project with Oracle, what stage is it currently at and when is it due to be completed?

We have embarked upon a series of benchmark exercises to look at how the use of Oracle technology can offer choice and cost certainty to members of our client base, and ultimately help clients enhance their businesses. These benchmarks also give clients a level of confidence in comparing different combinations of our respective products to help enhance their ability to reach goals in terms of time to market, scale and performance, and delivery of accurate, accessible and actionable data to business users.

Could you provide more detail about the various stages that have completed thus far?

The first exercise was to get an optimally tuned ‘baseline’ on the latest Oracle 11g platform based on our own test environment. This gives us some figures that we can use in talking to customers and prospects in terms of what is achievable with the core offerings.

The second exercise involves the repeat of the tests using the Oracle Exadata appliance. This does two things: first it allows us and our customers to explore the characteristics and strengths of this particular platform, and secondly it gives us useful ‘compare and contrast’ metrics for helping our mutual clients make their sizing and performance decisions.

The third phase is a full blown benchmarking exercise with a mutual client where we will come out with a series of configuration matrices that will allow that client to make appropriate decisions, but which will also give other clients and prospects a ‘ready reckoner’ with which to start to explore confidently what options are available with their various characteristics. As part of that exercise, we will also be exploring the IMDB option from Oracle to see what sort of accelerator and capacity options that would give us and our clients.

Have there been any challenges?

The first two exercises ran very smoothly. We have had great help from the Oracle specialists and we are excited to see the initial Exadata results in particular.

What has been the driving force behind the work?

Data volumes are getting higher, the time windows are getting smaller, and the access demands are getting more complex due to new regulations and risk initiatives, and liquidity fragmentation – all of which require new technology, innovation and the ability to keep up with future requirements. Firms are seeking faster time to market, increased revenues, stronger competitive differentiation, enhanced risk management and keeping up with rapidly changing and complex markets, and data scale and performance are key to achieving these goals.

In addition, the evolution curve in data management is changing also. Where once the focus was on getting to a ‘golden copy’ quickly and concentrating on securities of interest for example for trading, now clients are looking to monitor more than their particular securities. They are monitoring and requiring information on a wider range of items, and taking advantage of features for example in our products regarding ‘bi-temporal’ data management, or the ability to roll data forward or back to determine an ‘as of’ position of the data in time. They also want to be able to interrogate, view and direct data at all parts of the lifecycle, not just as finished golden copy. All of this demands flexibility, power and an understanding of the levers involved, as well as the options available.

What tangible benefits will Asset Control customers experience as a result of the work?

Customers will have enhanced access to accurate, accessible and actionable data to drive their businesses.

By starting with clear baseline metrics, we can establish clear milestones to improve upon – targets that are key to helping clients achieve ambitious business goals. With Oracle’s help, we can demonstrate what optimisation on the current platform can achieve, and then we can show a comprehensive set of progressively sophisticated solutions that will be matched with various factors including number of instruments, type of access, data quality needs, cleansing requirements, rule sophistication, historic and future profiling, and so on. This will help us serve our clients better in terms of getting the optimum environment and configuration to suit their current and growing needs.

Given that Oracle is working on similar projects with other data management solution vendors, will first to market provide an advantage?

I think that all of the data management vendors have characteristics in their products that they use to differentiate and attract clients; therefore, it is good practice always to try and show those characteristics in their best light with partners who form part of the solution for a given client. Not having this information is a disadvantage, and therefore it is good that we keep abreast of our solution partners’ evolving product portfolios too, and they ours.

How will this impact your competitive positioning and how does it sit within Asset Control’s overall plans for the year?

The business implications of data and the needs of decision makers are really what are driving Asset Control’s plans for the year. We’re focusing on scalability, extensibility and lower TCA while increasing speed, integration and reliability. Streamlined implementations and system throughput and performance are crucial.

This project enhances our ability to show our products’ capabilities to those clients who are currently or potentially Oracle clients too. More generally, it also reveals the multi-dimensional aspect of planning for data management that relies on a series of profiling characteristics that can be answered by a range of sophisticated solutions and partnerships. This is a key strand in our product plans this year as we explore the concept of choice for clients arising from flexible functionality, awareness of capability and a solutions approach. One size does not fit all, but everybody needs what’s right for them. We have embarked this year on exploring options around database structures and configurations, and these exercises are a critical step in achieving this.

Related content


Recorded Webinar: Trade surveillance: Deploying monitoring and surveillance capabilities for today’s new normal

Let’s face it: The old ways aren’t coming back. A plethora of challenges brought on by the covid-19 pandemic, coupled with unrelenting market volatility and uncertainty, have pushed financial service firms to look for rigorous monitoring and surveillance solutions to meet the demands of the emerging trading landscape. Working from home (WFH) has increased the...


Pico Taps Intel ‘Ice Lake’ to Achieve Sustained 100Gbps Processing for Corvil Analytics

Trading infrastructure specialist Pico has teamed with Intel to achieve 100Gbps sustained real-time processing for its Corvil Analytics product, which is used by financial institutions to manage and analyse their network flow. The development comes hot on the heels of Intel Capital joining other investors in funding a $135 million Series C investment round for...


Data Management Summit London

The Data Management Summit Virtual explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...