The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Choosing an Approach to Analytics: Is a Single Technology Platform the Right Investment?

By Abhishek Bhattacharya, technology practice lead at Sapient Global Markets

Effective analytics provide insight into what happened, why it happened and what is likely to happen in the future, and include the factors that could help shape different outcomes. But when it comes to the ‘how’ of analytics – including which technology platform or platforms will be used to support them – there is less clarity as there are fundamental challenges in building analytics capability, including the pros and cons of investing in all-encompassing technology platforms.

Although technology platforms for analytics are the focus of this article, it would be a disservice to readers not to acknowledge that technology represents only a piece of the picture. When it comes to building an analytics capability, the real complexity lies not in the technology, but in the business case and the supporting analytics models.

To be successful, every analytics initiative must start with a clear understanding of appropriate business cases. How and where will the analytics be used? What are the critical performance indicators and/or business questions that must be measured and analysed? From there, sources of data must be identified and data must be modelled for the analytics engine. Analytical models, also known as quantitative models, provide sophisticated analytics, including analytics that can help predict or optimise outcomes. With these models in place, the next challenge is ensuring the quality of the data that enters the analytics engine. The phrase ‘garbage in, garbage out’ applies here. If data is sub-par quality, the output will be too, and business stakeholders will lose trust in the analytics.

All of these steps must be executed for every new business problem and all must be technology agnostic. Together, they represent about 80% of the effort within any analytics initiative, with the remaining 20% focused on technology, essentially choosing a platform, creating a production and testing environment, and conducting performance testing and tuning. The core steps of the effort should help inform technology decisions, one of which is whether to build a one-size-fits-all platform or develop a series of platforms, each designed for a specific requirement or set of requirements.

Come One, Come All?

Opting for a single enterprise analytics platform can seem like a logical decision, a way to ensure consistency and cost-effectiveness across all of an organisation’s analytics initiatives. Yet an all-encompassing platform is unlikely to succeed for a number of reasons.

The first reason is the sheer diversity of analytics needs. An organisation may be able to address its current range of needs, but it can be difficult, if not impossible, to anticipate all of the possible types of analytics it will need in future. Building a platform around all of those theoretical needs would also come at a very high price in financial terms.

Second, a one-size-fits-all platform could cost the organisation in terms of opportunity. As the user community becomes more proficient in analytics, it will ask for advanced features and capabilities. In most cases, this involves an evolution from descriptive ‘what happened’ to predictive ‘what is likely to happen’ and prescriptive ‘how can we increase the likelihood of our desired outcomes’ analytics. Building incrementally as these needs arise is a much more palatable solution. An incremental approach also leaves open the opportunity to tap into ongoing innovation. Technology platforms represent a fast-moving, ever-changing landscape, where committing to a single stack can cost you the chance to leverage something newer and better.

Finally, the advent of cloud computing has revolutionised the way an analytics environment can be set up and a technology platform built. The cloud has significantly reduced the work, making it possible to have an environment up and running in days, if not hours. This affords real flexibility and provides the ability to grow a platform to support additional users and new types of analytics. It also makes it easy to start small and build credibility and momentum over time.

For all these reasons, building an enterprise platform for analytics is probably not well advised. However, the opposite approach, building each component individually and then harmonising the components, can be equally expensive and ineffective. Much of the cost is spent in integration, data movement and harmonising the components.

Striking the Right Balance

If neither an all-encompassing platform nor a conglomeration of platforms is the right approach, how should organisations proceed? The key is to strike a balance between building everything and building the bare minimum. Ideally, such an approach would yield an all-encompassing architecture (see Figure 1) that:

  • Embraces layers: Rather than focusing on the platform specifically, think in terms of the layers of any analytics capabilities. An effective architecture will include layers for data, data ingestion and business intelligence. Compared to traditional techniques, this approach affords much more flexibility over time. In particular, when traditional star schema are used, everything is driven out of the schema, making it difficult to evolve the platform as the need for analytics changes.
  • Offers components within each layer: For greater effectiveness and agility, each layer should be built with modular components. The data layer must provide the ability to manage structured, analytical, unstructured and streaming data. The data ingestion layer should have modules for master data management, extract transfer and load, and data quality management. The business intelligence layer must offer self-service, various types of analytics, visualisation capabilities and support for multiple device types.
  • Is designed for evolution: It is important to build an architecture that can easily accommodate change. As part of that, work to build an understanding of the dimensions of potential change, such as business problems and types of quantitative models. By understanding the aspects likely to change, you can identify appropriate components and technologies at each layer. Fortunately, modern technologies from columnar databases to the Hadoop open source software framework are inherently flexible and do not force every part of the solution to be tied to a specific quantitative model or schema.

Think Big, Start Small

Every successful analytics initiative will be built upon a sound framework that includes identifying business value, building models, sourcing data and, ultimately, driving adoption. At every step, technology is a critical enabler, but it should not be the central focus, nor should it be a barrier. With many analytics technologies now available on the cloud, it is possible to get started with very little upfront capital cost. Using Amazon Web Services, Azure and other cloud services an organisation can begin to build an all-encompassing architecture, starting small, building something and showing value to the business before making substantial investments.

Related content

WEBINAR

Upcoming Webinar: How to establish data quality and data governance for analytics

Date: 22 April 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data quality has been a perennial problem for financial institutions for many years, but this needs to change as firms become increasingly reliant on accurate analytics to deliver business opportunity and competitive advantage. New approaches to data quality can...

BLOG

Octopai Supports Cloud Migration with Cloud Native Data Lineage for Business Intelligence

As data volumes grow, new sources emerge, and firms migrate to the cloud, data lineage has become essential to mapping, visualising and understanding data across the enterprise, and crucial to providing business users with trusted information. Octopai, which came to market in 2018, offers automated metadata-based data lineage with a focus on analysing and understanding...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...