The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Choosing an Approach to Analytics: Is a Single Technology Platform the Right Investment?

By Abhishek Bhattacharya, technology practice lead at Sapient Global Markets

Effective analytics provide insight into what happened, why it happened and what is likely to happen in the future, and include the factors that could help shape different outcomes. But when it comes to the ‘how’ of analytics – including which technology platform or platforms will be used to support them – there is less clarity as there are fundamental challenges in building analytics capability, including the pros and cons of investing in all-encompassing technology platforms.

Although technology platforms for analytics are the focus of this article, it would be a disservice to readers not to acknowledge that technology represents only a piece of the picture. When it comes to building an analytics capability, the real complexity lies not in the technology, but in the business case and the supporting analytics models.

To be successful, every analytics initiative must start with a clear understanding of appropriate business cases. How and where will the analytics be used? What are the critical performance indicators and/or business questions that must be measured and analysed? From there, sources of data must be identified and data must be modelled for the analytics engine. Analytical models, also known as quantitative models, provide sophisticated analytics, including analytics that can help predict or optimise outcomes. With these models in place, the next challenge is ensuring the quality of the data that enters the analytics engine. The phrase ‘garbage in, garbage out’ applies here. If data is sub-par quality, the output will be too, and business stakeholders will lose trust in the analytics.

All of these steps must be executed for every new business problem and all must be technology agnostic. Together, they represent about 80% of the effort within any analytics initiative, with the remaining 20% focused on technology, essentially choosing a platform, creating a production and testing environment, and conducting performance testing and tuning. The core steps of the effort should help inform technology decisions, one of which is whether to build a one-size-fits-all platform or develop a series of platforms, each designed for a specific requirement or set of requirements.

Come One, Come All?

Opting for a single enterprise analytics platform can seem like a logical decision, a way to ensure consistency and cost-effectiveness across all of an organisation’s analytics initiatives. Yet an all-encompassing platform is unlikely to succeed for a number of reasons.

The first reason is the sheer diversity of analytics needs. An organisation may be able to address its current range of needs, but it can be difficult, if not impossible, to anticipate all of the possible types of analytics it will need in future. Building a platform around all of those theoretical needs would also come at a very high price in financial terms.

Second, a one-size-fits-all platform could cost the organisation in terms of opportunity. As the user community becomes more proficient in analytics, it will ask for advanced features and capabilities. In most cases, this involves an evolution from descriptive ‘what happened’ to predictive ‘what is likely to happen’ and prescriptive ‘how can we increase the likelihood of our desired outcomes’ analytics. Building incrementally as these needs arise is a much more palatable solution. An incremental approach also leaves open the opportunity to tap into ongoing innovation. Technology platforms represent a fast-moving, ever-changing landscape, where committing to a single stack can cost you the chance to leverage something newer and better.

Finally, the advent of cloud computing has revolutionised the way an analytics environment can be set up and a technology platform built. The cloud has significantly reduced the work, making it possible to have an environment up and running in days, if not hours. This affords real flexibility and provides the ability to grow a platform to support additional users and new types of analytics. It also makes it easy to start small and build credibility and momentum over time.

For all these reasons, building an enterprise platform for analytics is probably not well advised. However, the opposite approach, building each component individually and then harmonising the components, can be equally expensive and ineffective. Much of the cost is spent in integration, data movement and harmonising the components.

Striking the Right Balance

If neither an all-encompassing platform nor a conglomeration of platforms is the right approach, how should organisations proceed? The key is to strike a balance between building everything and building the bare minimum. Ideally, such an approach would yield an all-encompassing architecture (see Figure 1) that:

  • Embraces layers: Rather than focusing on the platform specifically, think in terms of the layers of any analytics capabilities. An effective architecture will include layers for data, data ingestion and business intelligence. Compared to traditional techniques, this approach affords much more flexibility over time. In particular, when traditional star schema are used, everything is driven out of the schema, making it difficult to evolve the platform as the need for analytics changes.
  • Offers components within each layer: For greater effectiveness and agility, each layer should be built with modular components. The data layer must provide the ability to manage structured, analytical, unstructured and streaming data. The data ingestion layer should have modules for master data management, extract transfer and load, and data quality management. The business intelligence layer must offer self-service, various types of analytics, visualisation capabilities and support for multiple device types.
  • Is designed for evolution: It is important to build an architecture that can easily accommodate change. As part of that, work to build an understanding of the dimensions of potential change, such as business problems and types of quantitative models. By understanding the aspects likely to change, you can identify appropriate components and technologies at each layer. Fortunately, modern technologies from columnar databases to the Hadoop open source software framework are inherently flexible and do not force every part of the solution to be tied to a specific quantitative model or schema.

Think Big, Start Small

Every successful analytics initiative will be built upon a sound framework that includes identifying business value, building models, sourcing data and, ultimately, driving adoption. At every step, technology is a critical enabler, but it should not be the central focus, nor should it be a barrier. With many analytics technologies now available on the cloud, it is possible to get started with very little upfront capital cost. Using Amazon Web Services, Azure and other cloud services an organisation can begin to build an all-encompassing architecture, starting small, building something and showing value to the business before making substantial investments.

Related content

WEBINAR

Recorded Webinar: Leveraging data lineage to deliver tangible business benefits

Data lineage is central to data quality, accuracy and access. It is also essential to understanding your data and data flows, systems from simple applications to multiple business intelligence solutions, and how people in your organisation are using data. Implemented across the enterprise, data lineage can provide significant business benefits, including new business opportunities, better...

BLOG

SS&C Algorithmics Releases Risk Scenario and Simulation Service

SS&C Technologies has released SS&C Algorithmics Scenarios-as-a-Service (ScaaS), a subscription-based risk scenario and simulation service. The first scenario in the series focuses on the impact of pandemics such as COVID-19. The service uses machine learning, global datasets, financial risk modelling and stress testing to create risk scenarios. The pandemic service delivers specialised stress tests over...

EVENT

Data Management Summit USA Virtual

Now in its 11th year, the Data Management Summit USA Virtual explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...