The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Standard Life Takes Stealthy Approach to Building a Data Warehouse

A stealthy rather than Big Bang approach to data warehousing can meet business requirements in a timely and cost-conscious way, and lay the foundations for a scalable solution, said Jim Shaw, solutions architect at Standard Life, as he presented a case study of data warehouse development at last week’s Financial Information Management (FIMA) Conference in London.

“Enterprise data warehouse projects are typically large, complex, long and expensive. Significant change is required and there needs to be a high degree of senior management buy-in. That is a hard sell,” he said. “So, we reduced the complexity, time and cost, and decided to deliver an enterprise data warehouse in smaller chunks.”

Shaw’s project to build a data warehouse on an incremental basis has to abide by Standard Life rules requiring each project to have its own business case and be developed in collaboration with external partners.

In response to business requirements, Shaw and his team rolled out the first element of the data warehouse for accounting in the third quarter of 2008. This was driven by a business requirement to view accounting data across all source systems, maintain consistent data formats and provide access to data that had not previously been available to the accounting function.

A second element of the data warehouse, forward pricing, which reduced risk and allowed the movement of funds between insurance administration and investment systems, was rolled out in the fourth quarter of 2009, followed by policy assets, a core enabler of profit generation, in the fourth quarter of 2011. The next tranche of the build will support Solvency II. It will go live in 2013, delivering required balance sheets and optimising the company’s regulatory capital position.

Standard Life predefined the architecture for the data warehouse using Kimball methodology, part of its overall strategy. It then employed an Oracle database, Ab Initio extract, transfer and load tools, and Cognos business intelligence tools for data presentation to deliver the database solutions, basing what it could on reusable components such as data models and frameworks.

“The benefits of an incremental approach are productivity, data consistency and a scalable solution, but it is important to stick to strong architectural governance,” said Shaw. “The IT team worked with the business and we could support current business priorities, align with the business and structure for growth. Going forward, we will give the business control using business rules, not code, and the business will lead the extension of the enterprise data warehouse with new business processes, attributes and propositions.”

Related content

WEBINAR

Recorded Webinar: Managing LIBOR transition

The clock is ticking on the decommissioning of LIBOR towards the end of this year, leaving financial institutions with little time to identify where LIBOR is embedded in their processes, assess alternative benchmarks and reference rates, and ensure a smooth transition. Some will be ahead of others, but the challenges are significant for all as...

BLOG

Are You Ready for Brexit?

Things might be up in the air at the moment around Brexit negotiations, but the UK financial regulators have made it very clear that firms must be ready for the final deadline on December 31, 2020 or risk facing the consequences. Firms that don’t prepare in advance for the changing regulatory environment could face tough...

EVENT

Data Management Summit USA Virtual

Now in its 11th year, the Data Management Summit USA Virtual explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

FRTB Special Report

FRTB is one of the most sweeping and transformative pieces of regulation to hit the financial markets in the last two decades. With the deadline confirmed as January 2022, this Special Report provides a detailed insight into exactly what the data requirements are for FRTB in its latest (and final) incarnation, and explores what needs to be done in order to meet these needs on a cost-effective and company-wide basis.