As the enterprise data management (EDM) discipline has matured during recent years, there has been a growing focus on the importance of downstream distribution of data. One element of the solution to this problem has been presented as integration technology to facilitate linkages between data management systems and data consuming applications.
In line with this emphasis on finding a technology solution for downstream integration we have seen some EDM vendors partnering with integration specialists – an example would be GoldenSource’s tie-up with PolarLake. Others – such as Netik – have emphasise that as well as data management solutions they already have integration capabilities within their own product line-ups.
But according to Peter Serenita, chief data officer at JPMorgan Worldwide Securities Services, “it is important to recognise that the integration of downstream users and systems into the EDM solution is a data problem and not a technical problem”.
Speaking at the FIMA event in New York earlier this month, Serenita told delegates that while downstream integration is an absolutely vital component of EDM, “there are plenty of different technologies today to link two systems together”. The real challenge is “aligning the data so that it means the same thing in both the enterprise source and the legacy downstream system”.
By way of illustration, if the source system has “A” for a data element and the downstream system requires this to be transformed into a “Z”, then the problems that can arise become clear: the difficulty is compounded if yet another legacy system requires this same value to be transformed into a “T”, he explained. “Ultimately a client may want to bring all of this data together and then you need to reconverge the data back to the original standard, in this case the ‘A’. And this is the true challenge of data integration,” he added.
Firms have a choice of several options when it comes to solving this data integration challenge, Serenita suggested. “The first and optimal solution is that all downstream systems conform to the ‘golden copy’ standard,” he said. This would mean there is no need to translate or transform the data: all users and systems would be speaking the same language and consolidation of information would also be consistent.
“This is absolutely obtainable during the development of new systems,” he contended. “At JPMC we have a gating process for new systems development that ensures convergence to our ‘golden copy’ standards. When we initiate an effort, we provide some ‘seed money’ to document the request. Then once the requirements are defined, the team must come back to the governance group to get money to do the design, and then finally come back again to get money for the build and implementation. During each one of those gating processes, we review the plans to ensure they are aligned with our strategic direction and our data management programme. This discipline allows us to ensure that our data and technology efforts are focused on delivering quality and value to our clients.”
The problem of course is that making multiple legacy systems conform to the golden copy data standards will often be a difficult and expensive undertaking. “The optimal solution would be for the legacy system to conform to the ‘gold standard’,” Serenita told FIMA delegates. However, if that is not possible, then there are two options for the integration of that system to the gold standard, he suggested.
“First is that the golden copy is sent in a standard format to all downstream consumers, and it is the responsibility of the downstream consumers to translate or transform this standard to their own language. The second option is that the centralised group is responsible for the transformation of the gold standard to the local dialects.”
Each of these options has its advantages and disadvantages, he said, but the key decision points are around the firm’s ability to centrally manage the transformations and provide an end-to-end view of the data as it is consumed by the different portions of the organisation. “At JPMorgan we have a very active data management practice that is responsible for the integration and quality of data across the Worldwide Securities Services organisation,” he added.
Not only is the task of downstream integration not a technical activity but a data activity, but data management does not end at the golden copy, Serenita told FIMA delegates. “Without integrating this data into the business processes and systems downstream in the business, the golden copy is useless,” he said. But there are multiple options for data integration. The best is the convergence of all systems to the standard, which is possible for new system builds but harder to achieve for legacy systems. “Short of convergence to the gold standard, the choice you make will depend on your corporate culture and your ability to centralise or decentralise the responsibility of transformations,” he added.
Subscribe to our newsletter