Data warehousing solution vendor Teradata has thrown its hat into the ring to act as a key technology partner to the US Office of Financial Research (OFR) and Dilip Krishna, vice president of financial services at the vendor, testified during last week’s government organised roundtable on systemic risk about the benefits of a big data approach to this task. He discussed the unsuccessful attempt of Teradata and a number of other IT firms during the shaping of Dodd Frank to lobby the US government to make several amendments to the structure of the OFR related to its IT strategy and direction, and once again reiterated the benefits of these changes.
These proposed amendments to the OFR’s strategic framework are also contained in a recent Data bill that has been tabled by chairman Darryl Issa, who is also a big supporter of XBRL. They include fast forwarding the process of establishing an IT framework and setting a definitive timeline for the work to be conducted. At the moment, there is some degree of uncertainty in the market about the order in which the Treasury agency is due to tackle items beyond legal entity identification, as well as concerns about a delayed appointment process for the director of the OFR.
Despite his proposals for a few amendments, Krishna is therefore in support of the aims of the OFR, unlike fellow speaker at the event and academic and author Nassim Nicholas Taleb, who suggested that such an endeavour could prove damaging to the financial markets as a whole. Krishna, on the other hand, reckons the OFR is “a critical component” of making the financial system safer and is keen for the new agency to begin its task by “streamlining federal IT systems and harmonising (the data) procurement processes.”
He suggested that the OFR could learn some lessons from the financial institutions it is seeking to oversee by examining successful implementations of EDM platforms (some of which Teradata will have had a hand in, of course). The OFR’s own data centre, which has been tasked with establishing a data foundation on which to build systemic risk monitoring capabilities, should therefore look to data warehouses in some of the larger financial institutions and Teradata’s own experiences in this space, Krishna argued.
“Several financial institutions have, for their own risk management and financial reporting purposes, developed data repositories similar to that envisioned for the OFR,” he said. “The common principle employed by the most successful of such efforts is to ‘think big but start small’. They combine an ambitious long term agenda with a small, well scoped initial phase of the programme that is targeted to deliver a specific need.”
It is this “tightly scoped” vision for the data centre that Krishna is keen to see added to the OFR’s strategic plan, obviously and ideally with some involvement of his own firm (given the extent to which he stressed the long history of Teradata’s involvement in the market and its experience, the vendor is keen to get in on the action at the coalface). By setting the agenda at the outset, he suggested that the Treasury agency could “quickly become useful” by setting the reference data standards on which technology capabilities can be built. He suggests that existing reference data standards provided by the vendor community should be used to this end in the immediate term, with a longer term goal of establishing a “single authoritative source of reference data.”
Much like Richard Berner, counsellor to the US Department of the Treasury’s Tim Geithner, and John Liechty, professor of marketing and statistics at Smeal College of Business for Penn State University and director for the Centre for the Study of Global Financial Stability, and, for that matter, most of the OFR’s key champions, Krishna is also convinced that the OFR’s standards work will reap risk management benefits for the industry as a whole. Moreover, like Liechty, he suggested that the gathering together of detailed positional and transactional data on a periodic basis would be a much harder task to tackle than reference data.
Krishna noted therefore that there are “many barriers to perfectly standardising position and transaction data across all the major systemically important financial institutions,” but said that these are not insurmountable. He said: “None of these barriers, in our opinion, are formidable enough to prevent the Office from using what is available for gross systemic risk computations. In fact, using position and transaction data for risk analysis will act as a catalyst for improving the quality of such data over time.” It will be interesting to see whether the industry agrees with this optimistic sentiment.
He ended his speech with a reference to the “age of big data,” a theme that Teradata and many other vendors have appropriated to represent their, often risk management focused, data support structures. The idea behind it is that these data management solutions can support vast quantities of data and fast risk analytics calculations. This is a topic that will, no doubt, be discussed at length at A-Team’s upcoming Data Management for Risk, Analytics and Valuations conference in London on 17 October.