Trillium Software has released a new version of its enterprise data quality solution, Trillium Software System, which it says is aimed at accelerating the design, development and deployment of real-time data quality programmes across an organisation. The development of version 12 of the solution was prompted by client requirements for faster and more accurate data to meet governance, risk and compliance legislation, explains Ed Wrazen, vice president of product development and strategy at the enterprise data quality vendor.
“Financial authorities are demanding tighter compliance with governance, risk and compliance (GRC) legislation, such as Basel II. As banks work feverishly to become compliant, they will grapple with a major challenge: how can they ensure that their information and risk calculations are based on verified facts?” says Wrazen. “How can they optimise their transparency and understand the data that is at the core of their key performance indicators? Does data meaning change within different contexts? How can they make sure data is provably correct for its intended purpose? Senior management and board members need to be 100% confident in the data behind the disclosures they make.”
This is where the vendor’s solution comes into play, according to Wrazen, by providing high volume data profiling and discovery, data cleansing, and data quality dashboards and reporting. “They need to have data intelligence and governance programmes in place right across the institution and they need to develop them and deploy them quickly. The Trillium Software System v12 is designed to be the technology component of this process; accelerating design, development and deployment of good data governance regimes,” he elaborates.
The vendor has added a new cross-enterprise platform to the solution, which it has dubbed ActiveEnterprise Resources, aimed at accelerating data quality projects. “ActiveEnterprise Resources serves to accelerate the development, deployment and management of enterprise real-time data quality projects and services in high availability production environments,” claims Wrazen. “Users can rapidly configure and deploy real-time data quality processes and integrate them into enterprise applications. Administrators can now control and monitor data quality servers and services across the institution. Administrators and developers can also test their data quality projects and rules for real-time initiatives.”
The platform includes: the project deployment manager tool, for the configuration, deployment and integration of data quality projects; the director system manager, a network-based tool that enables administrators to control and monitor data quality servers; and the cleansing rules analyser, for the testing of quality projects and rules by developers and administrators.
The software also includes new data quality dashboard and reporting capabilities, with a view to helping data management teams translate data quality results into business terms and metrics for a clearer understanding of their impact on business operations. Scorecards and reporting is available for distribution through a browser-based presentation layer in the form of charts, ratings and graphs, says the vendor.
“Users can create even more striking charts, ratings and graphs that visually illustrate issues so that they are easily communicated and understood,” says Wrazen. “V12 allows metrics to be associated with specific business functions, providing improved levels of context for information governance. In other words, these improved data quality metrics – placed in business context – allow institutions to quickly identify where any decline in data quality levels poses a threat to the bank’s performance or threatens to dent the reliability of information behind an executive’s declaration of compliance.”
The upgraded solution features a new business rules library focused on product, customer, supplier and financial data. “The business rules library empowers users to easily design, deploy, manage and share user defined business rules across multiple data sources and systems from one location. This library enhances support for data governance and the enforcement of data quality corporate standards,” explains Wrazen.
The user interface to the solution has also been upgraded by adding the ability to create language specific screens and more help and context sensitive display information for users.
The release is part of the vendor’s strategic product roadmap, says Wrazen, which involves the release of a new version of the solution every 12 months and an interim release if required. The development was a collaborative process with the vendor’s customers, he explains. “We did work with some key customers in financial services, particularly in the context of regulatory risk compliance such as Basel II. This fed directly into our product strategy and development plans.”
He continues: “We regularly solicit customer feedback across all industries through a number of channels and also have a strategic customer advisory forum. Customer feedback, support and satisfaction are the primary drivers for the improvements we make to our products. We have also conducted several focus groups with key customers and industry analysts during the course of the year to showcase our plans and development and get further ideas and recommendations for how we can improve our products.”
Wrazen is confident that Trillium’s offering is unique enough to separate it from others in the market. “Organisations are moving away from point data quality solutions that only support a particular process, type of data, technology or requirement. They are moving towards solutions that can scale to, and support multiple technologies, applications and data domains such as customer, product, supplier, financial and so on. Also, international data support is important, particularly for large global banks, which are increasingly centralising processing of customer data. Trillium Software provides the capability and functionality for managing data quality across the whole enterprise, unlike many point solutions,” he elaborates.
Despite budget cutting within many financial institutions, Wrazen reckons the current market environment has proved beneficial for the vendor. “We are seeing considerable growth in the data quality market for several reasons. Institutions are seeking to increase customer value and improve returns on existing assets – and, in the current climate, this is driving a lot of change within financial services. And regulatory compliance and governance is high on every institution’s strategic agenda too. These aims can be effectively served by senior decision makers only if the information that supports them is consistent, accurate and timely.”
When it comes to regulatory compliance, given the high profile problems of the past, there is now strong awareness that not only will good data quality get you out of trouble, but, if data is managed as an asset, maintained and improved, that it will generate more value for your business, contends Wrazen. “Even in the current economic climate and these times of uncertainty, it’s absolutely clear that information must be high in quality if organisations are to deliver better value from existing systems and applications. Because data quality management and improvement offers a very high ROI and can be tied into many business goals and initiatives, organisations are getting the necessary funding to support its implementation,” he concludes.