About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Taking the Risk Out of Mutual Fund Compliance By Jeff Levering, Vice President, NewRiver

Subscribe to our newsletter

Selling a mutual fund requires complex mutual fund information to be summarised down into data that sellers can understand, which is most often driven by a technology or data warehouse infrastructure. Missing and/or bad mutual fund data is nothing new, but it has recently garnered the attention of regulators as it was directly responsible for creating the renowned mutual fund “breakpoint” issue. As a result, sellers of mutual funds are now asking what they can do to better understand the fundamentals of mutual fund data and how to identify possible risks if these issues were to occur again.

The breakpoint issue related to investors being overcharged commissions due to some brokerage and other firms not disclosing or using detailed fund policies, like “rights of accumulation” or “letter of intent”, when selling a fund. While committees were formed and industry groups opined, the net result is that very little has changed. The breakpoint issue is alive and well today. The problem is at one level simple. Mutual fund sales are driven by technology that relies on mutual fund policies, such as “rights of accumulation” rules, being summarised as data points for the technology to work. Firms offering funds may sell billions of dollars of funds every week, and maintain considerably more on their books, but frequently they pay scant attention to the accuracy of their mutual fund reference data. Reference data is the critical link as it electronically represents a number of key data attributes including investors, intermediaries, issuers, products and prices. With reference data comprising the majority of the data content in trades, firms realise that ensuring accurate reference data is no longer an issue that can be avoided. Even when firms believe they have the reference data issue under control, random data audits have found that when comparing the mutual fund data held by distributors to the mutual funds’ own rules, a disturbing accuracy problem still exists. Comparisons have shown that for many firms, discrepancies in data results because funds have one policy as disclosed to the SEC while their distributors use entirely different ones. The best source of mutual fund information – its pricing and other policies – is contained in each mutual fund’s prospectus and more detailed statement of additional information (SAI). Every single fund is required to provide its prospectus and SAIs to only one place, the Securities and Exchange Commission. Initially required in a paper-based format, beginning in the late 1990s mutual fund companies were required by the SEC to file these documents in an electronic format to the Electronic Data Gathering, Analysis, and Retrieval (EDGAR) system. However, without an easy way to get the information out of these electronic prospectuses at the SEC, brokerage firms, industry associations and vendors have been devising ways to get a data repository with prospectus-like information for their trading and information technologies. Each solution, which runs the gamut from teams of people in back offices, to industry led repositories, has touted its data quality as the most reliable. Yet because each ignores the inherent power of the EDGAR system as the base resource, many initiatives begin based on a flawed model. In order for firms to build and manage this mutual fund data themselves they need to manually gather information from fund companies via phone, email and faxes; buy and piece together data feeds from vendors such as Morningstar; and then run separate processes to figure out which source is probably best. Frequently, and especially with the larger firms, there may be multiple groups creating their own versions of this data which increases the cost and likelihood that the information will not match internally. Firms absorb incremental expense to do this work; and in larger firms the costs can easily exceed $1 million. Also, these same firms privately suspect the quality of this effort due to its patchwork approach. This leaves the firm with an uncomfortable truth: high quality, reliable mutual fund data is essential, but it is not available for free; and alternatively is expensive and risky to self-manufacture. The key industry solution for the mutual fund data is a free service provided by a well known industry utility and securities depository. The concept is fairly straightforward; fund companies, if they choose, may provide information to the service which will then be provided at low cost to participant firms, primarily the brokerage firms who need it. The free service was designed in 1999 and remained idle until 2002 when the breakpoint issue was becoming more clearly understood. This first version of the service was revamped and re-launched in the fall of 2007. While the effort was a start in the right direction, the reason the service won’t work revolves around three key issues: coverage, completeness and cost. Coverage is the first thing a firm must question when it comes to the service. Does the service have the participation of all the funds their firm needs? If not, every time a fund is missing, the information must be manually found and managed which once again opens them up to inaccurate reference data. More specifically, when comparing service data from October 2007 to the funds data on the SEC’s EDGAR system, considerable gaps in the offering still applied, with more than 40 per cent of funds missing from the service. Why are 40 per cent of the funds missing from the service, while 100 per cent are providing information to the SEC as required? Because not all fund companies participate in the service of the industry utility, and of those that do participate, not all elect to provide data to the service. Completeness refers to how accurate the information is, including do the policies the funds are populating on the service match the same policies in the SEC’s EDGAR system? There are two types of discrepancies, the first being where the data is simply different, and the second where the data is missing when the EDGAR system shows the fund has a policy. In the case of the breakpoint tables, the core commission pricing table for front end loaded mutual funds, 68 per cent of the funds in the service either have a discrepancy or are missing. Even more detailed mutual fund policies show discrepancies between the service and the SEC EDGAR system. The compound effect of multiple discrepancies across various data elements leads to an exponential increase in disclosure and transaction errors. So that leaves us with cost. Firms hoping to rely on the service are left with trying to cobble together data due to the missing funds. This added cost, which is often buried deep in operations, tends to be hidden from most organisations. Fund companies, and their investors, are similarly left with the expense of manually managing this data. Avoiding the risks and costs of unreliable data is not as difficult as it seems provided you follow seven steps to assess the accuracy of the mutual fund data your firm currently uses. Firms should ask themselves the following key questions: What does the mutual fund data do? How important is it to our enterprise value? Since mutual fund trading is automated to a point, find out what data your firm is currently using to support trading. Do a “back-of-the-napkin” assessment of the risk level to your company if this data is wrong – including but not limited to regulatory, financial, reputation et cetera. How is mutual fund data collected and managed now, and what cost/risk assessments have been done? Mutual fund data can come from multiple sources, and take significant resources to manage. Firms need to ask how much money they are spending to collect this data now. Does our firm have, or is it considering, a data governance initiative to reduce risk? As securities and financial services firms recognise that their automated systems provide added efficiency and accuracy when leveraging high-quality data, are they creating organisations and procedures for overseeing data management? At one end of the spectrum firms are forming a complete data governance organisation, including a chief data officer, to manage data acquisition, management and risk assessment. At the other end of the spectrum firms are opting to adopt outside vendor capabilities to support, create and manage quality data sourcing and oversight. The goal is the “golden copy” of the data from which an entire business can run. Be sure to understand what your firm is doing, and how the organisation is changing to better manage mutual fund data. Consider doing a mutual fund data audit. Organisations should test mutual fund data via a comprehensive audit, whereby data from your firm is compared to current available policies sourced from existing SECs documents. The goal is twofold: one, identify discrepancies, and two, identify mutual fund policies that are available, but currently missing. What data do employees rely on that your company does not manage itself? When the breakpoint issue first occurred, many firms worked to mitigate future risk by creating paper forms and spreadsheets. Chances are this data, which is outside of most organisations’ core collection processes, is old and under-managed. It is critical to understand if your employees are using data from vendors or analysis firms and are disclosing this information to investors – and are they taking into account all the requirements for breakpoint disclosure? What level of validation is sufficient? Mutual fund data, because it simplifies accuracy checking for each trade through automation, needs to be reliable, that is consistent, correct and complete. But data quality standards will vary depending upon the type of data, its source and resources applied. Take the time to understand the validation processes that are in place today and determine what if anything needs to be done in the future. Are liability and risk part of the assessment? This kind of assessment frequently happens after a problem has been found. For instance, the breakpoint issue most likely has already been found and is now assumed fixed at your firm. However, what would be the cost to your firm if the issue were rediscovered by regulators? What protections do you have if you are self-sourcing mutual fund data internally? Investing in exemplary mutual fund data makes sound business sense, especially if it is sourced from the SEC’s EDGAR system. Because of the risk and complexity, this is one of those cases where “buying” a solution makes so much more sense than “building” one, both from a cost as well as a risk reduction perspective. Before your company spends another day operating under the false assumption that critical data is accurate, make sure you dedicate the time to examining your organisation’s use of mutual fund data to ensure efficient, ongoing investor relations and to ensure the overall reputation of your firm. By doing so, you may become your firm’s biggest asset, having protected it by making some sound and basic changes in this important yet often hidden and under-valued area. Jeff Levering is vice president of corporate development and business strategies at NewRiver.

Subscribe to our newsletter

Related content


Recorded Webinar: How to optimise SaaS data management solutions

Software-as-a-Service (SaaS) data management solutions go hand-in-hand with cloud technology, delivering not only SaaS benefits of agility, a reduced on-premise footprint and access to third-party expertise, but also the fast data delivery, productivity and efficiency gains provided by the cloud. This webinar will focus on the essentials of SaaS data management, including practical guidance on...


Ground-Breaking Standard Setter TCFD Given Warm Send-off

The king is dead; long live the king. At the end of this year the Taskforce for Climate-related Financial Disclosures (TCFD) will cease to exist but its operations will be continued by the International Sustainability Standards Board (ISSB). The standard setter’s dissolution eight years after its creation has prompted a wave of reflection within the...


Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.


Impact of Derivatives on Reference Data Management

They may be complex and burdened with a bad reputation at the moment, but derivatives are here to stay. Although Bank for International Settlements figures indicate that derivatives trading is down for the first time in 10 years, the asset class has been strongly defended by the banking and brokerage community over the last few...