Printer Friendly

Are the outside credit agencies headed for extinction? Why structured data drives improved risk analytics.

The U.S. Senate Banking Committee, chaired by Richard Shelby (R-AL), held a hearing in February on possible conflicts of interest in the credit rating agency industry. What possible conflict of interest could there be for an industry where the subjects of the analysis are the clients? Plenty.

In the House, Rep. Richard H. Baker (R-LA), chairman of the Financial Services Committee's subcommittee on capital markets, insurance, and government-sponsored enterprises, has expressed impatience with the Securities and Exchange Commission, saying that if the federal agency doesn't take action to reform the rating agency system, he may introduce legislation to address problems he sees within it.

If safeguarding equity analysis independence is a good thing, it would seem to argue that it applies even more so to credit analysts serving a bond market ten times the size. During the Shelby hearings, there was no mention of the question of investors, banks, and companies generating their own internal ratings for measuring default and restatement risk, this rather than rely upon the SEC-imposed monopolies of Moody's, S&P, and Fitch. But that is precisely where the industry is headed: a marketplace where all banks and public companies finally grow up and become rating agencies themselves.

The fact is, under both Basel 11 and Sarbanes-Oxley, risk officers and directors must perform a degree of diligence regarding external threats that obliges them to come up with their own, independent risk ratings for all counterparties, public and private, down to and including retail customers. This means banks and corporations must be able to model these risks in detail and in real-time so that they can maintain internal ratings and produce these ratings for regulators and/or auditors on demand. The era of the rating agency as the definitive source of risk quantification is already ending.

THE END OF AN ERA

Two megatrends are sweeping across the analytical landscape and changing the course of quantitative due diligence. The first is regulations that have deputized banks and auditors as watchdogs, thus raising the bar on their internal risk measurement needs; if nothing else, to defend themselves against error and omission consequences. The second is the technological commoditization of the data needed to support a broader base of high-quality internal analytics by institutions. While neither of these future tides is yet perfected, the die has been cast.

The epicenter for the transformation of risk analytics from a service provided by outsourced vendors to an internal process starts with events such as Long Term Capital Management and the more general expansion of leveraged products in the financial world. The increased use of leverage in the capital markets, along with the collapse of the technology bubble, served as catalysts for regulators and the academic community to look for new tools to measure risk and anticipate events.

The failure to heed the caveat emptor warning that applies to all models was compounded by the reduction of predictive accuracy that comes during periods of economic distress. Just look at the tact that sub-prime lending is the faster-growing area of retail bank assets and that tells you that the quality of the U.S. market is changing as borrowers as a group come under greater stress.

The people who run the major rating agencies know the limitations of the "contemporary" risk measurement tools in use today. The academic literature describes the broadening of the uncertainty band in rather direct language for anyone who bothers to take the time to read and heed it. The problem is that people who run major Wall Street banks and investment houses rarely have time to read or heed the warnings of theoretical researchers. As my partner Christopher Whalen noted in the previous issue of The International Economy ("Managing Risk: A Skeptic's View of Basel II"), the Basel II framework starts off" using risk surveillance tools that clearly missed most of the fraud and accounting manipulation of the past decade.

Using indicators outside the range of their designed utility is a bad thing. You get disconnects, for example in the case of a commercial bank, between your predicted and actual loss experiences the precise issue at stake with the Basel II process and one that is already rippling through the credit markets. The number of surveillance subjects that have exceeded the design utility of market price-based methodologies has increased in recent years, part of the slide of the U.S. economy into a state where sub-prime credits are closer and closer to the mean.

We are in a part of the business cycle where an increased fraction of the economy consists of companies that are for lack of a better word "in between"--neither healthy where market-pricing proxies provide accurate valuation nor insolvent enough where default analysis becomes the name of the game. These companies morph constantly, coating their noisy financials with extraordinary events. They are what the textbooks call "in need" of reformulation-intensive fundamentals modeling and specific scenario analysis. In Street talk, they need to be restructured.

Even as the credit standing of many U.S. companies (and individuals) has come under stress, the rating agencies cut back the infrastructure that would provide the ability to respond to this challenge. Moody's KMV, for example, only rotes the companies it does cover up to a 20 percent default rating or 2,000 basis points, but the sub-prime world requires visibility out to a 50 percent default probability or 5,000 basis points range. None of the tools offered by the outside vendors addresses this shortcoming. Indeed, since the collapse of the technology craze, the rating agencies have stripped down, looking for ways to leverage smaller staffs and eliminate the traditional expenses associated with a very "hand made" business of forensic analysis. As a result, there is a gaping hole in the coverage of the "in-between" borrowers, as illustrated by the chart.

Unlike healthy companies that gladly pay' $10,000 per shot to have a bond issue rated by a major agency, the marginal company cannot play in that league and thus the rating agencies have largely ignored these companies. The cost-to-profit ratio to serve the "in betweens" is a tough business model because these are precisely the group of companies that will hesitate to pay for a credit rating. The problem has become so acute that a cottage industry of alarm bell services designed to call out and embarrass struggling "in betweens" has been spawned in the past couple of years since the latest spate of corporate calamities.

A "NATIONAL INTEREST" IN THE STRUCTURE OF DATA

To power a broader base of internal risk and ratings analytics to address the "in-betweens," probably the most important issue is to have reliable structured financial data for public companies and investment funds. The U.S. Securities and Exchange Commission also recently issued a release adopting amendments to establish a voluntary program related to eXtensible Business Reporting Language (XBRL). Registrants may voluntarily furnish XBRL data in an exhibit to specified EDGAR filings under the Securities Exchange Act of 1934 and the Investment Company Act of 1940. This program begins with the 2004 calendar year-end reporting season.

If you use the SEC public Web site (www.sec.gov), you will run into a document that describes the Commission's five-year plan. It provides a clear image of a future world where "evidence grade" data on all SEC registrants, not just the traded companies the vendors deem worthy of coverage, is available for download in structured formats and is free for the asking.

This filing-level improvement in corporate transparency also benefits from Sarbanes-Oxley's Rule 404 documentation process, particularly for the production of general ledger data for use by institutions providing "private information" grade audit, advisory, and banking services. We look forward to the day all commercial accounting packages export general ledger data in a universal analytics ready format, which is arguably mandated by Basel II. Banks, for example, will require that companies provide periodic reports in structured format, probably XBRL, and use this privileged data to assemble the internal ratings that Basel II requires. If common sense prevails in the corporate suite, the same structure adopted by the company's accounting firm will spread throughout the organization.

SEC SHOULD TAKE A LESSON FROM THE FDIC

Institutional-grade analytics is a world where piecemeal treatment is not enough, yet unfortunately the SEC seems content to let the financial industry define the standard for public company reporting. Close examination of the issues suggests that the SEC's present passive approach is contrary to the national interest. Why? Because there are times when the private sector needs some guidance, especially when it comes to public disclosure.

The Best Structured Data Award in the United States goes to the Federal Deposit Insurance Corporation. Designed to support econometrics, the FDIC's electronic filing system puts out the cleanest (all the derived calculations and checksums loot) and most immediately useful bulk financial data in the world, far exceeding anything available from any commercial data vendor. It is evidence grade, audit trail clean information traceable right to the line item in the attested filing.

The FDIC has been keeping these records ever since computers were invented and herein lies its strength from a technical point of view: the system is arbitrary and entirely deterministic, exactly what the SEC should be doing with tagged data from public companies. Think of it: a twenty-year-old system at the FDIC is actually technically superior to the SEC's current EDGAR system. A quarterly bank call report has hundreds of discrete data elements, all tagged and ready for import into a commercial database. And the regulators are making improvements.

The federal banking agencies announced a new implementation plan for the Central Data Repository (CDR)--an Internet-based system created to modernize and streamline how the agencies collect, validate, manage, and distribute financial data submitted by banks in quarterly "Call Reports." CALL/TFR reports have been submitted to accession into the electronic depository for decades and teething pains are inevitable. But ultimately when mated with the excellence of the wise old COBOL back end, this amalgam of the best of breed from old and new remains the template against which other "national interest" data development should be modeled.

TAKING SEC TO THE NEXT STEP

Because company filings have never been as structured as bank reporting, it is taking a little longer to rationalize the front end, thus the SEC's cautious approach. The SEC program, while voluntary, points in the same direction as the FDIC, but the pace suggests that the final goal is years away. And as we've noted before, because of the requirements of Section 404 of Sarbanes-Oxley, the accountants have been busily structuring client data from the ground up in order to create a comprehensive inventory of all of the information used in a company's financial reports.

The promise of the SEC's voluntary program is that, if made mandatory, it will enable another round of standardization for company information. It provides an environment to manage enriching the set of tags or "meta data" describing each item in a 10-K or 10-Q, for example, even footnote data and other information peculiar to that entity or industry. This will enable third-party vendors to improve the transparency of "as filed" company data from regulators and the SEC's EDGAR database. The advent of complete EDGAR filings with data tags will be a revolution for financial analysts.

It seems clear that in the future, rating agency numbers will become crosschecks, not icons. What is still missing in the emerging discussion regarding tagged SEC data is input from players like the Financial Accounting Standards Board, the American Institute of Certified Public Accountants, and others who can help the SEC and the Public Company Accounting Oversight Board define a regulatory regime for translating footnotes into structured data--regardless of whether the end result is delivered in the current tagged-format or XBRL. Either way, the commercial data vendor community will get the job done as and when the SEC exerts some additional leadership and forces all commercial companies to accept a common format to fully tag their public filings.

U.S. financial data should be as clean and standardized as it in Europe, where XBRL has already been widely adopted as a reporting format for public and private companies. Given this template, perhaps Chairman Shelby, Chairman Baker and the SEC should ask not how to make third-party credit ratings more reliable and less conflicted, but instead how long it will take to enable every bank, corporate treasurer and investor to generate their own independent ratings for default and restatement risk. The future health of our economy's banks and corporate borrowers, especially the "in betweens" who need the greatest attention, depends upon whether the SEC gets this issue right.

Dennis Santiago is CEO and C0-founder of Institutional Risk Analytics.
COPYRIGHT 2005 International Economy Publications, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2005, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Santiago, Dennis
Publication:The International Economy
Geographic Code:1USA
Date:Jan 1, 2005
Words:2126
Previous Article:Santomero speaks: the Philadelphia Federal Reserve President takes on inflation, interest rates, and the future of the American economy. A TIE...
Next Article:Are the emerging markets finally decoupling from the United States?
Topics:


Related Articles
Megadeath.
Kanisa announces availability of Kanisa5.
Using analytics to 'hit it out of the ballpark': no longer just for the fortune 1000 companies, today's analytics solutions provide access to highly...
Verint announces ULTRA analytics to optimize customer-focused operations.
VI. Effects of securitisation on the loan portfolio composition (loan book), credit risk exposure, asset funding of banks and banking regulation.
Wall Street's derivatives casino: is today's eerily tranquil scene an illusion?
Improving customer analytics and reporting.
Analytics: first step to effective treasury STP.
Power tools: new and evolving technologies are helping personal lines underwriters properly assess risks and provide better rates for consumers.
Leveraging analytic solutions to improve insight, performance and customer experiences.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters