You are here
Financial Data Standards & Structure
Occasional Paper (Work in Progress)
"A Critical and Empirical Examination of Currently-used Financial Data Collection Processes and Standards"
By Suzanne Morsfield, Center for Excellence in Accounting & Security Analysis--Columbia Business School; Steve Yang, Stevens Institute of Technology; Susan Yount, Workiva (co-authors' note: we have received a small research grant from the SWIFT Institute)
This Occasional Paper is a "study in progress" examining the design, implementation, maintenance, and ongoing effectiveness of six current financial data standards or data collection settings: FIX, FpML, ISO15022, ISO 20022, XBRL, and the U.S. CFTC's Swaps Data Repositories. In the process of gathering data on these primarily XML-based approaches, we also seek input on other methods or languages that stakeholders may prefer or recommend with an eye toward answering the question, "what is the future of financial data standards?'.
The impetus for this project was a call for research by the SWIFT Institute that highlighted statements made by Scott O’Malia of the U.S. Commodities and Futures Trade Commission (CFTC) about the CFTC’s inability to find the infamous London Whale trades in the mounds of swaps trading data that the Commission had been requiring in the wake of the Dodd-Frank bill. Hence, this project seeks to answer questions about how current standards are working, and what the lessons learned are from these specific implementations.
Other key questions include: when and where exactly is a financial data standard necessary and when might data science expertise be an adequate, and sometimes better solution? While important, these are beyond our current scope, but should be addressed by future research in this arena.
We begin the current project by laying out a structure for assessing how each of the six use cases currently performs. The framework we propose consists of understanding how (or if) the following four key areas were developed and currently function for each of the six settings:
Communication standards (not database standards)
Data and information ontologies (relationship modeling).
Next, we analyze each of the standards or settings along the following key aspects on a case-by-case basis:
How does data analysis technology, such as database, cloud computing, etc. leverage this financial standard?
What are the common obstacles for the financial industry to use big-data technology in this case?
What role does industry reference data standards play in making this standard more usable?
How does development of technology, content, organization and governance make this standard more useful?
This method provides a common model at the business-related level, and across all stakeholders to all financial data standards or settings, by which to enable the following in current and future analyses:
Evaluate/change existing standards
Decide whether a new standard is necessary, and
Design a new standard that serves all stakeholders.
Finally, we conduct an empirical evaluation of all the cases and provide cross-analysis of the factors that may have contributed to the success of the standards studied--from its to design through its use and updates or maintenance (through its entire life cycle where possible). A unique feature of our study is that we conduct our surveys on a 360 degree basis—i.e., we gather views from all possible stakeholders, not just from those who designed the standards.
We believe that the above approaches taken together are crucial to understanding how the standard is actually functioning. We believe that it is this approach that will allow us to provide a rich and data-driven set of recommendations for future financial data standards development.
Since the U.S. Security and Exchange Commission’s 2009 mandate that portions of 10Ks and 10Qs be submitted in a digitized format known as eXtensible Business Reporting Language (XBRL), issuers and others have questioned the usefulness of the resulting data now available. As early promoters of “interactive data,” Columbia Business School’s Center for Excellence in Accounting and Security Analysis (CEASA) undertook a review of the state of XBRL and interactive data with a focus on their utility for security analysis. This project involved interviews with representatives of the various stakeholders (i.e., preparers, regulators, analysts and investors, XBRL developers, data aggregators, and XBRL filing and consumption tool vendors), and an in-depth discussion with and survey of investors and analysts. The survey and interview questions, and our conclusions, were organized around the original vision for interactive data—i.e., that data in this format would provide incrementally more relevant, timely, and reliable information to more end users, who could then manipulate and organize the data according to their own purposes at a lower cost.
This paper investigates the usefulness of the resulting data that were available at the time of publication. In our view, XBRL has succeeded in so far as the objective of providing users with free, interactively-available numerical data from portions of published financial statements and footnotes, as soon as they are filed with the SEC. Most of the analysts and investors we spoke with are interested in and tried to use the footnote data that are XBRL-tagged. However, this access has not translated into ongoing current use by investors and analysts for many reasons which the report articulates in more detail. With this in mind, we provide our general conclusions and make some recommendations.
XBRL and Interactive Data Roundtable
CEASA hosted and moderated an off-the-record and candid roundtable in October, 2011 which discussed in detail the state and future of the SEC's XBRL implementation from the viewpoint of a significant intended audience of the data (according to the SEC's own documentation, speeches, and the related rule itself)--i.e., analysts and investors. Representatives from key regulators, standard-setters, investment firms (buy-side and sell-side), SEC registratants (of various sizes), data aggregators, filing agents, and academia were present. This roundtable prompted CEASA to study the issues raised in more detail and led to the above White Paper.
CEASA has recently received gifts of $125,000 each from Morgan Stanley and General Electric. In 2003, a founding gift from Morgan Stanley and GE was instrumental in creating the center, which has since produced a body of independent research for stakeholders across academia, industry, and government. With Morgan Stanley’s and GE's renewed pledge of support, CEASA is well-positioned to lead the next chapter of dialogue around security analysis and accounting issues and to generate practical policy solutions.
CEASA Director of Research, Suzanne Morsfield, awarded research grant from the SWIFT Institute to report on “The Future of Financial Data Standards” with co-authors Stephen Yang of Stevens Institute, and Susan Yount of Workiva.
CEASA co-founder, Stephen Penman, releases new Occasional Paper, “The Value Trap: Value Buys Risky Growth”, co-authored with Francesco Reggiani of Bocconi University.
Submit a Paper to CEASA
CEASA welcomes submissions for online publication of papers analyzing and proposing solutions to current issues surrounding accounting and/or security analysis policy.