American Land Title Association
Home  >  Publications  >  Title News Archive
Publications


SoftPro is the nation's leading provider of Real Estate Closing and Title Insurance software


The Evolution of EDI in the Mortgage Industry

Advertise with Title News   Current Issue   Archives:   2014   2013   2012   2011   2010   2009   2008   2007   2006   2005   2004   2003   2002   2001   2000   1999   1998  

January/February 1999 - Volume 78, Number 1

by James M. Lesher

Introduction

The advent of the Internet and its related technologies is radically changing the way goods and services are advertised and delivered. This has been emphasized in the mortgage industry recently by the entry of non-mortgage related companies, namely Microsoft and Intuit, into the mortgage origination or brokerage business. Their entry was precipitated by the consumer demand for a more efficient and a less personally intrusive process for buying a home. Their entry was facilitated by the adoption and rapid adaptation of leading edge technology.

While at first glance this incursion of outsiders into the mortgage industry may seem a threat, it should actually turn out to be a great benefit because it should increase the number of mortgage applications that are able to be processed, resulting in more mortgages being originated. Since neither Microsoft nor Intuit are lenders, as neither are most other Internet based originators, there is still a need for the processing, underwriting, and funding functions to be performed.

However, even these traditional functions are undergoing radical change from within the mortgage industry. Initiatives by Fannie Mae and Freddie Mac, as well as the consolidation that is taking place, are increasing the rate of change of the policies and procedures and processes of loan origination.

Background

In the early 1990s, the Mortgage Bankers Association of America (MBAA) began an initiative designed to facilitate the adoption of Electronic Data Interchange (EDI) by the mortgage industry. It had become obvious that the paperwork involved in the processing of a loan was the primary bottleneck to obtaining any increase in efficiency in the process. It wasn’t that automated data processing wasn’t being used, it was that the various systems involved in the process couldn’t "talk" to each other directly. In order to transfer data from one system to another, someone on the first system would print the data on paper, and someone else on another system would re-enter it manually. EDI would eliminate this basic bottleneck by allowing the systems to talk directly without manual intervention.

In order for EDI to work, however, there had to be standards to define the format and content of the data being transferred between systems. Without standards, system B would not be able to understand what system A was trying to say. The MBAA instituted a number of EDI Workgroups to deal with the various types of data that needed to be transferred between systems. It also chose the X12 standards mechanism as the basis for the EDI standards development. At that time, X12 had been successfully used by the transportation, petroleum, and retail industries, and was the primary EDI standard related to electronic commerce. X12 is an American National Standards Institute (ANSI) standard which is governed by the Accredited Standards Committee (ASC) X12, and is administered by its secretariat, the Data Interchange Standards Association (DISA).

The State of EDI in the Mortgage Industry

X12 standards have been created for the most widely used transactions in the mortgage industry. Where they have been universally and uniformly adopted, such as in the mortgage insurance segment, significant processing efficiencies have been realized. Even where the standards have not been universally adopted, those companies that have adopted them have realized significant advantages, primarily in the area of data uniformity and the ability to use the data in highly automated systems to obtain competitive advantage.

The situation that is now faced by the mortgage industry, as mentioned above, is that the "front end," or origination portion of our business, is changing rapidly while the "back end," or servicing portion, is more stable. It is clear to all involved that standards are necessary and desirable. However it is also becoming evident that there is a need to have a standards mechanism that allows modification of the standards used during origination more efficiently than is currently possible given the broad, cross-industry involvement in the X12 process. There is also a need for a standard that is more adaptable, or "friendly," to the new technologies being employed for origination, not the least of which is the Internet.

The "front end" and the "back end" of the mortgage industry are closely coupled. This requires that any modification or change in the standards used for one end still be closely related to those used for the other end. There will still be a need to transfer the data collected at the front to the systems in the back. Moreover, it must be ensured that any alternative standards adopted for the front end are able to classify and identify data to a degree equal or better than those that exist at the back end.

The foregoing requirement is not unique to the mortgage industry. Any industry wishing to automate must collect data at the front end in a manner that facilitates the efficient and meaningful delivery of that data to the back end. Everywhere, front-end systems tend to use the latest technology while back-end systems tend to be the larger, more stable, even "legacy," systems. This requirement exposes a deficiency in the current standards in the face of the state of leading edge technology.

The deficiency is related to the complexity of the X12 standards and the lack of flexibility, especially as related to their use in front-end systems. Converting inputted data to the X12 format, as well as converting X12 data to displayable formats places a heavy burden on these systems. Moreover, these systems are typically laptops or basic PCs, not well suited to heavy processing demands imposed by X12.

The existing standards used for front-end systems in the Internet environment are not a solution to the problem either. The HyperText Markup Language (HTML) is the most widely used "interface" standard for conducting commerce over the Internet. It controls the presentation and collection of the data used to conduct that business. As it turns out HTML is not very good at classifying and identifying data it collects, nor is it able to adequately format and display that same data. It was designed to work well in static, display situations, but is not very adaptable to dynamic, bi-directional, data-driven applications.

Even More Standards

The mortgage industry is not the first industry to experience the need to have its "documents" organized in a format such that the data contained within those documents is easily accessible to both human and machine readers. In the 1960s the idea of using descriptive tags to identify specific sections of a document began in the publishing industry. This led in 1969 to the development by IBM of the Generalized Markup Language (GML) as a means of allowing text editing, formatting, and information retrieval subsystems to share documents. Much of GML was implemented in large mainframe publishing systems and achieved widespread industry acceptance. By 1990, IBM produced over 90 percent of its documents using GML.

GML was primarily a commercial product. In the late 1970s work began on a generalized standard expanding on the success achieved with GML. This standard was known as the Standard Generalized Markup Language (SGML).

SGML became a formal standard in 1986 and is widely used by government, military, research, publishing, aerospace, and automotive organizations, to name just a few.

As the Internet came to be more widely used a group of people decided that there was a need to be able to display the data transmitted over the Internet in a graphical format that would make the data easier to view and to find. That need resulted in the creation of the World Wide Web (Web) and the "browser," a program that could graphically view the data available on the Web. However, in order for the browser to properly format the data it was being asked to present, that data had to be structured in a format that told the browser what in the document was what, and how it was to be displayed. The browser designers were familiar with SGML and knew that it could be used to convey such information, but they also knew that SGML was far too complex to be feasible for use with the Web. What they came up with was HTML, a limited subset of SGML, designed specifically to allow documents to be published on the Web.

Over the past few years, HTML has become widely adopted for use in displaying data from static documents as well as for displaying dynamic documents generated from databases or other sources of information. However, as stated above, while HTML is fairly good at displaying information, it is not well suited in situations where the system receiving the HTML document needs to interact with the data contained within that document. Also, since HTML requires the data to be organized according to the manner and the order in which it is to be displayed, the structure and relationships concerning the original data is lost.

The need for a robust way of classifying data was obvious to the creators of the Open Financial Exchange (OFX) standard, CheckFree, Intuit, and Microsoft. OFX was designed as a broad-based framework for exchanging financial data and instructions between customers and their financial institutions. The existing Internet standards, such as HTML, were inadequate for conveying the data required for the financial exchanges. For that reason the designers reverted to the SGML standard for the data portion of OFX because of its robustness.

It has become evident from situations like this that there needs to be a way of providing the richness of SGML, while at the same time providing the ease of use of HTML. What has resulted from this need for a mechanism more powerful than HTML but simpler and easier to adapt than SGML is the emergence of a new proposed standard called the eXtensible Markup Language (XML). XML is an attempt to bridge the gap between the overly complex SGML and the overly simple HTML. It was initially developed in 1996 under the auspices of the World Wide Web Consortium (W3C). The abstract of the W3C Recommendation regarding XML states: "Its goal is to enable generic SGML to be served, received, and processed on the Web in the way that is now possible with HTML. XML has been designed for ease of implementation and for interoperability with both SGML and HTML."

Support for the presentation of XML data is being built into Web browsers. Internet Explorer 4.0 already has add-ons available enabling the display of XML data. Netscape 5.0 (Mozilla) is also planned to support XML. This, along with the stated XML design goals, that it be easy to read, easy to process, and easy to create, should ensure the widespread use of XML for electronic commerce.

XML and EDI

If, as it appears, XML is evolving into the primary EDI standard for the Internet and related electronic commerce, what will its relationship be to traditional EDI standards such as X12? That question is just now beginning to be explored. In the latter half of 1997 groups were formed expressly chartered to explore the relationship and interoperability between XML and traditional EDI.

CommerceNet is an industry consortium established in 1994 to support organizations using the Internet for business or developing electronic commerce applications. One of CommerceNet’s stated goals is promoting a framework that encompasses interoperability among developing standards. CommerceNet believes that XML may just be the "killer application" needed to open up the Web for electronic commerce. In August 1997 CommerceNet began an initiative to accelerate the adoption of XML by developing examples, demonstrations, and showcasing member applications.

The XML/EDI Group was founded in July 1997 as a grass roots advocacy open to anyone with an interest in improving Electronic Business for end users, and specifically through the use of XML and EDI together. The XML/EDI Group is very active in exploring the possible methods to be used in converting between XML and X12 documents. Considerable preliminary work, along with considerable debate, is taking place in this forum.

Even the ASC X12 has recognized the need to be actively involved in the evolution of EDI standards. Early in 1998, ASC X12 entered into a joint project with the CommerceNet Consortium and the XML/EDI Group to investigate how to translate X12 data elements, segments and transactions into XML.

XML/EDI and the Mortgage Industry

How does the emergence of XML as a standard for electronic commerce affect the mortgage industry with regard to the significant work it has expended in developing and implementing traditional X12 EDI standards? How does, or should, the mortgage industry Data Model currently under development fit into the XML/EDI framework? These are just two of the many issues that must be resolved soon so that the industry is not left behind the technology curve and at the mercy of the new, high technology entrants.

One of the primary issues currently under debate affecting the mortgage industry directly is the methodology to be used to "translate" X12 data to XML and the reverse. A primary goal for XML is that "XML documents should be human-legible and reasonably clear." X12, however, is not generally human-readable, nor clear. One of the proposed translation methods suggests creating XML tags directly from the X12 segment and element definitions. While this would create a mapping that was clear from a standpoint of relating the XML data to the X12 data, it is not at all clear that this mapping would create an XML document that was clear and human-readable from a business model standpoint.

There are also a number of issues with respect to how X12 has been implemented in the mortgage industry. In some instances, there are multiple implementation guides for the same transaction set defining how that transaction is used for different purposes. Moreover, because some companies implemented transaction sets before development was fully completed (which it never seems to be), there are instances of varying code value usage and even varying placement of data within the transaction sets. If X12 to XML mapping is done using segment and element definitions, this situation would lead to mapping X12 transaction sets to multiple, inconsistent XML documents for the same data.

Conclusions

It is imperative that the mortgage industry take a proactive role in the development of the emerging XML/EDI standards. There are significant threats and opportunities emerging that will be affected by the outcome of the standard development processes. Failure to be involved in the process may well lead to forfeiture of significant portions of the market to those more technically adept and less constrained by the current state of EDI implementation.

Resources

Simon St. Laurent, XML: A Primer (Foster City, CA: MIS:Press, 1998) An excellent introduction to XML, its evolution and application.

Charles F. Goldfarb, The SGML Handbook (Oxford: Clarendon Press, 1995) The definitive book on the technical aspects of SGML.

CommerceNet Consortium, http://www.commerce.net  An industry consortium promoting electronic commerce. Expensive to join.

World Wide Web Consortium (W3C), http://www.w3.org An international industry consortium dealing with the standardization and use of the Web. Expensive to join.

Data Interchange Standards Association, http://www.disa.org The premier association for electronic commerce, primarily ASC X12 and UN/EDIFACT. Not so expensive to join.

The XML/EDI Group, http://www.xmledi.com  An ad hoc group dedicated to promoting and guiding the future of XML/EDI standards and products. Membership is open and free.

Peter Flynn, The XML FAQ, http://www.ucc.ie/xml/  A list of Frequently Asked Questions about XML maintained for the World Wide Web Consortium.

The SGML/XML Web Page, http://www.sil.org/sgml/sgm.cfml  A comprehensive online database containing reference information and software pertaining to SGML and its subset, XML.

XML.com, http://www.xml.com  Commercial exploitation of XML by the publishing industry. It shows the high level of interest in XML.

Charles F. Goldfarb’s SGML Source Home Page, http://www.sgmlsource.com  No resource list would be complete without a link to the home page of the creator of SGML, Charles F. Goldfarb.

James M. (Jim) Lesher is President of Per-Centage Corporation, a technology solutions company serving the mortgage industry. He is an active participant in the Mortgage Bankers Association’s (MBA) technology initiatives related to the automation of the mortgage lending business. In addition, he has also been heavily involved in the development of the ANSI X12 EDI transaction sets related to mortgage banking, and is currently co-chair of the MBA’s Technology Workgroup for Credit Reporting

Mr. Lesher is an alumnus of the University of California at Berkeley, where he majored in Electrical Engineering and Computer Science and worked for an ARPA project developing the first multi-microprocessor computing systems. He has significant consulting experience in computer applications development projects, primarily relating to the areas of aerospace, credit, and finance. Mr. Lesher can be reached by email at james@percentage.com, or by phone at 626-744-1212.



Print Friendly


How To Find Us:
American Land Title Association
1800 M Street, NW, Suite 300S
Washington, D.C. 20036-5828
P. 202.296.3671 F. 202.223.5843
www.alta.org
service@alta.org
Copyright © 2004-2014 American Land Title Association. All rights reserved.
SecurityMetrics for PCI Compliance, QSA, IDS, Penetration Testing, Forensics, and Vulnerability Assessment