Is “Native” Data Dictionary on the Horizon for MLSs? NAR Proposal, Planning, and Questions

Michael Wurzer of FBS wrote a thoughtful post this week regarding a recent MLS best practices proposal from the National Association of REALTORS. The best practice proposes: 

By July 1, 2022, MLSs should create with their vendors and leadership a written plan with a timeline and cost estimate to establish a native* RESO Data Dictionary compliant MLS for all listing content available to MLS Participants and Subscribers.”

“Native” means all of the MLS’s data access services for Participants, Subscribers, vendors, designees, and other authorized recipients must be delivered Data Dictionary compliant data without the need to convert it from some other format.

Background is necessary. There’s been a concept in the real estate industry for a long time about an MLS going “native” Data Dictionary. Some MLSs have undergone this process. The internal technical details vary across organizations, but the output is the real goal. No matter the database or internal mapping details, can an MLS deliver all of its outbound data services to its customers (broker participants, agent subscribers, and 3rd party vendors including core services) that’s compliant with RESO Data Dictionary standards? If so, the MLS would be deemed “natively” DD compliant.

“Native”, as intended by the users of the term, means more than just providing a RESO Web API service (which is the primary driver of Data Dictionary compliant data in the industry today) but still maintaining non-standardized RETS or other non-standard feeds to other applications. Native, as described by the proposal, means ensuring that an undefined implementation at the core level of the MLS is feeding every service that relies on the MLS’s underlying data set a RESO compliant set of data.

Michael asks some important questions, and I think it’s worth asking his big question first:

Honestly, thinking beyond these kinds of basic questions to the broader prospect of 500 MLS conversions in the years ahead makes me wonder what outcome is being sought by this proposal, and whether this is the best path to achieving that outcome? In terms of a plan, I’d recommend some basic cost-benefit analysis be the first step, because it’s hard to imagine an outcome that’s going to be achieved that justifies the costs, especially when the reality is that the RESO DD actually embeds within it the ability to map something as fundamental as status. And we also know that thousands of data consumers today are processing mapped feeds successfully today.

This is exactly the right question, and the reason this best practice is on the table. I’ve been listening to this question be asked, and not answered, for a decade. 

Does a “native” Data Dictionary MLS provide more value? I think the answer is unequivocally, “Yes” if we’re talking about the outcomes provided to data consumers and not implementation details of data providers. (As RESO’s CTO Joshua Darnell points out, the focus should be on the interface that provides the standard data, not the implementation at the database level.) But what are the costs in terms of time and resources to create these outcomes for data consumers, and can the technology community support a mass migration? 

This is the point where the industry usually says, “It’s too hard,” and throws up its hands without doing the analysis. And that’s why it’s exactly this kind of effort by NAR, to motivate MLSs and their vendors to engage in truly documenting what these costs will be, that must be the first step.

Establish the lift. Get all stakeholders on the same page. Evaluate the cost/benefit. If the benefits outweigh the costs, we’ll know (the MLSs who have already decided to go “native” seem to think they do). If they don’t, we’ll explore different approaches.

So back to the details of what MLSs might be asking in this process of establishing a plan and budget for a “native” conversion:

The first thing that comes to mind is that I’m kind of skeptical that any MLSs will actually be able to comply with it as written, which is that MLSs must deliver “Data Dictionary compliant data without the need to convert it from some other format.”

This is a good point, and the reason why proposals go through multiple rounds of public discussion and review. In particular, you can see how “without the need to convert it from some other format” might imply to some that the MLS is not allowed to convert it from some other format. 

Having been in the NAR advisory board meetings, I can tell you that’s not the case. Rather, the intent was that the data consumers (broker participants) would not have to convert the data. Whatever transformations the MLS needs to perform internally is its business, as long as all outputs conform to standards. Clarification is probably warranted here.

Michael gets into the details:

The first thought that pops into my mind is that I can’t think of any MLSs off the top of my head that even follow the RESO standard statuses. Instead, pretty much every MLS provides the “local MLS” status fields and then maps those to provide the RESO standard status fields. So, will all those MLSs have to conform their statuses to the RESO standard statuses as part of this project?

The intent of a RESO compliant feed is not to limit or hide local MLS fields that provide additional details not included in the Data Dictionary. Local, custom MLS fields are critical parts of a data set that complement the DD. They are not opposed to it. And all of these fields should be delivered in any MLS data feed side by side. An MLS’s local status, and the RESO standard status that the local status maps to, are both important to communicate together

The next thing that comes to mind is that there are many, many cases (trust me, we’ve been mapping MLS data to the RESO DD for years) where MLSs have more detailed data than the DD supports. So, will those MLSs either lose that detailed data or will the additional details be added to the DD ahead of or as part of the conversion?

A Data Dictionary compliant feed will include fields that aren’t in the Data Dictionary. As the mantra goes, “If the Dictionary does it, and you do it, then you have to do it the Dictionary way.” But if the local field is not in the DD, it’s still a valuable field to deliver in its custom form.

Michael also points out a key area where non-compliance with DD standards begins. When listing input forms don’t use standard fields, agents and brokers get used to inputting non-standard things in their MLSs:

One of the key benefits I can see of having all MLSs be “native” (without data mapping) is for third party listing entry and updates to the MLS. If a third party wants to enter a listing, but they’re using the DD and that doesn’t have the same level of detail as the local MLS, that causes data loss or degradation, so having them the same becomes important. However, getting there on every field is a massive undertaking that raises very practical cost-benefit concerns.

It’s true that the DD was never intended to include every field from every MLS: it’s there for the common cases. Listing input modules need to be able to accept both DD fields and local custom fields. Having both side by side is RESO compliant.

But listing input forms should also require standard DD fields like ListPrice to be input according to the standard, and not create fields like AskPrice or LP. While these are really technical constructs behind the scenes, the very listing input that many brokers use in the MLS often begins the process of non-standardization from the date of listing creation.

Michael suggests a phased approach to analyze viability:

To this point, instead of starting with some broad mandate about every single field, how about we start with some core categories like property type, status, location fields (e.g., address, city, county, school district, etc.), and media to see if MLSs can get native on those? These narrow categories would bring significant advantages and would be a great test to see if and how MLSs are able to respond to such a change before jumping off the bridge and being forced to blow out the data dictionary to accommodate all the MLS local field details or for MLSs to lose those details to comply with this policy.

In sum, if this proposal moves forward, I’m very hopeful a clear outcome is defined that justifies the scope of the project, and that at least the initial scope is limited to some of the fundamental categories that have proven most challenging so far.

To be clear, a best practice is not a mandate. And no technical changes are even suggested to be implemented yet: merely an exploration of the requirements by proposing a plan. Michael makes a reasonable suggestion on a phased approach, identifying the most important areas to focus on first. The bigger question to begin with, though, is what the cost of an entire conversion would be, and then to be able to break that down into components.

The proposal at hand is essentially a fact-finding mission. Until the MLSs and their vendors engage in crunching the numbers, the industry will continue to blindly guess at whether these conversions are worth the cost, and we’ll continue the cycle of questions without answers.

My gut says that the MLSs who have gone through “native” Data Dictionary conversions will weigh in soon with the short term growing pains and the long term efficiencies gained from the process. But I won’t speak for them. 

ADDED: Based on further conversations on Twitter, maybe it’s time we stop trying to cram a lot of technical meaning into a term like “native”, and just start over again describing the intended inputs and outputs. It might remove some confusion between the business and the technical experts.

In the meantime, it’s time to show our cards and have the conversation. MLSs, let us know your thoughts.

Author

  • All opinions expressed herein are personal opinions and do not constitute the position or views of any organization. Sam DeBord is CEO of Real Estate Standards Organization (RESO). He has two decades of experience in the real estate industry, spanning real estate brokerages, mortgage lending, and technology consulting. He has served as President’s Liaison for MLS and Data Management with the National Association of REALTORS®, a REACH mentor, and on the board of directors for NAR, Second Century Ventures, and California Regional MLS. Sam began his career as a management consultant for PricewaterhouseCoopers. He is a recognized real estate industry writer for publications including REALTOR® Magazine, Inman News, and the Axiom Business Books Award-Winning Swanepoel Trends Report.

Discuss!