This article was originally published on Geek Estate Blog:
Project Upstream aims to become the defender and gatekeeper of brokers’ real estate listings nationwide. The National Broker Public Portal intends to provide a nationwide display of data that puts brokers at the controls. The former is not intended to exclusively fuel the latter (though it seems like a match made in heaven–more on that later).
Upstream is not an MLS and will not replace the MLS’s co-broker, commission, and professional standards roles. Its creators do not intend for it to be public-facing on its own. It’s intended to feed many portals with standardized data.
Upstream and the National Broker Public Portal are projects about advertising online to the real estate consumer. Accurate data and fair display guidelines are important to brokers. Based on consumer traffic numbers to certain portal sites, consumers don’t seem to care as much about the same issues.
Brokers need to achieve their goals by creating their own vision of perfected data display, while simultaneously building an attractive platform that consumers will not only approve of, but proactively search out.
Differentiation on a level that consumers can easily and quickly understand will be key to gaining a significant market share at a price that’s attainable.
To differentiate in the consumer space broker projects need unique data layers that are only available to brokers. They can create them by:
1) Reworking the proprietary data they already have, or
2) Generating entirely new layers of data that accompany any new listing that enters the marketplace through their gateway.
Unearthing another layer of current listing data
When we say the portals have “access to MLS listings”, we’re really talking about just a small subset of listing data. Portal advertisers show the basic listing data, and surround it with other data from publicly available sources.
An Upstream-type gateway could require its members to give broader access to their listing data fields in return for being a part of this new unique consortium of data providers. By allowing more data fields into this sole repository to become publicly displayed, a preferred portal that is allowed to display this data can claim a truly unique identity.
Just one example of data that might be leveraged for unique consumer content: keybox history. The preferred portal might be able to display:
- The number of showings for a property
- The rate of showings
- The days in which the property gets the most activity
- Aggregated showing data across neighborhoods and cities
- Heat maps on Sunday traffic for house hunting
- Which neighborhoods are trending upward
- Which ones are lagging and have better potential for discounts on prices
That’s just one facet of listing data that’s currently behind a shroud. We could think of another half dozen in an hour. An initiative to expose this kind of data on a single website would have immediate consumer impact.
Creating a New Layer of Super-MLS Data
If an Upstream-style project really gained ground as the starting point for listing input, why not give the option, or potentially the requirement, that new listings on this data source add new fields that might not have existed at the MLS level before?
Imagine requiring every single listing entered on Upstream to include a legal description, a floorplan, a permit history, or a 3D-model of the home. Think of the value of a Super-MLS layer of data that added the kind of consumer-viewable documents and features that can’t be found anywhere else online and can’t be reproduced through public data sources.
Of course these things would have legal and financial ramifications, but the point is not that the examples given are the specific answers. The answer itself might not be apparent today, but the ability to provide a listing with a totally unique display layer, one that isn’t available to any other advertiser, is riveting.
Why would it work? The company inputting the data has 1 million data entry staff members.
Agents are the data creators. Even a well-funded portal’s engineering army is a fraction of the size of the agent masses creating unique content on a property-by-property basis.
When the data gatekeeper becomes the de facto starting point for the industry, whatever new layers of proprietary data have been added to the listing input fields will become the standard of operation. The listings could still be fed to MLSs for B2B/IDX/co-brokerage functionality, while a single consumer portal which is allowed to be the sole national display vehicle for these new layers would be able to hammer its opponents over their heads with this Super-MLS listing data.
Get To The Point Or Go Back To Work
The conversations we’ve had about portals selling our data back to us have been long on what brokers want. They’ve always been short on how to realistically get those things in a consumer advertising market, which is far more important.
If brokers want to create better data, and display it in a superior way, they should focus first on who will pay attention. Ask what new things can be done with the proprietary data they already have. Add on a layers of new proprietary fields and make them a feature or a bonus for the agents who input them–an opportunity for their listings to stand out, not just to do more work. Create a reason for consumers to fund the project(s) through traffic going forward.
By unearthing hidden listing data and creating new Super-MLS layers, you might just have the kind of data that consumers can’t get enough of and that a portal could benefit financially from in a big way–and that’s where the data creators start taking their leverage back.
Sam DeBord is a former management consultant and web developer who writes for for Inman News and REALTOR® Magazine. He is Managing Broker for Seattle Homes Group with Coldwell Banker Danforth, and a Director for WA REALTORS® and Seattle King County REALTORS®. You can find his team at SeattleHome.com and SeattleCondo.com.