Abstract technology concept with house and keyboard
Shutterstock

AI and real estate data: Who’s making the rules? 

As AI becomes more embedded in transactions, MLSs are pursuing a “license, not lawsuit” strategy to create guardrails, protect consumers — and limit legal risk.

February 23, 2026
5 mins

Key points:

  • After years of commissions lawsuits and antitrust scrutiny, MLS leaders are working to modernize data rules before AI creates the next wave of litigation.
  • California has already passed disclosure rules for AI-altered listing photos, while brokerages and MLSs are creating their own compliance strategies.
  • As sensitive transaction data flows into AI systems, the industry must determine who regulates its use — and who is liable when errors occur.

Editor's note: As AI becomes increasingly embedded in real estate tools, transactions and processes, industry leaders are grappling with how to balance the benefits and risks of the technology. In this two-part series, we look at how some brokerages, MLSs and consumer advocates are approaching AI, and the questions the industry should be asking.

Read part one here.


AI's sudden ubiquity has transformed the way that real estate agents and consumers communicate with one another and even approach the transaction itself. But as brokerages continue developing proprietary AI tools and pushing adoption across their agent networks, a quieter conversation is unfolding behind the scenes around licensing agreements, compliance frameworks and data governance.

Who controls how AI can use real estate data? What happens when sensitive client information is entered into large language models like ChatGPT, Gemini or Claude? And in an industry already navigating legal fallout, who ultimately bears the risk if something goes wrong?

Guardrails and disclosures

After years of commission lawsuits, antitrust scrutiny and regulatory pullback from the National Association of Realtors, few industry leaders are eager to test the legal limits of artificial intelligence without guardrails.

California lawmakers recently passed legislation requiring disclosure when listing photos are altered using artificial intelligence, including side-by-side comparisons of original images. The move reflects a growing recognition that AI-enhanced marketing can blur the line between presentation and misrepresentation.

Brokerages have begun responding internally as well. Some firms, like eXp, have updated listing agreements to include AI disclosure language and rolled out agent training around photo enhancement and virtual staging — efforts aimed at reinforcing transparency and consumer trust.

AI doesn't create the risk — it scales it

Disclosure is only one part of the solution.

Matt Fowler, CEO of Doorify MLS in North Carolina, said most MLSs already prohibit edits that change a "material fact" about a property — language that predates generative AI.

Fowler described a recent incident where an agent digitally removed a gas meter from the side of a home because it was unsightly. In another instance, four exterior steps were edited down to three during landscaping changes. Both were violations — not because AI was involved, but because the edits altered what a buyer would actually encounter.

"That's material," Fowler said. "You can't just wish that away."

The difference now isn't that manipulation exists — it's that AI makes it faster, cheaper and easier to execute at scale. That scale, Fowler argues, is what creates liability.

"We want a license, not a lawsuit," he said — a phrase that has become Doorify's organizing principle as it looks to modernize how MLS data can be used in AI systems.

Modernizing policies

Fowler said a more pressing issue is how AI interacts with MLS data itself.

Policies governing listing display and data sharing — including long-standing Internet Data Exchange (IDX) and Virtual Office Website (VOW) frameworks — were largely written for a different era. They anticipated broker websites and portals, not brokerages feeding MLS data into large language models, merging it with CRM systems or building AI-powered search and marketing tools.

As those use cases expand, the question becomes what counts as authorized use.

Doorify's new "license, not a lawsuit" digital data agreement updates those rules to account for the current technological environment. Instead of limiting brokers to a narrow list of approved applications, Fowler said the approach focuses on clearly defining prohibited uses — particularly around consumer data and privacy — while leaving room for innovation inside those guardrails.

Who writes — and enforces — the rules?

As AI becomes embedded in brokerage workflows, the question of regulatory authority has become increasingly pressing. Fowler believes state real estate commissions — not national trade groups acting alone — are best positioned to serve as the ultimate authority. In North Carolina, he said, the commission regulates more than 80,000 licensees and plays an active enforcement role in consumer protection.

"We take a similarly stringent look at this data," Fowler said, noting that MLS systems often contain more than listing information. They may include buyer details, showing schedules, home addresses and, in some cases, financial data.

That is why, he argues, MLSs are increasingly functioning less like listing databases and more like secure infrastructure providers.

"We think of ourselves a little bit more like AWS than the form filler stack of brochures that we used to be," Fowler said.

California's legislative approach reflects one path forward, while MLS licensing frameworks — which function as industry-defined guardrails designed to move quickly as technology evolves — represent another.

The privacy question ahead

Consumer advocates warn that the next flashpoint may center on data privacy. Wendy Gilch, a research fellow with the Consumer Policy Center, has raised concerns about consumers and agents entering sensitive documents into commercially available AI tools without fully understanding how that information is stored or used.

In residential real estate, that could include contracts, inspection reports or underwriting paperwork — documents that were never intended to be pasted into a chatbot.

If AI becomes more embedded in the transaction, the liability question will become unavoidable: Who bears the risk if sensitive information is exposed or if an AI-generated error leads to financial harm?

For MLS executives like Fowler, the goal is to answer those questions before a court does.

Because while AI may be the newest layer of the real estate stack, the industry's legal exposure is anything but new — and without clear rules, innovation could once again collide with litigation.

Get the latest real estate news delivered to your inbox.