
AI-Generated Property Photos: The Disclosure Question No One Has Answered Yet
Real estate has always involved presenting a property in its best light. Professional photography, landscaping for curb appeal, a fresh coat of paint before listing — none of this is considered deceptive. The question being debated across MLS boards and legal circles right now is where "best light" ends and misrepresentation begins when AI is doing the enhancing.
The Spectrum of AI Modification
It's worth being precise about what's actually happening, because "AI photo manipulation" covers a wide range of practices with very different ethical weights.
At one end: basic AI enhancements like sky replacement, grass greening, and brightness correction. These have been used in real estate photography for years and are largely accepted as standard practice. The property still looks like the property.
In the middle: virtual staging of vacant rooms, which places AI-generated furniture and decor into photos of empty spaces. This is now widespread and generally disclosed — many portals have "virtually staged" labels that appear with these photos. The property is still accurately represented; it's just shown with hypothetical furnishings.
At the more contested end: AI modifications that change the actual physical condition of the property. Removing a power line from a backyard photo. Filling in a crack in a driveway. Replacing an aging roof in the image. These changes affect what a buyer expects to find when they arrive, and they're harder to justify as mere presentation.
Where the Rules Stand
The National Association of Realtors Code of Ethics requires that advertising be honest and not misleading. State licensing laws similarly prohibit misrepresentation in marketing. These existing standards apply to AI-modified images, but they were written without AI in mind, and applying them requires judgment calls that aren't always straightforward.
Several MLS systems have introduced specific AI disclosure requirements, but there's no national standard. The most common requirement — a label or notation when virtual staging is used — addresses the middle of the spectrum but says little about the edges.
The Practical Stakes
A buyer who makes an offer based partly on photos that depicted a property differently than it actually exists has a potential misrepresentation claim. Whether that claim goes anywhere depends on the degree of modification and the applicable state law, but the legal exposure is real.
The cleaner standard — one that would protect agents and buyers alike — is simple: if an AI tool changed something that a buyer would care about finding in person, disclose it. The industry will get there eventually. The question is whether it does so proactively or in response to litigation.
- Jason