
What the FTC's AI Guidance Means for Real Estate Marketing
The Federal Trade Commission has spent the past two years building out its AI enforcement posture, and while most of the attention has focused on consumer products and social media, a substantial portion of the agency's guidance applies directly to real estate marketing. Here's what's relevant and what agents and brokerages should understand.
Endorsements and AI-Generated Reviews
The FTC's revised endorsement guides, which took effect in 2023 and have been actively enforced since, address AI-generated and synthetic testimonials explicitly. Creating fake reviews or testimonials — including those written by AI — is a deceptive practice under FTC rules. For real estate, this means AI-generated "client testimonials" that aren't based on actual client experiences are off-limits, regardless of how plausible they sound.
The FTC has also indicated that material connections between an endorser and the business being promoted must be disclosed. If an agent uses AI to solicit or amplify reviews in exchange for something of value, that's a disclosure issue.
Substantiation for AI-Generated Claims
Any marketing claim — whether written by a human or an AI — needs to be substantiated. An AI tool that generates listing copy claiming "the best value in the neighborhood" or "lowest days-on-market in the zip code" is making a factual claim that the agent publishing it is responsible for. The fact that an AI wrote it doesn't insulate the agent from liability if the claim is false or unsubstantiated.
This seems obvious in principle, but AI-generated marketing content can include claims that are plausible-sounding but inaccurate. The agent reviewing and publishing is the responsible party.
Impersonation and Synthetic Media
The FTC has taken a strong position against using AI to impersonate real people without their consent — including synthesizing someone's voice or likeness for marketing. For real estate, the relevant scenario is the use of AI-generated video or audio that mimics a real person, or the use of AI-generated "spokesperson" characters that imply a real human relationship that doesn't exist.
The line here isn't always obvious, but the FTC's frame is: would a consumer be deceived about the nature of who they're dealing with?
The Practical Upshot
None of this requires abandoning AI marketing tools. It requires using them with the same standard of review you'd apply to any marketing communication. Review AI-generated content before publishing. Don't publish testimonials that aren't based on real client experiences. Verify factual claims. The FTC's AI-related enforcement is still in its early stages, but the direction is clear — and real estate, with its heavy marketing activity, is not an industry that will be able to claim it didn't get the memo.
- Jason