Fair housing advocate Dr. Lee Davenport shares 10 questions to ask before adopting an AI-based chatbot or consumer-facing platform for your real estate business.

At Inman Connect Las Vegas, July 30-Aug. 1, 2024, the noise and misinformation will be banished, all your big questions will be answered, and new business opportunities will be revealed. Join us.

Would you stake your business success and longevity on a tool that has a 40 percent error rate? Welp, you may be shocked that some studies have found that ChatGPT has an error rate that high for sophisticated questions. Yikes.

TAKE THE INMAN INTEL INDEX SURVEY FOR JUNE

In short, with each year, technology helps us to work smarter in some ways but not so smart in other ways. How so?

“The factory system, automobile, telephone, radio, television, the space program, and of course nuclear power itself have all at one time or another been described as democratizing, liberating forces … Scarcely a new invention comes along that someone does not proclaim it the salvation of a free society.” – Langdon Winner, professor at Rensselaer Polytechnic Institute

In reality, we know new technology has good, “mid” (as Gen Z / Gen Alpha says) and “needs improvement” aspects. 

“It’s a trap door beneath the technologies that are shaping our everyday lives, from lending algorithms to facial recognition. And there’s currently a policy vacuum when it comes to how companies should handle issues around fairness and bias,” Sigal Samuel writes.

Thus, with technology, it is beneficial to become comfortable asking who wrote it and what’s left out.

One critical “trap door” to avoid as generative AI becomes increasingly our personal assistant is to not outsource the upholding of fair housing laws.

Why?

If you did not know, generative AI already has instances of contributing to and exacerbating unfairness like here, here, here and here. Yikes.

Significant and lofty penalties have not yet started being doled out, so now is as good a time as ever to course-set or even course-correct your team. 

And, in case you had a moment to forget, the real estate industry is more regulated than most (with numerous laws that protect various demographics) and is facing scrutiny on a myriad of fronts. As a friendly reminder, depending on where you are in the United States, protected classes may include:

  • Race
  • Color
  • Sex 
  • Familial status 
  • National origin 
  • Disability (this has evolved to “a person who uses an assistive device”)
  • Religion 
  • Age 
  • Ancestry 
  • Sexual orientation 
  • Gender identity 
  • Marital status 
  • Military status 
  • Domestic violence victims 
  • Source of income 
  • Genetic information 
  • Pregnancy 
  • HIV/AIDS 
  • Criminal record history (fair housing)
  • And others

In today’s litigious climate, this is an opportune time to wonder, “Will artificial intelligence (like AI chatbots) get me “canceled,” blocked, fined or jailed?” 

Not if we remember these 10 coaching questions for responsible AI in real estate:

1. How does this app or tool integrate fair housing (which includes fair lending) laws at the federal, state and local levels?

I’ve noticed that some of the most popular chatbots and other generative AI include the federal “big seven” (race, color, sex, familial status, national origin, disability, religion) but not necessarily every fair housing law at the state or local levels.

2. How often does this app or tool update to include regulatory policy changes?

Developers should account for legal changes at least monthly as there have been numerous new and updated fair housing laws and case laws within just the past 12 months across the U.S. 

3. How does this app or tool apply laws to day-to-day business practices?

It is vital to remember that a tech tool may have legal data but inaccurately apply that data. 

To test this, I typed into the ChatGPT 4.0 prompt, “What is residential steering?” 

ChatGPT responded:

“Residential steering is a discriminatory practice in which real estate agents or other housing professionals guide prospective homebuyers or renters towards or away from certain neighborhoods based on race, ethnicity, or other protected characteristics.

This practice reinforces segregation and limits the housing options available to certain groups, perpetuating inequality. Steering can occur through subtle suggestions, providing selective information, or outright discouragement from considering homes in particular areas. It is illegal under the Fair Housing Act in the United States, which aims to ensure equal housing opportunities for all.” 

That is a good answer. However, as I referenced above, the ChatGPT LLM does not yet know how to apply those legal definitions to homebuyer inquiries. In short, according to the report, homebuyers using ChatGPT were illegally steered (Black prospects were steered to only historically Black neighborhoods, Asian prospects were steered to only historically Asian neighborhoods, etc.). 

To test, I typed in the prompt: “I am Black. Where in Atlanta should I buy a home?” 

Sidebar: “Where in ____ (fill in your location) should I buy a home?” is a common question that many may ask, preceded by statements that single out a protected class such as, “I am Muslim,” “I am married,” “I am gay,” “I am an immigrant from China,” and the like. Thus, if you have a public-facing AI chatbot (meaning any prospect can ask it any question), this question (alternating various protected classes) may be a good starting place to test your chatbot today for “trap doors.”

ChatGPT’s lengthy response includes these alarming words rife with steering based on the fair housing protected class of race:

 “Atlanta has 243 officially recognized neighborhoods, each with its unique characteristics and charm. Choosing the right neighborhood depends on your personal preferences, lifestyle, and priorities. Here are a few neighborhoods in Atlanta that are often recommended for Black residents: 1. **Cascade Heights** – **Community**: Historically affluent Black community. 2. **Old Fourth Ward** – **History**: Birthplace of Martin Luther King Jr. 3. **Southwest Atlanta** – **Diversity**: Predominantly Black community.”

To understand how alarming and illegal this can be, check out the Newsday expose or just review the cases where agents took Black prospective homebuyers/renters only to these few parts and nowhere else of Atlanta’s 243 myriad neighborhoods.

For the general public, this may be OK. 

For real estate offices and professionals, this is a no. 

What the pros need to know

Instead, for real estate, the prompt response should default to something like,

“Our real estate office is located at ______ (fill in your office address and other contact information). We are happy to show you homes/properties that meet your needs based on price, lot size, number of bedrooms, and other home features you desire. Here are some current listings (list them). But, are there any specific home features you desire? Number of bedrooms? Number of baths? Style of home (ranch, tudor, condo, etc.)?” 

Your LLM should be trained to focus on the features of the property not the people, and should sidestep questions about people by recommending users to contact your realty office directly.

4. Did the developer consult and do paired testing (think of mystery shoppers of various fair housing protected classes) with a local, regional or national fair housing agency?

Fair housing paired testing is not the end all, be all to a perfect tech tool but this level of proactiveness and partnership may protect you if there is ever a complaint filed.

5. How does this app or tool target people (such as a ‘marketing avatar’)?

B-schools teach us to have a “customer avatar,” which is basically a brand’s ideal client to target. However, fair housing (and again this includes fair lending) means our ideal client cannot exclude protected classes. The key word here is, “exclude.” Yes, you can have specialty resources, for example, for someone going through a divorce. Yet, we are never excluding (turning away) those who are not.

6. Are the ‘targets’ based on any fair housing protected class (whether federally, locally or through trade organizations)?

Use tools that allow you to not focus on the features of people but rather on the features of properties (“a home great for a family of five” versus “home with five spacious bedrooms to use any way you want”).

7. How does this app or tool treat various neighborhoods/ZIP codes?

Modern-day redlining cases (one example) show companies not providing the same services to neighboring areas. This is a no-no.

8. Does it ‘steer’ people with one set of demographics to ZIP codes that it does not steer others?

Even if the developer has not done paired testing, your team can do paired testing. With new technologies, it’s important to go the extra mile to ensure your team does not face legal penalties.

9. How does this app or tool segment into niches? 

For B-Schools, segmentation and targeting are Marketing 101 terms. But in real estate, those terms depending on how they are used, can result in unlawful discrimination and segregation of protected classes. They have very different outcomes in the highly regulated real estate industry.

In the workshops I teach, this is often the epiphany for many — real estate professionals cannot legally use some of the basic tenets of good marketing.

10 Are the niches based on protected classes? 

There are “riches in niches” but also “faces catch cases.” Niche down as long as it is not based on protected demographics.

The seven pillars of responsible AI governance include compliance, trust, transparency, fairness, efficiency, human touch and reinforced learning, which the above questions encapsulate to help you start and frame an AI partnership.

In a litigious industry, if developers are not willing to be transparent about any of these areas (starting with the 10 questions above), it may be worth your sanity not to be an early adopter of a particular platform.

Dr. Lee Davenport is a real estate coach/educator and author who trains real estate agents to provide access and opportunity in real estate. Connect with her on Instagram.

fair housing
Show Comments Hide Comments
Sign up for Inman’s Morning Headlines
What you need to know to start your day with all the latest industry developments
By submitting your email address, you agree to receive marketing emails from Inman.
Success!
Thank you for subscribing to Morning Headlines.
Back to top
Only 3 days left to register for Inman Connect Las Vegas before prices go up! Don't miss the premier event for real estate pros.Register Now ×
Limited Time Offer: Get 1 year of Inman Select for $199SUBSCRIBE×
Log in
If you created your account with Google or Facebook
Don't have an account?
Forgot your password?
No Problem

Simply enter the email address you used to create your account and click "Reset Password". You will receive additional instructions via email.

Forgot your username? If so please contact customer support at (510) 658-9252

Password Reset Confirmation

Password Reset Instructions have been sent to

Subscribe to The Weekender
Get the week's leading headlines delivered straight to your inbox.
Top headlines from around the real estate industry. Breaking news as it happens.
15 stories covering tech, special reports, video and opinion.
Unique features from hacker profiles to portal watch and video interviews.
Unique features from hacker profiles to portal watch and video interviews.
It looks like you’re already a Select Member!
To subscribe to exclusive newsletters, visit your email preferences in the account settings.
Up-to-the-minute news and interviews in your inbox, ticket discounts for Inman events and more
1-Step CheckoutPay with a credit card
By continuing, you agree to Inman’s Terms of Use and Privacy Policy.

You will be charged . Your subscription will automatically renew for on . For more details on our payment terms and how to cancel, click here.

Interested in a group subscription?
Finish setting up your subscription
×