Sponsor MessageBecome a KQED sponsor
upper waypoint

Waymo, Alphabet Sued for Bias After AI Allegedly Mislabels SF Doctor as Terrorist

Save ArticleSave Article
Failed to save article

Please try again

Dr. Nasser Mohamed during Pride Sunday in San Francisco on June 26, 2022. The physician and civil rights activist is suing Alphabet and its subsidiary Waymo for denying him access to a public transportation service on the basis of AI software falsely flagging him as a terrorist. (Courtesy of Dr. Nas Mohamed)

After two years of trying and failing to sign up for Waymo, friends inside the company told Dr. Nasser Mohamed his Middle Eastern Muslim name set off the AI identity screening. But Dr. Mohamed alleges he couldn’t get a human to correct the error. So now he’s suing the company and its corporate parent, Alphabet.

In a lawsuit filed Tuesday in San Francisco Superior Court, the Qatari-born-and-raised-physician claimed Alphabet, Inc. and its subsidiary Waymo, LLC, discriminated against him based on ethnicity, religion, and national origin when they denied him equal access to their services after their artificial intelligence-powered identity verification program erroneously identified him as a terrorist on the U.S. Government’s Office of Foreign Assets Control Sanctions List.

“My entire life and my background and my work are quite public,” Mohamed told KQED. “I’m a physician and an LGBT rights activist based in San Francisco, California. And I’m known for my work within medicine, but also in civil rights work.” He was even elected to serve as Grand Marshall in the 2023 San Francisco Pride Parade. He’s served on the board of San Francisco Pride.

Sponsored

Mohamed’s beef with Waymo and Alphabet goes beyond Alphabet using overly broad criteria that resulted in a “false positive” match and flagged him as a national security risk.

His repeated attempts to get Waymo employees to override the decision failed. “Literally, there is no mechanism in place for me to pursue, to go and escalate this. They were all dead ends.”

Mohamed is seeking damages and a ruling that would bar Waymo from using name-matching algorithms without human review.

A Waymo driverless taxi drives through Downtown San Francisco, California, on Nov. 2, 2023. (Carlos Avila Gonzalez/SF Chronicle )

In response, a spokesperson wrote KQED, “We are committed to providing access to all in the communities we serve. We disagree with the claims made.”

In a statement posted on social media, Mohamed wrote, “This is not about conflict — it is about clarity, accountability, and ensuring that communities who have historically been subject to bias are not quietly left behind as technology evolves.”

“This case is not an anti-AI, anti-algorithm case,” added Shounak Dharap, Mohamed’s attorney, who teaches a course on applied AI for lawyers at the University of San Francisco School of Law, noting the case was brought under laws meant to protect Californians’ civil rights and prevent unfair business practices.

Companies in numerous industries are facing lawsuits seeking to establish their liability for discrimination involving artificial intelligence.

One example pending in federal court is Mobley v. Workday, in which a Black job applicant alleges the company’s AI-powered hiring tools discriminated against him and other applicants based on race, age, and disability.

The U.S. Department of Justice, along with California and other states, is suing RealPage, alleging that its algorithmic pricing software enabled landlords to collude and inflate rents.

“The same things that happen when people are in charge are gonna happen when algorithms are in charge of filtering information. But if there aren’t enough parameters and constraints, then we’re gonna be rolling back the time back to when we didn’t have the kind of civil rights protections we have now,” Dharap said.

Without more details from Waymo or Alphabet, it’s unclear how they are verifying customers’ identities.

“In fairness, I don’t know what Waymo is doing to verify identity,” wrote Hany Farid of UC Berkeley’s School of Information. “But if it is only doing a simplistic name matching, this is inexcusable because we now have fairly good technology to verify identity that is light years ahead of a simplistic (and lazy) name matching.”

lower waypoint
next waypoint