This partial transcript was computer-generated. While our team has reviewed it, there may be errors.
Rachael Myrow: From KQED, this is Forum. I’m Rachael Myrow, in for Mina Kim.
Last week, Meta and YouTube were found negligent in a case centered around social media addiction. The argument at the heart of the plaintiffs’ case: the products themselves — the infinite scroll, the algorithms, the design choices that keep not just kids, but all of us, on the apps for too long until, as the kids like to say, we’re all cooked.
We all feel the addictive pull of these platforms. But after years of congressional hearings that went nowhere, and state regulation that’s nibbled around the edges of this elephant, the courts appear to be the stage where Silicon Valley may finally be forced to confront the question of accountability.
Joining me to break it all down today are two reporters who have been covering this: Jasmine Mithani, technology reporter for The 19th, joining us via Zoom from Los Angeles. Thank you for being here, Jasmine.
Jasmine Mithani: Thanks for having me.
Rachael Myrow: And Jeff Horwitz in studio, technology reporter covering Meta for the Reuters enterprise team. And I should also mention, the lead reporter on The Wall Street Journal’s “Facebook Files” series a few years back. Thank you, Jeff.
Jeff Horwitz: Thanks.
Rachael Myrow: Well, Jeff, why don’t we kick this off with you? This wasn’t the first lawsuit of its kind claiming behavioral harm at scale by design, but the lawyers said this verdict creates a playbook for how to win these kinds of cases. Does it, though? I mean, I’m thinking of a particular carve-out, Section 230, that has long shielded these companies. What’s your take?
Jeff Horwitz: Oh, we’re jumping straight into Section 230, are we? Okay.
Yes. So there are two of these cases. One, brought by private plaintiffs in Los Angeles, that’s viewed as sort of a test case for thousands of private claimants alleging that the company did not reveal the harm it knew was likely to be caused by its products.
And then another one that’s more focused on child predation, out of New Mexico, brought by the state attorney general.
Both of those, just coincidentally, after running for years, have within the last week had jury verdicts. And in both instances, Meta was found responsible. And the penalties are substantial.
So, Section 230 — the sort of longstanding shield that dates back to the Prodigy bulletin board era — protects platforms from liability when they choose to exercise some oversight over their platforms. They can’t be held responsible for harmful content that users post. That was the initial definition.
And it stayed pretty constant even as the world moved on from Prodigy bulletin boards and onto social media platforms — first connecting you to people you knew, and then to content from people you didn’t know.
That had been viewed as ironclad.
What the plaintiffs in both cases here — both the state of New Mexico and the private plaintiffs — went for was the idea that Meta had, one, knowingly designed its products in a way that was unsafe.
Think of it like, I don’t know, a baby crib that has bars too far apart and creates a strangulation hazard.
And then also that Meta had lied about what its products were likely to do. So they’ve said things like, “We made this safe,” when in fact there were people internally saying, “We have made this not safe.”
So this approach — this combination of consumer protection and bad product design — is being viewed as a potential end run around the Section 230 protections.
Rachael Myrow: Jasmine, set the stage for the months to come, especially this summer. Am I right in thinking there are about 350 families in the pipeline behind the plaintiff known as Kaylee?
Jasmine Mithani: Yes. There are thousands of cases. They’re both at the state level and also at the federal level.
The case in Los Angeles, with plaintiff KGM, was the first of what they call bellwether cases, which is just testing out this legal argument and seeing what it could mean for these thousands of cases.
So there are two more bellwether cases coming this summer, and those will be consequential in determining what will happen for the thousands and thousands of plaintiffs.
But I do want to note that just because this one case settled — and we’re talking specifically about the individual plaintiffs, like the one in Los Angeles — it doesn’t necessarily mean that it’s an automatic win for anybody else.
There still has to be consideration of the specific circumstances, the specific liability, and the specific allegations of what kind of harm was done.
So it’s not quite as clear-cut.
But many tech accountability advocates are really heralding this as a change in the tide.
Rachael Myrow: And, Jeff, of course, there are going to be appeals. Tell us more about that.
Jeff Horwitz: Yeah. Meta has promised to appeal both cases.
And I should mention the New Mexico case is kind of a two-phase thing.
Only the first one — the monetary verdict by the jury — has been settled. That was $375 million.
The next stage is injunctive relief, where the state would be in a position to get a judge to order Meta to make product changes, at least for the state of New Mexico.
But obviously, it’s hard to change a social media platform in just one state.
So this is going to be probably years still in terms of fighting.
People have compared this to past product liability litigation — whether it’s related to the opioid crisis or tobacco.
Yes, addiction is a theme in some of these cases, though I’m not necessarily drawing a direct connection there.
But it should be a while until there’s real clarity as to exactly how this all plays out.
Rachael Myrow: Jasmine, I’ve heard predictions that a lot of cases are going to get consolidated across the country. Is that what you’re hearing too?
Jasmine Mithani: I think there’s the potential for larger-scale settlements, if that’s what you’re talking about.
The cases at the federal level have already been consolidated into multidistrict litigation.
And then also at the lower level, they’ve been consolidated on that scale as well.
But, like I said earlier, all these cases are different.
So the potential for what kind of harm has been caused, and who is really at fault, those are still open questions.
Rachael Myrow: Open questions indeed.
Just a reminder to listeners: Meta has been ordered to pay $4.2 million — again, this is before the appeal. YouTube, $1.8 million.
Meanwhile, in New Mexico, a court ordered Meta to pay $375 million for misrepresenting child safety on its platforms.
I should mention these numbers sound big until you remember what these companies are worth.
Jeff, how likely is it that any of these cases will change the way the companies operate?
Jeff Horwitz: I mean, I hear you that for a company that did $160 billion in revenue last year, none of these are exactly breaking the bank.
And in fact, the afternoon these cases closed out, Meta’s stock was actually slightly up.
So I think how far this matters to them financially is unclear.
That said, look, New Mexico is 0.62 percent of the U.S. population. If you scale that out, we are talking a lot of money at that juncture.
And I think Meta has, for a number of years, understood that they are the lead dog in terms of being challenged.
Other companies have either settled or had less pressure.
So I think it’s likely that they’re already making some changes, and they certainly have been for several years in anticipation of this moment.
They’ve been launching Instagram for teens and really promoting that heavily.
Rachael Myrow: Oh, I should mention it’s $160 billion in revenue for Meta, by the way.
Friends, we’re talking about social media addiction and the role big tech plays in keeping users — especially young people — endlessly engaged on their platforms.
We’re talking with Jasmine Mithani and Jeff Horwitz, two fabulous tech reporters.
We want to hear from you. Keep listening.
You’re listening to Forum on KQED.