Sponsor MessageBecome a KQED sponsor
upper waypoint

Sex Workers Tried to Warn Us About Age Verification Laws

Save ArticleSave Article
Failed to save article

Please try again

A yellow canary with heavy digital distortion sits against a dark background, overlaid with glitching lines, fragments of code and scanning graphics. A translucent android face and large robotic eye appear on the right side. The Close All Tabs logo is in the top right corner.
A distorted canary layered with glitching code and an android eye in a composite image about sex work, surveillance and age-verification laws. (Composite image by Morgan Sung using images by GeorgePeters and Vito Cangiulli / Getty Images)

View the full episode transcript.

Requiring internet users to verify their ages before accessing mature content may sound reasonable. Shouldn’t we be doing a better job protecting kids from online vulgarities? But free speech advocates say the push for age verification isn’t really about protecting children — and that bills like the Kids Online Safety Act (KOSA) would open the door to greater surveillance, censorship and control of what people can do online. Those same free speech advocates say the evidence lies in what happened to sex workers after the passage of the bills known as Allow States and Victims To Fight Online Sex Trafficking Act (FOSTA) and the Stop Enabling Sex Traffickers Act (SESTA) in 2018. 

In this episode, Morgan is joined by writer, researcher and dominatrix Dr. Olivia Snow and Mashable associate editor Anna Iovine to explore the connections between porn, sex work and surveillance — and what age verification laws could mean for the future of the internet. 


Guests:

  • Dr. Olivia Snow, research fellow at UCLA’s Center on Resilience & Digital Justice
  • Anna Iovine, associate editor of features at Mashable

Further Reading/Listening:

Want to give us feedback on the show? Shoot us an email at CloseAllTabs@KQED.org

Follow us on ⁠Instagram⁠ and ⁠TikTok⁠

Sponsored

Episode Transcript

This is a computer-generated transcript. While our team has reviewed it, there may be errors.

Morgan Sung: So you know the messaging app Discord? Well, they recently announced that later this year, everyone on Discord will be a teen by default. Does this mean you’ll be transported back to middle school, confront your teenage bullies, and kiss your childhood crush like some kind of reverse 13 going on 30? Sorry, but no. It does mean that the way you can use Discord now might be very different. It’s part of a bigger push to “age gate” the internet. Discord’s new Teen by Default setting means that all users automatically get the teen version of the platform. So sensitive content is blurred out, and certain servers are off-limits. Discord said it’ll use both AI detection and human review to decide which servers are for adults only. How do you get past the age gating? Easy. Just upload a face scan or a photo of your government ID. 

Dr. Olivia Snow: So on its face, that seems like a pretty good idea. Like, I mean, who needs to be accessing adult content on Discord? Like, sure, we’ll all be safe, fine, but I mean none of this was ever about protecting children, ever. This is about data farming and mass surveillance. 

Morgan Sung: This is Dr. Olivia Snow. She’s a researcher at UCLA’s Center for Resilience and Data Justice, where she studies sex work and algorithmic surveillance. And she’s writing a book about this topic. 

Dr. Olivia Snow: And I’m also a dominatrix. 

Morgan Sung: It turns out, being a sex worker has become increasingly perilous on today’s internet. As a sex worker herself, Olivia has seen firsthand the way platforms have targeted and surveilled sex workers, even if they aren’t posting explicit or sexual content that violates the site’s rules. She says that Discord’s new age verification policy raises a lot of red flags about privacy. 

Dr. Olivia Snow: By requiring ID, like on one hand, that can prove that you’re of, you know, the right age. On the other, it also provides a digital footprint of the content that you are consuming, which under our current administration can be really dangerous if that content happens to be, for example, like queer-related. If it’s organizing around racial justice. Now Discord could potentially just offer up a list of names. 

Morgan Sung: Discord said they’re offering privacy forward verification options. They claim that facial scans would never leave the user’s device and that IDs would only be used to verify age. They also said that users’ real identities would never be associated with their accounts and that their third-party vendors wouldn’t store any of this verification data. It’s all supposedly deleted right after users are age-checked. 

Dr. Olivia Snow: Of course they’re not doing that, but like there have been multiple reports of that data getting breached and leaked. And you know, how would that happen if they were getting rid of our data? Oh, right. They aren’t. They’re selling it. 

Morgan Sung: Yeah, Discord had a major data breach last year that exposed about 70,000 users’ government IDs. The company initially enforced age checks in the UK and Australia last year to comply with local social media regulations. But hey, the company said that the vendors they’re working with now had nothing to do with that huge violation of user privacy, so it’s all good now. Last week, we talked about Roblox, the super popular kids’ gaming platform and their new age verification policy.

That came on the heels of dozens of lawsuits against the company over allegations of predators grooming children on the platform. Age gating is becoming the norm online as platforms face increasing pressure to keep kids from seeing potentially harmful content. With age verification laws sweeping the UK, Australia, much of Europe and even here in the U.S. Free speech advocates are sounding the alarm about censorship and surveillance. 

Dr. Olivia Snow: Of course we want to protect children. We always want to protect children, but that’s not what the legislation is actually about. If any legislation were about protecting children, then we’d have like gun reform, but we don’t. It’s really about expanding the surveillance state and using protecting kids and protecting, you know, children’s purity, whatever, as an excuse. And it’s an excellent excuse. 

Morgan Sung: And there’s one group that’s been warning us about this exact issue for decades, sex workers. Today, we’re diving into the link between porn and the First Amendment and how the tactics first used to censor and surveil sex workers are now being used against everyone else.

This is Close All Tabs. I’m Morgan Sung, tech journalist and your chronically online friend, here to open as many browser tabs as it takes to help you understand how the digital world affects our real lives. Let’s get into it.

So what does sex work have to do with free speech? A lot more than you’d think. The UK’s Online Safety Act went into effect last year, which puts the onus on platforms to ensure that minors aren’t exposed to, quote, harmful content like porn or violence or self-harm. It’s a very broad and subjective umbrella, which means that all kinds of content can now be age-gated, like footage of police brutality against pro-Palestinian protesters. Or discussions of LGBTQ relationships. The UK’s Online Safety Act is responsible for the most recent and widespread changes, but it’s definitely not the first piece of legislation to require age verification.

Let’s start with a new tab. Why do I have to verify my age on Discord?

Joining me is Anna Iovine, associate features editor at Mashable, who primarily covers dating, relationships, sex, and sex work, and how they’re all linked to this current digital landscape. So she’s been covering the effects of age verification laws pretty closely. 

Anna Iovine: Very broadly, age verification laws require personal data, such as a facial scan or a government ID, in order to access data that might be deemed, quote, harmful for minors. So in a lot of cases, this has to do with pornography. And in the United States, around half of the country has these laws now, but they’re all different because they’re state laws. 

Morgan Sung: Indiana was the first state to enact one of these laws back in 2023. 

Anna Iovine: And we’ve seen this moral panic around pornography really for years leading up to that point. And since then, there have been copycat bills of the Louisiana law. And then last year, the Supreme Court deemed that the Texas age verification law was constitutional. So that proved that age verification laws were here to stay in the country, at least for now. 

Morgan Sung: So the United Kingdom’s Online Safety Act went into effect this year. What are some of the unexpected places that we’re seeing this rollout? 

Anna Iovine: So some platforms that are not NSFW have started age gating their content such as Spotify and now we see Discord. And even some subreddits have been age gated in the UK such as Stop Drinking, which obviously doesn’t have anything to do with pornography. But the UK Online Safety Act deemed some content categories potentially harmful for minors and addiction content does fall into that, even if you’re talking about recovery — which is the issue with some of these laws in that if you are discussing some of these quote unquote adult topics, you might not even be posting anything harmful or you might be trying to get help. So that’s just one ill effect of these law, but it’s spread way beyond pornography. 

Morgan Sung: Yeah, I mean, we’re seeing this across the world, like these laws initially targeted porn sites and sexually explicit content. But now we have to submit your face to Discord to chat with strangers. What is the logic behind trying to age gate some of this content? 

Anna Iovine: There are the outward purposes of these bills, which is to protect children. But in actuality, children are not protected when they have to scan their face and input their personal data, sensitive data, into websites that might not know how to hold this data. I don’t think that makes children safe. And I also don’t it’s safe to prevent children from seeing certain types of content.

For instance, one category that falls under the Online Safety Act is eating disorder content, which can be very harmful. But if you’re in eating disorder recovery, why should you not have access to recovery content? I don’t think that algorithms or AI or whatever systems they use to filter out what content is, quote unquote, for adults, knows the difference between what can be helpful and what can be harmful.

And I also think that, at least in the United States historically, moral panics have been outwardly centered on children, like, “Oh, think of the children.” Such as the satanic panic in the 80s. But really, I would speak for myself, I think it’s to chill speech and to chill sexuality and just blame it on like, “Oh, we cannot have this content around children.” But as a result, now adults have to input their personal data, adults may not have access to content that is their right to. 

Morgan Sung: Are these policies effective? Like, can’t you just use a VPN to get around it? 

Anna Iovine: Exactly. You can use a VPN and they are not effective is the problem. What people see when these laws go into effect is that searches for VPNs go way up because people will figure out a way to get around them. That’s what happens with censorship. And in the case of porn sites that have to implement an age verification system, In the US and in the UK, if a site is based there, there’s a high likelihood that they will comply or will try to comply. But otherwise, if the website isn’t based in one of these territories, then they might not comply at all. And that’s something that Pornhub has pointed out that if other porn sites are based in other European countries or what have you, why would they want to follow a law in a different territory? It doesn’t make sense. So people can either use a VPN or just go to a site that doesn’t comply with the law. 

Morgan Sung: While the use of age verification technology is relatively recent, the crackdown on porn really ramped up almost a decade ago. And similar to age verification laws today, the legislation that led to the porn crackdown made online platforms responsible for the content their users posted, sweeping changes that heavily surveilled and censored sex workers. We’re now seeing similar tactics being used against the general public. We’ll dive into the ripple effects of a pair of laws. Called FOSTA and SESTA, but first, a quick break.

We’re back, and we’re diving into the great porn ban of 2018. Let’s open a new tab. Are you ready to be surveilled like a sex worker?

Right now, age verification laws in the UK, Australia, and in the US are leading to a crackdown on porn in the name of protecting children. We saw similar restrictions years ago, but that was in the name of stopping sex trafficking. So back in 2018, President Trump signed a pair of bills into law called FOSTA and SESTA and drastically changed the internet. Here’s Mashable editor Anna Iovine again. 

Anna Iovine: So FASTA-SESTA stands for Fight Online Sex Trafficking Act and Stop Enabling Sex Traficking Act. And outwardly, it was to stop online sex trafficking and it had bipartisan support because if you said, “Oh, I’m not voting for the bill that stops online sex-trafficking,” it looks really bad. But in actuality, this bill made sex workers less safe and did not do much for sex trafficking at all. I was actually looking at a report released in 2021 that there was only one federal conviction from FOSTA-SESTA of a sex trafficker. But this made people less safe because as a result of FOSSTA-SESTTA, which was a carve out of Section 230, all these online platforms will now be liable for any content that is quote unquote soliciting or enabling prostitution or sex work. 

Morgan Sung: If you’ve been listening to the show for a while, you may notice that Section 230 comes up pretty often. It’s known as the 26 words that made the internet. Section 230 of the 1996 Communications Decency Act protects online platforms from liability for what their users post. It also protects platforms from liability if they choose to remove or restrict user content, even if it’s not criminal. That’s why platforms are allowed to remove hate speech. Now, section 230 doesn’t let platforms get away with criminal activity, like openly selling drugs. It holds platforms responsible for their own actions, but not for those of third parties. It means that if you have a blog and someone else leaves a comment that says, hey, buy drugs here, you aren’t liable for what they commented on your post. And they also can’t sue you if you delete their comment.

Section 230 enabled Facebook, Twitter, Reddit, Substack, and even the New York Times recipe comments section to become the vibrant town squares of discourse that they are today. Okay, maybe that’s optimistic. Whether you love or hate the incessant arguing online, that level of openness and exchange is only possible because of Section 230.

However, FOSTA and SESTA removed that legal immunity for platforms that “facilitate sex trafficking.” A phrase that has been interpreted very broadly. Platforms are now held legally accountable for any of their users’ activity that could be linked to sex work. One of the biggest sites to go down after FOSTA-SESTA passed was Backpage. It was kind of like Craigslist, an online bulletin board where users could advertise equipment rentals, seasonal gigs, and escort services. Backpage was notorious for its role in trafficking minors. And multiple investigations shut down the site. But FOSTA-SESTA also wiped out platforms for consensual sex work. 

Anna Iovine: As a result of this, people who wanted to do sex work suddenly found that they had fewer resources. They couldn’t talk to each other about clients or the like. They couldn’t vet clients online because all these avenues of finding clients suddenly went away. And if they wanted to, say, advertise their services or just live online and have an online presence like all of us do, they had their accounts banned or what’s called shadow banned, which means it gets deprioritized by the platform and it can’t be searched and they don’t show up on the explore page and such. So if anything, it actually was a boon to sex traffickers because it made sex workers more vulnerable. They were suddenly more disconnected from each other and potential safer clients. And I think it made it easier who prey on sex workers because they didn’t have the, because they had fewer online resources than they did before FOSTA-SESTA.

And there are multiple studies that say that as a result of FOSTA-SESTA, sex workers are less safe now. I also want to mention the ripple effects of FOSTA-SESTA, a ton of people that are not sex workers are also getting their accounts shut down or getting shadow banned, especially if they’re people of color or LGBTQ people and maybe show a little bit of skin. And these are all people who do not have power. Like Kim Kardashian can post whatever she wants, whatever scantily clad photo she wants. She can show it. She will not get de-prioritized because she’s rich and famous. But if someone, if an erotic artist or a content creator or God forbid someone has an OnlyFans and wants to find some clients, they cannot post on Instagram the way that Kim Kardashian does. And that is a big effect of FOSTA-SESTA as well. 

Morgan Sung: This overcorrection has made it much harder to post anything related to sexuality or women’s bodies. And it extends into all online platforms. Pole dancing instructors have been banned for showing off their athletic feats. Sex educators have been banned for trying to raise awareness about STIs and birth control. Instagram bans images of, quote, female breasts that include the nipple, but makes exceptions for depictions of breastfeeding or posts about mastectomies. However, like many other platforms, Instagram’s automated moderation isn’t great at understanding context. So, content about breast cancer awarneness is still taken down. 

Anna Iovine: And in terms of women’s health, we’ve also seen that on Meta in particular, women’s health ads get rejected, such as if you’re trying to sell period products or rather create ads for period products, you can get rejected for being sexual. And yet in a lot of cases, ads for male sex toys or erectile dysfunction medications often don’t have that problem. 

Morgan Sung: And it’s not just social media platforms. Payment processors, for example, don’t want to be held liable for potentially facilitating sex trafficking. So they’ve refused to service anything related to sex, from adult content creator subscriptions to the sale of sex toys. FOSTA SESTA has rippled into every online service, and it’s affected sex workers on platforms totally unrelated to their work. Like how Airbnb flags sex workers and anyone close to them, like roommates and partners. Dr. Olivia Snow, the UCLA surveillance researcher and dominatrix, has been flagged on Venmo and Cash App. Even her DoorDash account was suspended a few years ago. 

Dr. Olivia Snow: So one day I had a friend who had just moved to Los Angeles and she was struggling and like living in a hostel and I was like, “Yeah, I’m gonna send her a sandwich. Like I’m going to be a good pal.” And I like get on DoorDash and I’m sending her a sandwich and it just suspended my account like as the transaction was about to go through. And then I got an email about it that was like your account has been suspended due to like X, Y, and Z. I was flagged as a high-risk user the same way that I’ve been flagged on Venmo or Cash App, but Venmo and Cash App make a little bit of sense because I’ve received tribute from clients on those. I haven’t received the same on DoorDash, but it’s still the same technologies that were able to flag me. 

Morgan Sung: How do platforms know? According to Olivia, algorithmic surveillance. Like everyone else on the internet, sex workers have been tracked across platforms. All of this data has been used to identify certain users as high risk. It’s not unlike the way that platforms track user activity to figure out what their specific demographic likes to buy. For example, Olivia’s cat made an appearance right before we started recording this interview. 

Dr. Olivia Snow: We were talking about my cat earlier, right? It is not unlikely that after me saying that or you hearing that or anyone watching this, if they open up Instagram, they’re gonna get an ad for cat food, right? They share data and they sell it to each other, mainly to better market to us. By us, I mean like everyone who’s using the internet. They know what device we’re on, they know our phone number, they know our email address, they know credit card number, social security number. So when you have like this, like these constellations of data points that we are willingly sharing with these platforms and it’s really not too difficult to link these things together. 

Morgan Sung: Olivia often compares sex workers to “the canary in the coal mine” when it comes to surveillance. She said sex workers are often the test population for data collection, surveillance, and censorship. 

Dr. Olivia Snow: One thing I really love about the canary and the coal mine metaphor is that the way that the canaries functioned in coal mining was that it was when the canary stopped singing that you wanted to take note because that meant that the oxygen levels weren’t enough to sustain the canary’s consciousness. So it’s not that like, “Oh, I finally hear the canary,” like, which is how I feel that is that’s often misinterpreted as. It’s more like, “Oh, the canary’s been singing and now just isn’t anymore.”

And that is openly the plan for how to deal with sex workers on the internet and on various technologies. We don’t want to see porn. And then when we don’t see the porn, we know that our content moderation is working. But the people who you’re moderating out of sight are same people who are saying like, “Hey, you know, this is going to get you used against you next,” but you don’t hear that because you can’t hear them anymore. 

Morgan Sung: Let’s look at a real-world example of this. Surveillance and censorship after Roe v. Wade was overturned. FOSTA-SESTA cast a very broad net around the quote, facilitation of trafficking, which meant that platforms cracked down on anything related to sexuality. They responded similarly after the Dobbs decision in 2022. 

Dr. Olivia Snow: We almost immediately saw this censorship expand to people sharing information on how to access abortions, how to excess contraceptives, safe sex in general. There’s no language in there that criminalizes talking about abortion on Instagram, especially if you’re in a state where abortion is protected. But, out of, you know, whether it’s an abundance of caution or plausible deniability or a genuine desire to silence activists or, you know, censor this information and stymieing its circulation, platforms just started going after abortion and like safe sex content, you know, pretty immediately. 

Morgan Sung: Olivia pointed to the Texas Heartbeat Act, which effectively bans abortions after six weeks. The law also allows private individuals to sue anyone who performs, induces, or aids and abets abortions, after the cutoff. “Aids and abets” is very broad. It can include clinic staff members, Uber drivers who take someone to an abortion clinic, or even friends who help pay for the procedure. The same methods used to identify sex workers as “high risk,” like tracking activity across platforms, collecting location data, tracking keystrokes, can also be used to flag anyone seeking or providing abortions, especially in states where it’s criminalized. 

Dr. Olivia Snow: So that certainly mimics FOSTA-SESTA in the criminalization of facilitation. And I mean, that rhetoric and that language and these other policies around prostitution already existing made it really easy to justify expanding that to people seeking or accessing or providing abortions. Another demographic that I’ve seen absolutely throttled is Palestinians and pro-Palestine activists. Like I noticed myself when I was tweeting about Palestine, I got far more severe shadow banning doing that than I ever did tweeting about sex work. But you know, the reason they were able to do that is because the infrastructure was already in place. 

Morgan Sung: So, FOSTA-SESTA led to a widespread suppression of sexual content, and much more surveillance, in an effort to stop trafficking. And now, a few years later, we’re seeing another legislative push to restrict the internet. But this time, it’s in the name of protecting children. The era of age verification is here. Since 2022, Congress has been trying to pass a federal law, similar to the UK’s, called the Kids Online Safety Act, or KOSA. Here’s Anna again. 

Anna Iovine: KOSA is interesting because it almost combines FOSTA-SESTA with age verification laws because it requires platforms to have a duty of care to basically make it so children cannot see quote-unquote harmful content. And like the Online Safety Act, what qualifies as content harmful for children is really broad and includes harmful behaviors such as the eating disorders and addiction and also covers. Bullying, and such, in addition to pornographic and sexual content. So what would this do? It’s so broad, I think it signals that people don’t know about how the internet works because it’s like, how would this even happen? How can you prevent children from seeing this? 

Morgan Sung: Anna thinks it’ll result in a combination of FOSTA-SESTA and age verification laws, which means that a lot of content will be wiped from the internet. 

Anna Iovine: I think a lot of this content would be removed. I think tech platforms would just delete everything. Or if they’re using different tools, I think they would install age verification systems in order to corner off this content from minors. So it potentially has the power to be worse. Granted, KOSA has been introduced and failed and reintroduced a bunch of times, so I don’t know the viability of this law, but it keeps coming back. So it does seem like it’s not going away, but it does have the potential to be scary because I think it’s internet policy written by people who don’t know how the internet works. 

Morgan Sung: So there’s a real dilemma here. Obviously, we don’t want to expose kids to sexual or harmful content. But do we have to give up our privacy for that? Let’s open one more tab. Protecting kids versus the First Amendment. Free speech advocates are very concerned about KOSA and laws like it. So how does age-gating porn affect the way we can interact with the internet outside of just sexual content? 

Anna Iovine: It seems like as age verification laws become more and more widespread that it will be harder to be anonymous online, which I think is a huge privacy and security concern. And it also is scary because it, in terms of content that can be harmful for children, there’s a capacity for people to point at something that they don’t like such as criticism of the government and saying “you’re actually bullying,” or to look at two people of the same sex and say that’s pornography.

So, it really has the capacity to chill free speech and make it so you cannot be yourself online or you cannot say what you want to say, and the internet can be fundamentally changed from 30 years ago when Section 230 was passed. Whenever there is censorship, people will try to go around that, which is exactly what people are doing with VPNs, and now there’s a push to ban VPNs or at least ban them for children, which according to First Amendment and internet experts that I’ve spoken to, falls under second-order censorship. 

Morgan Sung: So a lot of lawmakers are making the assurance that they’re not trying to just ban porn with AV laws. But in Project 2025, the blueprint for Trump’s second term, they literally did lay out a plan to ban porn and imprison sexual content creators. What is the environment right now around a so-called porn ban? 

Anna Iovine: They don’t want porn, I think because they don’t like it, they don’t like human sexuality, they think it’s disgusting, what have you, but I also think it’ because they know that if porn is banned, they can call things that are not pornographic porn and just chill speech that has nothing to do with explicit content. We’re already seeing it. There was a congressman who called Bad Bunny’s Super Bowl halftime show pornography, which I think is exactly the kind of thing that we should be seeing because it shows that it’s not just about the explicit content. It’s not about just people being naked and having sex on camera. It’s about much more than that. 

Morgan Sung: Right. Is there an ethical way to keep kids from seeing sexual or harmful content without severely restricting the internet for everyone else? 

Anna Iovine: Yes, there is a way. It’s called device level filters. So free speech advocates and sex workers and the like have been advocating for this method for years. A device level filter is on the actual device and it blocks all websites deemed “restricted to adults,” and you cannot get around it. You can’t use a VPN on the device. You can’t like suddenly, I don’t know, turn it off and turn it back on again and it’s gone. It is on to device. And yet, if you’re an adult, you won’t have this filter and you won’t have to submit your ID or a facial scan or what have you. So according to these advocates, it is the best way. And it’s also what Pornhub and its parent company have been advocating for for years. 

Morgan Sung: Why is sex so intertwined with free speech online? 

Anna Iovine: I think because it is so shamed in our society because people in person don’t talk about sex a lot. We’re not taught it in schools. Our parents maybe don’t talked about it as much. So I think it’s really the third rail in a lot of these cases because it’s such a charged topic in our society. But human sexuality exists and it’s within us and it’s natural to want to express it whether it’s talk about sex, learn about it, or watch people have it. And we also see that whenever technology advances, porn is the first thing to go on it. Pornographers were very early adopters of the internet. They were early adopter of VR, and now they’re seeing they’re early adopters of AI — if it’s done consensually, deepfakes are a whole different story. But I think it’s natural. It’s one of those things like food and sleep, and even though it’s not compulsory like those other two things, I think it’s natural to want to discuss it, especially when it’s so squashed as a topic of conversation. 

Morgan Sung: Porn is often credited as the driving force behind the internet. The sex industry pioneered streaming, digital advertising, and e-commerce. Sex and the internet are so intertwined that it’s been enshrined in internet lore. Rule 34 is this meme that says, “If it exists, there’s definitely porn of it online.” And like we’ve heard throughout this episode, banning porn, or even restricting sexual content will affect everyone, regardless of whether you google smut. Olivia warns that teenagers will always be able to access porn if they’re determined enough, but these restrictions on sexual or harmful content will ultimately stop them from accessing information about safe sex or consent.

Many free speech advocates argue that it should be up to parents, not the government, to monitor and limit what their kids do online. In December, a House subcommittee advanced 18 bills that revolve around protecting kids online, including KOSA. The debate between protecting kids and protecting free speech and privacy is an ongoing fight. And Olivia said that it’s never been a better time to listen to sex workers. 

Dr. Olivia Snow: Because what they’re doing to us, they will do to you next and you’re not going to like it. It sounds excessive, it sounds like a moral panic, and yet all of these things that sex workers have been predicting for years are happening every day. 

Morgan Sung: Discord’s Teen-by-Default policy was supposed to start this month. It would require users to submit facial scans and upload their government IDs in order to access flagged servers. But last week, after significant backlash from users, the company announced that it’s postponing age verification requirements until the second half of this year.

In a blog post, Discord CTO and co-founder Stanislav Vyshnevskiy acknowledged that the platform “missed the mark.” And said that he “gets the skepticism.” Discord’s age verification is still going to happen, but Vyshnevskiy at least promised more transparency and alternative options that don’t involve giving her face to a third-party vendor. Okay, let’s close all these tabs.

Close All Tabs is a production of KQED Studios and is reported and hosted by me, Morgan Sung. This episode was produced by Maya Cueva. It was edited by Chris Hambrick and Chris Egusa, who also composed our theme song and credits music. Our team includes Jen Chien, who is the director of podcasts. Additional music by APM. Brendan Willard is our audio engineer.

Audience engagement support from Maha Sanad. Ethan Toven-Lindsey is our editor in chief. Some members of the KQED podcast team are represented by the Screen Actors Guild, American Federation of Television and Radio Artists, San Francisco, Northern California Local.

This episode’s keyboard sounds were submitted by my dad, Casey Sung, and recorded on his white and blue Epomaker Aula F99 keyboard with Greywood V3 switches and Cherry Profile PBT keycaps. Do you like these deep dives? Are you closing your tabs? Then don’t forget to rate and review us on Spotify, Apple Podcasts, or wherever you listen to the show. Maybe drop a comment too. And if you really like Close All Tabs and want to support public media, go to donate.kqed.org slash podcasts.

Thanks for listening.

Sponsored

lower waypoint
next waypoint
Player sponsored by