Show Transcript
This partial transcript was computer-generated. While our team has reviewed it, there may be errors.
Jasmine Mithani: So the Take It Down Act has two parts. First is that it criminalizes the publication and distribution of non consensual intimate imagery, And this means anything from, you know, a selfie that someone took that was meant to be shared privately and then was posted online without their permission to these nonconsensual deep fakes such as those Nudify apps or something like that, where a person, is sort of undressed. And then the second part of the law is creating a request and removal provision where Internet service providers are required to have sort of a user-friendly way for people whose non consensual intimate images are posted online to request for them to be removed, and they must be removed within forty eight hours.
Mina Kim: And I also understand that the Take It Down Act also covers threats of releasing images.
Jasmine Mithani: Correct. It’s not just, when the harm after the harm has happened. It’s also trying to take into account if someone’s trying to blackmail someone, say, you know, you must do this or else I’ll release these images, and trying to criminalize that behavior as well so it’s not exclusively after the harm has occurred.
Mina Kim: And what’s the timeline for the implementation? Has it gone into effect completely?
Jasmine Mithani: So the criminal provisions of the law went into effect as soon as president Trump signed the bill. The request and removal take down part of the law, Internet service providers have one year to implement that process. So that would go into effect, and the Federal Trade Commission could start enforcing that in May 2026.
Mina Kim: And so the Federal Trade Commission is responsible for enforcement. What are the penalties if you are, say, the creator or sharer of that kind of information or if you’re a company who doesn’t take it down in forty eight hours?
Jasmine Mithani: There are fines and also for people who, who posted themselves, there is jail time as well, as part of the law. And, also for companies, there are fines.
Mina Kim: So is it initially a misdemeanor then?
Jasmine Mithani: These, I believe, are penalties and felonies.
Mina Kim: Felonies from the start. Paresh, can you remind us why some kind of effort like this was needed? How bad the problem of non consensual imagery has become.
Paresh Dave: Yeah. I mean, I spoke to the offices of two of the senators who really led the charge on this, Amy Klobuchar and Ted Cruz. And I think the inspiration for them was hearing from, some of the teenagers even who were affected, by Nudify apps in their high schools or sharing of nonconsensual intimate imagery that they had sent maybe consensually to someone, and then it spread, far and wide from there. And, you know, these are people who have families themselves. It is very concerning to them, these stories that they heard, and there’s no reason that anyone would want to sort of support this kind of behavior. And so it’s an easy victory for both parties.
Mina Kim: And how fast and easy is it to create something like this and share it?
Paresh Dave: Far too easy. These apps are even advertised, sometimes on Facebook and places where people are congregating all the time. You download it, you upload a picture, and in a few minutes, you might have a synthetic image, showing someone in a state of undress. And then, of course, you know, images that people take, you know, maybe, to send to an intimate partner. Things go bad with that intimate partner, they start spreading that online. That is also a common issue. And, of course, you know, that can happen instantly where you post it in one place and it spreads across the Internet really quickly, which is why the takedown provision of forty eight hours is so essential.
Mina Kim: Yeah. You also wrote about how hard it is to get these removed. You profiled Breeze Lou and her story. Can you talk about how hard it’s been?
Paresh Dave: Yeah. So she was, you know, a victim. She learned while she was in college about a video of her that was online. Eventually, she worked with some organizations to identify that there were hundreds of links, of different images or screenshots of her in an intimate state. She tried contacting various organizations to get and websites to get the image down, to get those videos down. Eventually, she landed on around a hundred, you know, images that Microsoft just would not remove. She had to corner, she and a colleague had to corner a Microsoft executive at a conference. And it was only after that and about, you know, eight, nine months, almost ten months that the image was taken down. And this was years after the original situation. So it can be grueling, and not everyone has the wherewithal to pursue that because this is a very emotionally draining thing to deal with. You can’t always have the wherewithal and the courage to stick it out through a grueling process like that.
Mina Kim: Jasmine, what have you heard from advocates in terms of the emotional toll this can have?
Jasmine Mithani: So a lot of advocates have been calling this, you know, this distribution of nonconsensual images digital sexual violence. Research has shown that the impact on survivors is similar to people who have been assaulted, sexually, physically, you know, in real life. And that’s why this is sort of being treated like such a serious crime because of the also long lasting impacts, you know, and that it can be very difficult to get offline like in the case that Paresh just talked about. It’s something that can plague victims for years.
Mina Kim: Which may explain what you alluded to earlier, Paresh, just the widespread support for the bill, from both parties, for example. Tech companies were behind it too, though. Right?
Paresh Dave: Absolutely. Meta, Microsoft, who I just mentioned, got behind it, which is kind of funny because Microsoft took so long in that case that I just explained. And here they’re advocating for forty eight hours. But I think tech companies like to have some sort of regulation, to fall back on. It’s something that sort of takes away the liability from them in a way. And in this case, that is what this law does. If they do, you know, comply with the takedown provision, it sort of absolves them of liability, down the line.
Mina Kim: There were some other groups that were interested in this bill. For example, religious groups, Paresh, who are strong supporters. What was their reason for that?
Paresh Dave: So that is an interesting element to this where there’s a lot of Christian groups, conservative groups that for years have been advocating for the removal of sexual content from the web, or more guardrails. We’ve seen in a number of states, in the country where age verification laws have been passed where to access pornography websites, you have to upload an ID or prove your age in some other form or fashion, to show that you’re an adult. Those laws, again, very much, you know, pushed and advocated for by these Christian groups or these religious groups. And you’re seeing the same here, and it’s kind of like, there’s a concern among a lot of, sort of sex positive groups or sexual advocacy groups that this law, the Take It Down Act, is kind of a wedge at the federal level, to move towards banning all kinds of sexual content across the web.
Mina Kim: You’ve heard about that too. Right, Jasmine? Concerns from sex positive groups and others?