upper waypoint

Couch F***** Memes vs The Truth

Save ArticleSave Article
Failed to save article

Please try again

Striped sofa with a floral pillow next to a patterned ottoman in a wood-paneled room. Text reading "CLOSE ALL TABS" in the corner.
An empty couch with a pillow in a room with wood paneling.

This episode was originally published October 19, 2024.

View the full episode transcript.

In this episode of Close All Tabs, host Morgan Sung dives into the world of viral political memes with TechCrunch senior writer Amanda Silberling, unraveling the origins of the infamous JD Vance “couch rumor.” What began as a seemingly frivolous post on X about the Republican VP nominee spiraled into a meme with real-world consequences, shifting the tone of the race. Morgan is also joined by Imran Ahmed, CEO of the Center for Countering Digital Hate, to explore the fragile line between satire and disinformation in modern politics.

Want to give us feedback on the show? Shoot us an email at CloseAllTabs@KQED.org

Follow us on Instagram


Sponsored

Episode Transcript

 

Morgan Sung: Please note that this episode contains explicit language and adult themes, so listen with care. The way disinformation spreads online can lead to pretty horrifying real world consequences. We saw that in September when false claims about immigrants eating pets in Springfield, Ohio, led to widespread disruptions and bomb threats. Now, the people spreading those claims face legal action. 

News Anchor: 1 A nonprofit has filed charges against former President Donald Trump and his running mate, Senator J.D. Vance, accusing them of spreading false claims about immigrants in Springfield. 

Morgan Sung: We’ll have to wait and see if those charges stick to the former president and his V.P. running mate. But back over the summer, Vance himself was the subject of a disinformation campaign that felt far less dire. Remember that rumor that went around about J.D. Vance and his appreciation for couches? 

Jesse Waters: And if you’re going to accuse someone of having sex with a couch, you better have video. 

Morgan Sung: That was Fox News host Jesse Watters defending Vance, looking back and more closely at what seems like a silly rumor can actually tell us a lot about how disinformation spreads online. This all started when a shit post on the social media platform X claimed that Vance admitted to I don’t know, let’s call it “recreational couch penetration” in his 2016 memoir, Hillbilly Elegy. This one off joke blew up online, breaking through Meme Twitter and into the mainstream news cycle. 

Speaker 1: You may be wondering why you’re seeing a ton of memes and posts about Donald Trump’s VP pick, J.D. Vance, and his sectional relationship with his. 

Speaker 2: Couch, and J.D. Vance talking about childless cat ladies and having sex with a c ouch. You watched? Yes, You heard that correctly. 

Morgan Sung: Suddenly, people are clamoring to buy a copy of Hillbilly Elegy just to check if it was real. Google searches for J.D. Vance and Couch skyrocketed. Even Tim Walz joined in with a certified dad joke. 

Tim Walz: And I got to tell you, I can’t wait to debate the guy. That is if he’s willing to get off the couch and show up. So. 

Morgan Sung: Disinformation has always been rampant on Twitter or now X. But under Elon Musk’s leadership, it’s a lot easier to spread misleading information and outright lies. How did X allow this raunchy post which is proven to be untrue, to spread so far? And where do we draw the line between a funny meme and harmful disinformation? This is Close All Tabs a special series from KQED. I’m Morgan Sung. I’m a tech journalist. Your chronically online friend, and your guide to the weirdest and most fascinating corners of the Internet. Together, we’re diving into election memes, disinformation campaigns, political influencers, and we’ll open as many browser tabs as it takes — all to better understand how the digital world affects our real lives. Okay, let’s dive in. Open new tab. How did the couch rumor go mainstream? 

Amanda Silberling: So I saw the rumor first like the day that Donald Trump announced that J.D. Vance was his running mate. 

Morgan Sung: This is Amanda Silberling. She’s a reporter for TechCrunch, the co-host of the Internet Culture podcast Wow If True. And she is very online, which means she knows a lot about memes. And right after news broke that J.D. Vance was Trump’s VP pick, Amanda’s friend, who’s also a reporter, sent her a screenshot of an X post. 

Amanda Silberling: Which said, quote, I can’t say for sure, but might be the first VP pick to have admitted in a New York Times bestseller to f****** an inside out latex glove shoved between two couch cushions. Parentheses. Vance Hillbilly Elegy. Page 179 to 181, which came from an account with 1700 followers on Twitter, which is like not a lot for something of this magnitude. Like this is like a mid-level, like, fandom account on Twitter. And I remember seeing this and I was with another friend and I like, turned to them and was like, “Oh my God, this is so funny. Like, can you believe it?” Because it does sound like something that you would write about in like a coming of age memoir about growing up in like rural Ohio slash Kentucky. I don’t remember exactly, but like, it’s like something that’s kind of plausible to happen. So my first reaction was, “Haha, this is so funny.” And my next reaction was, “Wait a minute, I’m a journalist. Is this real?” 

Morgan Sung: Amanda went digging to see for herself if the couch story was real. So here’s page 180 of Hillbilly Elegy, narrated by Vance himself. 

Morgan Sung: Years later, I looked at my wedding party of six groomsmen… To a man. All of them had found careers outside of their hometowns, and none of them had any interest in ever going back. 

Morgan Sung: And no couch f******. So Amanda moved on with her life and forgot about the rumor. A week later, Joe Biden dropped out of the race and endorsed Kamala Harris. Kamala-themed fan edits a means and pictures of homemade merch flooded X. And then the couch rumor started floating around again. Someone tweeted a picture of what they said was a page from Hillbilly Elegy. This is how I first saw the the rumor. And I was like, “I guess I looks pretty legit.” Like, it looks like something my old college professor would send out like some PDF scan of our reading for the class. But how would you describe it? 

Amanda Silberling: It’s slightly pixilated, but it’s very legible and it is slightly off centered, which I think it’s very it’s giving like “TA scanned this 20 minutes before class” vibes. Do we want to see what dramatic reading of that page. Yeah. Years later, I looked at my wedding party of six groomsmen and realized that every single one of them had, like me, f***** a couch. All of us had found ourselves beheld by the eroticism of two cushions side by side with that lush, inviting valley between. 

Morgan Sung: In the middle of the night. Amanda woke up to a text from the friend who sent her the original couch post. This time she sent the screenshot of the fake page. Was this real? 

Amanda Silberling: And then I’m up for like an hour being like, “I need a prove this is fake. Like, this can’t be real.” And basically everybody was replying to it like, “Oh my God, this is in my copy too. Like, I have the first edition and it’s here like, Wow, it’s so crazy.” And like, these people are all in on the joke. 

Amanda Silberling: Right? I mean, there is like a tick tock going around of a guy being like, “I work at Barnes Noble’s and I and like, this page is just blank.” 

Speaker 3: And the funniest thing is that in the original book there a whole scene where he f***** a couch. It is supposed to be pages 178. Right? Tell me why it’s a blank page. They cut it. They cut the couch f*****. 

Morgan Sung: But at the same time, like, it’s like there are so many people who don’t, I guess have meme literacy or shitpost literacy who are just taking it as fact. 

Amanda Silberling: Yeah. And then the AP publishes a story called “No J.D. Vance did not have Sex with a couch.” And it’s one of their like AP fact check stories. And of course, everyone is like, “Wow, this is so funny. What a great headline.” My reaction to it was like, “Wow, I will never write a headline as good as No, J.D. Vance did not have sex with a couch.” 

Morgan Sung: But then the AP retracted the article, which only fanned the flames. It’s pretty rare for newsrooms to just take down the story if there are corrections or if the reporting just didn’t meet editorial standards, they’ll usually leave it up. But then add a note at the top or sometimes bottom of the page. 

Amanda Silberling: Pulling a story is like the nuclear button, like the the Oppenheimer move for lack a better word. 

Morgan Sung: Right. And like the fact that they retracted it made people believe the story even more because they were like, well, it must be true then. 

Amanda Silberling: And all they said about it was the story which did not go out on the wire to our customers, didn’t go through our standard editing process. We are looking into how it happened. And then like this just gives people on the Internet more room to speculate. And the answer that people come up with is “maybe the reason why they pulled the article is because we cannot definitively prove that JD Vance did not write in Hillbilly Elegy that he f***** a Couch, but we don’t know what he was doing.” Like, I don’t know, like we don’t have cameras on him full time watching all the weird stuff he did when he was an adolescent boy in like, Middle America. 

Morgan Sung: Amanda said that according to Google Trends, searches for “J.D. Vance” and “Couch” spiked after the AP took down the article. But before they even publish that story, massive blue wave liberal accounts on X repost reposted the couch affair as fact. Okay, let’s talk about how this meme spread. New tab. Who was reposting the couch meme? 

Amanda Silberling: When I was looking at who was actually talking about this before the AP reported on it, it was like MSNBC parents and like Vote Blue no matter who, just kind of like the very like pink hat at the Women’s March kind of liberals like these people have millions of followers and they have a wide influence. 

Morgan Sung: Even Democratic politicians who had stuck by Michelle Obama’s “when they go low we go high” mentality up until now, were cracking couch jokes. 

Elizabeth Warren: Trust Donald Trump and JD vance to look out for your family? Shoot, I wouldn’t trust them to move my couch. 

Morgan Sung: But Amanda says one poster in particular caught her attention. Dan Savage, the longtime sex columnist and LGBT rights activist. 

Amanda Silberling: I thought it was so interesting that Dan Savage, of all people, was one of the spreaders of it because he’s the Santorum guy. 

Morgan Sung: In 2003, Republican then-Senator Rick Santorum made really deeply offensive comments comparing gay sex to beastiality. Dan Savage wrote a column denouncing the homophobic comments and then took it a step further. He held a competition to coin a new definition for the word Santorum. You can look up the winning definition for yourself. But I can tell you that it has something to do with bodily fluids. This new definition was so popular that whenever someone searched for Santorum, that’s what they’d find at the top of their results. 

News Anchor 2: One of the top search results for Santorum on Google is a site called Spreading santorum.com. And it’s got a big brown blotch on it. And we can’t say more than that. 

Amanda Silberling: And because 2003, like search engine optimization is like wizardry, this is not something that there are professionals doing. So this basically tricked Google into thinking that the definition of Santorum was legitimate. 

Morgan Sung: More than 20 years later, it’s still at the top of Google results. And that’s why Dan Savage’s response to the couch rumors really stuck with Amanda. 

Amanda Silberling: He’s been trying to manipulate the Internet against Republican candidates for like 20 years. He knows what he’s doing. He posted one thing about the couch f*****, and then in a separate tweet, he was like, “LOL, this is definitely not real, but whatever.” I don’t know what exactly he posted, but he seemed to know he was in on the joke. So I don’t know. I mean, I think this is why the couch f***** incident is so interesting to me because I do think it falls in a moral gray area where, like, I simultaneously think it’s really funny, but I also think it sets a bad precedent. 

Morgan Sung: It’s important to note here that the couch rumor started on and spread on X. Other mainstream social media sites at least try to crack down on fake news and hate speech. But X has become a hotbed for disinformation. Brazil actually banned the site earlier this year over its refusal to stop the spread of political disinformation. Bot counts are spreading conspiracy theories and sowing discord ahead of elections around the world, like anti-migrant great replacement posts in the United Kingdom or in Rwanda, hundreds of accounts suddenly sharing messages in support of the incumbent president. Here in the U.S., we have those false claims of pet eating in Ohio. And both Donald Trump and Elon Musk have tweeted AI generated images with no attribution. Why does this site in particular breed disinformation? It thrives on X. We’ll open a new tab on that after this break. 

Morgan Sung: So what is up with X and disinformation? Let’s open a new tab. X’s disinformation problem. It wasn’t always this bad. Disinformation existed on Twitter before, but the site made active efforts to curb it. They banned political advertising, had active moderators and rolled out features to report election misinformation. But that all changed in 2022 when Musk took over to get into this. I reached out to Imran Ahmed. He’s the CEO and founder of the Center for Countering Digital Hate or CCDH, an international nonprofit organization tackling hate speech and disinformation on social media. 

Imran Ahmed: It’s cost free for social media platforms to be the primary vector of hate and disinformation. There’s no other industry in America — there’s no publisher that cannot be held liable for what they publish — apart from social media companies. 

Morgan Sung: Imran says the CCDH wants to change that. But the organization has been criticized by conservatives who claim that they promote censorship. X actually tried to sue the CCDH for loss of revenue after the organization reported on hate speech and disinformation on the platform. A federal judge dismissed the lawsuit this year and ironically said that X was the one trying to punish free speech. Imran says the platform actually incentivizes disinformation. 

Imran Ahmed: So it would help to start by defining the terms. Misinformation is is wrong information, right? But disinformation, which is another term that people will have heard being used, that’s intentional and therefore it goes beyond being wrong to lying. And quite often, the bulk of the stuff that’s being posted is actually people reposting or engaging with or repeating disinformation. So that’s the bulk of it is misinformation. It’s people who are just wrong, who may have swallowed the initial lie, which was spread by someone who was doing it with quite distinct intent. And in this election, we’ve seen a continuation, if not expansion on some platforms like X of the volume of disinformation and misinformation. And it’s the confidence with which disinformation actors are able to both post freely without hindrance, but also that are amplified by the platform’s own algorithms which give the advantage to contentious information, disinformation, hate that generates an emotional reaction from most people. And it’s that engagement that they’ve realized keeps people on the platforms arguing and shouting at each other, screaming at each other, sometimes trying to trying to stop people from spreading hate and lies. That’s the stuff that they’ve realized is so profitable to them. So we have seen that not only about actors becoming more sophisticated in using these platforms, but the platforms themselves are becoming weaker at enforcing their own rules. So it will surprise you to know that since 2016, things have actually gotten worse, not better, at the platforms. 

Morgan Sung: You know, you mentioned 2016. That was actually my next question, but that seemed to be the turning point for misinformation and disinformation, which had always existed online, but in more fringe areas of the Internet. But that seemed like the turning point for for that to go mainstream and like break into, you know, where everyone else was on the Internet. 

Imran Ahmed: Well, look, I think in that election in that year, a number of bad actors realize that these platforms are completely vulnerable. Like. I think people in the past have thought the platforms try their best to clean up this stuff. That was the year that we realized that, and it’s because of a series of events. You have the U.S. elections where there was a vast amount of disinformation flowing and there was foreign state involvement. But for me as a British person, you know, in the U.K. back then, we’d just had the Brexit referendum where we’d seen a vast amount of disinformation flowing through in the country targeting Black and Muslim communities with with lies. We’d seen a massive rise of anti-Semitism on the political left in the UK. We’ve seen a political assassination which is unprecedented in the UK of my colleague and my friend Jo Cox, MP. So we’d seen these things happening and there were these fractures happening all over politics, all over the world. And 2016 was the year when we realized, Holy moly, it’s not just happening in one place, it’s happening everywhere simultaneously. And when something happens everywhere simultaneously, that’s because it’s not down to one bad actor. It’s down to the system being broken. Right. It’s like if if an iceberg melts, does anyone say, well, that’s just that the iceberg decided to melt? I don’t know why, guys. It’s not. It’s because of something called climate change. And that’s fundamentally changing our atmosphere, changing our planet, the topography, the underneath the systems. Something more fundamental is happening. And that’s the year that we realized. 

Morgan Sung: Right? The year we realized we had to allow these social media giants to operate unchecked. 

Imran Ahmed: Yeah. It’s it’s the year that we realized that disinformation was a real systemic global problem that was destabilizing democracy and rolling back our ability to have cohesive communities. 

Morgan Sung: I know that misinformation and disinformation had existed on Twitter for a long time, particularly in 2016, but it seemed like following 2016, Twitter made some effort to curb it, you know, whether by installing reporting systems or stricter moderation. But under Musk’s leadership, X has rolled back a significant amount of that. Can you talk about how misinformation and disinformation has worsened since Musk’s takeover? 

Imran Ahmed: Well Musk did three things when he took over. First of all, he signaled to people that you can say and do whatever you want on our platform. And, you know, we showed that that led in the week after he took over the number of times the N-word was used on X tripled compared to when it was when it was Twitter. So there was a tripling in the usage. And, you know, we’ve seen disinformation increase substantially. 

Morgan Sung: Imran says they’ve also seen that so-called blue checks are more likely to post disinformation. And by blue check, he’s talking about the people who pay for X premium, the monthly subscription service that comes with a suite of special features, including a verified checkmark. 

Imran Ahmed: Because they get that additional algorithmic boost in visibility. So we’re seeing more and more disinformation in our feed. Then the second thing he did was let back onto the platform tens of thousands of people who’ve been banned previously for being serial violators of the rules. And Twitter still has rules, so X still has rules on the platform. They have to because no one would advertise there if it was completely you know, if if there was no rules at all. The third thing he did was unveil himself as the chief disinformation officer of Twitter by spreading disinformation himself. I mean, we did a study showing that 50 of his posts about election disinformation have actually reached an audience of 1.2 billion views. And that is deeply disturbing. 

Morgan Sung: X rolled back many of Twitter’s existing features to curb misinformation, but at least it kept community notes. Community notes for those who don’t know allows users to add their own fact checking that will appear at the bottom of the post. According to X, over 500,000 users contribute to community notes. But critics point out that it can’t replace a paid trained moderation team. What are your thoughts and community notes? I mean, that’s the one feature he’s added. But you know, it puts the onus on the users to moderate themselves to fact check themselves. 

Imran Ahmed: So I don’t think it’s a bad idea. I mean, I think is quite innovative, and quite clever, but it’s not the solution. I mean, the solution to to everything is to have like moderation, to have a reporting platform. So you can report posts, someone has a look at it if it’s bad, you know, to ban people that that are really problematic. Like, you know, it takes effort to keep a community healthy and real debate requires healthy — it requires a healthy community in which everyone feels they can express their views. A debate is not someone posting the N-word to you 500 times after you post nonstop. That’s what happens on X. It is. It’s just is not conducive to the public discourse that we need in a healthy democracy. So I do think that we have this this particular systemic problem on X. 

Morgan Sung: Imran says a proliferation of generative AI, including Grok — X’s own AI chat bot and image generator — has decreased the cost of producing disinformation. That makes the truth even more difficult to discern. 

Imran Ahmed: It used to be that you can’t believe something unless you see it, see or hear it yourself. Well, now you can’t even believe what you see and hear because, you know, disinformation flows. And again, what does that lead to? Yes, some people believe the disinformation, but it also leads to a more general sense that you don’t know what’s true or not because you don’t know if AI was involved in it. 

Morgan Sung: Can you give any examples of, if not on any social media platforms, successful features or successful attempts to curb misinformation and disinformation? 

Imran Ahmed: Look, I think there are various individual techniques that could be used — inoculation. There’s there’s working with the platforms. And I think some platforms are more responsive to research like CCDH is and try and take steps. But the truth is that we have a systemic problem. So in 1996, Congress passed a law called Section 230. Basically, Congress said, you know, like news websites, if you run a news website, you’re responsible for the news, for the news that you publish. But the comments underneath, you’re not responsible for that legally. And social media companies came along and said, well, “why don’t we just make a business out of the comments? And then we never responsible for anything.” So they can put out their products. And even if they cause harm, if they, you know, send your daughter thousands of images about eating disorders and have she’s too fat, there’s nothing that you can do to hold them liable. And it’s time for us to think about reforming that because a healthy democracy requires a healthy information ecosystem. 

Morgan Sung: Imran paints a pretty bleak picture of our social media landscape and our relationship to the truth. But what about untruths that don’t seem quite so diabolical? I mean, I admit it. The couch rumor was hilarious, even if it’s untrue. What’s the harm in reposting it? Time for another tab. Is all disinformation harmful? I asked Imran about this. 

Imran Ahmed: I don’t think this information is ever harmless. J.D. Vance is a vice presidential candidate. One person decides “I’m going to spread a lie about him,” and that has gotten to millions of people and changed political discourse. And then you see other candidates talking about it in their convention speeches, even in major speeches in the election. And yes, it’s a joke. And yes, it’s so absurd that you wouldn’t believe it’s necessarily true, but you wouldn’t accept it the other way round, would you? If someone said that Barack Obama wasn’t born in America, he was born in Kenya and he faked his birth certificate, which is something that happened, we all get really upset. What’s the difference exactly? And a system that allows for lies to spread at the speed of light, in fact, to get the advantage in spreading over the truth. That’s really problematic for us. It begs the question, would you want to run for election if you knew that the next week your opponent could spread a lie about you and that could destroy your reputation and be something that hangs over you and your family, your loved ones and your children for years, decades to come. And I think the reason why this particular rumor, because it was so bananas, you know, so lurid. 

Morgan Sung: Right. 

Imran Ahmed: Like, yeah. He put on some gloves. 

Morgan Sung: Yeah. 

Imran Ahmed: He pulled his pants down. And he said “right, sofa.”

Morgan Sung: Right. 

Imran Ahmed: “You kinky little sod.” And the reason why it’s so it’s so outrageous, it sort of stimulates our prurient interest. That’s why it got seen by so many people. 

Morgan Sung: Imran says studies show that content that violates the rules of a platform actually drives engagement. And when he says engagement, he means likes shares and other platform reactions. If you interact with a post, it doesn’t always matter if it’s a positive or negative reaction. It’s all still engagement. 

Imran Ahmed: If I post “the sky is blue outside in Washington DC,” everyone’s going to go “alright whatever dude.” If I post “the sky outside is purple,” or if I post that this morning I had salamanders for breakfast, you’re going to get a lot of engagement. And so what it says is that the closer you get to breaking social rules or even the rules of the platform, the more violative the content is, the more engagement it gets. And it’s really stark. You get very, very low engagement until you get right to the end of breaking the rules and then it shoots up. Wow. 

Morgan Sung: So it seems like there’s almost an incentive, an engagement incentive to spread disinformation. 

Imran Ahmed: Because of the second part of the equation, which is that engagement equals amplification. So if you if you get high engagement, the algorithms say, hey, you know, there’s all this content that people post billion people, billions and billions of posts every week, how are we going to decide what we’re going to show people because we can’t share a billion people’s content simultaneously. So we’re going to elevate some content and we’re going to reduce the frequency of some content. And its the elevation is given to that high engagement content because it keeps people on platforms because it’s got high engagement. And so therefore, you can show them more ads. And these are really just advertising companies. That’s all that these platforms make money from is advertising. So how can we keep eyeballs on there for as long as possible? Pump the highest engagement content. So that creates an incentive to the platforms to elevate and engage in content, which they know is the stuff that’s basically BS — lies, hate, everything else. But it also gives an incentive to the people who are producing the content. If you want to be seen by lots of people, if you want to go viral, then post bullshit because bullshit is the ultimate arbiter of virality. 

Morgan Sung: Of course, disinformation isn’t the only content that goes viral. Still, it’s highly engaging. As of last year, posts containing misinformation aren’t allowed to be monetized on X. But that doesn’t stop them from spreading. You know, this couch example is a relatively lighthearted example, but this spread from a tiny account with 1700 followers. But what are the more serious ramifications of, you know, anyone with no following, being able to spread a lie? 

Imran Ahmed: I mean, I don’t want to get too serious about it, but the truth is that that democracy itself is under threat by this phenomenon, not on an individual meme by meme basis, but a systems-wide basis. If people just give up on the truth, apathy is the natural next step. And that’s where tyranny takes hold. And civilizations do collapse. You know, American democracy is complex and underpins everything else. The performance of our economy, you know, our ability to have this this rich tapestry within our society of identities of people with backgrounds, the melting pot that is America is reliant upon certain truths being inalienable. When truth disappears, I think that America will collapse, could collapse very, very fast. 

Morgan Sung: It seems like it’s very difficult for like a normal person who isn’t a journalist like myself or, you know, an expert in this like you to know who to trust and who not to trust. Like, how do you think people should address misinformation online? How do they make themselves less susceptible to it? 

Imran Ahmed: Well, I mean, I’m also a human being that has a family and I have things I like doing outside of work, so I can’t be bothered working out what’s true or not. And I’m not gonna fact check everything I see. That would be the most depressing life on the planet. If everything you heard, You have to go and fact check it. Holy moly. That would be. It would be unbelievable. Like how we live our lives. So, I mean, to my mind, use social media for what it’s good at: making you laugh. But don’t treat it like a serious place to get information. I mean, it’s like junk food. Every now and then, it’s fine. You know, it’s kind of great to have a Popeye’s spicy chicken sandwich, but you can’t have that for every meal. Right. Because it’s going to make you incredibly unhealthy. And that’s why our information diet is to us, we’ve become too reliant on fast information from social media. And, you know, our brains are starting to rot as a result. 

Morgan Sung: Yes. These are all systemic issues that we’re dealing with here still. We all have choices to make as individuals. 

Imran Ahmed: We’ve got to think about our own actions like we’ve learned when it comes to climate change and to our eco- to our physical ecosystem that each of us has a responsibility as well. That’s part of our responsibility, is making sure that we don’t contribute the pollution of our information ecosystem. 

Morgan Sung: Okay, so within the close confines of my group chats, I’ll probably keep sending memes about the election, but at least I’ll think twice about posting them publicly since according to Imran: 

Imran Ahmed: If you want to have it in the back of your mind that every time you hear JD Vance say something that you don’t agree with, just feel free to think to yourself. “But you are couch f*****.” But saying it, publishing it, spreading it is undermining our democracy. 

Morgan Sung: And as my responsibility to the information ecosystem, maybe I’ll community note myself in the group chat. Just to be safe. That’s it for this deep dive. Let’s close these tabs. Close All Tabs is a production of KQED Studios and is reported in hosted by me, Morgan Sung. Our managing producer is Chris Egusa. Our producer is Maya Cueva. Jen Chien edited the series and is KQED’s director of podcasts. Original Music and Sound Design by Chris Egusa. Additional music from APM. Audience Engagement Support from Maha Sanad. Katie Sprenger is our Podcast Operations Director. Holly Kernan is our Chief Content Officer. A special shout out to the team at Political Breakdown for letting us share our episodes on their feed. We’d love to hear what you think about the series. Hit us up at podcasts at KQED.org. That’s podcast with “s.”

Sponsored

lower waypoint
next waypoint