Ending Human Trafficking Podcast

349: Legislative Reform in the Fight Against Online Exploitation


Listen Later

Eleanor Kennelly Gaetan joins Dr. Sandie Morgan to discuss the critical need for legislative reform to combat online sexual exploitation, focusing on Section 230 immunity and emerging laws like the Take It Down Act.

Eleanor Kennelly Gaetan

Eleanor Kennelly Gaetan is director of public policy at the National Center on Sexual Exploitation in Washington DC. She has been an advocate for stronger laws to fight sexual exploitation and has had a role in passing key anti-trafficking laws like the Justice for Victims of Trafficking Act and SESTA-FOSTA, which changed Section 230 to hold tech platforms more accountable for their role in enabling sex trafficking.

Key Points
  • Human trafficking was only identified as a crime in the year 2000 with the passage of the Trafficking Victims Protection Act, making it a relatively new field where small movements have achieved significant progress.
  • Eleanor witnessed firsthand in Romania how young women were lured abroad with false promises of legitimate work, only to be trafficked into commercial sexual exploitation, highlighting the critical need for proper victim services rather than detention centers.
  • The Take It Down Act represents a crucial breakthrough by criminalizing the uploading of non-consensual sexually explicit material for the first time and requiring platforms to provide real human help desks for removal requests within 48 hours.
  • Image-based sexual abuse creates ongoing trauma for victims because unlike other trafficking incidents that end, having images online means “you’re being raped and it’s online and you can’t get it down,” creating continuous retraumatization.
  • Section 230 of the Communications Decency Act, passed in 1996 when the internet was nascent, provides broad immunity to internet service providers and has been interpreted by courts as creating a “wall of immunity” for social media platforms.
  • The case against Twitter involving 13-year-old boys demonstrates how platforms monetize child exploitation material through advertising revenue while claiming Section 230 immunity protects them from liability.
  • California’s Age Appropriate Design Code represents one approach to reform by requiring companies to test products for age appropriateness before launch, using product liability law to sidestep Section 230 immunity issues.
  • Meta tracks children’s negative emotions and targets vulnerable youth with harmful content, including targeting kids who fear being “too fat” with eating disorder material, showing the deliberate exploitation of minors.
  • Bipartisan support exists for reform, with both Democratic and Republican senators preparing to introduce a bill to repeal Section 230, recognizing that tech companies are not policing themselves effectively.
  • The Social Media Victims Law Center currently represents over 4,000 families whose children have been harmed or killed due to social media platform irresponsibility enabled by Section 230 immunity.
  • Congressional education on online harms has accelerated with over 24 briefings since 2019, positioning the current Congress as potentially the most informed ever on these issues.
  • Federal guidance on best practices remains insufficient, with some jurisdictions like San Diego developing excellent collaborative models while others lack functional systems for moving victims into services.
  • Resources
    • Social Media Victims Law Center – Social Media Litigation Lawyers
    • Can’t Look Away: The Case Against Social Media
    • CDA230
    • Transcript

      [00:00:00] Welcome to the Ending Human Trafficking Podcast, brought to you by Vanguard University’s Global Center for Women and Justice in Orange County, California. I’m Dr. Sandie Morgan, and this is a show where we equip you to study the issues, be a voice, and make a difference in the fight to end human trafficking right where you are.

      [00:00:23] Today, I’m honored to welcome Dr. Eleanor Gaetan to the show. She’s director of. Public policy at the National Center on Sexual Exploitation that’s in Washington DC. She has been an advocate for stronger laws to fight sexual exploitation and has had a role in passing key anti-trafficking laws. Like the Justice for Victims of Trafficking Act and Cesta Fossa, which changed Section 230 to hold tech platforms more accountable for their role in enabling sex trafficking.

      [00:01:07] sandie: Eleanor, we have been in the same movement for decades, and it is exciting to see how some of our long held dreams have come to fruition.

      [00:01:21] And one of mine has been to have you on this podcast.

      [00:01:24] 349-guest: Oh, professor Morgan, thank you so much. It’s really a delight to join the coast. I’m speaking to you from Washington DC I know you’re there in California, and we together embrace all the advocates in between.

      [00:01:36] 349-sandie: Well, and for our listeners who have been long time subscribers, my former podcast intern, I Dallas.

      [00:01:46] she’s working with Dr. Gataen. So I, it was like full circle, the both coasts, all of us hands held together in this work. It is hard work. It takes dedication and long-term determination. Some people might even say we’re a bit stubborn.

      [00:02:09] Dr. Gaetan: Certainly stubborn have to be persistent and stubborn. But the great thing about the field of human trafficking and the, you know, this was only identified as a crime in the year 2000. So let’s recall that, that, that there wasn’t a name for human trafficking. until 22, the year 2000, and the Trafficking Victims Protection Act.

      [00:02:27] So it’s a relatively new field and yet the advance, the progress has been as a result of human champions and a few, you know, it’s a small movement that has mountains.

      [00:02:42] Sandie: and you were instrumental in passing the first Trafficking Victims Protection Act here in the us. Tell me about that.

      [00:02:51] guest: So I worked for U-S-A-I-D in Romania, and Romania was an example of a country where when the. When communism ended in 1989, 1990, people had been trapped in their countries. People were desperate to travel, and they lost a lot of jobs, and so people really needed work and they were seeking work abroad.

      [00:03:11] So young women were especially vulnerable to those promises of a babysitting job or a waitressing job, or an elder care job. Around Western Europe and around the world. and so vulnerable to the, the promises of traffickers. So we witnessed in Romania and Moldova entire villages of young women being being lured abroad, and so many of them abused badly in both legal brothels and illegal prostitution around, around the world.

      [00:03:39] I mean, the US government was supporting a, a trafficking shelter in Romania and Bucharest that I was helping to manage as a democracy officer. And it was shocking to me that it wasn’t, it wasn’t a, a shelter, it was a detention center. So women were being abused in, say, legal brothels in Germany were arrested, repatriated to their home country.

      [00:04:01] There’s 20 years old, they come back with nothing and then they’re put in virtually like a prison cell and told they’re supposed to stay in this shelter, but it’s a detention center. And of course they ran away. So I was witnessing the lack, complete lack of services to help people who had been traumatized at a young age in the sex trade, in the commercial sex trade around the world.

      [00:04:23] Sandie: Wow. And I was in Athens, Greece at that time, and we saw the recidivism as the girls that were quote on quote rescued, just came back to us from Romania, from Moldova.I see their faces even as we’re talking right now.

      [00:04:38] 349-guest: But to answer your question, it was all those stories from people like you in Greece and me in Bucharest feeding into Washington. So Congress was hearing from both feminist and faith-based organizations about this terrible crime that really takes advantage of the hopes and dreams of youth. I mean boys too, as we know, but mainly women and girls.

      [00:05:04] And Congress was hearing that and became a bipartisan effort to exert US leadership to end this crime, define it and end it. But at the beginning, it  as you well know, professor Morgan, it really wasn’t international crime. And yet the US didn’t look at itself and say, to what extent do we have human trafficking, both sex trafficking, labor trafficking happening in our own country?

      [00:05:27] That came a little later. with iterations of the Trafficking Victims Protection Act, but, but it really was the sources Americans abroad reporting on what they were seeing, what you and I were seeing around the world that caused the, the passage of TVPA.

      [00:05:42] 349-sandie: Eventually that led to the ripple effect of similar legislation in dozens of countries and eventually over 150 and our US. community there in Athens, Greece was very much a part of eventually getting the Greek version passed before the Olympics arrived in 2004. And this really brings up how important legislation and policy is in our movement.

      [00:06:20] I know a lot of my students, their dream is I’m gonna go and rescue girls. I’m gonna go and start a shelter, things like that. But. The work that William Wilberforce did is a model that I have stuck to in my career of trying to bring together thought leaders and policy implementers. Tell me why policy is critical?

      [00:06:54] 349-guest: So we’ve just discussed really a great example in which, in, in human trafficking had to be defined legally before you could create a framework for both preventing the crime and prosecuting the criminals and helping the victims. Until we had that law. We didn’t have a framework of reference in order to both prevent it and help those who suffer from the crime.

      [00:07:20] So policy is crucial to, making operational prevention efforts, protection efforts, prosecution efforts. And what has happened though is that with the internet trafficking is a great example. A perversely excellent example of how a crime changes as it moves online. It’s even more devastating today, the way people are groomed online, and then it moves into real life.

      [00:07:51] But, but the internet has been, ruinous for many people who’ve fallen into the clutches of predators, and we don’t have sufficient legal response to online trafficking.

      [00:08:06] 349-sandie: Well, in my world here, there was great celebration when Congress passed the Take It Down Act and I felt like people sort of set their tools down and said, ah, now we can, we can. We’re done. But that’s not the case, we still have to fight the battle on the new frontline, which is online, on the internet, where some folks, estimate 80% of recruiting happens.

      [00:08:43] The stories of Sextortion are heartbreaking and must be addressed. So tell me what else we need to do besides the Take It Down Act

      [00:08:58] 349-guest: Wonderful. So the Take It Down. Act was so crucial because it criminalizes for the first time the uploading of non-consensual sexually explicit material, and this was absolutely ruining lives, right? So cases of I’m, I’m friendly with a woman who had no idea that her boyfriend took pictures of them involved in sexual relations.

      [00:09:22] She thought it was just happening in one room, and instead he made recordings. She had just turned 18. He posts them on PornHub, and in fact, only years later she gets a call from a high school friend and said, you know, I think you’re on PornHub.

      [00:09:40] And she looks at that. She looks, looks it up. And indeed it is from, a, a moment that she thought was private and it’s online.

      [00:09:50] It immediately triggered in her so much anxiety and upset. She tried to contact PornHub. There’s no human being you can reach to get material like that down. This is called image-based sexual abuse. And it wasn’t a federal crime until Take it Down Act. So. Now it’s a federal crime and also the the bill, the the legislation requires digital platforms to have a real life help desk.

      [00:10:18] Where you can go and say, this is not consensual. Take it down. And the platform is required within 48 hours to take it down. Now this will remind you probably of copyright law, if I post three minutes of Beauty and the Beast on YouTube, it is immediately taken down because copyright law is so embedded in our legal framework in the United States, and you cannot just rip off some Disney film and post it as your own.

      [00:10:44] little personal snippet on YouTube, but the same coverage didn’t help individuals who had their images posted on into the public. And of course, this, this friend of mine had that material circulated internationally. It showed up on 45 websites. There’s no way to take it down once it gets really viral and out in the world.

      [00:11:05] so Post-traumatic stress disorder is common, losing jobs or relationships, it can be ruinous.

      [00:11:15] Eleanor Gaetan: And interviewing, women who’ve suffered image-based sexual abuse really underscores the fact that it feels like an ongoing trauma. So. You can have a trafficking incident.

      [00:11:26] It happened to you. You were violated by a predator. You had a horrible experience. It’s in the past and you go through a healing process, but when images are up online, you’re being raped and it’s online and you can’t get it down. You feel? You experience that as ongoing, ongoing sexual assault. So that’s why it’s, it was especially crucial for a utterly bipartisan bill.

      [00:11:49] 349-guest: It was introduced a year ago by Senator Ted Cruz in Texas because teenagers who were victims of deep fakes take it down, act all. So includes AI generated deep fakes. So these young, these teens in a high school in Texas had images made of them by another high school, a a male in, in their high school.

      [00:12:09] They couldn’t get it down. And they went to Senator Cruz and he was really moved by their personal story. And that’s where, that’s the origin of the, of the bill.

      [00:12:19] 349-sandie: Wow. Okay, so what’s the next

      [00:12:22] 349-guest: step, right? So that’s criminal, that criminalizes, image-based sexual abuse. But there’s another bill pending called the Defiance Act. And that, it targets the acts of uploaders, but it provides a civil right of action. So victims have a path to justice and can sue, sue those who commit this crime?

      [00:12:45] In US law, of course, having a criminal component and a civil component really helped amplify the weight of the law and, and start to, you know, people need to be convinced they really shouldn’t do this. But if you suffer, no consequences, the criminality increases.

      [00:13:04] 349-sandie: and for the victim, sending someone to prison does have a sense of justice, but also the element of restitution that is afforded more in our civil approach

      [00:13:21] 349-guest: exactly.

      [00:13:22] 349-sandie: so significant.

      [00:13:24] So tell us about the Defiance Act.

      [00:13:27] 349-guest: Well, defiance Act is fascinating because we have, like Senator Durbin from Illinois, a Democrat, very, very, excellent leader in this field. And Senator Graham, a Republican from South Carolina. Again, we’re protecting that bipartisanship on the house side. Uh, representative Ocasio-Cortez is, is the a, a lead sponsor together with.

      [00:13:49] Uh, Lauren Lee, representative Lee is a Republican from Florida. This is so important to our movement. You know, party label is irrelevant when this horrible thing happens to you. Image-based sexual abuse or human trafficking, doesn’t matter what your politics are. So we’ve worked really hard on Capitol Hill.

      [00:14:09] Um, ZI will only support bills that are. That have both Democrats and Republicans in support. So defiance is, is got that civil piece. So does a, a bill that, Senator Blackburn from Tennessee and Senator Coons from Delaware, uh, propose no fakes act. Now this goes to the problem of the, that the music industry has identified that voice likenesses can be AI generated voice likenesses can.

      [00:14:38] Be projected on the internet and people don’t know the difference between, like this, this country singer was testifying, McBride, country, country Music Star testified with Cozzy’s lawyer Kristen Price. And but it also allowsAn individual who suffers from image-based sexual abuse to sue for damages.

      [00:15:00] So there are a few approaches, to the image-based sexual abuse problem, but we need, it’s like, think of car safety. We need seat belts, we need airbags. We need speed limits. We need driver’s licenses, all of the above in order to protect people who are riding in cars Well. The internet has been the Wild West.

      [00:15:21] There have been no restrictions on the internet. For a, for a policy explanation, I can provide a policy explanation for why the internet is so dangerous, especially to our youth. And so also addressing the fact that the internet is an utterly unregulated market. Now, I personally am a small government person.

      [00:15:40] I think we should have a limited government. However, it is fundamental American jurisprudence, this idea of liability. If you harm another person, you should be liable. That is one of the reasons we walk into a Walmart or a, or a Target, and there’s no puddles on any floor. ’cause those, stores are highly aware of liability for if somebody slips and falls in that store, they will be sued.

      [00:16:06] And individuals get big settlements right from, from slipping and falling. Right? It’s, it is the job of that store to maintain health and safety for those who walk into it,

      [00:16:17] So we don’t have similar jurisprudence on, well, we don’t have correct laws to protect people online that we do to protect people in the bricks and mortar reality of three-dimensional world.

      [00:16:31] And that is, is the frontier in which people like me work on policy. That is our most important frontier right now is to protect people online.

      [00:16:41] 349-sandie: So, I remember the first time somebody really made me stop and think about this a long, this is a long time ago, Ernie Allen, and he started including me in letters to attorneys general about Section 230.

      [00:17:00] So if you’re one of my students and you’re like, what, what? just look up section 230, do a lit review, but let’s get in a capsule. What is section 230 and why is it like a cornerstone in this battle?

      [00:17:20] 349-guest: Excellent question. So it’s the section  230 is one provision of a bill called the Communications Decency Act. It was passed in 1996 when the internet was a baby, and Congress wanted to benefit this promising reality of the internet and the worldwide web. And so there had been some cases where, one of the early, like MySpace or an AOL, it, it, it’s a, one of the early manifestations of the internet.

      [00:17:50] there was concern that how on earth could the internet company, which is really just a highway for information, how could it be held liable for activities? By those posting. Right. So it’s, it’s almost like saying the highway isn’t responsible for a drunk driver. Because that drunk driver is in a car going down the highway, just using the highway as, as a a, a transit, a method for transportation.

      [00:18:16] And with that idea of the internet as a big neutral highway. The CDA, the Communications Decency Act included few provisions, that were like Good Samaritan provisions saying, you know, the, the, the internet service providers, the ISPs could not be held liable for  what, what was being posted and what was happening online, because they aren’t publishers, you know, they’re, they’re just like a highway for information.

      [00:18:45] Okay. That bill was actually gutted by the Supreme Court around some First Amendment, questions. But what stood was the provisions known as CDA 230, which said, the internet service providers cannot be held responsible for what is being posted by the content that is being posted online.

      [00:19:04] Right? That’s all protected by the First Amendment. But fast forward to today and so many courts interpreted those provisions as providing like a wall of immunity, an entire wall of immunity for these, social media platforms and an internet service providers. So we have a case, lemme give you an example.

      [00:19:27] Let’s look at an example. We have a case against Twitter. Now it’s called X, but the case occurred when it was Twitter. Two boys, 13 year olds. They think they’re talking to a 16-year-old, and it’s Snapchat. It’s not a 16-year-old girl. She keeps asking them to do various sexual acts, including acts with each other, and when she wants to meet them, they block her.

      [00:19:53] Two years later, so they’re 16. All those images and the boys of course thought it was disappearing. Right. Snap is a, is a platform where people think the, the images disappear. That Predator had made a video of all those separate images and it was child pornography. These were young teens, 13-year-old boys engaged in sexual activity that is by definition, child pornography.

      [00:20:18] It was posted on Twitter, it was circulating on Twitter, and thank God somebody called the mom of one of the boys and said, your son is up in his room talking about taking his own life. Because he’s so mortified that these images of him full, fully unclothed are circulated at our high school. Mom intervened.

      [00:20:40] She was a nurse intervened and, and, thank God this her son is alive today, but he still suffers the mortification of those images being blasted around the world. The family, excellent family. The mom and dad were both nurses and they shared shifts so that they would always be home for their kids.

      [00:20:59] Alert. The police, the police come and say, are you sure you wanna file a report? Your son produced pornography, which is illegal. We could arrest him for producing

      [00:21:10] 349-sandie: Mm.

      [00:21:10] 349-guest: I mean, what crazy

      [00:21:11] 349-sandie: Oh my

      [00:21:12] 349-guest:Right? And then the family contacted Twitter and said, take it down. These are young boys. We verify they’re 13 years old in these videos.

      [00:21:23] Twitter came back and said. This doesn’t violate our community standards and left the material up. The only reason that video was taken down was because the family’s pastor happened to have a friend who was a DHS, a Department of Homeland Security agent, flashed a badge from DC and, and Twitter took it down, eventually.

      [00:21:45] So we have a case, we are with a group of lawyers representing that family. It’s mom is Jane Doe and, and the boy is John Doe. The two boys are, John doe’s. In the filing. So as we’ve entered discovery, we got through in into discovery. The Twitter actually says, we, we don’t care if you call this sex trafficking, because money was made by the advertising, right?

      [00:22:07] So Twitter was monetizing those images and making money through the ads. That as people were viewing the material. So our legal argument is you’re monetizing commercial sex, sex sexual material, that’s commercial sex act. And so under the Trafficking Victim’s Protection Act, that’s sex trafficking, Twitter comes back and says, doesn’t even matter.We have the rights under CDA  230.

      [00:22:35] That’s not our problem. You know, it’s, it’s, it’s on our websites, but that’s not our problem. CDA two 30 protects us, and, and we don’t have to take it down. We, we don’t need to deal with it even though they. They demonstrated that they knowingly were aware of this material online and they left it up.

      [00:22:53] So this is a battleground. We’ve lost this lawsuit, but we hope it goes to the Supreme Court because it’s time for the Supreme Court to really look at this and say, what are you talking about? That’s child pornography, a federal offense. You cannot be facilitating child pornography online. That’s just one example of how, These companies really think they have no responsibility to take down material that is federal criminal material, and they hide behind CDA two 30 to leave it up.

      [00:23:26] 349-sandie: So how do we begin at not just the national level, but at the local level to combat this

      [00:23:38] to begin initiatives that call for reform. What can everyday people and politicians at school boards, politicians in local government, city state, professors, thought leaders in our universities? What’s our role in, CDA 230.

      [00:24:05] 349-guest: So two schools of thought have emerged and California as is so often as a leader, California passed an excellent bill called the Age Appropriate Design Code,

      [00:24:17] and it said companies are should be obligated. These products are used by youth, you know, adult used by all kinds of people all day long

      [00:24:28]  before a product, a service, a platform is launched, it needs to be tested and we need to verify that it’s age appropriate and we need to make sure that unsafe products aren’t lost.

      [00:24:41] So it’s also within the sort of product liability zone of the law. Right. And California, passed the age appropriate design code, which was passed in Vermont. For example, just a few weeks ago, it’s being challenged in court by the, the, the big tech companies. and, and we’ll see where that goes. But that idea is let’s look at these platforms from the product liability standpoint.

      [00:25:06] So it sort of sides steps the CDA immunity problem and brings new legal theory to bear in order to force these damn companies to be responsible. They’re unbelievably, there’s like so little accountability, an excellent book,  just finished reading it this week. There’s a book called Careless People.

      [00:25:29] It’s written by Sarah Wynn Williams. She was for 10 years, the Global Public policy leader at Facebook, or Meta, you know, meta owns Facebook and Instagram and WhatsApp. And she testified in the Senate, as she says in her book, Careless People that Meta Tracks children so closely. And knows even their negative emotions, they target ads to kids who are suffering, like fear of having being too fat.

      [00:25:58] So they’re targeting those kids who fear they’re too fat with material about diets and even dis eating disorder material. It’s terrifyingly irresponsible, and yet they hide behind CDA 230. So the age appropriate design code in California, we’ll see how it goes in the courts, but more states should probably pass the age appropriate design code.

      [00:26:25] But also, states are looking at AI. Let’s, what about curbing ai? You know, artificial intelligence is being used. To, to generate deep fakes, pornographic material, using your face, but putting in a different body, and that’s terrible for, for the victim too. So state level legislative action is relevant. meanwhile, we just this week a senate office shared with us a bill that would repeal CDA 230, and we were very excited to see for Democrat senators and for Republican senators both.

      [00:27:03] Already committed to introducing that bill. It hasn’t been introduced yet, so I’m not gonna say who the, members were, but high level, important members of the US Senate have said, enough is enough. We must, force these companies to be accountable ’cause they are not policing themselves. So there’s, there is a bill pending in, in, we believe there will be a.

      [00:27:24] A bill pending in Congress to repeal CDA 230. I’m gonna mention one thing, that was we were disturbed at in, that came out of the house side. We’ve all heard about the big beautiful bill, the budget reconciliation bill on the house side. It include a very disturbing set of provisions that said no state can regulate AI for 10 years.

      [00:27:46] Wait a second. A moratorium on state regulation of AI. Why? Our country is a, is a set of 50 states that have their own, you know, have community standards, have their own laws. Why would states not be able to regulate AI? So protests have been raised from, from Republican and Democrat senators. So we hope it will not succeed on the Senate side and we’ll nip it in the bud.

      [00:28:11] But that’s how powerful big tech is in Congress, that they’re able to engineer a really absurd, right, 10 year moratorium, guess what that is? That is giving AI that same kind of immunity that the internet had under CDA 230 we’re, we’re still struggling to try to get away from the utter. Corporate irresponsibility that CDA two 30 gave us, and now we’re facing an attempt to give that to ai.

      [00:28:40] sandie: Wow. Okay. So much to think about here. Let’s give some links and suggestions for how people can gain the knowledge and insight to be an articulate, advocate for change in their local community. I am all about pursuing knowledge so that we can show up in these spaces with more than passion, more than anger.

      [00:29:14] So give us three examples of resources we can look to.

      [00:29:21] 349-guest: So I’ll send you a link to Professor Mary Gra Leary’s testimony before the Senate Judiciary Committee in which she really deep dives into CDA 230. It’s an excellent, testimony from earlier this year. That’s superb. There, there have been a number of investigative pieces. The Wall Street Journal has done several investigations that are good reads.

      [00:29:45] for example, in 2023, they revealed that social media algorithms connect pedophiles to children. They, they provide pedophiles with more and more links to kids. You know, quite shocking. Or in your introduction, you, you mentioned, so I can send you links to some of the exceptional Wall Street Journal investigations.

      [00:30:10] There is a super law center called the Social Media Victims Law Center. They’re featured in a new documentary, it’s called, Can’t Look Away.

      [00:30:21] 349-sandie: Mm-hmm.

      [00:30:21] 349-guest: this documentary is just extremely well-informed about the, the, CDA  230 immunity problem. And also, the tragedies of families who’ve lost their loved ones because of irresponsibility that the CDA 230 immunity promotes.

      [00:30:38] Right now, this Social Media Victims Law Center has over 4,000 families whose, whose children have been harmed or who have lost their kids as a result of responsibility in social media. and, and there the Social Media Victim’s Law Center website is quite informative. So I’ll, I’ll send you that link.

      [00:30:58] There have been over 24 Congressional briefings on this subject since 2019. So in the last six years, over 24. Congress is about the best educated on the online harms. We are hoping this Congress 2024. no 2025, 2026, the hundred 19th Congress is the one that will get serious.

      [00:31:22] 349-sandie: Okay. I think we will have to have another conversation for sure. And I wanna bring this also to, you mentioned the faith-based community, and I think this discussion belongs in our local

      [00:31:43] religious community, the youth leaders, the folks doing parenting classes, how do parents respond? So I’m gonna probably call you and ask you to come back and talk with me about that. All right?

      [00:31:58] 349-guest: Absolutely

      [00:31:58] Love to, and, and I know that the White House has set up a faith-based office, that is looking at these issues. So hopefully we’ll get some more federal guidance because one thing, and you know, professor Morgan from the human trafficking movement, it’s, it’s, there isn’t enough federal.

      [00:32:17] Federal guidance on best practices. You know, there are jurisdictions around the country that are doing some very good work or, or places where, like San Diego has a, a superb district, attorney or who, who has helped engineer like MOUs between police and direct service providers. And it’s very, it’s it like women who are in exploitation or being exploited, being trafficked and getting them into services. There are some jurisdictions that are doing this well, and then there are places that are absolutely, without model. You know, they’re not, they, they, they don’t have a highly functional system for moving victims into services. So I wish we got more tips and tips from the federal government, right? The Department of Justice sees all this happening around the country.

      [00:33:03] Well, tell us who’s doing the best job.

      [00:33:06] 349-sandie: Those are good questions. I’m gonna start a list of questions and look for guests that will join us to discuss those. Dr. Gataen, I am delighted to have this conversation.

      [00:33:24] We’re gonna keep it going, and thank you so much for being part of our podcast today.

      [00:33:30] 349-guest: It’s a pleasure. Thank you so much. I hope I can come back on because it’s, it’s so important and you’ve done such remarkable work over the years by combating human trafficking. Thank you for everything you’ve done.

      [00:33:43] 349-sandie: Oh, thank you. And we’ll get together maybe in the same space in the spring for Ensure Justice. And I want our listeners to start planning now for March, 2026. Thanks so much.Have a great day.

      [00:34:00] 349-guest: Bye-bye.

      [00:34:01] Wow, Eleanor, thank you for shedding light surrounding section 230 in the fight against online sexual exploitation.

      [00:34:10] Your expertise really helps me understand better how advocacy and legislative reform play vital roles in protecting communities. Listeners, I encourage you to take the next step by visiting our website, @endinghumantrafficking.org to check out our show notes and other episodes. If you haven’t already, be sure to subscribe so you don’t miss any of the important conversations we’re having.

      [00:34:45] We’d also love your help in growing this podcast. You can start by just following us and connecting on Facebook and Instagram and LinkedIn. And if you know someone who would benefit from today’s episode, invite them to subscribe and join our community. Thank you for listening. I’ll be back in two weeks.

       

      ...more
      View all episodesView all episodes
      Download on the App Store

      Ending Human Trafficking PodcastBy Dr. Sandra Morgan

      • 4.8
      • 4.8
      • 4.8
      • 4.8
      • 4.8

      4.8

      122 ratings


      More shows like Ending Human Trafficking Podcast

      View all
      Stuff You Should Know by iHeartPodcasts

      Stuff You Should Know

      77,846 Listeners

      Freakonomics Radio by Freakonomics Radio + Stitcher

      Freakonomics Radio

      32,299 Listeners

      The World and Everything In It by WORLD Radio

      The World and Everything In It

      6,979 Listeners

      Hidden Brain by Hidden Brain, Shankar Vedantam

      Hidden Brain

      43,483 Listeners

      The Ben Shapiro Show by The Daily Wire

      The Ben Shapiro Show

      154,023 Listeners

      Pantsuit Politics by Sarah & Beth

      Pantsuit Politics

      4,926 Listeners

      Code Switch by NPR

      Code Switch

      14,548 Listeners

      The Daily by The New York Times

      The Daily

      111,917 Listeners

      Up First from NPR by NPR

      Up First from NPR

      56,285 Listeners

      Crime Junkie by audiochuck

      Crime Junkie

      366,312 Listeners

      BibleThinker by Mike Winger

      BibleThinker

      2,755 Listeners

      Compelled - Christian Stories & Testimonies by Paul Hastings

      Compelled - Christian Stories & Testimonies

      996 Listeners

      Dateline NBC by NBC News

      Dateline NBC

      47,886 Listeners

      Morning Wire by The Daily Wire

      Morning Wire

      26,552 Listeners

      The Mel Robbins Podcast by Mel Robbins

      The Mel Robbins Podcast

      20,604 Listeners