Right now, there is a bill headed to President Trump’s desk that will protect Americans from deepfake and revenge pornography. This is an incredible win for Americans, and something I have been fighting for my entire career. 

Over decades in the cybersecurity field, I’ve helped solve a lot of crises for government agencies, major corporations and more. But there’s a certain kind of call that I still dread getting. It starts with a trembling voice on the other end of the line: “Can you help me?” As they tell their tale, my heart sinks. A lump forms in my throat. This is another life shattered by explicit images strewn across the digital public square, each notification on their phone another dagger into their sense of safety as the image spreads.

For years, my team and I have made a point to fight for these survivors. We make calls, write letters, beg social media giants to remove the content – real photos stolen from private moments or deepfakes conjured by AI’s dark alchemy. This crime is indiscriminate. It targets the socialites, the professionals, rising stars, the rich and the poor. One professional athlete, a courageous woman I helped, saw her stolen images resurface dozens of times over several years. She told me it made her want to stop leaving her house. Even Scarlett Johansson — a global icon with every possible resource at her fingertips — can’t banish her deepfakes for good. If she’s powerless, what chance do everyday victims have? They have to watch their trauma go viral, reliving it with every view. Some, in desperation, have taken their own lives.

There are different ways to refer to this phenomenon. Terms like “revenge porn” and “deepfake” are often used interchangeably. “Deepfake” describes a specific kind of AI-produced image that usually uses a reproduction of a real person’s face. I reject “revenge porn” because while it may sound edgy in headlines, it re-victimizes the victim. People take intimate images of themselves for all kinds of reasons, from self-confidence to sharing with someone they trust. That may not be everyone’s cup of tea, but it doesn’t make it okay to share those peoples’ images without their consent in ways for which they weren’t intended. That’s why I refer to these pictures or videos as non-consensual intimate images, or NCII.

It wasn’t that long ago that pornography hid behind brown wrappers and clerks who checked IDs before you bought it. Now, anyone can steal an image, fabricate a lie, and blast it worldwide at lightning speed. The Cyber Civil Rights Initiative estimates over 15 million Americans – that’s 1 out of every 23 of us – have faced NCII sharing, a number that continues to balloon as AI fuels the carnage. Deepfakes — 98% pornographic, 99% targeting women — number in the hundreds of thousands, with just 16 sites known for hosting them racking up 200 million visits in a six-month period last year.

Social media companies can do better. It’s the right thing to do, and they have the technology to do it. If their algorithm can figure out my favorite shoe color and relentlessly sell them to me, they should be able to address this plague across their platforms. They are making some efforts. Meta removed 1.2 million NCII items in 2024’s first half, but gaps persist. And it’s not entirely the fault of the platforms themselves.

A tangle of laws and regulations used to combat NCII sharing makes the process slower and more difficult. A 2024 study found that X (formerly Twitter) left 25 reported deepfakes up for weeks, while removing copyright violations in hours – a stark sign that today’s laws prioritize property over people. YouTube and TikTok axe millions of videos over copyright claims, yet billions of views on deepfake sites reveal a glaring chasm. Victims currently have to navigate a maze of rules, pleading to faceless systems with no human to hear their pain and no guarantees the images will come down.

That’s why their haunting words echo in my mind: “Can you help me?” Yes, I can. But my team can’t chase resurgent ghosts across the internet forever.

Thankfully, Washington has woken up to the crisis. The ‘Take It Down Act” makes publishing non-consensual intimate imagery — real or AI-crafted — a federal crime and requires platforms to remove it within 48 hours of a victim’s plea. The bill’s sponsors are Sen. Ted Cruz and Sen. Amy Klobuchar – both outspoken, prominent members of their parties who have run for president in recent years. They have very little in common politically – although they are both parents of daughters. But they worked together to secure an astounding unanimous vote to pass the bill in the Senate. And this week, thankfully, it received an overwhelming victory in the House and is now headed to President Trump’s desk.

Critics, many from the libertarian tech crowd, claim the “Take It Down Act” stifles free speech, but protecting victims from abuse isn’t censorship — it’s justice. It helps the rule of law catch up with the pace of technology. It’s not about politics, it’s about people. By one estimate, over 4 million American children have been scarred by AI porn fakes of themselves or peers. The NCII porn pirates don’t care who you vote for – they will strike your family, your friends, your coworkers. The sensible updates to our laws included in the “Take It Down Act” will help us protect ourselves and each other. This is a victory worth celebrating.

* * *

Theresa Payton is the CEO of Fortalice Solutions. She was the first female White House chief information officer, serving under President George W. Bush.

The views expressed in this piece are those of the author and do not necessarily represent those of the Daily Wire.

​[#item_full_content]  

​[[{“value”:”

Right now, there is a bill headed to President Trump’s desk that will protect Americans from deepfake and revenge pornography. This is an incredible win for Americans, and something I have been fighting for my entire career. 

Over decades in the cybersecurity field, I’ve helped solve a lot of crises for government agencies, major corporations and more. But there’s a certain kind of call that I still dread getting. It starts with a trembling voice on the other end of the line: “Can you help me?” As they tell their tale, my heart sinks. A lump forms in my throat. This is another life shattered by explicit images strewn across the digital public square, each notification on their phone another dagger into their sense of safety as the image spreads.

For years, my team and I have made a point to fight for these survivors. We make calls, write letters, beg social media giants to remove the content – real photos stolen from private moments or deepfakes conjured by AI’s dark alchemy. This crime is indiscriminate. It targets the socialites, the professionals, rising stars, the rich and the poor. One professional athlete, a courageous woman I helped, saw her stolen images resurface dozens of times over several years. She told me it made her want to stop leaving her house. Even Scarlett Johansson — a global icon with every possible resource at her fingertips — can’t banish her deepfakes for good. If she’s powerless, what chance do everyday victims have? They have to watch their trauma go viral, reliving it with every view. Some, in desperation, have taken their own lives.

There are different ways to refer to this phenomenon. Terms like “revenge porn” and “deepfake” are often used interchangeably. “Deepfake” describes a specific kind of AI-produced image that usually uses a reproduction of a real person’s face. I reject “revenge porn” because while it may sound edgy in headlines, it re-victimizes the victim. People take intimate images of themselves for all kinds of reasons, from self-confidence to sharing with someone they trust. That may not be everyone’s cup of tea, but it doesn’t make it okay to share those peoples’ images without their consent in ways for which they weren’t intended. That’s why I refer to these pictures or videos as non-consensual intimate images, or NCII.

It wasn’t that long ago that pornography hid behind brown wrappers and clerks who checked IDs before you bought it. Now, anyone can steal an image, fabricate a lie, and blast it worldwide at lightning speed. The Cyber Civil Rights Initiative estimates over 15 million Americans – that’s 1 out of every 23 of us – have faced NCII sharing, a number that continues to balloon as AI fuels the carnage. Deepfakes — 98% pornographic, 99% targeting women — number in the hundreds of thousands, with just 16 sites known for hosting them racking up 200 million visits in a six-month period last year.

Social media companies can do better. It’s the right thing to do, and they have the technology to do it. If their algorithm can figure out my favorite shoe color and relentlessly sell them to me, they should be able to address this plague across their platforms. They are making some efforts. Meta removed 1.2 million NCII items in 2024’s first half, but gaps persist. And it’s not entirely the fault of the platforms themselves.

A tangle of laws and regulations used to combat NCII sharing makes the process slower and more difficult. A 2024 study found that X (formerly Twitter) left 25 reported deepfakes up for weeks, while removing copyright violations in hours – a stark sign that today’s laws prioritize property over people. YouTube and TikTok axe millions of videos over copyright claims, yet billions of views on deepfake sites reveal a glaring chasm. Victims currently have to navigate a maze of rules, pleading to faceless systems with no human to hear their pain and no guarantees the images will come down.

That’s why their haunting words echo in my mind: “Can you help me?” Yes, I can. But my team can’t chase resurgent ghosts across the internet forever.

Thankfully, Washington has woken up to the crisis. The ‘Take It Down Act” makes publishing non-consensual intimate imagery — real or AI-crafted — a federal crime and requires platforms to remove it within 48 hours of a victim’s plea. The bill’s sponsors are Sen. Ted Cruz and Sen. Amy Klobuchar – both outspoken, prominent members of their parties who have run for president in recent years. They have very little in common politically – although they are both parents of daughters. But they worked together to secure an astounding unanimous vote to pass the bill in the Senate. And this week, thankfully, it received an overwhelming victory in the House and is now headed to President Trump’s desk.

Critics, many from the libertarian tech crowd, claim the “Take It Down Act” stifles free speech, but protecting victims from abuse isn’t censorship — it’s justice. It helps the rule of law catch up with the pace of technology. It’s not about politics, it’s about people. By one estimate, over 4 million American children have been scarred by AI porn fakes of themselves or peers. The NCII porn pirates don’t care who you vote for – they will strike your family, your friends, your coworkers. The sensible updates to our laws included in the “Take It Down Act” will help us protect ourselves and each other. This is a victory worth celebrating.

* * *

Theresa Payton is the CEO of Fortalice Solutions. She was the first female White House chief information officer, serving under President George W. Bush.

The views expressed in this piece are those of the author and do not necessarily represent those of the Daily Wire.

“}]] 

 

Sign up to receive our newsletter

We don’t spam! Read our privacy policy for more info.