On Tuesday, Texas GOP Sen. Ted Cruz introduced a bill to protect and empower victims of “revenge pornography.”

The bill, titled the “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act,” would “criminalize the publication of non-consensual intimate imagery (NCII), including AI-generated NCII (or “deepfake pornography”), and require social media and similar websites to have in place procedures to remove such content upon notification from a victim,” according to Cruz’s office.

“There has been a staggering increase in exploitative sexual material online, largely due to predators taking advantage of new technologies like generative artificial intelligence, or AI,” Cruz stated. “Now almost instantaneously, perpetrators can use an app on their phones to create fake, explicit images depicting real people, commonly referred to as deep fakes. Disturbingly, this is increasingly affecting and targeting minors, particularly young girls. Up to 95% of all internet deep fakes depict sexual images, with the vast majority of it targeting women and girls.”

“It feels like almost every week now we’re seeing a new report of a middle school or high school young girl falling victim to this despicable activity,” he continued. “Young girls are waking up to a text message from their friends saying there’s an explicit photo of them circulating on social media or explicit video. And yet, the photos aren’t real. They were created using AI, typically taking the face of a real person and using AI technology to graph this seamlessly and imperceptibly to a different image, and a sexually explicit image. Imagine being 14 years old and showing up at school where your classmates believe they have seen an explicit photo of you that is a fake and a fraud. The photos may not be real, but the pain and the humiliation is.”

CLICK HERE TO GET THE DAILYWIRE+ APP

“For many young teens, it’s changed their lives forever,” he declared. “Even worse, these photos can prove very difficult to remove from the internet. They can spread quickly from one phone to another, to another to another, causing many women and girls to have to live with being victimized over and over again.”

“There are a few bills in the Senate right now that are working to tackle this issue,” he acknowledged. “I want to be clear, our legislation is designed to be complementary to those efforts. I’ve supported the other efforts, and this is designed to fill gaps that the other bills do not address. As the ranking member of the Senate Commerce Committee, my legislation is the first of its kind to require websites, including social media sites, to have in place a process to remove such content within 48 hours of a request from the victim. The sites must also remove any copies of those images.”

​[#item_full_content]  

​[[{“value”:”

On Tuesday, Texas GOP Sen. Ted Cruz introduced a bill to protect and empower victims of “revenge pornography.”

The bill, titled the “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act,” would “criminalize the publication of non-consensual intimate imagery (NCII), including AI-generated NCII (or “deepfake pornography”), and require social media and similar websites to have in place procedures to remove such content upon notification from a victim,” according to Cruz’s office.

“There has been a staggering increase in exploitative sexual material online, largely due to predators taking advantage of new technologies like generative artificial intelligence, or AI,” Cruz stated. “Now almost instantaneously, perpetrators can use an app on their phones to create fake, explicit images depicting real people, commonly referred to as deep fakes. Disturbingly, this is increasingly affecting and targeting minors, particularly young girls. Up to 95% of all internet deep fakes depict sexual images, with the vast majority of it targeting women and girls.”

“It feels like almost every week now we’re seeing a new report of a middle school or high school young girl falling victim to this despicable activity,” he continued. “Young girls are waking up to a text message from their friends saying there’s an explicit photo of them circulating on social media or explicit video. And yet, the photos aren’t real. They were created using AI, typically taking the face of a real person and using AI technology to graph this seamlessly and imperceptibly to a different image, and a sexually explicit image. Imagine being 14 years old and showing up at school where your classmates believe they have seen an explicit photo of you that is a fake and a fraud. The photos may not be real, but the pain and the humiliation is.”

CLICK HERE TO GET THE DAILYWIRE+ APP

“For many young teens, it’s changed their lives forever,” he declared. “Even worse, these photos can prove very difficult to remove from the internet. They can spread quickly from one phone to another, to another to another, causing many women and girls to have to live with being victimized over and over again.”

“There are a few bills in the Senate right now that are working to tackle this issue,” he acknowledged. “I want to be clear, our legislation is designed to be complementary to those efforts. I’ve supported the other efforts, and this is designed to fill gaps that the other bills do not address. As the ranking member of the Senate Commerce Committee, my legislation is the first of its kind to require websites, including social media sites, to have in place a process to remove such content within 48 hours of a request from the victim. The sites must also remove any copies of those images.”

“}]] 

 

Sign up to receive our newsletter

We don’t spam! Read our privacy policy for more info.