x
Breaking News
More () »

Why are police and school districts struggling to deal with a new type of bullying?

Students are targeting classmates by using AI to create fake nudes

HOUSTON — AI can do some pretty amazing things, but as the technology spreads, some are using it for alarming purposes. Last school year, school district across the country started to report the spread of fake sexually explicit images of young students. 

In most cases, a classmate took the pictures off the victim’s social media pages, used artificial intelligence apps to remove their clothing and then spread the fake nudes to other classmates.

Now law enforcement and school administrators are scrambling to combat the disturbing trend. Earlier this year, the FBI warned that these deep fake images of minors are illegal and it is a crime to create and distribute them.  But this issue is so new that many administrators are unaware of how to handle it. 

Politico reports that some school districts are not mandated to report these images to law enforcement.

Texas does have a law making deep fake sexually explicit video illegal, but it doesn’t apply to photos. Senator Ted Cruz has introduced a bipartisan bill that would make it illegal to post nonconsensual nude images, real or fake. 

When he introduced the bill in June, he was joined by Elliston Berry, a North Texas teen who had explicit fake pictures created by a classmate who stole her image from Instagram.

Before You Leave, Check This Out