It’s a scary time to raise a child. You want to protect them from everything, especially with things getting out of hand on social media and on our phones at times. A Texas mom taught her 11-year-old the ins and outs of online safety but was never expecting him fall victim to a scam. Luckily, she jumped into action and now she’s speaking out against this terrifying scam that could easily happen to anyone.
Kelsey (@kelseyleaaaa on TikTok) posted a video with the warning: “Please stop and listen to this if your child has a phone.” She says her 11-year-old son was out of town at a friend’s house so she took the chance to clean. When she checked her phone, she saw that she had a ton of missed calls from her child. She called him back and, when he answered, he was crying and hysterical. He went on to shockingly tell her, “Somebody did something to me,” which strikes fear in any parent’s soul.
Because her son wasn’t able to articulate what was going on through the years, his friend’s parents helped him speak up. She said, “It turns out, somebody texted him and asked for a picture of his face.” Kelsey said she isn’t sure why, but her son sent it to them. From there, she said they “sent back pictures of what looked like him naked.” She said it’s known as a “sextortion” scam and involves the scammer making AI-generated photos to blackmail the victim. Usually, it’s for money.
Kelsey went on to say that one of the scariest parts of this is what the pictures “looked real.” Her son naturally panicked and told his friend, who then told his parents. Apparently, they all saw things like this on social media happening to others and it’s a very legitimate scam. She said the blackmail attempt “traumatized” her son, and she’d be contacting the FBI.
Commenters were quick to defend Kelsey and her son, who was undoubtedly the victim in this situation. One person said, “He has no blame. He’s 11 and had no idea this would happen just from a regular pic.” Another said, “The fact AI has the capability of doing this is scary.”
One commenter gave some strong advice, “If the images were sexual in nature, you can file a report with the National Center for Missing and Exploited Children’s CyberTipline. Analysts can send it to the appropriate law enforcement. It’s such a common occurrence unfortunately and it’s called sextortion.” Another person urged, “Please look up Gavin’s Law. It was created after my friend’s son passed. His father got the law passed.” This is a specific law in South Carolina that makes sexual extortion a felony offense. It was passed after a state representative lost his teen to suicide after he was extorted due to compromising photos.
It’s truly scary that these things could happen to our children, whom we would do anything to protect. And it’s only going to get worse if the law doesn’t catch up with generative AI.