If you’re an adult who follows “only young gymnasts, cheerleaders and other teen and preteen influencers active on” Instagram, what other content is the Instagram Reels algorithm likely to recommend that you check out? The answer, according to a recent Wall Street Journal investigation, is “jarring doses of salacious content … including risqué footage of children.”
Think about that friend who always encourages you to order just one more drink at the bar.
To understand what’s going on here, let’s step out of the digital world and go “brick and mortar.” Let’s think about that friend who always encourages you to order just one more drink at the bar. This friend can be a lot of fun, in moderation and in adults-only settings. In large doses and in an all-ages setting, this friend can become a dangerous creep — and turn you into one, too.
Let’s call this friend Al. Al knows you, and he knows what you like. Al is out to show you a good time and keep the good times rolling. Al doesn’t know where the line is. Algorithms on social media platforms and search engines typically act like our friend Al.
As the U.S. Supreme Court explained in Twitter v. Taamneh, algorithmically generated recommendations mean that “a person who watches cooking shows on YouTube is more likely to see cooking-based videos and advertisements for cookbooks, whereas someone who likes to watch professorial lectures might see collegiate debates and advertisements for TED Talks.”
Let’s think about what happens when you and Al go watch football at the local high school. It’s fun to relive your glory days, until later when Al follows the students, and you follow Al … to the girls’ locker room. There are the cheerleaders, just off the field, still in their sports bras and athletic shorts. You try to tell yourself there’s nothing wrong with seeing them like that — it’s more than they’d be wearing if you were all at the town pool. But part of you — the part that Al doesn’t have, because he’s Al — knows it’s different for you to see the cheerleaders in the locker room. That’s their private space, not a public space, and you’re not a teen girl.
But you stay, and when Al suggests that you follow some of the cheerleaders back home and look through their bedroom windows (They left the shades open! Their parents know the shades are open! They’re dressed, wearing cute clothes!), you go along.
Adults should not be looking into young people’s locker rooms, bedrooms or other private places.
If this scenario feels uncomfortable to you, if it feels creepy to you — that’s because it is. Super uncomfortable, super creepy. Take the technology out of it. Take the technical jargon, like “algorithm,” out of it. Adults should not be looking into young people’s locker rooms, bedrooms or other private places (unless they are parents or other adults with legitimate, boundary-respecting reasons to be in those spaces).
This story about Al helps to explain how algorithms work. This hypothetical also captures a deeply disturbing reality: Many adults are peering via social media into actual locker rooms, bedrooms and other private youth places, looking at content featuring youth of all ages doing personal or intimate things. Even when this content doesn’t cross the line into the sexually exploitative or abusive, it is profoundly creepy to have adults’ eyes in these youth spaces. And when the content does cross that line, this adult behavior is criminal, dangerous and morally reprehensible.








