California just passed a law that could change the way Instagram and TikTok work for teenagers. Starting now, platforms can’t serve “addictive feeds” to minors in California unless parents give explicit permission. These feeds, which use behavioral algorithms to keep users scrolling, are at the heart of how apps like Instagram and TikTok operate. For minors, though, the rules are getting stricter.
The law, called SB 976, defines addictive feeds as those that recommend content based on what users do, not what they specifically say they want to see. The idea is to stop these platforms from trapping young people in endless cycles of content. Companies now need to either adjust their algorithms for younger users or risk facing penalties. And by 2027, they’ll have to introduce tech to verify users’ ages and customize feeds accordingly.
As highlighted by TechCrunch, this law didn’t happen without a fight. NetChoice, a group representing companies like Meta and Google, sued to block it. They argued it violates free speech rights. While they managed to block parts of the law, like one that restricted nighttime notifications for minors, the ban on addictive feeds is moving forward. California’s not alone here, though. New York passed a similar law earlier this year, and other states are starting to look at their options.
This is a big deal for platforms like Instagram. Their algorithms have faced criticism for years, especially around the mental health of teenagers. Studies are already digging into how these feeds affect young users, and the findings aren’t great. TikTok has also been under fire globally for its influence on younger audiences. It recently turned to AI to moderate content, but that’s a different problem altogether.
This isn’t just about laws, though. It’s part of a broader movement by governments and parents to step in where they feel tech companies aren’t doing enough. In the UK, for example, thousands of parents and schools have agreed to delay giving kids smartphones until they’re 14. California’s law takes a different approach. It’s not about keeping kids off devices but about making the platforms they use safer.
So, what happens now? Platforms like Instagram and TikTok need to figure out how to comply without losing the features that make their apps so popular. Behavioral algorithms are a big part of their success. They keep people engaged and bring in ad revenue. But now, companies will have to balance that with new rules designed to protect younger users.
For teens in California, this could mean less scrolling and more intentional use of these apps. For parents, it’s a chance to have more control over what their kids see online. And for everyone else, it’s a sign that social media’s free reign might be coming to an end.