Is Big Tech Regulation Finally Coming? | Opinion

Last week, the Senate Judiciary Committee took the CEOs of Meta, X, TikTok, Snap, and Discord to task on child safety. And it was a tense discussion. Senators verbally flogged each tech executive. Senator Lindsey Graham (R-S.C.) started off by describing social media services as "dangerous products" that are "destroying lives" and "threatening democracy itself." Senator Dick Durbin (D-Ill.) said that these platforms subject children to "sextortion"—a practice in which predators trick children into sending sexually explicit pictures of themselves.

The bipartisan assault didn't stop there.

Senator Marsha Blackburn (R-Tenn.) pointed out that Meta specifically targets young users, and even places a monetary value of $270 on each child's life. At one point, she accused Meta of being the "premier sex trafficking site in this country." Senators Ted Cruz (R-Tex.), Tom Cotton (R-Ark.), and Josh Hawley (R-Mo.) all went after TikTok's apparent relationship with the Chinese government. Senator Laphonza Butler (D-Calif.) asked Snapchat's CEO to apologize to the parents who have lost children to overdoses from drugs obtained on the platform.

This all confirms what we already knew—tech's self-regulation schemes don't work. Why? Well, as Senator Sheldon Whitehouse (D-R.I.) bluntly put it, the "platforms really suck at policing themselves."

Just look at their track record.

The Wall Street Journal found that pedophiles are using Instagram to solicit sex from minors; the platform may have even promoted that content and connected kids to pedophiles through various hashtags. Kids are overdosing from drugs they obtained over Snapchat.

Things are even worse on TikTok. Use of the platform has led to a slew of accidental child deaths or hospitalizations. The Center for Countering Digital Hate released a report in December showing how TikTok "bombard[s]" kids "with eating disorder and self-harm content." What's more, suicide content had over a million posts and 8.8 billion views on the platform. Our version of TikTok is so bad that it isn't even allowed in China. Its undeniable relationship with the Chinese government has a bipartisan chorus calling for the app to be banned in the U.S.

Senators expressed exhaustion and frustration at tech companies not coming to the table on solutions. Senator Graham lamented on the fact that "there's not one law on the book" for parents to hold Big Tech accountable. Along similar lines, Senator Amy Klobuchar (D-Minn.) said "I'm so tired of this.... It's been 28 years...since the start of the internet. We haven't passed any of these bills, because everyone's 'double talk, double talk.' It's time to actually pass them."

Mark Zuckerberg at Senate Judiciary hearing
Mark Zuckerberg, CEO of Meta, testifies during the US Senate Judiciary Committee hearing, "Big Tech and the Online Child Sexual Exploitation Crisis," in Washington, DC, on January 31, 2024. Brendan SMIALOWSKI / AFP/Getty Images

But here's the thing—the tides are turning.

This hearing demonstrated that. First, it showed the bipartisanship on this issue is stronger than ever before. Better yet, Congress has proposed promising bipartisan bills directed at protecting children online.

But the true shift is that we now have some industry support for these measures. In fact, at the recent Senate hearing, the CEOs of X and SnapChat both endorsed Senators Blumenthal and Blackburn's Kids Online Safety Act (KOSA). KOSA would impose a duty of care on social media companies "to prevent and mitigate harms to minors," which includes preventing platforms from promoting content containing "self-harm, suicide, eating disorders, substance abuse, and sexual exploitation" to children. LinkedIn also publicly announced the company's support for KOSA. Industry was a huge missing piece to getting KOSA over the line last term and now we have it.

X took its support one step further. It shockingly supported the Stop CSAM Act, which reforms Section 230 of the Communications Decency Act by allowing victims of online child exploitation to sue the platform. This is a complete one-eighty for tech platforms.

More surprisingly, Meta supported and even proposed legislative measures requiring parental consent to use social media services. This is a far cry from where the company was even five months ago. This approach would require app stores to have procedures in place for an app to verify the age of a user without having of-age consumers provide more personal information to social media companies, like giving them pictures of users' government IDs.

How? Apple and Google—noticeably missing from this hearing—are the gatekeepers to that information because age information is located on the devices they control. It's why kids can't purchase products or services without a parent's say so. Device-level verifications make sense because, not only do Apple and Google already collect this information, but mobile devices are the primary access points for kids using app store services. With this legislation, age verification and parental consent regimes will be more obtainable.

We have all of the ingredients for success—bipartisan agreement, targeted solutions, and even industry backing—so there's no reason to wait any longer to protect our kids from Big Tech. Frankly, we should all heed Senator Graham's sentiments: "For all [social media's] upside, the dark side is too great to live with." It's time for Congress to put our children first and not cower to corporations that are more than willing to exploit them for profit.

Joel Thayer is president of the Digital Progress Institute and an attorney based in Washington, D.C. The Digital Progress Institute is a nonprofit seeking to bridge the policy divide between telecom and tech through bipartisan consensus.

The views expressed in this article are the writer's own.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer



To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go