Social Media CEOs Brace for Senate Spotlight on Child Safety: What to Expect

(Bloomberg) — Some of the tech industry’s most prominent and powerful leaders will descend on Capitol Hill on Wednesday for a Senate hearing focused on protecting children online.

Read more from Bloomberg

The chief executives of Meta Platforms Inc., X (formerly Twitter), Snap Inc., TikTok and Discord will provide testimony and take questions from members of the Senate Judiciary Committee, which has supported several bills related to children’s digital safety. Social media platforms have been increasingly scrutinized by Congress as growing evidence suggests that excessive use and proliferation of harmful content can harm young people’s mental health.

Several bipartisan proposals seek to hold technology companies accountable, strengthen protections for young users and stop the sexual exploitation of children online. But dozens of tech trade groups and civil liberties organizations have criticized many of the proposed measures as flawed and counterproductive, arguing that passage would make online privacy and safety worse. A handful of social media companies, including TikTok, which is owned by ByteDance Ltd., and Meta, are facing lawsuits in California that claim the companies were negligent and ignored potential harm. in their platform for teenagers.

Questions from elected officials rarely remain hyper-focused on the matter at hand when tech CEOs visit Washington, especially during hearings related to online content moderation. Wednesday’s session may be quite wide-ranging. Here’s a look at who will be testifying from each company, and what they’re likely to discuss.

Read more: Senators Meta, X, Tiktok Examine Children’s Online Safety

Mark Zuckerberg – Meta

Meta, which owns social networking apps Facebook and Instagram, has faced significant backlash over the years regarding its child safety practices. Investigations by whistleblowers, news organizations and academic researchers have found that the company’s sites can harm the mental health of young users and link predatory networks to child sexual content. In October, more than 30 states sued Meta, alleging that its social media apps were promoting content harmful to youth. Meta had plans in 2021 to create a version of Instagram for children under 13, but later scrapped those plans after backlash related to Instagram’s impact on teen mental health.

CEO Mark Zuckerberg, who has testified extensively before Congress in the past but has recently retreated from policy issues, will focus on the company’s efforts to improve child safety. Earlier this month the company announced plans to tighten default messaging settings for teens on Instagram and Facebook, and prevent teens from seeing age-inappropriate content. Meta has advertising policies that prevent marketers from showing certain types of ads to teenagers, or targeting them based on certain factors, such as gender or their activity on the network.

Linda Yaccarino – X

Linda Yaccarino will make her first appearance before Congress as CEO of X, a role she took on last June. Yaccarino, a former head of advertising at NBCUniversal, has spent his first eight months in the job trying to win back advertisers and convince skeptics that X’s owner, Elon Musk, still cares about policing the social network . Musk has repeatedly spoken or tweeted about the importance of protecting children online, making it a central part of the company’s public campaign to regain user trust and approval.

Yaccarino was in Washington this week to meet with senators ahead of Wednesday’s hearing and talk about the company’s efforts to combat child sexual exploitation. During a meeting, she also emphasized that X is a completely different company than its predecessor, Twitter, and is likely to try to separate X from its former self during the hearing by highlighting differences in content policies or in strategy. X announced over the weekend that it will build a new Trust and Safety center in Austin where employees will be primarily focused on combating child sexual exploitation. Yaccarino is also likely to face issues surrounding the proliferation of AI-generated content, including explicit content, and its spread on X. Last week, explicit generated images similar to Taylor Swift were circulated on X for hours before they were removed, collecting millions. raise comments and questions about the platform’s ability to quickly mitigate illegal and illegal posts.

Evan Spiegel – Jump

CEO Evan Spiegel oversees Snapchat, an app popular with teenagers focused more on one-on-one messaging than public posting. But that is not insulated from criticism. Snap is facing a lawsuit in California brought by families who allege their children died of overdoses after buying drugs through the app.

In 2022, Snap introduced a function that allows parents or guardians to see certain activity on their child’s account and apply controls, such as whether their child is allowed to interact with the company’s AI chatbot. Last year, the app also implemented a strike system for accounts that publicly post content to Stories or Spotlight that is inappropriate for teenagers.

Before Wednesday’s hearing, Snap became the first tech company to endorse the Kids Online Safety Act, opposing industry trade group NetChoice’s position on the bill.

Shou Chew – TikTok

CEO Shou Chew returns to Congress almost a year after his first solo testimony before the House. Then, he was questioned about child safety concerns about the addictive nature of TikTok, and about content that promotes eating disorders, drug sales and sexual exploitation. Chew, who faced a rowdy audience at that hearing, argued that these issues are not unique to TikTok.

Last year, the company introduced a preset time limit of one hour for users under 18 before the app requires a passcode to continue viewing. Users who say they are between 13 and 15 years old by default have private accounts and cannot send messages. Like other apps, TikTok has a dashboard that can share usage information with parents and caregivers.

Chew could also get questions about the company’s relationship with China through parent company ByteDance. Senators may also ask about recent hot-button issues not related to children, including perceived biases in conflicts such as the Israel-Hamas war, as well as the proliferation of AI-generated videos.

Jason Citron – Discord

Originally a chat app for gamers, Discord has been involved in several high-profile investigations related to child exploitation, extremism and even terrorism. Today, Discord is mainstream among millennials and Gen Z for everyday communication with friends and acquaintances online. In 2021, the company reported 150 million monthly active users and even explored the acquisition of Microsoft for $12 billion.

With increased demand comes high levels of abuse. Between 2021 and 2022, cases of child sexual exploitation on the platform increased nearly sixfold to 169,800, according to data from the National Center for Missing and Exploited Children. That’s 73% higher than X, although the increase is also partly due to better detection methods.

CEO and co-founder Jason Citron will represent the company before the committee and discuss initiatives to protect children on the platform. That will likely include its new open-source model for detecting novice child abuse and working with the cross-platform child safety organization Lantern. In late 2023, Discord rolled out new features for teens and families to better control their online experience. — By Aisha Counts and Oma Seddiq

Read more from Bloomberg Businessweek

©2024 Bloomberg LP

Leave a Reply

Your email address will not be published. Required fields are marked *