Congress is turning its focus to two of the internet industry’s new but crucial faces: TikTok and Snap, after repeatedly bringing in the same corporations and their shy, overtrained executives.
Senators from the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security will interview policymakers from those two businesses, as well as YouTube, on how their platforms influence vulnerable teenage users on Tuesday.
In early October, shortly after disclosing her name, Facebook whistleblower Frances Haugen testified before the same committee on related matters.
Snap’s VP of Global Public Policy Jennifer Stout, TikTok’s VP and Head of Public Policy Michael Beckerman, and YouTube’s Leslie Miller, who handles government affairs and public policy, will testify at the hearing on Tuesday at 7 a.m. PT.
Senator Richard Blumenthal (D-CT), the chair of the subcommittee, will lead the hearing, which will focus on the negative impacts of social media on children and teenagers.
“The bombshell reports about Facebook and Instagram—their toxic effects on young users and lack of truth or transparency—raise serious concerns about Big Tech’s approach to kids across the board,” Blumenthal said, linking reports about Instagram’s dangers for teens to the dangers of social media in general.
Marsha Blackburn (R-TN), the ranking Republican on the subcommittee, has expressed an interest in privacy concerns surrounding TikTok.
As members of the subcommittee take turns questioning the three policy heads, we anticipate to hear about eating disorders, harassment, bullying, internet safety, and data privacy. The senators will also examine measures that may help protect children and teenagers online, however how solution-oriented the meeting will be remains to be seen.
The KIDS Act (Kids Internet Design and Safety) is one of the potential answers, as it would introduce new internet protections for people under the age of 16. Last month, Blumenthal and Democratic Senator Ed Markey reintroduced the bill.
The mental health of children and teenagers isn’t the only societal concern that social media platforms are currently involved in, but it is one that both Republicans and Democrats are rallying around. For starters, it’s a unique forum for criticism, with plenty of political overlap on both sides.
Both parties appear to agree that tech’s biggest companies need to be regulated in some way, though they emphasise different aspects of the why: conservatives argue that these companies have too much control over what content is removed from their platforms, while liberals argue that these companies have too much control over what content is added.
Democrats, on the other hand, are often more concerned about the type of content that is left up, such as extremism and disinformation.
The hearing on Tuesday will almost certainly delve into how algorithms promote dangerous content. Hearings are a rare opportunity for the public to learn more about how social media businesses offer individualised information to their users because social media companies keep their cards close to their chests when it comes to how their algorithms function.
We’d like to think that the often lengthy, repetitive tech hearings Congress has held in recent years have taught us a lot about that kind of thing, but between lawmakers asking uninformed or irrelevant questions and evasive tech executives with hours of media training under their belts, the best we can usually hope for is a few new tidbits of information.
While Facebook will not be present at this hearing, new disclosures about the business and Instagram are likely to influence what happens on Tuesday. The public response to leaked Facebook information has been a focus for all three social media corporations set to testify, and new reporting on that data just came out on Monday.
TikTok announced a new set of safety precautions, including a well-being guide, better search interventions, and opt-in popups for sensitive search terms, shortly after early reports that Instagram is aware of the hazards it poses to underage users.
Snap launched a new set of family-focused safety capabilities this week, giving parents additional insight into what their children are doing on the site. When compared to platforms like Facebook, Instagram, and Twitter, these social networks have a higher percentage of younger users, making robust safety tools even more important.
Prior to the hearing, YouTube revealed several improvements to the types of children’s material that will be eligible for monetization, as well as other kid-friendly safety features.