YouTube, TikTok, Snap want to distance themselves from Facebook

Officials appeared at a Senate committee hearing a day after a consortium of 17 news outlets, including Bloomberg, published dozens of articles based on leaked Facebook data suggesting that the company had put profits on the safety of users. How prioritized – especially teens – over its products.

The Consumer Protection Panel of the Senate Commerce Committee, led by Connecticut Democrat Richard Blumenthal and Tennessee Republican Marsha Blackburn, is investigating the efforts of Alphabet Inc.’s YouTube, ByteDance Ltd.’s TikTok and Snap Inc. to protect the online privacy of children and teens.

Blumenthal said in his opening remarks, “Differing from Facebook is no defense. What we want is not a race to the bottom, but a race to the top, really.”

Blumenthal said that while tech companies shouldn’t rely on parents to protect their kids’ privacy on their platforms, features need to be built in.

“I want a market where competition is there to protect children,” he said.

Blackburn expressed concerns about the data collected by TikTok and whether it has been shared with the Chinese government, where parent company ByteDance is based. She said that despite vague assurances, TikTok “has not eased my concerns in the slightest.”

While TikTok said it stores its data outside of China, including in Singapore and the US, “we do not share information with the Chinese government,” Michael Beckerman, TikTok’s vice president and head of public policy for the US, said at the hearing.

Witnesses also included Snap’s president of global public policy Jennifer Stout and YouTube’s vice president of government affairs and public policy, Leslie Miller.

emphasis on safety

Blumenthal and Blackburn’s subcommittee first heard from Facebook whistle-blower Frances Hogen, the former product manager who leaked documents to the committee and the US Securities and Exchange Commission. Haugen highlighted how Facebook’s engagement-based algorithms drive harmful content to go viral on the platform. She said these algorithms particularly affect teenage girls who already have negative thoughts about their bodies.

Three social media companies sought to differentiate themselves from Facebook in their approach to online safety, as TikTok and Snap make their first appearances before Congress.

Last week, Blumenthal separately invited Facebook CEO Mark Zuckerberg to testify before the subcommittee at a future hearing.

Facebook, Worried by Teen Usage Drop, Left Investors in the Dark

Snap stresses that one of its strongest privacy protections is that it only allows users 13 and older, and has no plans to market to children under 13. The registration process fails for individuals under the age of 13 who try to sign up.

“We make no effort – and have no plan – to bring children to market,” Stout told the committee.

Stout said that regulation alone will not solve the challenges of online privacy. “Technology companies must take responsibility and actively protect the communities they serve,” she said.

TikTok highlighted specific actions it has taken to protect children in recent years, including disabling the direct messaging feature for users under the age of 16. The company has also disabled all users from sending certain videos, photos and website links, and only those videos that have been approved through it. Content moderation is allowed.

TikTok has also removed 11 million suspected underage accounts from April to June 2021. But the company acknowledged the challenges it is facing.

“We know trust must be earned, and we want to earn trust through a high level of action, transparency and accountability, as well as the humility to learn and improve,” Beckerman said.

Facebook, Trump and how online speech operates: QuickTech

YouTube’s Miller told the panel that YouTube Kids, created in 2015, provides tools for parents to control and customize apps for kids. Miller said children under the age of 13 who are not in a parent’s “supervised experience” are not allowed on YouTube. They don’t allow personalized ads on YouTube Kids or the “monitored experience.”

Miller said the company removed about 1.8 million videos from April to June 2021 for violating the company’s child safety policies.

efforts of the legislature

Blumenthal and Massachusetts Democrat Ed Markey have sponsored legislation to update the Children’s Online Privacy Protection Act, which was enacted in 1998, several years before the social media companies’ launch. The law currently prohibits the collection of personal information of children under the age of 13. The law would extend the protection up to the age of 16. The bill has bipartisan support from Republican Senators Bill Cassidy of Louisiana and Cynthia Loomis of Wyoming.

Blumenthal also backed legislation to prohibit certain manipulative marketing practices geared toward online users under the age of 16, including banning auto-play features and algorithms that enhance violent and dangerous content. Is. That bill has no Republican sponsor to date.

subscribe to mint newspaper

* Enter a valid email

* Thank you for subscribing to our newsletter!

Don’t miss a story! Stay connected and informed with Mint.
download
Our App Now!!

.

Leave a Reply