In summary
After pioneering consumer data privacy protections, California’s Legislature has tapped the brakes on proposals to further regulate social media companies.
California lawmaker Cristina Garcia was spending time with a friend when she asked her friend’s 8-year-old why she was refusing to eat.
“I said, ‘Why aren’t you eating lunch?’ And she’s like, ‘Well, because I have to look thin and look a certain way to be a YouTube star and get followers,’” said Garcia, a Democrat from Bell Gardens who is a member of the Assembly.
After spending 13 years as a teacher, she had seen how eating disorders and body image concerns affected her students — and she blames social media’s altered images for feeding the problem. “I think that affects you even as an adult, but I think it’s particularly damaging when you’re a young kid, when you’re still trying to figure things out,” she said. “They’re being bombarded with these images that are photoshopped into unrealistic expectations.”
She’s carrying legislation that would require social media companies to disclose if certain images have been retouched. If someone — and particularly influencers — artificially improved their skin or toned their body, and did so in order to make money, the platform would have to say so and even specify what they altered.
But California state lawmakers who have introduced bills to further tighten social media practices are having little success thus far this year.
Lawmakers pushed consideration of Garcia’s Assembly Bill 613 to next year amid pushback from social media firms. The companies said it was difficult for them to even know when a picture has been edited since it often occurs on a third-party platform, such as a photo-editing app.
Other bills similarly shuffled over to a two-year track include one that would prohibit features like auto-play for children unless parents opt in to allow it, and another that would require social media companies to report obscene or violent posts on their platforms.
Michael Karanicolas, executive director of UCLA’s Institute for Technology, Law and Policy, said restricting social media is tricky to assess from a constitutional perspective, because laws and court rulings are constantly in flux.
“It’s not always easy to draw a clear and bright line between what the government can force you to disclose … and what the government can’t force you to say,” Karanicolas said. “That doesn’t mean that they’re not going to regulate the space because maybe the state government feels it’s worthwhile to roll the dice and see how the law survives a constitutional challenge.”
And the supermajority of Democrats gave the cold shoulder to a Republican-sponsored bill that would have deemed social media platforms the new “public square” — and sought to prevent them from restricting speech considered lawful under the First Amendment. A recent real-world example: Twitter’s ban on the account of former President Donald Trump.
Regulations — too far or not far enough?
So what tech-targeting social media bills are still moving forward this year? A bill that would require big social media firms to file quarterly terms of service with the state.
California’s landmark computer privacy law, which took effect last year, was the first of its kind nationwide to give people more control over their digital data. It grants Californians the right to request information businesses collect about them for free, and requires businesses to give users the chance to opt-out of having their data sold.
Critics say the law doesn’t go far enough in reining in social media platforms.
They’ve have called for increased protections for children roaming the Internet, and for regulation that stops the spread of misinformation and hate speech on social media.
While spending less than traditional big lobbying forces like oil interests and labor unions, technology companies still have a great deal of influence in the state because of their hefty economic impact; California’s surprising budget surplus this year is in part due to the huge success of Golden State-based tech giants during the pandemic.
A bill still in play this year would require social media companies to file quarterly reports about their terms of service to the California attorney general. The bill’s author, Assemblymember Jesse Gabriel, an Encino Democrat, said there’s a lot of confusion over social media companies’ policies. For example, it’s difficult to find terms of service reports — which outline policies like how companies collect users’ data and moderate content — on companies’ websites and there’s often no historical record of past reports.
‘Looking under the hood’ of social media
AB 587 would also require companies to provide data on how well they complied with their terms of service. Social media companies that have over $100 million in revenue would have to report statistics such as the number of posts flagged by the company, the views those flagged posts received and the number of posts that were removed, demonetized or deprioritized.
Gabriel said he hopes the proposed law would achieve two things: encourage “good behavior” and allow policymakers to better understand how misinformation, hate speech and the like spread on social media and influence hate crimes.
The proposed legislation would force social media companies to “let folks look under the hood a little bit because there’s just a lot of confusion right now and a lot of skepticism about what they’re doing,” Gabriel said. “I’m a big believer in transparency, because it encourages people to behave in ways that they would want for the public to see them behaving.”
Noting that the bill targets companies with over $100 million dollars in revenue in the past year, Gabriel said “I think companies like that can pretty easily comply with what we’re asking them to do, and I think a lot of this information that we’re asking for, they’re already looking at on a daily basis and maybe even more frequently than that.”
A Facebook spokesperson told CalMatters via email that Facebook already publishes “regular transparency reports, including our quarterly Community Standards Enforcement Report.” The report shares data on how many posts violated Facebook’s content standards and what actions the company took to deal with them, and can be found on the company’s Transparency Center website.
Facebook recently announced those reports would be audited outside the company by Ernest & Young, “so we’re not grading our own homework,” the Facebook rep said.
Giving bad actors a blueprint?
Business interests including the Internet Association, which represents Big Tech companies such as Facebook, Twitter and Google, contend that Gabriel’s bill may undermine the goal of reducing misinformation and hate speech by providing “bad actors” with a granular blueprint for evading detection.
“While well intentioned, these requirements will ultimately allow scammers, spammers, and other bad actors to exploit our systems and moderators,” went their argument, cited in the Assembly bill analysis.
These groups also warn that Gabriel’s bill could open up social media companies to lawsuits over routine decisions by content moderators, and perhaps even for how effective companies’ moderation practices are in the first place — which platforms predict could deter them from investing in content moderation.
“These requirements will ultimately allow scammers, spammers, and other bad actors to exploit our systems and moderators.”
Tech industry objections filed in Assembly
An Internet Association industry spokesperson, who would discuss the bill only if he was not named, noted that it would not apply to some social media companies that have a share in the spread of misinformation and hate speech. For example, the current law would include fancy exercise bike company Peloton, but not right-wing friendly social media sites Parler or Gab, since they don’t meet the $100 million revenue requirement.
One of the bills lawmakers held over until next year, AB 1545 by Berkeley Democratic Assemblymember Buffy Wicks, aims to add more parental controls around auto-play features and in-app purchases.
Do parents need an assist from Big Tech?
Wicks said the auto-play on websites like YouTube can lead to children watching objectionable content. Her example: If parents put on a “Thomas the Tank Engine” video on YouTube, an hour later their child might be watching a video on train crashes, depending on what YouTube algorithms think qualifies as related content.
Her measure would require websites like YouTube to add a parental opt-in for auto-play. A previous version — which died — would have created broader regulation.
Wicks said the bill has been popular with both Democrats and Republicans, especially those who are parents: “Any parent who has dealt with technology these days with their children knows this problem.”
“Just because you can create a product targeted at young people doesn’t mean you should.”
Marvin Deon, a vice president at Common Sense Media
The Internet Association objected to how the bill would be enforced. For example, the bill would require social media companies to disclose if an individual makes money from a post, which could be difficult to discern. It also said the bill’s requirement of an annual audit to ensure compliance with the Children’s Online Privacy Protection Rule was unnecessary because the attorney general already has the authority to enforce the rule.
“Just because you can create a product targeted at young people doesn’t mean you should,” said Marvin Deon, a vice president at Common Sense Media, a nonprofit that provides families with media literacy resources and age-based ratings of movies, TV shows and books. “We have to be sure that we’re keeping an eye on the Constitution, but also not skirting our duties to protect kids.
“Things that go after the addictive nature of some of the designs of these platforms, like the auto-play where a kid can start off looking at a Disney cartoon and 20 minutes later, he’s looking at a kid selling toys, and then 20 minutes later, some kid showing someone being blown up with some type of an explosive.”
Tech and social media companies often counter that parents are responsible for monitoring and regulating children’s online and social media use. But David Monahan, campaign director for Campaign for a Commercial Free Childhood, a non-profit that advocates for children’s privacy, disagrees.
Monahan said that legislation is necessary until companies stop manipulative and unfair practices, such as targeting kids to spend excessive amounts of time online, share personal information, watch advertising and make in-game or in-app purchases.
“We find corporations pointing the finger at families and parents and saying, ‘You’re the gatekeepers? Why aren’t you protecting your kids?’ And that’s really unfair,” Monahan said. “Parents need an assist from Big Tech.”
###
CALmatters.org is a nonprofit, nonpartisan media venture explaining California policies and politics.