Pair of California Bills to Protect Kids Online Show Promise, but there are Concerns

California lawmakers are considering a pair of bills to regulate kids’ use of online services. The state's tech laws can…

May 19, 2022

Share this...

by Larry Magid
This post first appeared in the Mercury News

California lawmakers are considering a pair of bills to regulate kids’ use of online services. One has a lot of promise but the other may have some unfortunate unintended consequences. This is a national and international issue because what happens in California tech, affects the entire world.

Although I have a few concerns over its details, I’m generally impressed by the California Age Appropriate Design Code Act (AB 2273) but I have my doubts about the Social Media Platform Duty to Children Act (AB 2408). Both bills are co-sponsored by Assembly members Jordan Cunningham (R-San Luis Obispo) and Buffy Wicks (D-Oakland).

Listen to “Interview with co-sponsor of California Age Appropriate Design Code Act” on Spreaker.
Listen to interview with  Assemblywoman Buffy Wicks

The California Age Appropriate Design Code Act is designed to protect minors under the age of 18, which is different from the current federal law, the Children’s Online Privacy Protection Act (COPPA) that applies only to children  under 13. But unlike COPPA, this bill does not completely eliminate a company’s right to collect personally identifiable information which has effectively required social media companies to ban children under 13. Instead, the California bill smartly requires companies to default teens to “a high level of privacy protection,” something that some companies already do. It also requires companies to post their privacy information and terms of service prominently in language “suited to the age of children” who are likely to use the service. If it becomes law, the bill would also prohibit companies from using any information from people under 18 for any purpose other than to deliver their service. That makes a lot more sense than COPPA’s complete ban on collecting any personal information from children under 13.

One thing I love about this bill is that it requires the service to provide “an obvious signal” to the child when they are being tracked or monitored if the service has a feature that allows the “child’s parent, guardian, or any other consumer to monitor the child’s online activity or track their location.” I have long argued that parents should not use any parental control or monitoring tool in stealth mode.

The California bill is modeled after the UK’s  Age Appropriate Design Code and, since most social media companies that operate in California and other US states also operate in the UK, many have already adopted parts of the UK code for their US users.

Details matter

As always it’s important to read the fine print and consider how this bill will be implemented, I have some concerns over the proposed task force which would be given regulatory powers. It would be appointed by the California Privacy Protection Agency, which is itself a new agency. Who would be on this task force and to whom would they ultimately be accountable? This brand new agency isn’t fully staffed nor has it promulgated any rules. It’s important that the task force include child rights experts as well as child safety and development experts. It’s not uncommon for people who focus on child protection to take actions that could unintentionally limit child rights. Many young people turn to social media to explore and express concerns around politics, religion, sexuality, health and many other topics that are important to them.

I’m also concerned that this bill is aimed at services “Likely to be accessed by a child.” I get that they’re trying to focus on companies that have content that attracts children even if they claim that they don’t market to kids but “likely to be accessed” can include a great deal of content. The Superbowl, for example, is watched by millions or children, but that hasn’t stopped TV networks from airing commercials for adult beverages. I know a kindergarten teacher who was unable to play children’s music from her YouTube Music Premium to her class because of YouTube’s over-cautious reaction to the Children’s Online Privacy Protection Act which was designed to keep personal information from kids away from marketers but not to prevent teachers from playing music to their students. The content may have been aimed at children, but the person playing it was a responsible adult.

The bill also states that “age verification methods must be proportionate to the risks that arise from the data management practices of the business, privacy protective, and minimally invasive.” I completely agree, but it’s also important to understand that age verification is very difficult in the U.S. where many children don’t have government issued ID and privacy laws prohibit access to school and social security records. In 2008 I served on the Internet Safety Technical Task Force which, after hearing from multiple experts on age verification, concluded that it wasn’t practical within the context of US laws and technology. Admittedly, artificial intelligence has progressed since then making this all worth a second look, but determining whether someone is a child is trickier than it might seem.

Finally, as with all state laws affecting internet use, there is the issue of state vs federal regulations. Because of its population size and tech presence, California regulations will likely set a floor for how the companies behave in every state. But when states pass rules that might contradict each other, it creates a confusing playing field for industry, regulators and users.

Listen to California bill would improve online privacy for children and teens
Listen to the one-minute ConnectSafely Report with Larry Magid about the Age Appropriate Design Code act

Addiction bill

I have stronger concerns about the Social Media Platform Duty to Children Act. Although perhaps I’m quivering over semantics, I’m not sure I even agree with the bill’s premise that social media is “addictive.” Although there are some psychologists and psychiatrists who believe that it is, the official bodies that represent psychiatrists and psychologists don’t classify it as such though they do recognize that obsessive use of technology can be problematic and harmful.

Having said that, I can’t argue with the bill’s backers that many kids spend too much time on social media and have a hard time getting away from their devices. For that matter, so do many adults but there is a long tradition of laws that protect children from things that are legal for adults.

The operative part of the bill calls for a civil penalty of up to $250,000 for a social media platform having features “that were known, or should have been known, by the platform to be addictive to children.” The service could “be liable for all damages to child users that are, in whole or in part, caused by the platform’s features, including, but not limited to, suicide, mental illness, eating disorders, emotional distress, and costs for medical care, including care provided by licensed mental health professionals.” There are some carve outs so this bill doesn’t ban everything that makes social media sites compelling, but it nevertheless runs the risk of preventing companies from offering features that kids love and should be able to use in moderation and, in some cases, with parental supervision.

I get it. Social media companies employ techniques designed to keep people online longer and some of these affect children as well as adults. But that’s true with just about any product. There are plenty of people who consider sugar to be addictive but that doesn’t stop companies from selling and marketing sugary sweets for children. If a child becomes obese after eating an excessive amount of Ben and Jerry’s ice cream, should the company be liable for both the physical and mental health consequences? What if that child also eats a lot of Lays potato chips? Should Pepsico, which owns Frito-Lay, be sued as well. How do we know how many pounds the child gained from ice cream vs how many from potato chips and what about all the other aspects of the child’s life? Perhaps someone should sue their school for not having a vigorous enough PE program? Maybe food companies should be compelled to make their products unappealing to children as a way of preventing over consumption.

There are also people who think TV is addictive so what about shows that have a cliff hanger at the end of an episode that hooks you into watching the next one, even though it’s way past your bedtime. By that definition, I’m addicted to just about every show I’ve “binged watched.”

I don’t mean to trivialize a serious problem. My non-profit, ConnectSafley.org, has devoted a great deal of resources to helping families deal with problematic internet use but both the problem and the solution is far more complex than just limiting screen time or punishing social media companies from employing features designed to keep people online longer.

I want to end by applauding Assemblymembers Wicks and Cunningham for both these well-meaning bills.  They should be given ample consideration but it’s important to focus on all the details and possible unintended consequences.  I look forward to seeing how these bills evolve.

Disclosure: Larry Magid is CEO of ConnectSafely.org which receives financial support from Meta, Google and other technology companies that could be affected by these bills.


Share this...