Consumer Reports supports, if it is amended, the California Age-Appropriate Design Code (AB 2273). For over 80 years, Consumer Reports has worked with consumers for truth, transparency, and fairness in the marketplace. We are strong proponents of public policy that protects consumer safety and bolsters consumers’ privacy. It is within this framework that we support the goals of AB 2273, to ensure greater protections for the safety and privacy of children and teens online. However, we believe that significant changes are necessary to this legislation in order to achieve these goals and to not create new problems.
All Californians, including kids, need more protections online. And kids are online more than ever, leaving them vulnerable to inappropriate data practices. In 2020, there were nearly none million apps worldwide, many of which were directed at children or are designed to work in conjunction with a connected-device for children. In 2021, 57% of kids between eight and 12 had a tablet, and 94% of households with children and teens between eight and 18 had access to a smartphone. And the amount of time children spend on these devices has increased over time. In 2021, 64% of kids between eight and 12 reported that they watched online videos “every day,” up from 56% in 2019. Eighteen percent reported that they used social media each day, up from 13% in 2019.
Too often, businesses have skirted Children’s Online Privacy Protection Act (COPPA), the federal law that requires parental consent for processing the personal data of children under 13. For example, in 2018, researchers found that the majority of the 5,855 most popular free children’s apps were potentially in violation of COPPA, due to their use of third-party software development kits (SDKs). Although many SDKs offer the ability to comply with COPPA by disabling tracking and behavioral advertising, their study suggests that a majority of the apps tested either do not make use of these configurations or “incorrectly propagate them across mediation SDKs.” Further, the researchers found that “19% of children’s apps collect identifiers or other personally identifiable information (PII) via SDKs whose terms of service outright prohibit their use in child-directed apps.” In recent years, the Federal Trade Commission has taken action against Google and YouTube, as well as Tik Tok, for COPPA violations.
That’s why we particularly appreciate provisions in the bill that would strengthen privacy protections, including by requiring covered businesses to “Provide prominent, accessible, and responsive tools to help children exercise their privacy rights and report concerns[,]” and to “provide an obvious signal to the child when they are being monitored or tracked[.]” These are reasonable mandates that would give parents and children greater ability to limit unwanted and unwarranted privacy invasions.
Conversely, some of the provisions of the bill appear to undermine its privacy intent, and we make several suggestions to better ensure that privacy is adequately protected.
- Remove the age verification requirement. Section 1798.99.31(a)(3) of the bill appears to mandate that services identify and verify the ages of all users. Mandated identity verification would require invasive and expensive data collection and eliminate consumers’ right to read and speak anonymously, undermining the bill’s fundamental objectives.
- Accommodate teens’ unique circumstances. Protections that are appropriate for kids under 13 are very different from teenagers. In some cases, parental control over teens’ activity online could raise immediate threats to their well-being. We suggest either bifurcating the bill to have different protections for kids under 13 and teens, or limiting the protections in the bill to kids under 13.
- Clarify obligations. Many of the bills’ requirements are vague, such as requirements to “maintain the highest level of privacy by default” or “universally uphold published terms, policies, and community standards.” Further, broad prohibitions on all secondary use of children’s data does not specify any targeted carveouts for operational uses such as fraud prevention or analytics. In order to provide more certainty for consumers as well as companies, these obligations should be more clearly articulated.
- Clarify applicability. Currently the bill applies to any service that “more likely than not” would be accessed by any child. Statistically, that would seemingly apply to the vast majority of websites and businesses. While all sites should have some obligation to ensure that kids are protected on their services, only sites with a substantial audience of child users should be obligated to enact heightened protections.
- Provide additional funding and time for the California Privacy Protection Agency (CPPA) to comply. The measure directs the CPPA to create a new task force focused on children’s privacy and safety online, and directs them to conduct an extensive rulemaking to implement the measure. Given the CPPA’s scarce resources, these additional demands could undermine their work in implementing and enforcing the California Privacy Rights Act (CPRA), which would compromise privacy protections for all Californians.
For the full letter, please see the attached PDF.