Episode 88 — California AADC: Age-Appropriate Design Code Protections

The California Age-Appropriate Design Code, or AADC, applies to online services that are “likely to be accessed” by children and teens under eighteen. This standard is broader than the federal COPPA framework, which focuses narrowly on services directed at children under thirteen. Instead, the AADC looks at practical use and market realities: does the service have child-friendly content, appeal to younger audiences, or collect signals indicating regular youth access? For example, a gaming platform with cartoon aesthetics, a social media app popular with teens, or an educational tool used in classrooms would all fall within scope even if not exclusively designed for minors. The “likely to be accessed” test shifts responsibility to businesses to evaluate risk indicators and act conservatively when children are present in their user base. Learners should understand this as a protective standard that expands obligations wherever youth engagement is reasonably foreseeable.
Default high-privacy settings are a cornerstone of the AADC. Covered services must configure accounts for minors to provide maximum protection without requiring manual adjustments. For instance, location sharing should be disabled by default, contact lists should not be publicly viewable, and profile visibility should begin at the lowest exposure setting. The logic is that children and teens may lack the judgment or technical skills to navigate privacy menus, so defaults must protect them upfront. This flips the typical business model of encouraging openness for engagement: instead, safety and restraint become the starting point. For learners, this principle reflects the shift from reactive to proactive privacy, ensuring protections are baked into the initial design rather than left to young users to configure.
Data minimization under the AADC restricts collection to what is necessary for the service’s core functions. This means platforms cannot gather additional information simply because it may be useful for advertising, analytics, or future development. For example, a study app may legitimately collect login credentials and progress data but has no justification to harvest precise geolocation or browsing history outside the app. The goal is to prevent the accumulation of sensitive information about minors that could expose them to risks if repurposed or breached. Learners should see this as a safeguard against function creep: every piece of data collected must be tied to a clear, child-focused purpose, reinforcing the idea that less data equals lower risk in vulnerable populations.
Privacy by design and by default are explicit obligations under the AADC. This means organizations must integrate child protection into product development from the earliest stages, rather than bolting on privacy features at the end. Teams must evaluate how design decisions—such as feed algorithms, notification settings, or sharing options—affect children and teens. For example, if a music streaming app includes a social feature, the default should be private listening unless a user, with age-appropriate understanding, chooses otherwise. Privacy by default ensures that protections are automatically enabled, while privacy by design ensures that risks are considered before release. Learners should view this as a holistic requirement: it is not just about a toggle in settings but about embedding child-centered thinking throughout the product lifecycle.
Profiling and targeted advertising are sharply constrained for minors under the AADC. Businesses must limit or avoid creating behavioral profiles of children and teens for marketing purposes. For example, an online store may recommend products based on current session activity but should not build long-term advertising profiles from browsing across platforms. The distinction between contextual and behavioral targeting matters here: contextual suggestions aligned with content are generally safer, while cross-context behavioral profiling raises serious concerns. This rule acknowledges that children may not fully understand or consent to being profiled, and targeted ads could exploit developmental vulnerabilities. Learners should see this as part of a global trend: regulators are increasingly skeptical about behavioral advertising directed at minors, with California setting clear boundaries against manipulative practices.
Geolocation data poses particular risks, and the AADC imposes strict limits on its collection. Services must not collect or retain precise location data unless it is strictly necessary for providing a requested feature, and even then, clear and prominent controls must allow users to toggle it on or off. For instance, a mapping app may use location to provide directions but must not track background movement for unrelated analytics. The prominence of location controls reflects the heightened safety risk: misuse of geolocation data can expose minors to stalking, harassment, or exploitation. Learners should recognize this as a core safety feature, ensuring that sensitive information about a child’s whereabouts is only processed when truly necessary and always under transparent user control.
The AADC also addresses nudges, or manipulative prompts, that push minors toward weaker privacy settings or longer engagement. These include brightly colored “accept all” buttons contrasted with obscure “reject” options, prompts encouraging location sharing for “full experience,” or game mechanics that reward oversharing. Such tactics are restricted because they exploit developmental vulnerabilities, steering children toward choices they may not fully understand. For example, a social media app cannot design its onboarding to encourage public profiles by default through gamified nudges. Learners should view this as an extension of dark pattern prohibitions: when applied to children, manipulative design crosses into harmful exploitation and is explicitly prohibited.
Transparency standards under the AADC require businesses to explain their data practices in clear, age-appropriate language. This means privacy notices cannot be dense legal text but must use simple terms, graphics, or even interactive explanations that children and teens can reasonably understand. For example, a video app might use a short animation to explain how watch history is saved and how to delete it. Transparency is not just about compliance—it builds trust and empowers young users to participate in their own privacy decisions. Learners should understand that transparency under AADC is developmental: communication must match the cognitive abilities of the audience, ensuring clarity for children at different age levels.
Age estimation is permitted under the AADC but must be proportionate to risk. Services are not expected to know every user’s exact age but must implement reasonable methods to assess whether children are likely to access the platform. Low-risk services may use self-declaration, while higher-risk services must employ stronger checks like age verification mechanisms. For example, a social platform with significant sharing features may need more robust age-gating than a homework helper app. The proportionality principle avoids over-collection of data while ensuring services cannot plead ignorance about youth usage. Learners should see this as a balancing act: the law seeks to protect children without forcing services to collect unnecessary sensitive data about age.
The AADC requires risk assessments similar to Data Protection Impact Assessments. These reviews must identify likely risks to children from features, data practices, and design choices. For example, an assessment might evaluate whether autoplay leads to excessive screen time or whether default visibility settings expose minors to strangers. The outcome must include mitigation measures and formal signoff before release. This creates a disciplined process where risks are identified, documented, and tracked. Learners should see this as the AADC’s structural safeguard: it requires organizations to look ahead and anticipate child safety risks, rather than reacting after harm has occurred.
Governance obligations ensure that risk management has accountable owners. Organizations must designate risk owners who sign off on assessments and track mitigation implementation. This accountability structure means responsibility cannot be diffused across departments. For instance, if a new chat feature poses risks of inappropriate contact, the designated owner must document safeguards, oversee deployment, and verify controls remain effective. Governance under the AADC resembles other compliance frameworks: clear ownership ensures visibility, follow-through, and accountability at the leadership level. Learners should view governance as the backbone of implementation, making sure promises are not only designed but actually enforced across the service lifecycle.
Restrictions on sharing and selling minors’ data are particularly strong. Covered services cannot sell children’s personal information and must impose strict limits on sharing, even for seemingly innocuous purposes like analytics. For example, a gaming platform popular with teens cannot transmit behavioral profiles to ad networks, regardless of consent. These restrictions cut off the flow of minors’ data into secondary markets, where risks of profiling, exploitation, or loss of control multiply. Learners should see this as an alignment with broader California privacy rules but elevated for youth populations: where adults may opt out, children are protected by outright bans and tighter default restrictions.
Security measures under the AADC must be proportionate to the sensitivity of children’s data and the risks of harm. This includes technical safeguards like encryption, monitoring, and access controls, as well as organizational measures like workforce training. For example, a social app collecting teen chats must implement end-to-end encryption and strict moderation protocols to reduce exploitation risks. The proportionality requirement recognizes that services with richer data and higher risks must apply stronger protections. Learners should see this as a dynamic standard: compliance is not about adopting a static checklist but about tailoring safeguards to the nature of child data and the context of use.
Finally, incident handling practices must account for the special vulnerabilities of minors. Businesses must design response plans that prioritize child safety, including rapid containment, parental notification, and regulatory reporting when incidents affect children’s data. For example, if a breach exposes location data of teen users, the organization must act quickly not just to secure systems but to alert families and authorities about potential safety threats. The emphasis here is on child protection, not just technical remediation. Learners should recognize that in the context of children, incident response is not only about system recovery—it is about safeguarding vulnerable individuals in real-world scenarios that demand urgent, empathetic action.
For more cyber related content and books, please check out cyber author dot me. Also, there are other prepcasts on Cybersecurity and more at Bare Metal Cyber dot com.
High-risk features such as autoplay, social sharing, and public-by-default settings receive special scrutiny under the California AADC. Autoplay, while convenient, can encourage prolonged engagement that children may struggle to regulate. Social sharing features can expose minors to unintended audiences, and default public settings often reveal more than a child realizes. The law expects organizations to review these features carefully and set them conservatively, ensuring defaults lean toward safety rather than exposure. For example, a video platform popular with teens should disable autoplay by default for minors and restrict initial visibility of posts to friends only. This review process acknowledges that design choices carry behavioral consequences, and when those choices intersect with youth, the stakes become higher. Learners should see these obligations as reminders that privacy and safety are not just about data flows but about how digital environments subtly shape children’s behavior and risk.
Account controls are also emphasized, particularly around visibility and communication. Covered services must provide minors with tools to manage who can view their profiles, send them messages, or contact them in other ways. Importantly, these controls must default to protective settings, such as blocking messages from unknown users or limiting contact to approved connections. Consider a social app where a child signs up for the first time: by default, their profile should not be searchable by strangers, and only known friends should be able to initiate communication. By establishing such defaults, the AADC seeks to prevent predatory or exploitative contact. Learners should understand that account controls are not optional extras; they are core safeguards that reduce exposure to inappropriate interactions and reinforce a baseline of online safety for minors.
Design testing under the AADC must include usability checks tailored to children. This means organizations cannot assume that standard interfaces designed for adults will be comprehensible to younger users. Instead, they must test features with child participants or conduct age-appropriate evaluations to confirm understanding. For example, if a platform provides a toggle to limit location sharing, the wording and design must be clear enough for a thirteen-year-old to interpret correctly. Usability testing should ensure children can exercise rights and protections without confusion. Learners should see this as part of a broader shift toward human-centered privacy: it is not enough for options to exist—they must be designed and validated in ways that match the capabilities and comprehension levels of the people expected to use them.
Parental control transparency is another obligation. While parental controls can help safeguard children, they must not become covert surveillance mechanisms that undermine trust. The AADC requires services to disclose to both children and parents what monitoring or control functions exist, ensuring minors are aware when their activity is being tracked. For example, if a parent activates screen-time limits or content filters, the child should be informed about these measures rather than monitored secretly. This rule balances parental oversight with respect for children’s dignity and autonomy. Learners should appreciate the nuance here: child protection and parental rights are important, but so too is the child’s ability to develop trust in digital systems. Transparency keeps all parties aware and ensures oversight is applied responsibly.
The AADC also extends consumer rights handling to minors, requiring businesses to support access, deletion, and consent withdrawal in ways that are practical for children and their guardians. For example, a teen should be able to request deletion of their old posts or withdraw consent for data processing given earlier in their account use. Parents or guardians may also exercise rights on behalf of younger children, provided the process includes verification. These rights ensure minors are not locked into digital histories that may haunt them later. Learners should think of this as digital hygiene: just as one can clean up physical belongings over time, minors should have mechanisms to clean up digital traces, supported by businesses that must honor such requests promptly and respectfully.
Dark pattern avoidance extends into sign-up, consent, and subscription cancellation flows. Businesses cannot design systems that pressure or trick minors into giving up more data or locking into services. For example, a game app should not use flashing colors or countdown timers to push a child into clicking “share location now,” nor should it hide cancellation options behind complex navigation. These prohibitions recognize that minors are particularly vulnerable to manipulative design tactics, making fairness in interface design a legal requirement. Learners should see this as part of a broader regulatory trend: laws increasingly scrutinize not just whether choices exist but how those choices are presented. When design crosses into manipulation, especially for children, it undermines autonomy and invites enforcement.
Vendor contracts take on heightened importance when third parties handle minors’ data. Contracts must include clauses that address how child data is processed, specify retention periods, and require deletion upon request. For example, if a children’s learning platform uses a cloud analytics provider, the contract must ensure that provider cannot retain student data for secondary uses after the service ends. Verification of deletion must also be documented. These clauses ensure accountability across the chain of processing, preventing gaps where minors’ data could escape protections. Learners should understand vendor management as the extension of compliance obligations: the promises made to children and parents must flow downstream to every partner, supported by enforceable legal agreements.
Documentation of risks and mitigations is another explicit duty under the AADC. Before launching a product or feature likely to be accessed by minors, organizations must record identified risks, the mitigation steps taken, and the decision whether to proceed. This creates a go/no-go checkpoint that ensures child safety is not overlooked in the rush to market. For example, if a social feature carries risks of bullying, the documented record might show the addition of robust reporting tools and default limits before approval. Learners should see this as evidence of maturity: documentation not only structures decision-making internally but also provides regulators with proof that risks were actively considered and managed rather than ignored.
Training requirements extend to all teams involved in building or maintaining services likely to be accessed by minors. Product designers, engineers, and trust and safety staff must understand AADC duties and how they apply to their daily work. Training should cover obligations such as data minimization, default privacy settings, and manipulative-design restrictions. For example, a designer creating a new onboarding flow should know how to avoid nudges that steer children toward weaker privacy. Training reinforces that compliance is not siloed in legal teams but is embedded across disciplines. Learners should appreciate this as a cultural change: child privacy becomes a shared responsibility, and training equips every contributor to uphold the standard consistently.
Cross-jurisdiction reconciliation is necessary because California’s AADC overlaps with federal and other state child privacy laws. For example, COPPA already requires verifiable parental consent for under-13 data collection, while states like Connecticut and Utah have their own child privacy provisions. Businesses must harmonize these obligations, applying the strictest rules where they overlap. For instance, a platform accessible to children nationally may adopt California’s higher-age protections across the board to simplify compliance. Learners should view this as a common compliance strategy: harmonization reduces fragmentation and ensures children enjoy consistent protections regardless of location, reinforcing fairness and operational clarity.
Metrics are expected to monitor program performance and emerging risks. Organizations should track indicators such as the number of child complaints, the speed of remediation for reported issues, and trends in incident frequency. For example, a rise in reports of inappropriate contact on a platform may signal weaknesses in default messaging controls. Tracking these metrics helps organizations prioritize fixes and demonstrate to regulators that protections are continuously monitored. Learners should see metrics as the feedback loop: they show whether safeguards are working and provide evidence that issues are addressed proactively rather than reactively.
Post-launch monitoring ensures that risks identified in assessments are not forgotten once a product is released. Organizations must track real-world usage, identify emergent risks, and implement rapid configuration updates to close gaps. For example, if children begin using a feature in unintended ways that increase exposure, the company must adjust defaults or controls quickly. This expectation reflects the reality that children’s behavior evolves, and protections must evolve with it. Learners should think of this as ongoing care: just as toys are recalled or updated when safety issues arise, digital services must be continuously monitored to keep children safe.
External communications are another important dimension. The AADC expects organizations to align messaging for parents, educators, and regulators, ensuring that disclosures are consistent and helpful. For example, FAQs for parents should explain privacy settings clearly, while regulators may require technical detail on compliance practices. Communication builds trust with external stakeholders who play key roles in safeguarding children. Learners should view this as transparency at multiple levels: not only must children and teens understand protections, but adults responsible for their wellbeing must also have confidence in how services are designed and maintained.
Finally, program governance under the AADC requires periodic reassessment and evidence retention. Organizations must revisit their risk assessments, update documentation, and retain evidence of compliance for potential audits. This cadence ensures protections are not static but evolve as services, risks, and laws change. For example, an annual review may identify that age-estimation tools need updating as technology advances. Retained evidence, such as training logs, consent flows, and deletion certificates, provides defensibility if challenged. Learners should understand this governance as a cycle: design, document, monitor, and revisit. By embedding periodic reviews, businesses can sustain compliance and continuously reinforce their commitment to protecting children online.

Episode 88 — California AADC: Age-Appropriate Design Code Protections
Broadcast by