Episode 92 — Other State Acts: Emerging Comprehensive Privacy Laws

Across the United States, more states are passing comprehensive privacy acts modeled loosely on pioneers like California, Virginia, and Colorado. These new laws share common DNA but often introduce their own twists, creating a patchwork that organizations must navigate. Applicability thresholds are one of the most consistent features: most state acts apply to entities that process personal data of a defined number of residents, generate revenue above a certain threshold, or handle a minimum volume of consumer records. For example, some states set the bar at 100,000 residents annually, while others use 25,000 if a business earns a percentage of its revenue from selling personal data. These thresholds are designed to balance meaningful consumer protection with avoiding burdens on very small businesses. Learners should understand that applicability tests act like filters: only organizations with a significant data footprint fall under these laws, keeping compliance focused where it matters most.
Role definitions are another foundational element, clarifying whether an entity is a controller, processor, service provider, or contractor analog. Controllers determine the purposes and means of processing, while processors and their analogs execute tasks under instruction. Service providers and contractors often appear in California-style statutes, with explicit restrictions preventing them from using data outside contractual purposes. These role definitions matter because they distribute compliance duties across the ecosystem. For example, controllers must handle consumer rights requests, while processors must implement security safeguards and assist in compliance. Learners should recognize that defining roles is more than semantics—it creates accountability chains. Without clear boundaries, businesses could deflect responsibility, but with them, obligations flow predictably from decision-makers down to implementers.
Consumer rights are at the heart of every comprehensive state act. Residents are granted the ability to access personal data held about them, correct inaccuracies, request deletion, and receive data in a portable format. These rights empower individuals to engage directly with businesses and reclaim agency over their digital footprints. For instance, a consumer may ask to review their profile at a retailer, correct outdated contact information, or delete records once they stop using the service. Portability further enhances competition by allowing consumers to transfer their data to new providers. Learners should see these rights as operational levers: they transform abstract values of privacy into practical actions consumers can take, shifting power from companies back toward individuals.
Opt-out rights form another common pillar, giving consumers control over how their data is used for monetization and profiling. Most state acts allow residents to opt out of the sale of personal data, the sharing of information for cross-context advertising, and certain automated profiling activities that produce significant effects. For example, a user can prevent their browsing history from being sold to advertisers or block profiling that determines eligibility for financial products. While opt-out regimes differ in detail, the principle is consistent: consumers must be able to say no to intrusive or high-risk practices. Learners should appreciate that opt-outs align with the American tradition of balancing commercial flexibility with individual choice, contrasting with Europe’s opt-in-first approach.
Sensitive data categories require stricter standards, typically anchored to opt-in consent. These include information about race, ethnicity, religion, health, sexual orientation, precise geolocation, and biometrics. Before collecting or using such information, businesses must obtain clear, affirmative agreement. For example, a health tracking app must secure explicit consent before storing reproductive health data or precise GPS trails. Special attention is given to biometrics because they are immutable, and to location because of safety risks. Learners should see opt-in regimes as protective buffers: by raising the bar for processing data that carries the greatest potential harm, states ensure businesses cannot default into risky practices without informed consumer approval.
Children’s and teen protections are becoming more prominent across state acts. These laws often mirror federal COPPA requirements for children under 13, requiring verifiable parental consent, but many extend protections to teens under 16. For example, some statutes require opt-in consent for targeted advertising directed at adolescents. Others mandate default settings that prioritize safety, such as private profiles and restricted messaging. Learners should recognize that youth protections reflect growing concern about developmental vulnerabilities in digital spaces. When minors engage with online services, regulators expect higher safeguards and stricter default rules to reduce exploitation and manipulation. These provisions illustrate how privacy is layered differently for age groups, with stricter expectations as vulnerability increases.
Privacy notices are another core expectation, requiring businesses to inform consumers at or before collection about what data is gathered, why, how it is used, and with whom it is shared. Notices must include details about retention periods, consumer rights, and opt-out mechanisms. For example, a social platform must disclose categories like browsing data, explain its use in advertising, and provide instructions for opting out of sharing. These disclosures must be updated regularly to reflect actual practices, ensuring transparency. Learners should understand notices as the communication layer of privacy law: they turn internal operations into consumer-facing explanations, building trust while providing the baseline for informed choice.
Reasonable security safeguards appear consistently across state statutes. Businesses must implement measures proportional to the sensitivity of the data and the risks involved, but laws generally avoid prescribing exact technical controls. This flexibility ensures scalability for organizations of different sizes and sectors. For example, a bank processing sensitive financial information must deploy encryption, multifactor authentication, and robust monitoring, while a smaller retailer may focus on access controls and secure deletion. Learners should note that “reasonable” is a dynamic standard—it requires justification and documentation, not bare minimum practices. Security obligations tie privacy to resilience, ensuring personal data is not only handled transparently but also shielded against threats.
High-risk processing activities trigger data protection assessments under many state acts. These assessments evaluate the benefits and risks of sensitive data use, targeted advertising, profiling, or large-scale collection. Controllers must document safeguards, alternatives, and mitigation measures, and often make these assessments available to regulators upon request. For example, an ad-tech company planning to launch a new targeting algorithm must assess whether it might disproportionately impact protected groups. Learners should see assessments as compliance discipline: they force organizations to anticipate risks before harm occurs, embedding accountability into project planning. Like pre-flight checklists, they ensure systems are safe before they launch.
Processor contracts and flow-down requirements ensure obligations cascade through the data ecosystem. Controllers must include specific terms in contracts, such as instructions for processing, confidentiality duties, deletion obligations, and audit rights. Processors, in turn, must flow these obligations down to subprocessors. For example, if a cloud vendor uses a subcontractor for storage, the subcontractor must be bound by the same privacy and security clauses. This layered contracting ensures that promises made to consumers do not evaporate once data moves downstream. Learners should understand contracts as the skeleton of privacy compliance: they structure how obligations extend across entities and prevent accountability gaps in multi-party ecosystems.
Appeals mechanisms, verification standards, and fraud controls strengthen rights-handling processes. When a consumer’s rights request is denied, they must be given a way to appeal and receive a written outcome within a defined timeline. Verification standards ensure that rights are exercised by legitimate consumers, not fraudsters, and fraud controls prevent malicious requests like unauthorized deletions. For example, a business may require secure login or two-factor verification before fulfilling a deletion request. Learners should see these provisions as procedural fairness: they make rights not only available but reliable, ensuring protections cannot be exploited or arbitrarily denied.
Universal opt-out recognition is gaining momentum across states, requiring businesses to honor browser-level or platform-based privacy signals like Global Privacy Control. This allows consumers to express choices once and have them enforced broadly, eliminating the burden of opting out on every site. For example, if a user enables a browser signal, all compliant websites must stop processing their data for targeted advertising. Some states also maintain their own opt-out lists for businesses to integrate. Learners should see universal signals as a technical solution to operational challenges: they translate legal rights into enforceable digital actions, making privacy protections scalable and frictionless.
Loyalty programs and financial incentives receive explicit regulation in many state acts. Businesses can offer discounts or benefits in exchange for personal data, but they must disclose the terms, explain the data trade, and ensure participation is voluntary. Coercive designs, such as punishing users who opt out of data sharing, are prohibited. For example, a grocery store offering a discount card must explain that purchase histories will be used for analytics and marketing, but it cannot deny customers the ability to shop without signing up. Learners should see these provisions as balancing innovation and fairness: incentives can exist, but they must be transparent, proportional, and free from exploitation.
Finally, non-discrimination provisions prevent businesses from retaliating against consumers who exercise their rights. Companies cannot deny services, reduce quality, or inflate prices simply because a consumer opts out or requests deletion. Limited exceptions exist when data is integral to a program, such as a loyalty discount tied to sharing purchase data. For example, a consumer opting out of targeted advertising cannot be charged a higher subscription fee for basic service access. Learners should understand non-discrimination as a fairness anchor: it ensures privacy rights do not become hollow by protecting individuals from punishment when they assert control over their data.
For more cyber related content and books, please check out cyber author dot me. Also, there are other prepcasts on Cybersecurity and more at Bare Metal Cyber dot com.
State privacy acts differ in how they define key terms such as “sale,” “share,” and “targeted advertising.” California, for example, interprets “sale” broadly to include transfers for valuable consideration, while other states limit the term to monetary exchanges. “Share” often relates to cross-context advertising, though its scope varies. Similarly, targeted advertising may be narrowly defined as behavior-based profiling in some statutes and more expansively in others. These definitional nuances matter because they determine when opt-out rights apply. For instance, a brokered data exchange might trigger obligations in one state but not in another. Learners should recognize these differences as sources of complexity: businesses must carefully parse statutory definitions to avoid assuming uniformity across jurisdictions, since obligations may hinge on subtle shifts in language.
Coverage of employee and business-to-business data is another area of divergence. Some states, like California, extend privacy protections to employees and contractors, though phased in with sunset clauses. Others exclude workforce and B2B data entirely, limiting the scope to consumer contexts. For example, under certain laws, an employee may request access to their HR records, while in others, those same records are outside the statute’s reach. This variability reflects different policy choices about whether privacy should extend beyond consumer transactions into professional relationships. Learners should view this as a critical operational distinction: companies must know whether their HR and vendor systems are subject to state privacy laws in each jurisdiction, adjusting compliance programs accordingly.
Private rights of action remain rare in emerging state acts, with most states relying on Attorney General enforcement. California is the notable outlier, allowing private suits in the event of certain security breaches. Other states prefer centralized enforcement to prevent litigation floods, balancing consumer protection with business certainty. For example, a consumer in Virginia cannot sue directly for a rights violation but must rely on the Attorney General to take action. Learners should recognize that enforcement posture influences risk: in California, organizations face both regulatory and civil exposure, while in most other states, enforcement risk flows through public authorities alone.
Cure periods, or grace windows to fix violations, also vary significantly. Virginia offers 30 days to cure, while Colorado initially allowed 60 days but plans to eliminate the automatic cure period over time. California has largely moved away from mandatory cure opportunities, preferring direct penalties. This evolution reflects the maturation of privacy regimes: early laws favored education-first enforcement, but as businesses gain experience, tolerance for leniency is shrinking. Learners should see cure periods as both a transitional feature and a compliance signal—organizations must expect stricter accountability as frameworks mature, reducing reliance on correction windows as a safety net.
Opt-out signal recognition timelines are another area of difference. Colorado requires controllers to honor technical signals, such as Global Privacy Control, by specific deadlines, while some states have not yet mandated such mechanisms. Others, like California, already expect businesses to integrate these signals seamlessly. A few jurisdictions may explore maintaining their own opt-out registries, allowing consumers to register once and have their preference applied universally. For learners, this shows how states are experimenting with technical enforcement: some mandate global standards, others build state-specific tools, and businesses must adapt to both approaches simultaneously.
Profiling and automated decision-making governance also vary. Some states, like Colorado, require meaningful disclosures about logic and provide appeals processes for adverse outcomes, while others simply reference profiling in general terms without extensive obligations. Depth of assessment detail also shifts: Colorado requires data protection assessments for high-risk profiling, while other statutes stop at requiring notices. For example, an insurer in Colorado using AI-based pricing must document its risk mitigation steps, whereas in Virginia, profiling obligations are less elaborate. Learners should see this as a spectrum: some states embed ADM governance deeply into their laws, while others leave it to future rulemaking or industry standards.
Data broker registration is an emerging add-on that goes beyond comprehensive acts. Vermont, California, and Oregon already require registration, with California’s Delete Act introducing a centralized deletion mechanism. Other states are considering similar expansions to bring transparency to brokers who otherwise operate invisibly. These obligations supplement core privacy laws by targeting high-risk sectors. For example, a broker registered in California may have to respond to centralized deletion requests that bypass direct consumer contact. Learners should recognize this as a sign of modular privacy lawmaking: states build core frameworks, then layer broker-specific duties on top to address opaque corners of the data economy.
Children’s privacy rules also diverge significantly. Some states mandate default high-privacy settings for minors, similar to California’s Age-Appropriate Design Code, while others stick to COPPA’s federal baseline of under-13 parental consent. Age thresholds vary, with some extending heightened protections to teens under 16. Defaults for profiles, messaging, or geolocation features may also differ by jurisdiction. For example, a platform might be required to set teen accounts as private in one state but face no such obligation in another. Learners should see this divergence as reflecting cultural and political differences about youth autonomy and protection, making harmonization particularly challenging in child-focused services.
Sector carve-outs and preemption rules add further complexity. Many state acts exclude data already regulated under HIPAA, GLBA, or FCRA, but the breadth of these carve-outs varies. Some laws broadly exempt covered entities, while others exempt only data within the federal statute’s scope. This means a hospital may be exempt under one law but still covered for marketing activities under another. Preemption dynamics shape boundaries, preventing duplication but creating tricky edge cases. Learners should understand carve-outs as legal fault lines: they determine which systems are regulated under state privacy laws and which remain governed by federal frameworks, requiring careful legal interpretation.
Rulemaking authority differs widely among states. California empowers a dedicated agency to issue detailed regulations, while other states rely primarily on Attorney General guidance. Some provide little interpretive authority at all, leaving organizations to navigate statutory text with minimal clarification. This variability affects how much detail is available to businesses when designing compliance programs. For example, Colorado’s Attorney General issues FAQs and interpretive rules, providing operational clarity, while Virginia relies more on the plain statute. Learners should see rulemaking differences as affecting compliance certainty: more guidance creates clarity but also introduces evolving obligations as rules are updated.
Breach notification duties are already common across states, but integration with comprehensive privacy laws adds complexity. Some statutes cross-reference existing breach laws, while others expand them to include obligations for notifying regulators about broader privacy violations. For example, an organization may have to notify consumers not only when sensitive identifiers are breached but also when unauthorized profiling occurs. This convergence of breach and privacy duties highlights the shift from reactive security toward proactive privacy oversight. Learners should view this as evidence of convergence: privacy and security are increasingly treated as inseparable components of consumer protection.
Multi-state compliance strategies often rely on template playbooks to align notices, contracts, and rights operations. These playbooks standardize processes while documenting exceptions for states with stricter rules. For example, a business may adopt California’s retention disclosure model nationwide while flagging Virginia’s narrower opt-out provisions as exceptions. Metrics-driven monitoring then helps surface outliers, showing where compliance may lag across jurisdictions. Learners should see playbooks as practical tools for navigating patchwork laws: they bring structure to complexity and reduce the risk of inconsistent execution.
Finally, harmonization strategies typically involve adopting the most protective baseline across states and documenting exceptions. This means implementing universal opt-out recognition, sensitive data opt-in, detailed retention schedules, and strong profiling disclosures, even where not strictly required. By aligning upward, businesses simplify compliance and enhance consumer trust. For example, a retailer may treat all U.S. consumers as though they were Californians, honoring global privacy signals and publishing retention schedules universally. Learners should see harmonization as both defensive and strategic: it reduces risk by satisfying the strictest rules and positions businesses as leaders in consumer trust, transforming compliance into a competitive advantage.

Episode 92 — Other State Acts: Emerging Comprehensive Privacy Laws
Broadcast by