Episode 33 — FTC Authority: Section 5 and Consumer Protection in Privacy

The Federal Trade Commission’s authority under Section 5 of the FTC Act sits at the heart of U.S. privacy enforcement. Section 5 prohibits unfair or deceptive acts or practices in or affecting commerce, which has allowed the Commission to become the nation’s de facto privacy regulator even without a comprehensive federal privacy law. By using this broad mandate, the FTC has been able to address misleading privacy statements, insufficient data security practices, and a wide range of emerging issues in digital markets. For learners, understanding Section 5 is essential because it provides the legal foundation for most enforcement actions that shape how companies present privacy policies, design consent mechanisms, and manage consumer data. Without appreciating this foundation, it is difficult to grasp why so many organizations treat FTC consent decrees as the guiding template for compliance programs.
The deception prong of Section 5 is perhaps the most visible in privacy enforcement. A company engages in deception when it makes a representation, omission, or practice that is likely to mislead consumers, and where that misleading element is material to a consumer’s choice or conduct. Put simply, if a company promises to protect data in a certain way but fails to follow through, it risks a deception finding. Materiality is key, because trivial misstatements rarely form the basis of a case. For example, overstating encryption strength or falsely claiming compliance with a privacy framework can mislead reasonable consumers into trusting services they otherwise would not. The FTC emphasizes consumer reliance on these representations, making companies accountable for every statement in their privacy notices, marketing claims, or customer support scripts.
Unfairness, by contrast, focuses on practices that cause or are likely to cause substantial injury to consumers that is not reasonably avoidable and not outweighed by countervailing benefits. This standard allows the FTC to address harms even where no explicit promise was broken. Substantial injury often involves financial loss, health risks, or significant invasions of privacy. For instance, exposing sensitive personal information due to weak security controls may be deemed unfair because consumers cannot reasonably avoid the risk, and the benefits of lax safeguards do not outweigh the harm. Importantly, unfairness captures systemic issues where companies rely on manipulative designs or default settings that expose users to avoidable risks. This flexible framework allows regulators to address practices that evolve faster than legislation.
One of the most significant applications of unfairness in privacy has been data security. The FTC has argued that failing to implement reasonable security measures constitutes an unfair practice, especially when the risks and consequences are foreseeable. The reasonableness test here is pragmatic, asking whether the safeguards are proportionate to the sensitivity of the data and the likelihood of harm. For example, storing unencrypted Social Security numbers on an unsecured server would clearly fail the standard, while more complex debates arise around the adequacy of encryption, access controls, or patching cadence. For learners, the takeaway is that “reasonable” does not mean perfect, but it does mean defensible—companies must show they considered risks and adopted controls aligned with common security practices.
Because privacy notices often form the first point of consumer interaction, they have become a cornerstone of FTC deception cases. When a company publishes a notice describing how it will collect, use, or share information, that document becomes a binding set of promises under Section 5. If the company deviates—such as by sharing data with advertisers despite claiming it never would—it can face charges of deception. The same holds true for vague or overly complex statements that obscure material details. In this way, the FTC treats privacy notices not just as compliance checklists but as enforceable consumer contracts. For organizations, this transforms a website footer into a critical legal obligation that must be carefully drafted, reviewed, and lived up to operationally.
Consent, choice, and preference handling have also been recurring flashpoints in FTC enforcement. If a company offers users the ability to opt out of targeted advertising but then disregards that preference, it risks a finding of deception. Similarly, burying key options in obscure menus or making opt-out procedures so cumbersome that users effectively cannot exercise choice may be deemed unfair. The Commission has underscored that true consumer autonomy requires both clarity and functionality. A familiar analogy is the unsubscribe button in an email: if clicking it leads to confusing steps or requires entering unnecessary information, the promise of choice is hollow. For learners, the principle is that offering options is not enough—companies must respect and honor them consistently.
In recent years, the FTC has increasingly scrutinized so-called dark patterns. These are interface designs that subtly manipulate user behavior, often nudging them toward less privacy-protective choices. Examples include pre-ticked consent boxes, confusing toggle labels, or misleading button colors that make declining harder than accepting. Dark patterns can cross the line into deception if they misrepresent choices, or into unfairness if they make it unreasonably difficult for consumers to avoid harm. Regulators view them as evidence of intent, suggesting that companies deliberately undermine user autonomy. For anyone preparing for the exam, it is worth remembering that design choices themselves can constitute a violation, even when no explicit misrepresentation is made. The FTC’s focus here highlights the intersection of law, psychology, and technology design in shaping privacy outcomes.
The Children’s Online Privacy Protection Act, or COPPA, is one of the few U.S. federal privacy laws with explicit statutory rules, and the FTC enforces it alongside Section 5. COPPA requires verifiable parental consent before collecting personal information from children under thirteen. The interplay with Section 5 arises because misrepresentations about COPPA compliance, or practices that unfairly expose children’s data, can trigger enforcement. For example, if a company claims to block underage signups but does nothing to verify age, it may mislead parents and regulators. In this way, COPPA operates as a specialized framework while Section 5 fills gaps in broader consumer protection. The combination gives the FTC a strong foothold in protecting children’s online privacy and sets the stage for debates about whether similar frameworks should exist for other vulnerable populations.
Another dimension of Section 5 analysis involves sensitive data categories. Information such as health data, financial details, geolocation, or biometric identifiers often triggers heightened expectations of protection. When companies handle these categories, even small missteps can be considered material or substantially injurious. For instance, failing to safeguard genetic data used in ancestry services could cause harms ranging from discrimination to identity theft. The FTC uses sensitivity as a lens to calibrate enforcement, recognizing that consumers reasonably expect greater diligence when stakes are higher. Learners should note that sensitivity is not formally codified in Section 5 but arises from a common-sense understanding of risk and consumer expectations. This flexible standard allows the Commission to adapt enforcement priorities as new data categories, such as machine learning training sets, gain prominence.
The responsibility of vendors and service providers also falls under the FTC’s Section 5 approach. When a company makes representations about its ecosystem, it implicitly vouches for the practices of its third parties. For example, if a social media platform promises that data shared with partners will be used responsibly, it cannot ignore misuse by those partners. The FTC has pursued cases where inadequate oversight of service providers led to consumer harm, framing it as either unfair or deceptive. This extends the duty of care beyond a company’s direct practices to its contractual relationships. For organizations, it means vendor management is not just a business concern but a compliance imperative. Proper due diligence, ongoing monitoring, and clear contractual controls become necessary safeguards against potential liability.
Algorithmic fairness and automated decision-making are emerging frontiers within Section 5 enforcement. The FTC has signaled that algorithms trained on biased data, or decision systems that produce discriminatory outcomes, may constitute unfair or deceptive practices. If a company claims its model is unbiased but evidence shows otherwise, that is classic deception. If the model causes unavoidable harm, such as denying housing or credit opportunities based on irrelevant characteristics, that may be unfairness. This focus reflects growing societal concern about artificial intelligence and its real-world impacts. For learners, the lesson is that privacy enforcement is not frozen in time. As technologies evolve, Section 5 is interpreted in ways that extend its principles to new contexts, ensuring the core values of fairness and honesty remain intact even in algorithm-driven systems.
Cross-device tracking and cross-context behavioral advertising are other areas where the FTC applies its Section 5 authority. These practices involve piecing together consumer activity across phones, laptops, and tablets, or inferring interests from browsing different types of websites. While powerful for advertisers, these techniques often occur without clear consumer understanding or consent. The FTC has pursued cases where companies claimed anonymity while in fact enabling detailed personal profiling. Such misrepresentations fall under deception, while the unavoidable nature of pervasive tracking can veer into unfairness. For individuals, the key idea is that transparency and choice must keep pace with technological sophistication. Companies cannot simply hide behind complex ecosystems to avoid accountability for how consumer information is linked and used.
When violations occur, the FTC has a wide palette of remedies. Injunctive relief can stop ongoing misconduct, while restitution and disgorgement address ill-gotten gains or consumer losses. In some cases, the Commission imposes extensive reporting and record-keeping duties to ensure transparency. These remedies not only address specific harms but also serve as public signals to the broader market. For organizations, the lesson is that noncompliance carries costs beyond fines—it can entail years of intrusive oversight. For learners, understanding these remedies helps illustrate why FTC orders are taken so seriously. They establish both corrective measures and deterrent signals, shaping corporate practices across entire industries.
Long-term order provisions are a distinctive feature of FTC enforcement. Companies found in violation often agree to twenty-year compliance obligations, including regular assessments, audits, and board-level reporting. These provisions institutionalize privacy and security controls, forcing organizations to embed them into governance structures. For example, an order might require annual independent assessments of security programs, with results reported to regulators. Such obligations extend far beyond initial penalties, effectively reshaping how companies operate over decades. For learners, this underscores the seriousness of Section 5 enforcement. It is not just about punishing past missteps but about engineering lasting cultural and structural changes in how organizations handle consumer data.
For more cyber related content and books, please check out cyber author dot me. Also, there are other prepcasts on Cybersecurity and more at Bare Metal Cyber dot com.
The FTC’s authority under Section 5 is more than theoretical; it is exercised through a range of investigative tools that allow the Commission to gather evidence and assess potential violations. Civil investigative demands, subpoenas, and compulsory process enable staff to compel documents, testimony, and data from companies under scrutiny. These powers are similar to discovery in litigation but operate during the investigatory stage, often before a formal complaint is filed. The ability to demand internal communications, system logs, or marketing drafts ensures that regulators can see both what a company told consumers and what it knew internally. For learners, the key insight is that Section 5 enforcement is not passive. The FTC has concrete mechanisms to pierce corporate opacity and test whether representations align with reality. Understanding these tools explains why even well-resourced companies devote significant resources to compliance.
Once an enforcement order is in place, the FTC structures compliance to ensure that promises translate into ongoing accountability. Orders often require companies to engage independent assessors—outside professionals tasked with evaluating whether security and privacy practices meet specified standards. These assessors submit reports to the Commission, providing third-party validation that reforms are not just theoretical. Periodic certifications by company officers reinforce internal accountability, putting senior leaders on record as attesting to compliance. This structure recognizes that privacy and security are not one-time fixes but continuous obligations. For learners, it illustrates how enforcement extends beyond headlines into years of monitoring, shaping company governance and operations. Compliance here becomes a dynamic, ongoing process rather than a static checkbox.
One enduring focus of the FTC is the ecosystem of data brokers. These companies collect, aggregate, and sell consumer information often without direct relationships with the individuals involved. The opacity of these operations creates significant risks: consumers may have no idea their data is being used for profiling, targeted marketing, or even eligibility decisions. The Commission views this lack of transparency as both a policy challenge and a potential source of unfairness. By investigating data broker practices, the FTC pushes for clearer disclosures and stronger control mechanisms. For learners, data brokers exemplify how Section 5 must adapt to business models where traditional consent frameworks break down. The opacity itself becomes a harm, eroding trust in digital markets.
Biometric technologies, particularly facial recognition, have increasingly drawn FTC attention. These practices involve unique identifiers that cannot easily be changed if compromised, raising significant risks of long-term harm. The FTC examines whether companies obtain meaningful consent, limit retention of biometric data, and apply adequate security controls. For example, storing facial recognition templates without encryption could be seen as unreasonable. Misrepresentations about biometric use—such as claiming data is used solely for authentication while employing it for marketing—can lead to deception findings. For learners, biometrics highlight how Section 5 principles are applied to novel technologies. The sensitivity of the data elevates expectations, making careful governance essential for compliance.
The Internet of Things has created sprawling networks of connected devices, from smart thermostats to wearable fitness trackers. The FTC evaluates whether these ecosystems are designed with security defaults and whether manufacturers provide mechanisms for timely updates. Failing to secure default passwords or leaving devices unpatchable can constitute unfair practices. The Commission emphasizes that consumers cannot reasonably avoid risks when vulnerabilities are baked into devices they bring into their homes. For learners, IoT enforcement illustrates how Section 5 adapts to distributed, embedded technologies. It underscores that security must be integral to design, not an afterthought added later, because device lifecycles and consumer reliance make retroactive fixes difficult.
Claims about artificial intelligence models have become another area of scrutiny. Companies may market AI systems as accurate, unbiased, or capable of certain predictive feats without sufficient substantiation. If those claims are overstated, they risk being deceptive. Beyond claims, the governance of models—how they are trained, tested, and monitored—also comes under review. The FTC stresses the importance of risk mitigation, including addressing bias, ensuring explainability, and safeguarding training data. For learners, this shows that Section 5 is flexible enough to cover emerging AI practices. It reminds us that innovation does not exempt companies from accountability; rather, the more transformative a technology, the greater the duty to substantiate and responsibly manage it.
Health-related data outside of HIPAA’s coverage is another enforcement priority. Fitness apps, fertility trackers, and wellness platforms often collect sensitive information but operate outside traditional healthcare regulation. The FTC has applied Section 5 to ensure that such companies provide notice, obtain meaningful consent, and secure data appropriately. Misleading consumers about the confidentiality of health data, or sharing it with advertisers without disclosure, can be deceptive. Inadequate safeguards for such sensitive data can be unfair. For learners, this demonstrates how Section 5 fills regulatory gaps. Even without sector-specific laws, companies handling sensitive information must meet baseline standards of transparency and reasonableness.
Location data has proven especially sensitive, given its ability to reveal intimate aspects of people’s lives, from health clinic visits to political rallies. The FTC has scrutinized practices where companies collect and sell precise geolocation information without clear consumer knowledge or control. Representing such data as anonymous while allowing re-identification can constitute deception. Selling sensitive location information without meaningful opt-out may be unfair. Learners should recognize that location data carries heightened risks because it can reveal not only who we are but what we do and where we go. Section 5 provides a framework to rein in these practices by demanding clarity, choice, and limits on sensitive uses.
The adtech supply chain—spanning publishers, platforms, and intermediaries—presents another challenge. Companies may disclaim responsibility by pointing to the complexity of these systems, but the FTC insists that accountability cannot be diffused away. If a platform claims to protect consumer privacy but allows opaque tracking by third parties, it risks liability. Enforcement in this space reinforces the principle that each actor must take responsibility for its role, even in distributed ecosystems. For learners, adtech demonstrates how Section 5 scales up to address systemic issues. The standard of fairness and honesty applies not just at the consumer-facing edge but throughout the supply chain.
The FTC also coordinates with other enforcement bodies to maximize impact. The Department of Justice often joins when remedies require court enforcement, while state attorneys general pursue overlapping violations under state law. Coordination ensures consistency and prevents companies from playing regulators off against one another. Learners should understand that Section 5 enforcement does not operate in isolation. Its effectiveness relies on partnerships that broaden reach and reinforce accountability. The presence of multiple enforcers amplifies the deterrent effect, making it clear that deceptive or unfair practices face scrutiny on multiple fronts.
Another important relationship exists with the Federal Communications Commission. The FCC regulates certain telecom and marketing practices, including robocalls, spam, and data handling by carriers. The FTC and FCC coordinate to avoid gaps or conflicts, ensuring that companies cannot evade oversight by exploiting jurisdictional boundaries. For learners, this illustrates how privacy and consumer protection span multiple domains. Section 5 is powerful, but it exists alongside other statutory regimes. Understanding these overlaps clarifies why compliance strategies must account for more than one regulator.
Consumer and market education initiatives form part of the FTC’s deterrence strategy. By publishing guidance, reports, and business education materials, the Commission sets expectations before enforcement occurs. These resources help organizations understand how Section 5 principles apply in practice, reducing the risk of inadvertent violations. For learners, education initiatives serve as a reminder that regulators are not solely punitive. They aim to shape market behavior by clarifying norms, providing best practices, and highlighting pitfalls. Awareness of these resources is as important as knowing the enforcement cases themselves.
International engagement has become crucial in a world of cross-border data flows. The FTC participates in global networks such as the Global Privacy Enforcement Network to coordinate investigations and share best practices. Cooperation with foreign regulators allows cross-border cases, such as those involving multinational platforms, to be addressed more effectively. For learners, this highlights the global reach of Section 5 principles. Even though it is a U.S. law, its enforcement often resonates internationally, shaping corporate behavior far beyond national borders. Understanding these dynamics is vital for companies operating globally.
Finally, translating Section 5 expectations into concrete compliance programs is an ongoing challenge. Companies must build controls, testing procedures, and documentation that align with FTC standards of fairness and honesty. This means integrating privacy by design, conducting regular audits, and ensuring clear governance structures. The lesson for learners is that Section 5 is not only about avoiding enforcement; it is about designing systems that respect consumer trust as a first principle. In practice, this means embedding accountability into every stage of the data lifecycle. A programmatic approach is not just good risk management—it is what regulators expect.
The conclusion of this discussion brings us back to the central synthesis: Section 5 of the FTC Act provides the twin anchors of deception and unfairness, applied flexibly to evolving technologies and business models. Through robust investigative tools, durable remedies, and long-term compliance orders, the FTC shapes how companies handle data in practice. Its authority is reinforced by coordination with other regulators, global engagement, and ongoing education initiatives. For learners, the enduring lesson is that privacy compliance in the United States is inseparable from Section 5. It provides both the legal framework and the cultural expectations that companies must meet. By internalizing these principles, organizations not only avoid enforcement but also build lasting consumer trust.

Episode 33 — FTC Authority: Section 5 and Consumer Protection in Privacy
Broadcast by