Episode 83 — Health Data Rules: WA MHMD, NV Health Data Act, and IL GIPA

Washington’s My Health My Data Act begins with a strikingly broad definition of consumer health data. Instead of limiting protections to traditional medical records, the law extends its reach to any information that identifies or could reasonably identify a person’s physical or mental health, attempts to obtain health services, or inferences drawn from their behavior that suggest health status. This means that not only blood test results but also data like step counts, fertility tracking, or online searches for medical providers may fall under the statute’s protections. For learners, it is helpful to picture a wide umbrella: beneath it sit conventional health records as well as less obvious forms of data that reveal something about an individual’s body, condition, or care choices. The breadth of this definition signals a deliberate effort to address the realities of a digital age where health data is generated continuously and often outside traditional healthcare settings.
The law identifies two key groups: covered entities and regulated processors. A covered entity is any business or nonprofit that determines the purpose and means of collecting, processing, or sharing consumer health data within Washington. A regulated processor, by contrast, handles consumer health data on behalf of a covered entity, following its instructions. This distinction is crucial because it clarifies responsibilities. Covered entities must implement the full range of obligations, from publishing privacy notices to honoring consumer rights, while processors must comply with contractual requirements that bind them to safeguard data and restrict its use. Imagine a meditation app company as the covered entity and its cloud hosting provider as the processor. The app company decides what data is collected and why, while the hosting provider must secure the servers and process data only according to the app company’s instructions.
Privacy notices are a cornerstone of MHMD. Businesses must clearly describe what categories of health data they collect, the sources of that data, the purposes for using it, and the categories of recipients with whom it is shared. These notices must be specific and accessible, not buried in dense legal jargon. For instance, a wellness platform should tell users that it collects heart rate data from wearable devices, uses it to track fitness progress, and shares it with a contracted analytics vendor to improve performance metrics. Transparency ensures that consumers are not left guessing about how their sensitive information is being used. From a teaching perspective, think of privacy notices as nutrition labels for data practices: they break down the ingredients so individuals can make informed decisions.
Consent under Washington’s framework is treated as a separate and heightened obligation. Businesses must obtain explicit, affirmative consent before collecting consumer health data and again before sharing it. This consent cannot be bundled into general terms of service or assumed through silence. Instead, it must be clear and specific. Consider a fertility tracking app: it must request consent to collect reproductive health information at the point of entry and then seek separate authorization if it intends to share that data with third-party advertisers. This two-layer approach reinforces consumer control and prevents organizations from relying on vague, blanket agreements. In plain terms, Washington requires that companies ask permission at each stage where consumer choice is meaningful, ensuring that consent is both informed and deliberate.
The act takes a firm stance on the sale of consumer health data. Sale is defined broadly, including not just direct monetary transactions but also exchanges for other valuable consideration. Companies must obtain valid authorization from consumers before selling their data, and they must disclose the fact that a sale will occur. Imagine a weight-loss app that packages and sells aggregated user health profiles to marketing firms. Under MHMD, the app must inform users explicitly and secure their permission before completing the transaction. By treating sales as a sensitive activity requiring transparency and choice, Washington aims to curb the commodification of health data and give individuals more control over whether their personal information becomes part of secondary markets.
Consumer rights are a hallmark of the statute. Individuals can request access to their consumer health data, obtain a list of third parties with whom it has been shared, and demand deletion of their data from both covered entities and processors. They also have the right to withdraw consent, obligating businesses to stop collecting or sharing their health data moving forward. Picture a person who used a mental health app but later decides they want their information erased. The app company must delete the records and direct its hosting provider and analytics partners to do the same. These rights empower individuals to manage their digital health footprint, shifting the balance of control from organizations to consumers.
Geofencing is another area where Washington has broken new ground. The law prohibits the use of geofencing technology around healthcare facilities to track or target individuals. Geofencing involves creating virtual boundaries around a physical location, enabling businesses to identify when someone enters or leaves the area. Without this restriction, advertisers could, for example, push pregnancy-related ads to people visiting reproductive health clinics. Washington recognized the risk of coercion and stigma inherent in such practices and banned them outright. This measure demonstrates how privacy laws can evolve to address specific technological risks that threaten autonomy and dignity in sensitive contexts.
Security requirements under MHMD are proportionate, meaning that safeguards must align with the volume and sensitivity of data held. Rather than prescribing a rigid checklist, the law expects organizations to implement reasonable measures appropriate to their risk environment. A large telehealth provider might be expected to maintain end-to-end encryption, multi-factor authentication, and continuous monitoring, while a smaller fitness app might focus on encrypted storage and strict access controls. The principle here is adaptability: what counts as reasonable for one organization may not suffice for another. Learners should understand this as a sliding scale—bigger risks require stronger defenses, but all entities must demonstrate diligence.
Processors under Washington’s law are not passive actors; they must actively protect consumer health data. They are required to follow the instructions of the covered entity, implement security safeguards, and ensure confidentiality. This means they cannot use the data for their own purposes or share it without authorization. Consider a cloud storage provider that holds sensitive patient communications for a telehealth platform. The provider must process the data only to maintain storage services, secure it against breaches, and delete it upon instruction. By binding processors with direct obligations, Washington ensures accountability throughout the data ecosystem, not just at the level of consumer-facing companies.
Contracts play a pivotal role in operationalizing MHMD. Agreements between covered entities and processors must clearly define roles, mandate confidentiality, set requirements for security measures, and obligate deletion or return of consumer health data when services end. This contractual clarity prevents ambiguity about responsibilities and creates a paper trail for compliance. For instance, a health monitoring device company contracting with a data analytics firm must ensure that their contract specifies how the analytics provider may use the data, how long it may keep it, and how it must dispose of it afterward. Contracts become both the backbone of accountability and the enforcement tool for ensuring that promises made to consumers are carried through the supply chain.
Documentation is another central requirement. Businesses must keep records of consumer consents, rights requests, and the mechanisms they use for transferring data. This documentation serves as evidence for regulators and as a self-audit tool for organizations. Imagine an investigation into whether a reproductive health app properly obtained user consent before sharing location data. The company should be able to produce timestamped records of consent forms, logs of data transfers, and correspondence showing how deletion requests were honored. In this sense, documentation is not just administrative—it is the proof that an organization’s practices align with its obligations.
Mixed-purpose apps and non-traditional contexts present unique applicability challenges. Many businesses that do not see themselves as healthcare providers may nonetheless fall under MHMD because they process health-related data. A fitness app that tracks calories, a meditation app that monitors stress levels, or even an e-commerce site selling pregnancy tests could all be covered. The key is whether the data collected relates to an individual’s health status or healthcare-seeking behavior. Learners should recognize that this broad scope means organizations must reassess their assumptions about whether they are subject to health privacy laws. The statute serves as a reminder that health-related data exists far beyond hospitals and clinics in today’s economy.
Enforcement of MHMD relies heavily on unfair or deceptive practice theories. Misrepresentations in privacy notices, failure to obtain valid consent, or inadequate handling of consumer rights can be treated as deceptive practices under Washington’s consumer protection law. Remedies may include civil penalties, injunctive relief, and consent decrees requiring changes to business practices. This mirrors the Federal Trade Commission’s approach at the federal level, making state enforcement an extension of broader consumer protection frameworks. Companies must therefore ensure that their practices match their promises and that their compliance programs are demonstrable. Regulators do not just look for harm; they also examine whether companies are transparent, accurate, and responsive.
To align programs with MHMD, organizations must begin with a comprehensive data inventory, mapping where consumer health data resides and flows. From there, they should publish compliant privacy notices, implement robust consent flows, and build systems for deletion and consent withdrawal. Verification processes should confirm that rights requests are honored across all vendors and systems. For example, an app that collects menstrual cycle data should maintain an inventory of where that data is stored, publish a notice describing its use, prompt users for separate consents, and ensure that deletion requests are verified and cascaded to third parties. This holistic alignment ensures that compliance is not piecemeal but integrated into the entire lifecycle of health data management.
For more cyber related content and books, please check out cyber author dot me. Also, there are other prepcasts on Cybersecurity and more at Bare Metal Cyber dot com.
Nevada’s consumer health data law takes many of the principles pioneered by Washington and applies them with its own distinctive focus. At its core, the law defines consumer health data broadly, ensuring it captures not just medical test results or formal treatment records but also data streams that reveal or infer health conditions. This includes everything from app-based wellness information to purchase histories and geolocation data tied to healthcare visits. Covered entities—those that determine why and how health data is collected—must publish a privacy notice that is specific and comprehensive. The notice must describe categories of health data collected, the purposes for which it is processed, the sources from which it is obtained, and the categories of recipients. Just as important, the law requires affirmative consent before collection and separate consent before sharing with third parties. For a Nevada resident using a nutrition-tracking app, this means the app must ask permission twice: once to gather diet and weight records and again if it intends to share that data with advertisers.
Processors under Nevada’s framework are not free to improvise in how they use health data. They are bound by contractual controls that tether their operations directly to the instructions of the covered entity. Contracts must clearly spell out what the processor may and may not do, require confidentiality, and impose obligations to implement strong safeguards. They also require the processor to delete or return data at the end of the relationship. Consider a health startup that outsources analytics to a vendor specializing in predictive modeling. That vendor may not retain or repurpose Nevada consumer health data for its own algorithms unless the contract and consumer consent specifically allow it. These contractual provisions serve as guardrails, ensuring processors remain service partners rather than independent actors in the data economy. They also strengthen the accountability chain, ensuring downstream partners meet the same high bar as the original collector.
Nevada grants residents robust rights that force businesses to build responsive systems. Individuals can request access to their health data and receive a copy in a portable, readable format. They can demand deletion not only from the primary business but also from any processors holding their information. They can also change their preferences by withdrawing consent, obligating companies to halt certain uses or sharing of their data. Imagine a Nevada resident who initially allows a fertility-tracking app to share her cycle data with advertisers but later decides she no longer feels comfortable. The law requires the app to stop sharing immediately and to confirm deletion requests across its vendors. This design places meaningful control in consumers’ hands and demands that businesses create operational pathways to act quickly and accurately on such requests.
Nevada also sets clear expectations for security, with an emphasis on proportionality. The law requires organizations to establish safeguards that align with both the sensitivity of the health data and the volume being processed. This means a small wellness app may satisfy the law with encryption, strong authentication, and routine security testing, while a large digital health platform with millions of users would be expected to implement layered defenses such as network segmentation, continuous monitoring, and formal incident response drills. The proportionality approach reflects a pragmatic balance: not every company has the same resources, but every company must show it has thoughtfully matched its safeguards to its risks. For learners, the key idea is that security is not optional or symbolic—it must be demonstrable, documented, and continually assessed against evolving threats.
Illinois’s Genetic Information Privacy Act, or GIPA, focuses narrowly on genetic information but applies with remarkable strength. Genetic information is defined broadly to cover the results of genetic tests, identifiable genetic characteristics, and even family medical history that can be used to predict health conditions. Unlike general health data, genetic information carries unique risks: it can reveal predispositions to diseases, connect individuals to relatives, and persist across generations. Illinois recognizes that these risks warrant heightened protection, especially as consumer DNA testing has surged in popularity. A company offering ancestry analysis or direct-to-consumer DNA kits must treat genetic results under the stringent requirements of GIPA, even if it does not directly provide healthcare services. This specialized scope highlights how different types of health data can call for different legal approaches, reflecting both their sensitivity and their potential misuse.
Central to GIPA is the requirement for written authorization before disclosure. Businesses cannot share genetic information without obtaining a specific, signed consent from the individual. This authorization must be clear about what data will be disclosed, to whom, and for what purpose. Importantly, redisclosure is prohibited without a fresh authorization, ensuring that genetic information does not leak into secondary markets. For example, a genetic testing company cannot automatically pass results to pharmaceutical partners or insurers without renewed consent from the consumer. This requirement sets a higher bar than most data privacy frameworks, where a single opt-in might cover multiple uses. By requiring repeated, deliberate authorizations, Illinois ensures that individuals maintain active control over their genetic data at each stage of its use or transfer.
GIPA erects strong barriers in employment and insurance contexts. Employers are prohibited from using genetic information when making decisions about hiring, promotion, or job assignments, recognizing the potential for discrimination. Insurers face similar restrictions, preventing them from using genetic data to set premiums, determine eligibility, or deny coverage. Imagine an applicant who learns through testing that they carry a gene linked to increased cancer risk. Without GIPA, an employer might see that information as a reason to avoid hiring, or an insurer might raise premiums unfairly. Illinois blocks such outcomes by categorically excluding genetic information from consideration in these contexts. These protections not only safeguard individuals’ livelihoods and access to care but also encourage participation in genetic testing by reducing fears of negative consequences.
The law also imposes clear duties for retention and destruction. Genetic information must not be kept indefinitely but instead should be retained only as long as necessary for the purpose for which it was collected. Once that purpose has been fulfilled, businesses must securely destroy or de-identify the information. A direct-to-consumer DNA testing service, for example, may be required to delete raw data and samples once analysis is complete unless the consumer has explicitly consented to continued storage. Secure destruction protects individuals from future risks of unauthorized access or misuse. Illinois thus signals that respecting genetic privacy involves not just consent at the front end but also disciplined lifecycle management throughout the duration of data stewardship.
Private litigation has become a defining feature of GIPA enforcement. Unlike many other privacy laws that rely primarily on regulators, GIPA grants individuals the right to sue for violations. This private right of action has led to a wave of class actions, often focused on procedural lapses such as failing to obtain written consent or retaining genetic data longer than permitted. Courts have shown willingness to treat even technical violations as actionable, reflecting the seriousness with which Illinois views genetic privacy. For organizations, this means compliance failures are not just regulatory risks but also potential sources of costly lawsuits. The lesson here is clear: companies cannot afford to treat GIPA as a low-priority statute, because its enforcement path empowers individuals directly.
Apps and wearables processing health or genetic signals across states must adapt to a patchwork of requirements. A fitness tracker collecting heart rate data may fall under Washington or Nevada rules, while a DNA ancestry service operating in Illinois is firmly within GIPA’s scope. Alignment guidance stresses that businesses should not silo their compliance efforts but should adopt common standards that meet the strictest requirements. For example, building consent workflows that satisfy Illinois’s written authorization rules and Washington’s explicit opt-in requirements can create a harmonized approach. While this may require additional upfront investment, it reduces complexity in the long run and builds trust by holding all consumer health and genetic data to the highest available standard.
Vendor governance is a shared theme across all three states. Companies must ensure that their processors and service providers comply with the same obligations they face. This requires rigorous contracts that specify permitted uses, security safeguards, and deletion requirements. It also requires oversight, including audits or attestations that vendors are following through. For instance, a wearable device company contracting with a cloud provider must verify not only that the provider encrypts stored health data but also that it deletes records upon termination of service. Without such governance, businesses risk liability for their vendors’ missteps. Vendor governance thus becomes both a compliance tool and a demonstration of accountability to consumers and regulators alike.
Certain categories of data demand heightened care across these statutes, particularly children’s health data, reproductive health information, and location-linked records. Washington explicitly bans geofencing around healthcare locations, while Nevada requires consent before location-linked health signals can be collected or shared. Illinois’s GIPA applies strict redisclosure limits that also protect children’s genetic data, often requiring parental authorization. These categories of data are treated as especially sensitive because they intersect with societal concerns about discrimination, stigma, and autonomy. For example, reproductive health data may expose individuals to political or cultural targeting, while children’s genetic information may carry lifelong implications if mishandled. Laws in all three states converge on the principle that these categories require exceptional transparency, security, and consent.
Cross-state harmonization is both a compliance strategy and a cultural commitment. Businesses often choose to adopt the most restrictive requirements as their baseline, applying them across all jurisdictions. This might mean honoring Washington’s opt-in consent model everywhere, implementing Illinois’s written authorization process for all genetic information, and applying Nevada’s comprehensive notice requirements universally. While such harmonization can be resource-intensive, it creates operational clarity and reduces the risk of missteps in states with overlapping but slightly different rules. More importantly, it positions the business as a privacy leader, willing to go beyond minimum compliance to provide consumers with strong, consistent protections across the board.
Evidence packages are essential for demonstrating compliance under these regimes. Regulators and courts expect organizations to produce documentation that shows how they implemented consent mechanisms, honored consumer rights, and secured sensitive data. This might include logs of consent transactions, records of rights requests and responses, contracts with vendors, and certificates of destruction for deleted records. For example, a reproductive health app operating in both Washington and Nevada should be able to show regulators a clear file containing its notice text, consent forms, deletion workflows, and vendor contracts. These evidence packages serve as the backbone of defensibility, proving that the organization’s practices are not just promises but documented, verifiable actions aligned with statutory requirements.

Episode 83 — Health Data Rules: WA MHMD, NV Health Data Act, and IL GIPA
Broadcast by