Non-Lawyers Summary

The U.S. has no single federal privacy law. Instead, roughly 20 states have enacted their own comprehensive privacy statutes, each with different thresholds, consumer rights, enforcement models, and breach notification clocks. California leads with the most aggressive regime — a standalone enforcement agency (the CPPA), a private right of action for breach victims worth $100–$750 per consumer, and some of the broadest definitions of "sensitive data" in the country. Illinois has a biometric-specific law (BIPA) that generates more class action litigation per capita than any other privacy statute in the U.S., with per-scan damages reaching $5,000. For security researchers, this patchwork matters in three specific ways: (1) if you hold breach data during responsible disclosure, deletion demands from individuals are legally enforceable in California, (2) if your research scrapes or aggregates personal information, you may be holding data subject to multiple state laws simultaneously, and (3) every state has its own definition of what personal information triggers breach notification, meaning one incident may generate 12 different notification clocks running at different speeds.


Overview

January 1, 2020. California woke up with something no other U.S. state had: a law that gave consumers the right to look a corporation in the eye and demand to know what it had collected, and why, and to order it deleted. The California Consumer Privacy Act (CCPA) was the first U.S. state law granting consumers affirmative rights over their personal data — not just remedies after a breach. Privacy advocates called it a watershed. Corporate legal departments called their outside counsel.

But that wasn't the end. It was the beginning.

Three years later, the California Privacy Rights Act (CPRA) — Proposition 24 — took full effect, creating a dedicated enforcement agency with rulemaking power, eliminating the cure period for regulatory violations, and adding new rights that companies had not anticipated. Since then, Virginia, Colorado, Connecticut, Texas, Oregon, Montana, Iowa, Utah, and Nevada have all enacted comprehensive privacy laws of varying scope.

This module maps the landscape, identifies the provisions that most directly affect security researchers, and provides practical frameworks for data handling during research operations.


CCPA/CPRA: The California Framework

Statutory Foundation

The California Consumer Privacy Act, Cal. Civ. Code §§ 1798.100–1798.199, as amended by the California Privacy Rights Act, Proposition 24 (2020), is the most comprehensive state privacy statute in the U.S. and the closest American analog to GDPR.

Core consumer rights under Cal. Civ. Code § 1798.100 et seq.:

Right to Know (§ 1798.100, § 1798.110): Consumers may request disclosure of what personal information a business collects, uses, discloses, or sells. Businesses must respond within 45 days, extendable by another 45 days with notice. "Personal information" under § 1798.140(v) is intentionally broad: any information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.

Right to Delete (§ 1798.105): Consumers may request deletion of personal information collected from them. Businesses must honor the request and direct their service providers to delete as well. Nine statutory exceptions exist — including data needed to complete a transaction, detect security incidents, comply with legal obligations, or engage in research in the public interest.

Right to Correct (§ 1798.106): Added by CPRA. Consumers may request correction of inaccurate personal information. The business must use commercially reasonable efforts to correct.

Right to Opt-Out of Sale or Sharing (§ 1798.120): Consumers may direct a business to stop selling or sharing personal information. "Sharing" was added by CPRA to capture behavioral advertising that did not technically involve a "sale."

Right to Limit Use of Sensitive Personal Information (§ 1798.121): Added by CPRA. Consumers may instruct businesses to limit use of sensitive personal information — a defined category under § 1798.140(ae) — to only what is necessary to perform the requested services. Sensitive personal information includes:

  • Social Security, driver's license, state ID, and passport numbers
  • Account log-in credentials (username + password)
  • Financial account information, debit/credit card numbers with security codes
  • Precise geolocation (within 1,850 feet)
  • Racial or ethnic origin
  • Religious or philosophical beliefs
  • Union membership
  • Contents of mail, email, or text messages (unless the business is the intended recipient)
  • Genetic data
  • Biometric information processed for unique identification
  • Health information
  • Sex life or sexual orientation

Right of Non-Discrimination (§ 1798.125): Businesses may not discriminate against consumers who exercise CCPA rights — including denying goods/services, charging different prices, or providing a different quality of service.

Business Thresholds

Not every business is subject to CCPA/CPRA. Under § 1798.140(d), a "business" subject to the law is a for-profit entity that:

  • Has annual gross revenues over $25 million (as adjusted);
  • Annually buys, sells, or receives or shares for commercial purposes the personal information of 100,000 or more consumers or households; or
  • Derives 50% or more of annual revenues from selling or sharing consumers' personal information.

Nonprofit organizations, government agencies, and small businesses that do not meet these thresholds are generally not covered — though they may still face sector-specific laws (HIPAA, GLBA, FERPA) or the California breach notification statute, which applies to any person or entity that owns or licenses computerized data of California residents.

§ 1798.150: The Private Right of Action — Where the Litigation Lives

Cal. Civ. Code § 1798.150 is the provision that generates the most litigation. It provides a private civil cause of action specifically and narrowly for breaches of security:

Triggering condition: A consumer's nonencrypted and nonredacted personal information is subject to unauthorized access and exfiltration, theft, or disclosure as a result of the business's violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the personal information.

Damages:

  • Statutory damages of $100 to $750 per consumer per incident (at the court's discretion)
  • Actual damages if greater than statutory damages
  • Injunctive or declaratory relief
  • Any other relief the court deems proper

What § 1798.150 does NOT cover: General CCPA/CPRA violations (failure to honor deletion requests, failure to disclose categories, opt-out violations) are enforced by the CPPA — not via private suit. Section 1798.150 is exclusively a breach remedy.

Pre-suit requirements: Before filing a § 1798.150 action, a consumer must provide the business with 30 days' written notice specifying the alleged violations. If the business cures within 30 days and provides written certification that violations will not recur, the action is barred. Courts have debated whether a "security breach" can be "cured" after the fact — the better view is that remediation of ongoing security failures can satisfy the cure requirement, but the breach itself cannot be undone.

Class action math: $750 statutory maximum × 100 million consumers = $75 billion theoretical exposure for a major breach. This figure drives settlement pressure in virtually every significant California data breach class action.


CPPA: The Enforcement Architecture

The California Privacy Protection Agency

CPRA created the California Privacy Protection Agency (CPPA) as an independent state agency with full administrative power to implement and enforce the CCPA/CPRA. Cal. Civ. Code § 1798.199.10 et seq. This was a structural break from the pre-CPRA regime, where the California Attorney General held exclusive enforcement authority.

CPPA powers:

  • Rulemaking authority (Cal. Civ. Code § 1798.185)
  • Civil enforcement through administrative proceedings
  • Referral to the AG for civil penalty litigation
  • Investigation of violations
  • Audits of businesses

Civil penalties (§ 1798.155):

  • Up to $2,500 per unintentional violation
  • Up to $7,500 per intentional violation
  • Each consumer whose rights are violated is a separate violation for purposes of penalty calculation

CPPA vs. AG: The CPPA is the primary regulator. The AG retains concurrent authority to enforce the CCPA but has indicated it will defer to CPPA on consumer privacy matters. For breach-specific claims under § 1798.150, the private right of action is the primary enforcement mechanism — neither the CPPA nor the AG needs to bring suit for consumers to recover.

Rulemaking History

The CPPA has been active since its creation, issuing regulations in several tranches:

Initial CPRA Regulations (2023): Effective March 29, 2023, the first CPPA regulations addressed: required contents of privacy notices, consumer request handling procedures (including response timelines and verification standards), opt-out mechanisms (including requirement to honor opt-out preference signals), service provider and third party contracts, and requirements for businesses that sell or share personal information.

Cybersecurity Audit Regulations (proposed, 2023–2024): The CPPA proposed regulations requiring certain businesses to conduct annual cybersecurity audits and submit results to the CPPA. These regulations remain in rulemaking as of 2026 and would represent a significant expansion of the CPPA's reach into active security compliance.

Automated Decision-Making Technology (ADMT) Regulations (proposed, 2023–2024): Proposed rules that would require businesses to conduct risk assessments before using automated decision-making technology, and give consumers rights to opt-out of certain ADMT uses. Directly relevant to security tools that use ML/AI for behavioral analysis.

Data Broker Regulations: California's Data Broker Registration Law (Cal. Bus. & Prof. Code § 1798.99.80) requires data brokers to register with the CPPA annually. The Delete Act (SB 362, 2023) will require the CPPA to build an accessible deletion mechanism allowing consumers to submit a single request to delete data from all registered data brokers — significant for security researchers who rely on data aggregators for OSINT.


Virginia CDPA

Virginia Consumer Data Protection Act

Va. Code § 59.1-571 et seq. (effective January 1, 2023)

Virginia's Consumer Data Protection Act was the second comprehensive state privacy law enacted, following California by two years. Its structure borrows from the GDPR more directly than CCPA, organizing obligations around controller/processor roles.

Thresholds: A controller subject to the CDPA is a person that, during a calendar year, either (a) controls or processes personal data of at least 100,000 Virginia consumers, or (b) controls or processes personal data of at least 25,000 consumers and derives over 50% of gross revenue from the sale of personal data.

Consumer Rights under § 59.1-573:

  • Right to confirm processing and access data
  • Right to correct inaccurate data
  • Right to delete personal data
  • Right to data portability
  • Right to opt out of targeted advertising, sale of data, and profiling for decisions with legal or similarly significant effects

Sensitive Data: The CDPA defines sensitive data categories at § 59.1-571 — including racial/ethnic origin, religious beliefs, mental/physical health, sexual orientation, immigration status, and precise geolocation — requiring explicit consent before processing.

Data Protection Assessments (§ 59.1-578): Controllers must conduct and document data protection assessments for processing activities that present heightened risk — including targeted advertising, sale of personal data, processing sensitive data, and profiling. Assessments must be made available to the AG upon request. This is a significant accountability mechanism with no equivalent in CCPA/CPRA at the statutory level.

Enforcement — AG Only: The Virginia CDPA has no private right of action. Enforcement is exclusively by the Virginia AG under § 59.1-580. The AG must give controllers 30 days' notice and an opportunity to cure before filing a civil action. The cure period has no sunset under CDPA. Civil penalties: up to $7,500 per violation.

No Private Enforcement: This is the most significant difference from CCPA/CPRA. A consumer whose data is mishandled under the CDPA cannot sue the business — only the AG can act. This makes Virginia CDPA violations systematically under-enforced compared to California.


Colorado Privacy Act

C.R.S. § 6-1-1301 et seq.

Colorado Privacy Act (CPA), effective July 1, 2023.

Thresholds: Applies to controllers that (a) process personal data of 100,000+ Colorado consumers annually, or (b) derive revenue from selling personal data and process data of 25,000+ consumers.

Consumer rights: Know, access, correct, delete, portability, opt-out of targeted advertising, sale, and automated decisions.

Sensitive data: Race/ethnicity, religious beliefs, mental/physical health diagnoses, sex life or sexual orientation, citizenship/immigration status, genetic/biometric data, precise geolocation, and children's data.

Universal Opt-Out Signals (§ 6-1-1306(1)(a)(III)): The CPA is notable for expressly requiring controllers to honor Global Privacy Control (GPC) signals and technically similar universal opt-out mechanisms by July 1, 2024. This was a first among state privacy laws — forcing businesses to build systems capable of recognizing browser-level privacy signals rather than requiring consumers to opt out manually per website.

Enforcement: AG and district attorneys enforce the CPA. No private right of action.

Cure period: The CPA originally included a 60-day cure period. Under § 6-1-1311(2), the cure provision expired January 1, 2025. After that date, violations do not require a cure opportunity before enforcement action — bringing Colorado closer to the California model.

Data Protection Assessments: Required for high-risk processing activities, similar to Virginia CDPA § 59.1-578.


Texas Data Privacy and Security Act

Tex. Bus. & Comm. Code § 541.001 et seq.

Texas Data Privacy and Security Act (TDPSA), effective July 1, 2024.

Scope: The TDPSA applies to persons that (a) conduct business in Texas or produce products or services consumed by Texas residents, (b) process or sell personal data, and (c) are not small businesses as defined by the U.S. Small Business Administration. Unlike California and Virginia, the TDPSA has no numerical threshold — the SBA size standard is the operative limit, meaning midsize companies that meet SBA "small business" definitions are excluded. This is a significant carveout relative to other state laws.

Consumer Rights: Know, access, correct, delete, portability, opt-out of targeted advertising, sale of personal data, and profiling.

Sensitive Data Categories (§ 541.002(22)): Racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexuality, immigration status, genetic or biometric data processed for unique identification, precise geolocation, children's data.

Enforcement — AG Only, No Cure After 2024: The Texas AG has exclusive enforcement authority under § 541.151. The TDPSA originally included a 30-day cure period that expired on the one-year anniversary of the statute's effective date — July 1, 2025. After that date, the AG may bring enforcement actions without first offering the violating party an opportunity to cure. Civil penalties: up to $7,500 per violation.

No Private Right of Action: Like Virginia and Colorado, Texas has no private right of action under the TDPSA.

Interplay with Texas Breach Notification Law: The TDPSA is separate from Texas's breach notification statute (Tex. Bus. & Comm. Code § 521.053), which requires notification within 60 days of breach discovery and notification to the AG for breaches affecting 250 or more Texas residents. A company that suffers a breach may face enforcement under both the breach notification statute and the TDPSA if the breach resulted from a failure to implement reasonable security measures.


State Quick-Reference Matrix

StateStatuteEffective DateThresholdPrivate RightEnforcementCure PeriodGPC Required
ConnecticutConn. Gen. Stat. § 42-515 et seq. (CTDPA)July 1, 2023100K consumers or 25K + 25% revenue from salesNoneAG onlyYes — expires Dec 31, 2024Yes (2024)
NevadaNev. Rev. Stat. § 603A (SB 220 + SB 370)Oct 1, 2021 (SB 220); July 1, 2024 (SB 370)Operators serving Nevada residents; no explicit threshold for SB 370 sensitive data categoriesNoneAG / private for SB 220 opt-out violationsNone specifiedNo
UtahUtah Code § 13-61-101 et seq. (UCPA)Dec 31, 2023100K consumers or 25K + 50% revenue from salesNoneAG onlyYes — no sunsetNo
MontanaMont. Code Ann. § 30-14-3001 et seq. (MTCDPA)Oct 1, 202450K consumers or 25K + 25% revenue from salesNoneAG onlyYes — expires Apr 1, 2026No
OregonORS § 646A.570 et seq. (OCPA)July 1, 2024100K consumers or 25K + 25% revenue from salesNoneAG onlyYes — expires Jan 1, 2026Yes
IowaIowa Code § 715D (ICDPA)Jan 1, 2025100K consumers or 25K + 50% revenue from salesNoneAG onlyYes — 90 days, no sunsetNo

Connecticut (CTDPA): Conn. Gen. Stat. § 42-515 et seq. is among the stronger comprehensive laws, adding data protection assessment requirements and an explicit requirement to honor opt-out signals beginning January 1, 2025. The 30-day cure period expired December 31, 2024. The AG may refer cases for civil penalties up to $5,000 per violation.

Nevada (SB 220/SB 370): Nevada's privacy regime predates the comprehensive wave — its original opt-out-of-sale requirement under SB 220 (Nev. Rev. Stat. § 603A.340) has been on the books since 2019. Senate Bill 370 (2023) added a consumer health data protection layer for covered entities processing "consumer health data," creating obligations similar to Washington's My Health MY Data Act. Nevada is unusual in that SB 220 creates a limited private right of action for consumers who are harmed by violations of the opt-out-of-sale requirement.

Utah (UCPA): Utah's Consumer Privacy Act is generally considered the most business-friendly comprehensive privacy law — it has no GPC requirement, no data protection assessment requirement, a permanent cure period, and no right to correct inaccurate data. Controllers must provide opt-out rights only for sale and targeted advertising, not for automated profiling. Controllers have 45 days (extendable to 90) to respond to consumer requests.

Montana (MTCDPA): Montana's law, effective October 1, 2024, applies to controllers processing data of 50,000 consumers — a lower threshold than most states, catching more midsize businesses. The cure period expires April 1, 2026, after which the AG may bring enforcement without prior cure opportunity.

Oregon (OCPA): Oregon's Consumer Privacy Act, effective July 1, 2024, includes rights for employees and job applicants — a distinction from most state laws that carve out employment data. Oregon also covers nonprofit organizations above the threshold, unlike California.

Iowa (ICDPA): Iowa's law, effective January 1, 2025, is among the most business-friendly after Utah. A 90-day cure period applies with no sunset. Iowa has no right to correct inaccurate data and limits deletion rights to data collected directly from the consumer (excluding inferred data).


Breach Notification: State Patchwork

What "Personal Information" Means Varies by Statute

One breach. Twelve different definitions. Twelve different notification clocks.

The definition of personal information that triggers breach notification varies significantly across states — and a single data breach involving a mixed dataset may trigger notification requirements in some states but not others.

California — Cal. Civ. Code § 1798.82: For breach notification purposes (as distinguished from the broader CCPA definition), California's § 1798.82 defines personal information as an individual's first name (or first initial) and last name in combination with any of:

  • Social Security number
  • Driver's license or state ID number
  • Account number + access code/password that would permit access to the account
  • Medical information
  • Health insurance information
  • Unique biometric data (fingerprint, retina/iris scan, voiceprint)
  • Tax ID number
  • Passport number
  • Username/email address + password or security question permitting access to an online account

Notification trigger: Unauthorized acquisition of computerized data that compromises the security, confidentiality, or integrity of personal information. California's 30-day outer limit runs from discovery.

New York — N.Y. Gen. Bus. Law § 899-aa (SHIELD Act): New York's SHIELD Act (effective March 21, 2020) expanded the definition of "private information" to include:

  • Standard identifiers (SSN, driver's license, account numbers)
  • Biometric information
  • Username/email + password
  • Medical/financial account information
  • HIPAA-protected health information

New York's notification deadline is 30 days from discovery, mirroring California.

Texas — Tex. Bus. & Comm. Code § 521.053: Texas triggers breach notification upon unauthorized acquisition of "sensitive personal information" — defined similarly to other states (SSN, driver's license, financial accounts, medical information, email/password). Reporting deadline: within 60 days of determining a breach occurred. Companies must notify the Texas AG for breaches affecting 250 or more Texas residents, with the notification submitted through the AG's online form.

The "60-Day vs. 30-Day" Problem for Researchers: A security researcher who discovers breach data in the wild faces a threshold question: which state's law controls the data they hold? In practice:

  • If the researcher discovers a database dump containing personal information of residents from multiple states, multiple breach notification obligations may apply simultaneously to the original company that was breached.
  • If the researcher provides the data to the breached company as part of responsible disclosure, the company's notification clock begins upon its own "discovery" — which arguably may predate the researcher's disclosure if the breach had occurred earlier.
  • The researcher holding the data is not the "owner or licensor" of the data within the meaning of most breach notification statutes, so notification obligations generally do not attach to the researcher directly. However, the researcher may have obligations under other theories.

Biometric Privacy: Illinois BIPA and Washington

Illinois Biometric Information Privacy Act

740 ILCS 14/1 et seq. (BIPA)

Illinois BIPA, enacted in 2008, remains the most plaintiff-friendly biometric privacy statute in the United States — and arguably the single most litigation-generating privacy law by volume of class actions.

What BIPA covers (740 ILCS 14/10): BIPA defines "biometric identifier" as a retina or iris scan, fingerprint, voiceprint, scan of hand or face geometry, or other unique physical or physiological characteristic (excluding photographs, physical descriptions, written signatures, and data derived from external context that do not qualify as biometric).

"Biometric information" means information based on biometric identifiers used to identify an individual.

Key obligations under 740 ILCS 14/15:

  1. Inform and obtain consent (§ 15(b)): Before collecting biometric identifiers/information, a private entity must: (a) inform the subject in writing that biometric data is being collected and stored; (b) inform the subject of the specific purpose and length of time for which the data is collected and stored; and (c) receive a written release from the subject.
  2. Retention policy (§ 15(a)): The entity must develop a publicly available written policy establishing a retention schedule and guidelines for permanently destroying biometric data when the initial purpose has been satisfied or within 3 years of last interaction.
  3. Non-disclosure (§ 15(d)): Biometric data cannot be sold, leased, traded, or otherwise profited from. It can be disclosed only with consent, or if required by law or valid warrant.
  4. Reasonable care (§ 15(e)): Entities must store, transmit, and protect biometric data using a reasonable standard of care and in a manner that is the same as or more protective than how the entity protects other confidential data.

Damages (740 ILCS 14/20):

  • $1,000 per negligent violation (or actual damages if greater)
  • $5,000 per intentional or reckless violation (or actual damages if greater)
  • Attorneys' fees, costs, and injunctive relief

Private right of action: Any person aggrieved by a BIPA violation may bring a civil action in any Illinois court. No AG involvement required.

Class action math: In a workplace context, if an employer collects fingerprints from 50,000 employees without complying with the notice requirements, and a court finds reckless or intentional violation: 50,000 × $5,000 = $250 million maximum exposure. This arithmetic has driven multi-hundred-million-dollar settlements against employers, time-and-attendance software vendors, and facial recognition companies.

Cothron v. White Castle System, Inc. — The Per-Scan Rule

The Illinois Supreme Court's 2023 decision in Cothron v. White Castle System, Inc., 2023 IL 128004, resolved a critical question: does a BIPA cause of action accrue once — when the first violation occurs — or every time a biometric identifier is collected without compliant authorization?

Holding: A separate BIPA cause of action accrues each time a private entity scans or transmits a biometric identifier without complying with the statute. This is the "per-scan" rule.

Impact on litigation exposure: Under Cothron, an employer whose fingerprint time-clock system made daily scans over 5 years would have a separate BIPA cause of action for each scan — not just one cause of action per employee. A 500-employee company with two scans per day over five years: 500 × 2 × 1,825 = 1,825,000 individual claims. At $5,000 per reckless violation: $9.125 billion theoretical exposure.

Court's moderation: The Cothron court noted that courts have discretion to award lesser damages even for intentional violations, and that "annihilative" damage awards might constitute a due process issue. Practically, this means settlement values are anchored somewhere between $1,000 and $5,000 per unique violation but are negotiated down from the per-scan ceiling.

Security research implication: Facial recognition systems used in security research — even for non-commercial purposes — that scan Illinois residents without compliant authorization face per-scan BIPA exposure. The private right of action is fully available to Illinois residents.

Washington My Health MY Data Act

Wash. Rev. Code § 70.372 (WMHDA), effective March 31, 2024 for regulated entities and June 30, 2024 for small businesses.

Washington's WMHDA is a health data-specific privacy statute that fills gaps left by HIPAA — which does not cover non-covered-entity health apps, wellness platforms, period-tracking apps, or fitness wearables.

What it covers: "Consumer health data" — any personal information that is linked or reasonably linkable to a consumer and that identifies the consumer's past, present, or future physical or mental health status. This includes reproductive/sexual health, medications, diagnoses, health conditions, and precise geolocation data that could reasonably indicate an attempt to acquire health services.

Key requirements:

  • Consent required before collecting or sharing consumer health data
  • Right to access, withdraw consent, and request deletion
  • Data cannot be sold without valid authorization
  • Small geofencing prohibition: cannot use geofence technology around healthcare facilities to collect consumer health data

Private right of action: Yes — a consumer may bring a civil action for actual damages, statutory damages of $1,000 per violation, injunctive relief, and attorneys' fees. Unlike BIPA, Washington's law does not specify per-scan versus per-incident accrual, which remains to be litigated.


Security Researcher Angle

Scraping and Data Aggregation Under State Privacy Laws

Security researchers routinely aggregate data — pulling from public sources, breach databases, OSINT tools — and must understand when this activity creates state privacy law exposure.

CCPA scraping: Cal. Civ. Code § 1798.140(v) defines "personal information" to include data "reasonably capable of being associated with" a particular consumer. This means aggregated data that individually is innocuous — IP addresses, browsing behavior, location data — may constitute personal information when combined. The aggregation problem — collecting many individually non-sensitive data points to build a profile of an identifiable individual — is not expressly prohibited by CCPA, but processing aggregated profiles for commercial purposes may constitute a "sale" or "sharing" that triggers opt-out obligations.

CCPA deletion demands during active research: If a security researcher has collected personal information about an individual as part of research — compiling victim profiles in a breach investigation — and that individual submits a CCPA deletion request, the researcher's obligations depend on whether the researcher is a "business" within the meaning of § 1798.140(d). Most individual researchers and small research organizations will fall below the $25 million revenue threshold and the 100,000-consumer processing threshold. However, research organizations, threat intelligence firms, and data broker-adjacent research operations may qualify and face real deletion obligations.

When you hold breach data: If a security researcher discovers and downloads a database of breached personal information as part of responsible disclosure, they become a custodian of personal information. Several legal frameworks bear on this:

  • CCPA: If the researcher qualifies as a "business" and the individuals in the dataset are California residents, the researcher may have notification obligations under § 1798.82 if the breach was of the researcher's own systems. But this is an unusual fact pattern — the researcher is the discoverer, not the original breached entity.
  • Stored Communications Act (18 U.S.C. § 2701): If the breach data was acquired from a communication service provider, possession of the data may raise SCA issues for the researcher depending on how it was acquired.
  • Minimization as best practice: The legally safest approach is to collect only what is necessary to document the vulnerability, immediately report to the affected organization, and delete the data once the vulnerability is confirmed. Extended retention of PII-containing breach data is where legal risk accumulates.

Responsible Disclosure and PII: Responsible disclosure frameworks (ISO/IEC 29147, CISA CVD guidance, platform-specific VDP terms) typically require the researcher to retain confidentiality of affected data. This aligns with the state privacy law objective of minimizing data exposure, but creates a tension: the researcher needs enough data to prove the vulnerability is real, but holding that data creates increasing legal exposure over time.

Practical framework:

  1. Capture the minimum necessary to prove exploitability (a truncated sample, masked fields, a count of affected records)
  2. Do not retain full PII datasets beyond what is needed for disclosure
  3. Notify the organization promptly — the earlier they know, the earlier their breach notification clock considerations begin
  4. If the organization is unresponsive, follow coordinated disclosure norms (90-day timeline) and document every communication
  5. Delete retained data once disclosure is complete

CCPA and BIPA overlap in research tools: Security researchers building or using facial recognition tools, fingerprint analysis tools, or voice identification tools that process Illinois residents' biometric data face BIPA exposure regardless of whether the purpose is security research. BIPA does not include a security research exemption. A researcher scanning social media photos to build a facial recognition test dataset from Illinois residents is processing biometric data without BIPA authorization. This is a real litigation risk.


12-State Comparison Table

StatePrivate RightAG EnforcementCure PeriodBiometric CoverageSensitive Data DefinitionBreach Notification ClockThresholdEffective Date
CaliforniaYes (§ 1798.150 — breach only)Yes (CPPA + AG)No (CPPA enforcement); 30-day pre-suit (§ 1798.150)Yes (in sensitive data definition)Broad — 12 categories including biometric, geo, health, sexual orientation30 days$25M revenue or 100K consumers or 50% revenue from salesJan 1, 2020 (CCPA); Jan 1, 2023 (CPRA)
Illinois (BIPA only)Yes ($1K–$5K per violation)AG may bring actionNoYes — statute is biometric-specificBiometric identifiers and information (740 ILCS 14/10)N/A (BIPA is not breach notification)Any private entity that collects biometrics2008
VirginiaNoneAG onlyYes (30 days; no sunset)Yes (in sensitive data)10 categories including biometric, mental health, immigration status, geolocationN/A (separate breach law)100K consumers or 25K + 50% revenueJan 1, 2023
ColoradoNoneAG + DAsExpired Jan 1, 2025Yes (biometric in sensitive data)Includes biometric, health, geolocation, immigration, union membershipN/A (separate law)100K consumers or 25K + revenue from salesJuly 1, 2023
TexasNoneAG onlyExpired July 1, 2025Yes (biometric in sensitive data)8 categories including biometric, health, geo, sexuality, immigration60 days (§ 521.053)SBA small business exceptionJuly 1, 2024
ConnecticutNoneAG onlyExpired Dec 31, 2024Yes (biometric in sensitive data)Includes biometric, geo, health, sexual orientation, immigration60 days (separate statute)100K consumers or 25K + 25% revenueJuly 1, 2023
NevadaLimited (SB 220 opt-out)AGNone specifiedYes (SB 370 health data)Health data under SB 370; standard categories under SB 22030 days (§ 603A.220)Operators selling data; no explicit threshold for SB 3702019 (SB 220); 2024 (SB 370)
UtahNoneAG onlyYes — no sunsetYes (biometric in sensitive data)7 categories; narrower than most30 days (§ 13-44-202)100K consumers or 25K + 50% revenueDec 31, 2023
MontanaNoneAG onlyYes — expires Apr 1, 2026Yes (biometric in sensitive data)Similar to VirginiaN/A (separate law)50K consumers or 25K + 25% revenueOct 1, 2024
OregonNoneAG onlyYes — expires Jan 1, 2026Yes (biometric in sensitive data)Includes employees; broader than most45 days (§ 646A.604)100K consumers or 25K + 25% revenueJuly 1, 2024
IowaNoneAG onlyYes — 90 days, no sunsetYes (biometric in sensitive data)Narrower; no right to correct30 days (§ 715C.2)100K consumers or 25K + 50% revenueJan 1, 2025
Washington (WMHDA)Yes ($1K per violation)AGNoYes (health data including reproductive)Consumer health data (health-specific statute)N/A (WMHDA is not breach notification)Any regulated entity or small businessMar 31 / Jun 30, 2024

Safe / Grey / Red Matrix: Data Handling During Security Research

ActivityWhy It Is Safe
Reviewing publicly available breach data on HaveIBeenPwned to verify exposure of a specific email addressPlatform has authorization; no direct PII access; no download
Accessing anonymized aggregate statistics from a breach (record count, data categories) to document a vulnerabilityNo PII; no collection of personal information
Using masked/truncated sample data (last 4 digits only, partial hashes) in a bug reportMinimum necessary disclosure; PII not retained
Deleting breach data immediately after confirming vulnerability and filing reportMinimization; no extended retention
Requesting your own CCPA data deletion as a test of a company's privacy programLegitimate consumer right exercise
Analyzing biometric data collected from yourself or consenting participants in a controlled research environmentConsent satisfies BIPA § 15(b); properly documented
Processing personal data of fewer than 25,000 consumers for nonprofit security researchLikely below CCPA/most state thresholds

GREY — Elevated Risk; Requires Precaution

ActivityRiskMitigation
Downloading a breach database dump to verify scope before reportingPossession of PII; SCA concerns if acquired from a provider's systemsCollect minimum sample; delete immediately; document chain of custody
Using facial recognition tools on public social media images to identify threat actorsBIPA exposure for Illinois residents; CCPA exposure for California residentsDo not process biometric data of Illinois residents without authorization; document research purpose
Aggregating personal data from multiple public sources to build a target profileMay constitute "sale" or "sharing" under CCPA if done for a business purposeKeep below CCPA thresholds; purpose-limit to security research; do not retain
Receiving CCPA deletion requests while holding breach data as part of an investigationIf researcher qualifies as "business," deletion obligation may attachDocument research purpose; respond within 45 days; consult counsel on exemptions
Operating a threat intelligence service that processes personal data of state residentsPotential classification as "data broker" in California; registration and deletion obligationsRegister if required; build deletion mechanism; audit data retention
Testing a health app's data security by registering as a consumer and reviewing data handlingWMHDA or HIPAA data may be collected as a byproductDo not retain health data; scope test to functional security aspects only
ActivityRisk
Retaining a full breach database containing PII for longer than needed to document and report the vulnerabilityCCPA, state breach notification, SCA exposure; could be characterized as unauthorized possession
Collecting biometric data from Illinois residents (fingerprints, facial scans, voiceprints) without written authorization under 740 ILCS 14/15Per-scan BIPA liability at $1,000–$5,000 per violation; private right of action
Building or operating a commercial service that aggregates personal health data of Washington residents without consentWMHDA private right of action; $1,000 per violation; AG enforcement
Using CCPA deletion exemptions as a justification for indefinitely retaining breach data from a third-party breachExemptions are narrow; extended retention is not covered by security research exemptions
Disclosing breach data to third parties (including other researchers, journalists, or law enforcement) without following proper legal processMay constitute unlawful disclosure under state privacy laws; potential CFAA issues
Operating a data broker service in California without registering with the CPPACal. Bus. & Prof. Code § 1798.99.82; enforceable fine per failure to register
Selling or licensing personal data collected during security researchLikely a "sale" under CCPA; triggers opt-out requirements and potential § 1798.150 exposure

Practitioner Takeaways

1. CCPA § 1798.150 is the breach statute that matters most for security researchers. The private right of action is breach-specific. If your organization fails to implement reasonable security and a breach results in unauthorized access to California residents' nonencrypted personal information, class action litigation is nearly automatic for any breach above 10,000 consumers. Document your security posture now — penetration testing records, patch management logs, and incident response procedures are your defense exhibits.

2. Illinois BIPA is strict liability territory for biometric data. BIPA has no scienter requirement for the $1,000 per negligent violation tier — a company that collects fingerprints without compliant notice and written authorization is liable even if it had no idea BIPA existed. After Cothron, every biometric scan is a separate cause of action. If your security tools touch biometric data of Illinois residents, get counsel before deployment.

3. The cure period map is shrinking fast. California (CPPA enforcement): no cure period. Colorado: expired January 1, 2025. Connecticut: expired December 31, 2024. Texas: expired July 1, 2025. Oregon: expires January 1, 2026. Montana: expires April 1, 2026. The legislative trend is clearly toward eliminating cure periods, bringing state laws closer to the GDPR model where violations are immediately enforceable.

4. Breach notification clocks vary; build a multi-state checklist. A single breach affecting residents of California, Texas, and Nevada triggers three different notification regimes: California (30 days), Texas (60 days), Nevada (30 days). If the breach involves health data, Washington's WMHDA may also apply. If the company is a HIPAA covered entity, the 60-day HIPAA breach notification rule runs concurrently. The single most important operational question is: which states' residents are in the affected dataset?

5. PII minimization is the legal risk management strategy. Every state privacy law discussed in this module — from CCPA to BIPA to WMHDA — imposes less risk on actors who hold less data for shorter periods. For security researchers, this means: collect the minimum necessary to prove the vulnerability, report it promptly, and delete. Extended possession of breach data is where legal exposure compounds.

6. The CPPA cybersecurity audit regulations, if finalized, will be a major compliance burden. The proposed CPPA regulations requiring annual cybersecurity audits and reporting to the agency — if finalized — would make California the first U.S. state to require annual affirmative security attestation from covered businesses. Security professionals should track this rulemaking; if enacted, it creates a mandatory market for penetration testing and third-party security assessment services.


Quiz

See: artifacts/quizzes/quiz-02f.md

Test your knowledge

Ready to check what stuck?

10 questions — cases, statutes, and the practical move for each. Takes 5 minutes.

Take the quiz now →