Non-Lawyers Summary
Open-source intelligence (OSINT) — gathering information from publicly visible sources — is broadly legal in the U.S., but a series of narrow statutes and court decisions create hard edges that practitioners can cross without realizing it. The main dividing lines are: whether a data source requires credentials to access, whether the collected data is used to surveil or harass an individual, and whether operations move from passively viewing dark web content to actively participating in illegal markets. Blockchain activity is public by design, but privacy coins and mixer services carry money-laundering risk even for researchers.
Nobody knew it yet, but the investigator had already crossed the line — twenty-three database queries ago. She was a licensed private investigator, working a fraud case, pulling everything she could find on the target: LinkedIn profiles, court records, social media archives, forum posts going back a decade. All public. All visible without logging in. All legal, as far as she understood.
Then a source offered her login credentials for a private investigators' database — a professional tool, legitimately purchased by the source's firm. She used it. Six searches. The target's home address, his wife's workplace, his children's school. She put it all in her report. The client was delighted. Three months later, the FBI knocked. The source's firm had revoked that login two weeks before she used it. She had accessed the database "without authorization" — not because she hacked anything, but because a revocation she was never told about had transformed a legitimate credential into an unauthorized one. The Stored Communications Act does not require malicious intent. It requires only access without authorization. She had that.
The hardest thing about OSINT law is not that the rules are complicated. It is that they are invisible until you are already on the wrong side of them. Public data is legal. Credentialed data is legal. The space between — where authorization ends, where aggregation becomes surveillance, where the dark web becomes a crime scene — is the terrain this module maps.
1. OSINT and the Stored Communications Act
The 1986 Law That Still Controls Your Google Dorking
It started with a phone call. A voicemail, stored somewhere on a server rack in a telephone switching station, read by someone who was never meant to hear it.
Congress panicked — and passed the Stored Communications Act.
That was 1986. The statute they wrote (18 U.S.C. §§ 2701–2712), enacted as part of the Electronic Communications Privacy Act, prohibits "intentionally access[ing] without authorization a facility through which an electronic communication service is provided" and obtaining stored communications from that facility.
They meant voicemail. Courts stretched it to webmail. Then to private social media messages. Then to cloud-stored data from services that didn't exist when the ink dried. The key phrase — "without authorization" — became the conceptual battlefield that would define OSINT law for decades. It is the same battlefield as the CFAA, fought on different terrain.
The SCA draws a simple line. But simple lines get complicated fast.
| Tier | Data Type | SCA Status |
|---|---|---|
| Publicly available | Public posts, open profiles, publicly indexed pages | No SCA protection — anyone may read |
| Restricted | Private messages, friends-only posts, login-gated data | SCA applies — unauthorized access is criminal |
What this means for OSINT: Scraping a public Facebook profile, a public LinkedIn page, or a Twitter/X post visible without login does not implicate the SCA. Accessing a private inbox, a "friends-only" post, or a restricted group without the account holder's permission potentially does — even if you obtained the credentials legitimately (e.g., from a leaked database).
Facebook v. Power Ventures — The Cease-and-Desist That Changed Everything
A startup called Power Ventures had a clever idea: let users log into their own Facebook accounts through Power's platform and pull in their friends and messages in one dashboard. Users loved it. Facebook didn't.
Facebook sent a cease-and-desist. Power continued anyway.
Facebook v. Power Ventures (N.D. Cal. 2012) was the moment courts drew a line in code. The court held that continued access after an explicit cease-and-desist constituted unauthorized access under both the CFAA and the SCA. The lesson was blunt: revocation of permission is legally effective. The moment you receive that letter and keep going, you are a criminal.
Practitioner note: If a platform sends you a cease-and-desist for scraping activity, further access is legally dangerous even if the data is publicly visible.
hiQ v. LinkedIn — The Case That Became OSINT's Constitution
In 2017, a startup called hiQ Labs was doing something remarkable: predicting which employees were about to quit their jobs. Their secret weapon was LinkedIn — publicly visible profiles, scraped at scale, fed into attrition models.
LinkedIn noticed. LinkedIn sent a cease-and-desist. LinkedIn threatened to sue.
But hiQ didn't fold. They sued first — and won.
The Ninth Circuit's 2022 ruling in hiQ Labs v. LinkedIn became the most important OSINT case of the decade. The court held:
- Scraping publicly available data does not constitute unauthorized access under the CFAA because the data is not "behind a gate" — anyone with a browser can see it.
- The Van Buren "gates-up/gates-down" framework (access to a prohibited zone, not misuse of permitted access) squarely supports this conclusion.
- LinkedIn's cease-and-desist did not transform public scraping into a CFAA violation, because there were no access controls to bypass.
But here is the hidden clause. The part the headlines missed.
Critical limitation: The hiQ ruling protects scraping of genuinely public data. It does not protect scraping login-required data, does not resolve state law claims, and does not address GDPR or CCPA obligations.
The drawbridge is still up. hiQ just proved you could walk up to it. It didn't prove you could go through.
2. OSINT Scraping Legal Limits
When the Supreme Court Rewrote the Rules
Just before oral arguments in Van Buren v. United States (2021), prosecutors believed they had the CFAA exactly where they wanted it. A Georgia police sergeant had abused his access to a law enforcement database to look up a license plate for money. Classic unauthorized use. Clear case.
The Supreme Court disagreed — and in doing so, dismantled a theory prosecutors had relied on for twenty years.
The Court held that CFAA's "exceeds authorized access" provision applies only to accessing a zone of a computer system the person was not permitted to enter — not to misusing data from a zone they were legitimately permitted to access. Gates up means you're in. Gates down means you're out. What you do once you're inside is a different question.
The Van Buren framework reshaped every OSINT scenario on the map:
| Scenario | CFAA Risk |
|---|---|
| Scraping public pages (no login required) | No CFAA violation under hiQ/Van Buren |
| Scraping after creating an account, within public-scope pages | Low risk; debated; ToS violation at most |
| Scraping pages behind login using your own valid credentials but bypassing rate limits | Gray zone; circuit split remains |
| Using stolen credentials to scrape login-required data | CFAA violation — § 1030(a)(2) |
| Bypassing a paywall, authentication token, or IP block using technical means | Strong CFAA exposure |
CFAA § 1030(a)(2) — The Prosecutor's Favorite Tool
This subsection is the primary weapon prosecutors deploy in scraping cases involving authentication bypass. Elements: (1) intentional access, (2) without authorization or exceeding authorization, (3) from a protected computer, (4) obtaining information. Note there is no damage requirement for civil suits when the threshold loss amount ($5,000) is met through investigation costs.
Terms of Service — The Theory That Died in the Supreme Court
For years, prosecutors argued that clicking "I agree" and then violating the fine print was a federal crime. Van Buren largely killed that theory. ToS violation alone — using a scraper on a site that prohibits automated access in its terms — does not establish criminal liability under the CFAA.
The terms of service checkbox is not a criminal statute. Not anymore.
GDPR Article 5 — The European Alarm System
Here is where American OSINT practitioners encounter a surprise: you can be in full compliance with U.S. law and simultaneously violating European law.
If your scraping target includes personal data about EU residents — even from a U.S. website — GDPR applies to the extent you are a "controller" or "processor" of that data. Article 5 mandates:
- Purpose limitation: Data may only be collected for specified, legitimate purposes.
- Data minimization: Collect only what is necessary.
- Storage limitation: Don't retain longer than needed.
Scraping EU residents' personal data without a lawful basis (consent, legitimate interest, or other Article 6 grounds) is a GDPR violation. Fines reach 4% of global annual revenue. OSINT practitioners operating in the EU or targeting EU subjects need a documented legitimate-interest assessment.
CCPA — California's Shadow Over Every Commercial Database
The California Consumer Privacy Act (Cal. Civ. Code § 1798.100 et seq.) applies to businesses meeting revenue/data thresholds. If you commercially exploit scraped data about California residents, CCPA obligations attach. For most individual OSINT practitioners, CCPA creates civil exposure primarily through the private right of action for data breaches involving scraped personal data sold without disclosure.
3. Stalking and Harassment Exposure
When OSINT Becomes a Weapon
There is a version of OSINT that helps investigators find missing persons, expose fraud, and protect the innocent. There is another version that destroys lives.
The law has a name for the second version.
Federal cyberstalking under 18 U.S.C. § 2261A applies when a person uses "any interactive computer service or electronic communication service" to engage in a course of conduct that places a specific person in reasonable fear of death or serious bodily injury, or causes substantial emotional distress.
OSINT becomes stalking when:
- It targets a specific individual (not a corporate or infrastructure target).
- There is a pattern — not a single lookup, but a course of surveillance.
- The aggregated information is used to monitor, follow, threaten, or intimidate.
The aggregation problem is real and terrifying: individually harmless OSINT queries (employer, neighborhood, daily schedule, vehicle, gym) combine into a surveillance profile that a court could treat as a course of conduct supporting a stalking charge.
The Fifty-State Minefield
Every U.S. state has a stalking statute. Most require a course of conduct, a credible threat, and knowledge that the conduct would cause fear. California Penal Code § 646.9, New York Penal Law § 120.45, and Texas Penal Code § 42.072 are representative examples. State laws often have lower thresholds than federal law and do not require interstate communication.
Key principle: Collecting OSINT on a private individual (not a public official in their public capacity) for non-journalistic, non-law-enforcement purposes creates stalking exposure if the purpose is surveillance or if the data is shared with someone who uses it to harass.
The Line Between Research and Predation
- Using OSINT to determine when someone leaves home to time a physical approach: stalking.
- Aggregating location data, work schedule, and vehicle information on an ex-partner: stalking.
- Conducting OSINT on behalf of someone else who uses it to harass: aiding and abetting.
- Accessing private accounts, location-sharing apps, or Find My networks without consent: SCA + state computer fraud statutes.
The line is not always obvious. That's what makes it dangerous.
4. Doxxing
The Law That Doesn't Exist — And the Laws That Do
As of 2026, there is no dedicated federal anti-doxxing statute. Bills have been introduced — the SHIELD Act, various session proposals — none enacted. Congress has not acted. The gap is real.
But the gap has edges. Federal exposure for doxxing flows through existing statutes that fill the silence:
- § 2261A (cyberstalking) if the doxxing is part of a harassment campaign.
- 18 U.S.C. § 875 (interstate threats) if the doxxed information is accompanied by threats.
- 18 U.S.C. § 1512 (witness tampering) if doxxing targets a witness or cooperator.
The states have moved faster than Congress.
| State | Statute | Coverage |
|---|---|---|
| California | AB 1732 (2022), Cal. Civ. Code § 1798.81.5 | Prohibits disclosure of personal info to facilitate violence against elected officials and their families; civil right of action |
| New York | Exec. Law § 79-n | Prohibits publishing personal information with intent to harm; applies to public officials and private individuals |
| Colorado | C.R.S. § 18-9-111 | Harassment statute covers electronic disclosure of personal information |
| Virginia | Va. Code § 18.2-60.4 | Criminal doxxing targeting law enforcement |
Trend: States are rapidly enacting doxxing statutes post-2020. Check current law in the target jurisdiction before relying on this table.
Civil Liability — The Courthouse No One Sees Coming
Even absent a criminal statute, doxxing exposes practitioners to:
- Intentional infliction of emotional distress (IIED): Requires extreme and outrageous conduct causing severe distress. Courts have found doxxing + coordinated harassment meets this threshold.
- Public disclosure of private facts: A privacy tort requiring that the disclosed facts were private, the disclosure was public, and the disclosure would be highly offensive to a reasonable person.
- Negligence: If you compiled a doxx file and shared it with someone who you knew or should have known would use it to harm the target.
You don't have to pull the trigger to be liable for the bullet.
5. Dark Web Operations — Tor Legal Exposure
The Network Built by the Government, Used Against It
Here is the paradox that defines the dark web: Tor was developed with U.S. government funding — DARPA, SAIC origins — and is now maintained by the Tor Project, a 501(c)(3) nonprofit. The FBI uses Tor. Journalists use it. Human rights activists use it from countries that would imprison them for their browsing history.
In the United States: using Tor is legal. No statute prohibits it. No court has held it criminal.
Some countries disagree — China, Russia, Iran block or criminalize Tor. For researchers working across jurisdictions, that matters.
Running a Tor Exit Node — The Risk That Comes With Being the Last Stop
Exit node operators have faced ISP termination, law enforcement visits, and civil liability claims from copyright holders. However:
- No U.S. court has imposed criminal liability on a Tor exit node operator solely for third-party traffic transiting the node.
- The Electronic Frontier Foundation and the Tor Project maintain legal guidance arguing exit nodes qualify for § 230 immunity (for content liability) and that operators are analogous to common carriers.
- Practical risk: Law enforcement will show up if your exit node is associated with CSAM, drug market traffic, or attacks. You will need to explain what a Tor exit node is. Have documentation prepared.
Silk Road and the Outer Boundary — United States v. Ulbricht
Once, there was a marketplace where you could buy anything.
Ross Ulbricht — known online as "Dread Pirate Roberts" — built Silk Road in a coffee shop. A libertarian experiment. A proof of concept that the internet could create a free market for everything. Within two years, it was processing millions of dollars in drug transactions monthly.
In 2013, the FBI found him. In 2015, United States v. Ulbricht (S.D.N.Y. 2015) established the outer boundary for dark web operations. Ulbricht was convicted of drug trafficking conspiracy, continuing a criminal enterprise, computer hacking, and money laundering.
For users — not operators:
- Browsing a dark web marketplace without buying anything: Legal in the U.S., assuming no active registration that constitutes agreement to participate in illegal activity.
- Creating an account and browsing: Gray zone — prosecutors argued in Silk Road prosecutions that account creation indicated conspiracy intent. Courts have generally not sustained conspiracy solely from registration without purchase.
- Purchasing drugs, weapons, or stolen data: Criminal charges under the relevant substantive statute (Controlled Substances Act, federal firearms statutes, § 1029 for access devices), potentially plus conspiracy.
Ulbricht received two life sentences without parole.
The U.S. subsequently dismantled AlphaBay (2017), Hansa (2017), and Dark0de (2021) using similar frameworks. The marketplace graveyard grows.
Accessing Hidden Services Under § 1030
Accessing a .onion hidden service does not independently violate 18 U.S.C. § 1030. Tor's routing is just a network protocol. The content accessed determines legality. CFAA applies to hidden services as it does to clearnet services — unauthorized access to a restricted zone is a violation whether the service runs on clearnet or Tor.
6. Dark Web Marketplaces — Legal Exposure Matrix
What You Buy Can Buy You a Cell
| Item Category | Primary Statute | Sentence Range |
|---|---|---|
| Stolen credit card data / access devices | 18 U.S.C. § 1029 (Access Device Fraud) | Up to 15 years |
| Counterfeit currency | 18 U.S.C. § 471 | Up to 20 years |
| Schedule I/II drugs | 21 U.S.C. § 841 | Quantity-dependent mandatory minimums |
| Firearms (FFL bypass) | 18 U.S.C. § 922(a)(1) | Up to 5 years |
| CSAM | 18 U.S.C. § 2252 | 5–20 year mandatory minimum |
Selling — Where "Aiding and Abetting" Follows You Home
Selling on a dark web marketplace adds aiding and abetting (18 U.S.C. § 2) liability for the offenses your customers commit with your products. Fentanyl vendors have been charged with resulting deaths under the drug-induced homicide theory (21 U.S.C. § 841(b)(1)(C)).
You don't have to know who dies. You just have to have shipped the product.
Operating — The Ulbricht Template
The Ulbricht charges remain the template for marketplace operator prosecutions:
- Drug trafficking conspiracy — operating the platform is itself participation in a conspiracy to distribute.
- Continuing criminal enterprise (CCE) — "kingpin" statute requiring 5+ persons under your direction, substantial income, series of violations.
- Computer fraud — operating the server infrastructure used for illegal activity.
- Money laundering — collecting and converting Bitcoin from drug sales.
Cryptocurrency as Evidence Trail — The Ledger That Never Forgets
Ulbricht believed Bitcoin was anonymous. He was wrong.
Blockchain analytics transformed dark web prosecution. Law enforcement working with Chainalysis and Elliptic traced Silk Road Bitcoin wallets through hundreds of transactions to identify Ulbricht. Every move he made was preserved, permanently, on a public ledger. Lesson: Bitcoin is pseudonymous, not anonymous. Every transaction is permanently public.
The blockchain is the witness that never dies.
7. Cryptocurrency and Blockchain OSINT
No Privacy Here — The Supreme Court Said So
United States v. Gratkowski (5th Cir. 2020) held that Bitcoin users have no reasonable expectation of privacy in their Bitcoin transactions recorded on the public blockchain. The Third-Party Doctrine from Smith v. Maryland applies: by broadcasting transactions to the network, users voluntarily expose them to the public.
This has profound implications:
- Law enforcement does not need a warrant to run blockchain analytics against public transaction data.
- Chainalysis, Elliptic, and TRM Labs reports are admissible without 4th Amendment suppression issues.
- The only meaningful privacy protection is coin-joining, mixing, or using privacy coins — all of which carry their own legal risk.
Blockchain Analytics in Court
Blockchain analytics have been admitted as expert testimony in multiple circuits. Prosecutors have successfully argued:
- Chainalysis Reactor's transaction graph tracing is a reliable methodology under Daubert.
- Heuristics like "common input ownership" (addresses that co-sign transactions are likely controlled by the same entity) are scientifically grounded.
- Exchange KYC records obtained via subpoena or MLAT close the pseudonymity gap.
Privacy Coins and the Tornado Cash Verdict
Monero (XMR) uses ring signatures, stealth addresses, and RingCT to obscure sender, receiver, and amount. Using Monero is legal in the U.S. But:
- The IRS has offered bounties for Monero tracing tools.
- Using Monero specifically to layer proceeds of drug sales or other crime constitutes money laundering (18 U.S.C. § 1956) regardless of the coin's built-in privacy.
Then came the moment that shocked the crypto world.
Tornado Cash (OFAC 2022): Without warning, OFAC sanctioned Tornado Cash — an Ethereum mixing protocol — holding that smart contract-based mixers are sanctionable "persons" under IEEPA. A protocol. Code running on a blockchain. Sanctioned. Interacting with Tornado Cash after the sanctions designation created OFAC violation exposure even for non-malicious users. Two Tornado Cash developers were subsequently criminally charged with money laundering conspiracy.
The developers didn't steal anything. They just built the infrastructure that others used to hide what they stole.
Practitioner rule: Legitimate blockchain OSINT (tracing a hack, researching a wallet for threat intelligence) using public data is lawful. Actively mixing your own funds, accepting client funds to mix, or advising clients to use mixers to evade law enforcement creates serious money-laundering exposure.
8. GEOINT and Facial Recognition
Clearview AI — The Company That Scraped the World's Face
Clearview AI did what no one had done before: they scraped billions of facial images from public social media profiles and turned the internet into the world's largest lineup. Law enforcement loved it. Privacy advocates were horrified. The courts got involved.
In 2022, Clearview settled ACLU v. Clearview AI (N.D. Ill.) under the Illinois Biometric Information Privacy Act (BIPA), agreeing to stop selling to most private entities in Illinois and to restrict law enforcement sales.
BIPA (740 ILCS 14) is the most plaintiff-friendly biometric privacy statute in the U.S.:
- Requires written informed consent before collecting biometric identifiers (including facial geometry).
- Requires a written policy governing retention and destruction.
- Private right of action: $1,000 per negligent violation, $5,000 per intentional or reckless violation, plus attorneys' fees.
- Clearview's Illinois liability was estimated in the billions for unconsented collection from Illinois residents.
GDPR Article 9 — The European Category That Changes Everything
Under GDPR, biometric data processed for the purpose of uniquely identifying a natural person is a special category requiring explicit consent or another elevated lawful basis. Running facial recognition scraping against EU social media subjects without consent is a high-severity GDPR violation. The Irish DPC and French CNIL have both issued enforcement actions against facial recognition scraping.
The Facial Recognition Risk Map
| Activity | Risk |
|---|---|
| Submitting a photo to a commercial service (PimEyes, Clearview via authorized channel) | Low; shifts liability to platform |
| Building your own facial recognition scraper against social media | BIPA (IL), GDPR (EU), potential state biometric laws (TX, WA) |
| Using facial recognition to track a specific private individual | Stalking statutes + biometric laws |
| Running bulk facial recognition to de-anonymize protest attendees | BIPA, First Amendment litigation target |
9. Threat Intelligence Operations
The Researcher in the Criminals' Forum
Every day, security researchers log into places that would horrify most people. BreachForums. RAMP. Exploit.in. Dark markets humming with stolen data. They're not there to buy. They're there to understand.
The legal analysis:
- Viewing public forum posts (no login required): Legal. No CFAA issue; no SCA issue.
- Creating an account to access restricted areas: Gray zone. Account creation may imply agreement to participate in the community's norms. No case has directly criminalized researcher-only accounts absent actual illegal activity.
- Purchasing stolen data "to analyze it": Almost certainly a violation of § 1029 (access device fraud) or other relevant statute. The "research purpose" defense has not succeeded in this context.
- Downloading leaked databases to check for client exposure: Legal in most contexts; CISA's guidelines support this for threat intelligence; civil privacy tort risk remains.
The Undercover Line Private Researchers Cannot Cross
Law enforcement agents routinely use undercover identities to infiltrate criminal organizations. Private researchers and investigators do not have the same authority.
- Private investigators posing as criminals on dark web markets risk criminal liability for any illegal acts they perform undercover (purchasing contraband, even to prove supply chains exist).
- Entrapment is a government defense — it does not protect private actors who induce crime.
- Sharing undercover intelligence with law enforcement before taking action is the safest path; memorialize the relationship in writing.
Threat Intel Sharing and the Antitrust Ghost
Sharing threat intelligence with competitors (e.g., through ISACs — Information Sharing and Analysis Centers) raises antitrust concerns if the information shared could facilitate price coordination or market allocation. The DOJ and FTC have issued guidance indicating that properly structured ISAC sharing does not violate antitrust law provided:
- The shared information is limited to technical threat indicators (IOCs, TTPs, vulnerability data).
- Pricing, output, or customer information is not shared.
- Participation is voluntary and non-exclusive.
Government Sharing Frameworks
- Automated Indicator Sharing (AIS): CISA's machine-to-machine STIX/TAXII feed for IOC sharing between government and private sector. Participation does not create legal liability for the shared indicators.
- ISACs (FS-ISAC, H-ISAC, E-ISAC): Sector-specific sharing; membership agreements govern liability. Federal antitrust safe harbors cover compliant ISAC activity.
10. Safe / Grey / Red Activity Matrix
| Activity | Legal Status | Notes |
|---|---|---|
| Google dorking (public site search operators) | SAFE | Querying public indexes; no access controls bypassed |
| LinkedIn scraping (public profiles, no login) | SAFE (U.S.)** | hiQ v. LinkedIn (9th Cir. 2022); GDPR applies for EU subjects |
| LinkedIn scraping (login-required data, bulk) | GREY | ToS violation; CFAA risk if auth controls bypassed |
| Shodan searches (public scan data) | SAFE | Shodan queries its own scan data; you are reading their results |
| Active port scanning a target without authorization | RED | CFAA § 1030(a)(5); possible state computer crime statute |
| HaveIBeenPwned lookups (email check) | SAFE | Public API; checking your own or authorized domain data |
| Downloading leaked databases from HIBP for self-audit | SAFE | Authorized service; lawful purpose |
| Accessing Tor Browser / .onion sites (viewing only) | SAFE (U.S.) | Legal software; legal to browse without purchasing contraband |
| Visiting dark web markets without account creation | SAFE (U.S.) | No criminal exposure from passive browsing |
| Creating an account on a dark web drug market | GREY | Account creation arguably evidence of conspiracy intent |
| Purchasing anything on a dark web market | RED | Substantive criminal charges + conspiracy |
| Running Maltego on a public target (open sources only) | SAFE | Aggregating public data; same legal profile as manual OSINT |
| Running Maltego transforms that query login-required APIs | GREY | Depends on whether you have authorized API credentials |
| Reverse image search (TinEye, Google Images) | SAFE | Querying public indexes |
| Submitting target photo to PimEyes / facial recognition service | GREY | Legal in U.S.; GDPR risk for EU subjects; BIPA risk if IL resident |
| Building your own facial recognition scraper against social media | RED (IL/EU) | BIPA (IL), GDPR (EU), state biometric laws |
| Public records requests (FOIA, state equivalents) | SAFE | Statutory right; no legal exposure |
| Accessing paste sites with publicly posted leaked data | SAFE | Viewing publicly accessible data; no auth controls bypassed |
| Downloading and analyzing paste-site leaked data commercially | GREY | Civil privacy tort risk; GDPR/CCPA obligations if data monetized |
| Blockchain tracing (public transaction data, Chainalysis-style) | SAFE | Gratkowski — no 4th Amendment protection; public data |
| Using cryptocurrency mixing services for research funds | RED | OFAC sanctions exposure post-Tornado Cash (2022) |
Key Cases Quick Reference
| Case | Court | Year | Holding |
|---|---|---|---|
| Van Buren v. United States | U.S. Supreme Court | 2021 | "Exceeds authorized access" = entering prohibited zone, not misusing permitted access |
| hiQ Labs v. LinkedIn | 9th Cir. | 2022 | Scraping publicly available data is not a CFAA violation |
| Facebook v. Power Ventures | N.D. Cal. | 2012 | Access after cease-and-desist = unauthorized access under CFAA + SCA |
| United States v. Ulbricht | S.D.N.Y. | 2015 | Dark web marketplace operation = drug conspiracy + CCE + money laundering; 2x life |
| United States v. Gratkowski | 5th Cir. | 2020 | No 4th Amendment protection for public Bitcoin blockchain transactions |
| ACLU v. Clearview AI | N.D. Ill. | 2022 | BIPA applies to facial recognition scraping; nationwide injunctive settlement |
Practitioner Checklist
Before conducting OSINT on an individual target:
- [ ] Is the target a public figure in their public capacity? (Lower privacy expectation)
- [ ] Am I accessing only publicly available data (no login required, no auth bypass)?
- [ ] Do I have a documented legitimate purpose (legal, journalistic, security research)?
- [ ] Is the target's personal data subject to GDPR or CCPA? If yes, document lawful basis.
- [ ] Will I aggregate location + schedule + personal data in a way that creates a surveillance profile?
- [ ] Have I received a cease-and-desist or platform notice about this activity?
- [ ] If operating on Tor or dark web, am I passively observing only — no purchases, no account creation on marketplaces?
- [ ] For blockchain OSINT: am I using only public transaction data without interacting with sanctioned protocols?
Next module: 02B — Zero-Day Markets, Export Controls, and Commercial Spyware
Test your knowledge
Ready to check what stuck?
10 questions — cases, statutes, and the practical move for each. Takes 5 minutes.