Publications, reports, and articles.

Dark Patterns in Personal Data Collection: Definition, Taxonomy, and Lawfulness (2022)

Posted on:
October 25, 2025

The paper investigates dark patterns in privacy (DPPs): interface‑design tactics that manipulate users into disclosing more personal data than they intend, to the benefit of the service provider.

Jarovsky’s paper delivers a clear, interdisciplinary framework for identifying and regulating dark patterns in privacy. By tying UI manipulation to cognitive bias theory and positioning the GDPR’s fairness principle as the legal fulcrum, the research equips regulators, designers, and legal practitioners with a concrete roadmap to detect, assess, and ultimately prevent privacy‑harmful design practices.

What is this research about?

Guiding questions:

  1. How can DPPs be defined in a way that isolates them from ordinary nudges?
  2. Which cognitive biases do DPPs exploit, and how can they be systematically categorized?
  3. What is the legal status of DPPs under the EU GDPR (and related regimes such as the DSA and CCPA/CPRA)?
  4. Can existing legal principles—especially the GDPR’s fairness principle—be leveraged to curb DPPs?

The study aims to (i) propose a precise definition, (ii) build a taxonomy grounded in cognitive‑bias theory, and (iii) evaluate the compatibility of DPPs with current EU data‑protection law, suggesting pathways for regulatory reform.

What do you need to know? – Context & Significance

Background pointWhy it matters
Rise of manipulative UI – Users regularly encounter opaque settings, forced‑consent banners, and default‑heavy designs that funnel them toward data‑rich choices.Demonstrates a systemic problem that undermines the GDPR’s consent requirements.
Legal gap – While the GDPR mandates “freely given, specific, informed” consent, it does not explicitly address how consent is obtained via UI.Creates a loophole where controllers can obtain apparently lawful consent that is actually coerced.
Existing scholarship – Prior work on dark patterns (e.g., Brignull, Nouwens, CNIL) treats them mainly as marketing tricks; few connect them to privacy law.Jarovsky’s work bridges HCI, behavioral economics, and EU law, offering a multidisciplinary lens.
Policy momentum – The EU Digital Services Act (DSA) and California CPRA have begun naming dark patterns, but their scope is limited.Highlights the timeliness of a deeper legal analysis of DPPs.
Unique contribution – The paper (a) refines the definition to separate dark patterns from nudges, (b) maps each pattern to a specific cognitive bias, and (c) frames the GDPR’s fairness principle as the primary legal lever.Provides a concrete analytical toolkit for regulators, designers, and scholars.

What did the researchers find? – Key Findings & Illustrative Quotes

3.1 Definition & Conceptual Clarification

  • Dark patterns in privacy (DPPs) are manipulative UI choices that harm privacy while benefiting the controller.
  • Distinction from nudges: nudges are easy to avoid and non‑coercive; DPPs are hard to avoid and purposefully detrimental.

“To be considered a dark pattern, the design must be manipulative and have as an objective goal to make the data subject worse off according to the observed criteria.”

3.2 Taxonomy (Four Core Categories)

CategoryCore MechanismExample(s)
PressureCoercive language or conditional access (“you must share X to use Y”).“Require marketing consent to complete a purchase.”
HinderDeliberate friction—hidden settings, complex navigation, privacy‑invasive defaults.“‘Accept all’ vs. a labyrinth of individual switches.”
MisleadAmbiguous wording, double negatives, visual tricks (color, contrast).“Green ‘deny’ button, red ‘accept’ button.”
MisrepresentFalse claims of necessity or benefit.“Claim data collection is legally required when it isn’t.”

Each category is linked to a set of cognitive biases (e.g., default effect, framing, anchoring, social proof).

3.3 Legal Analysis

  • Consent incompatibility: DPP‑driven consent breaches GDPR requirements of freely given and informed consent (Recitals 42‑43).
  • Fairness principle as a remedy: Since the GDPR repeatedly cites fairness yet never defines it, the author argues that fairness should prohibit practices that “materially distort or impair” autonomous decision‑making.
  • Regulatory gaps:
    • DSA Recital 67 mentions dark patterns but exempts GDPR‑covered practices, leaving DPPs largely untouched.
    • CPRA explicitly invalidates consent obtained via dark patterns, offering a model for EU reform.

“DPPs breach the principle of fairness, for cumulatively: (a) not respecting reasonable expectations … (c) involving manipulation … (d) negatively affecting data subjects’ privacy.”

3.4 Outlier / Particularly Interesting Insights

  • Homo Manipulable vs. Homo Economicus: The paper proposes a shift from the rational‑agent model to a “Homo manipulable” view, recognizing bounded rationality as a legal reality.
  • Cross‑domain relevance: Although focused on privacy, the taxonomy applies to other domains (finance, attention, emotion) where dark patterns appear.
  • Policy recommendation: Unpack the GDPR’s fairness principle into actionable guidance, akin to the EU’s Unfair Commercial Practices Directive.

How can you use this research? – Audience‑Specific Takeaways

AudiencePractical Applications
Privacy Regulators & PolicymakersDraft interpretative guidance that treats DPPs as violations of the fairness principle. Amend the DSA to remove the GDPR exemption for privacy‑related dark patterns. Incorporate the taxonomy into supervisory checklists for DPIA reviews.
Designers & Product TeamsConduct an internal audit using the four‑category taxonomy to spot DPPs in UI flows. Replace pressure and mislead patterns with transparent, opt‑in consent dialogs. Adopt “privacy‑by‑design” checklists that explicitly test for default‑effect exploitation.
Legal Counsel & Compliance OfficersRe‑evaluate consent records for evidence of pressure or misrepresent tactics. Advise clients that consent obtained via any of the four categories may be deemed invalid under GDPR and CPRA precedents. Prepare arguments for fairness‑principle defenses in enforcement actions.
Researchers & AcademicsExtend the taxonomy to other jurisdictions (e.g., Brazil’s LGPD, India’s PDPB). Empirically test the impact of each pattern on user behavior using eye‑tracking or A/B experiments. Explore the intersection of DPPs with algorithmic profiling and AI‑driven personalization.
Consumer Advocacy GroupsUse the taxonomy to produce “dark‑pattern scorecards” for popular apps/websites. Educate users about cognitive biases that make DPPs effective, encouraging critical UI scrutiny.

Future‑research directions highlighted by the author

  • Operationalize the fairness principle with concrete metrics (e.g., “fairness impact assessment”).
  • Develop automated detection tools that map UI elements to the four DPP categories.
  • Study the effectiveness of regulatory interventions (e.g., mandatory UI disclosures) across jurisdictions.

5. What did the researchers do? – Methodology Overview

Methodological ElementDescription
Conceptual analysisSynthesized literature from HCI, behavioral economics, and EU data‑protection law to craft a precise definition separating DPPs from nudges.
Taxonomy constructionMapped a non‑exhaustive list of cognitive biases (anchoring, default effect, framing, etc.) to observable UI manipulations, producing four high‑level categories (Pressure, Hinder, Mislead, Misrepresent).
Legal doctrinal reviewExamined GDPR Articles 6, 7, 25, Recitals 42‑45, the DSA, and the CPRA, interpreting how each regime treats consent and dark‑pattern‑like practices.
Case illustrationsPresented realistic user scenarios (Alice, Bob, Charlie, Danah) to demonstrate each pattern type and its privacy impact.
Normative argumentationLeveraged the EU’s fairness principle and the PECL’s contract‑law concepts (mistake, fraud, pressure, hindrance) to argue for a legal re‑framing of DPPs.

No empirical data (surveys, interviews, or experiments) were collected; the work is a theoretical‑legal synthesis supported by illustrative examples.


Discover more from Knowledge Mobilization for Settlement

Subscribe to get the latest posts sent to your email.

Summary

The paper investigates dark patterns in privacy (DPPs): interface‑design tactics that manipulate users into disclosing more personal data than they intend, to the benefit of the service provider.
arrow-circle-upenter-downmagnifier

Please take this short survey to help improve the KM4S web site. The survey is anonymous. Thank you for your feedback! (click on the screen anywhere (or on the x in the top right corner) to remove this pop-up)