DG Justice and Consumers, European Commission

Rue Montoyer 59 

1000 Brussels

17 March 2023

Re: Digital Fairness - Fitness Check on EU Consumer Law

Dark Patterns, Consumer Protection and Digital Fairness:

A roadmap to illuminating user-centric design

By Dr Mark Leiser[1]

As contemporary society grows increasingly reliant on technological advancements and the digital sphere, it is incumbent upon legal regimes to evolve concomitantly to safeguard consumer interests. In recent years, a phenomenon warranting particular attention is the emergence of "dark patterns" - manipulative and deceptive user interface designs aimed at surreptitiously guiding users' behaviours and impinging upon their autonomy and control. The submission assesses the extent to which extant and forthcoming legislation is equipped to address this pernicious practice through the prism of the digital fairness review.

I. Enforcement Actions by Consumer Protection Agencies

The European Commission has studied dark patterns and manipulative practices, interfaces designed to push consumers into making choices that may not be in their best interest.[2] The Federal Trade Commission (FTC) released a report in 2022 showing how companies are increasingly using sophisticated design practices known as “dark patterns” that can trick or manipulate consumers into buying products or services or giving up their privacy.[3]   Within the EU, consumer protection authorities have adopted various enforcement measures targeting companies that employ dark patterns and deceptive design techniques in contravention of the Unfair Commercial Practices Directive (UCPD). The UCPD's objective is to safeguard consumers from unfair and deceptive commercial practices, encompassing those that exploit or manipulate consumer behaviour through dark patterns. Regulatory actions instituted by consumer protection agencies range from imposing fines and injunctions to filing legal complaints. These enforcement measures deter businesses from utilising dark patterns, which entail misleading or coercive methods that erode user autonomy and control. The consequences of such patterns can be harmful, giving rise to forced continuity, privacy Zuckering, and compelled action.  Personalised choice architectures employ personal data, such as browsing history, demographic information, and preferences, to customise user experiences and shape decision-making. By presenting options capitalising on cognitive biases, these designs can subtly sway users towards choices that advantage the company, such as selecting a costlier product, committing to a subscription, or disclosing personal information. For instance, the Dutch Authority for Consumers and Markets (ACM) has sanctioned a company for inadequate transparency regarding personalised pricing. Other e-commerce platforms might utilise a user's previous purchases to prominently feature higher-priced items, aware that the user has a propensity for procuring expensive products. While this may not overtly contravene GDPR or UCPD, it engenders ethical quandaries concerning exploiting users' vulnerabilities and biases for commercial profit. Innovative forms of dark patterns emerge as technological advancements and user experience design evolve. This section critically analyses the European Union's preparedness to address these nascent challenges by scrutinising four distinct categories of developing dark patterns:

  1. Personalised Choice Architectures in Compliance with GDPR/UCPDP: These dark patterns leverage personal information to tailor choice architectures and influence users' decisions while adhering to the General Data Protection Regulation (GDPR) and Unfair Commercial Practices Directive (UCPD) stipulations. Despite formal compliance with regulatory frameworks, these practices might exploit users' susceptibilities and cognitive biases. Consequently, it is paramount that the EU ensures the UCPD's scope remains sufficiently adaptable to encompass such dark patterns and that the GDPR's provisions relating to transparency and consent are stringently enforced. As technological progress persists, corporations increasingly utilise personalised choice architectures conforming to GDPR and UCPD standards to manipulate user choices. Although aligned with existing regulations, these practices may nonetheless prey upon users' vulnerabilities and biases, raising concerns about their ethical implications and the effectiveness of consumer protection enforcement.
  2. Personalised Choice Architectures Non-Compliant with GDPR/UCPD: In instances where personalised choice architectures fail to adhere to GDPR and UCPD stipulations, the EU should persist in enforcing the extant legal framework. This enforcement would necessitate imposing fines, injunctions, and other penalties on companies violating regulations. Furthermore, the EU could contemplate fortifying the UCPD by explicitly addressing dark patterns and augmenting the authority of consumer protection agencies to counteract these practices. When personalised choice architectures disregard GDPR and UCPD requirements, the EU must enforce the existing legal framework effectively, safeguarding consumer rights and holding companies accountable for their deceptive practices.
  3. Recommendations Powered by Human Avatars: Some dark patterns use human avatars to recommend products or services, giving the appearance of unbiased advice from a trusted source. These avatars can further manipulate consumers as they become more sophisticated and lifelike. The EU should consider updating its regulatory framework to address this deception by requiring clear disclosure of such avatars’ artificial nature and commercial intent. As technology advances and the digital landscape evolves, emerging trends such as using human avatars and AI-powered communication tools like ChatGPT-4 transform how businesses interact with consumers. While these innovations can improve the user experience and streamline communication, they also introduce new consumer protection and regulatory compliance challenges.
  4. Generative Dark Patterns Powered by Large-Language Models like ChatGPT: As artificial intelligence and machine learning advance, large-language models like ChatGPT-4 can generate increasingly sophisticated dark patterns. These practices may be challenging to detect and regulate, as they can adapt to users' behaviour in real time. The EU should invest in research and development to improve its ability to identify and monitor these advanced dark patterns and work closely with the AI community to develop ethical guidelines and technical safeguards. As artificial intelligence (AI) and machine learning advance, large-language models like ChatGPT can generate increasingly sophisticated dark patterns. These practices may be challenging to detect and regulate, as they can adapt to users' behaviour in real time. With companies like Meta already testing AI in consumer marketing and ChatGPT-4 envisaged as a medium between platforms and consumers, traditional means of delivering terms and conditions, privacy policies, and other transparency obligations will be powered by new forms of machine learning. This transformation presents both opportunities and challenges for consumer protection.

I will now outline consumers' opportunities, risks, and challenges in line with these four categories.

  1. Personalised Choice Architecture in Compliance with the GDPR/UCPD

Opportunities for Improved Consumer Experiences:

1.1 Enhanced Personalization: AI-powered communication can offer consumers more personalised experiences by tailoring information to their needs, preferences, and interests. This could make terms and conditions, privacy policies, and other disclosures more engaging and easier to understand, ultimately leading to more informed decision-making.

1.2 Streamlined Interactions: AI-driven systems like ChatGPT-4 can improve the efficiency and effectiveness of consumer interactions with platforms, simplifying processes and saving time. For instance, AI could help users navigate complex legal documents, quickly locate relevant sections, and obtain concise explanations of key provisions.

1.3 Dynamic Adaptation: Large-language models can adapt to changes in user behaviour and preferences in real time, allowing platforms to serve their customers better and address their concerns. This can lead to more effective communication and greater satisfaction among consumers.

  1. Personalised Choice Architectures not compliant with the GDPR/UCPD Challenges and Risks for Consumer Protection:

2.1 Deceptive Practices: AI-generated content can create new dark patterns that manipulate users' behaviour and exploit their vulnerabilities. For instance, AI-powered systems could provide misleading or biased information, selectively present terms and conditions, or pressure users to decide against their best interests.

2.2 Lack of Transparency: AI-driven systems may need to be more explicit about the distinction between human-generated and AI-generated content, making it difficult for consumers to assess the credibility and reliability of the information they receive. This lack of transparency can undermine trust in platforms and services.

2.3 Privacy Concerns: Using AI to personalise experiences may require collecting and processing large amounts of personal data, raising concerns about privacy and data protection. This can lead to potential violations of GDPR and other privacy regulations.

  1. Recommendations Powered by Human Avatars

Challenges and Regulatory Response:

3.1 Clear Disclosure Requirements: Companies employing human avatars should be required to disclose the artificial nature of these avatars and their commercial intent. This can be achieved through visual cues, disclaimers, or other means that make it apparent to users that they are interacting with a non-human entity.

3.2 Guidelines for Responsible Avatar Design: The EU can develop guidelines for the ethical and responsible design of human avatars, outlining best practices and principles that companies should adhere to when developing and deploying these technologies.

3.3 Enhanced Monitoring and Enforcement: Consumer protection agencies should be equipped with the necessary tools and resources to monitor the use of human avatars and ensure compliance with disclosure and design requirements.

  1. Generative Dark Patterns Powered by Large Language Models like CHATGPT

Addressing the Challenges:

4.1 Develop Ethical Guidelines and Technical Safeguards: The EU should work closely with the AI community to develop ethical guidelines and technical safeguards that promote responsive design and discourage using dark patterns. This may include incorporating fairness, accountability, and transparency principles into AI systems to prevent deceptive practices and uphold consumer rights.

4.2 Invest in Research and Development: The EU should invest in research and development to improve its ability to identify and monitor AI-generated dark patterns. This may involve developing advanced detection and monitoring tools and exploring new methods for analysing and interpreting AI-generated content.

4.3 Strengthen Regulatory Frameworks: The EU should consider updating its regulatory frameworks to address the unique challenges AI-driven systems pose. This may include creating new disclosure requirements for AI-generated content, particularly in consumer settings like AI-powered communications with customers, enhancing data protection rules, and providing more explicit guidance on applying existing consumer protection laws to AI-generated dark patterns.

By proactively addressing the opportunities and challenges presented by AI-powered systems like ChatGPT-4, the EU can ensure that consumer protection remains a priority in the rapidly evolving digital landscape. This will involve balancing the potential of AI to improve consumer experiences with the need to safeguard against the risks associated with deceptive practices and privacy violations.

Maintaining the Efficacy of Consumer Protection Enforcement

To preserve the efficacy of consumer protection enforcement amid the emergence of dark patterns, the European Union (EU) should contemplate the following initiatives:

  1. Bolster Coordination Amongst Consumer Protection Agencies: The EU must augment collaboration and coordination amongst its member states' consumer protection agencies. Establishing a centralised platform for exchanging information, best practices, and resources would enable agencies to consolidate their expertise and collaboratively address dark patterns more effectively, thereby ensuring a uniform approach to enforcement across the region.
  2. Allocate Resources for Advanced Detection and Monitoring Technologies: Consumer protection agencies require state-of-the-art detection and monitoring tools to identify and combat dark patterns proactively. Investment in avant-garde technologies, such as AI-driven analytics and real-time monitoring systems, would facilitate swift and efficient identification of deceptive practices, enabling agencies to undertake prompt enforcement action and mitigate further consumer detriment.
  3. Escalate Penalties and Sanctions for Regulatory Non-Compliance: The EU should contemplate intensifying penalties and sanctions for non-compliance with consumer protection regulations to dissuade organisations from employing dark patterns. This might encompass imposing more significant fines, issuing injunctions, and pursuing legal action against recidivist transgressors. Demonstrating the significant repercussions of contravening consumer protection laws will prompt companies to prioritise ethical design and abstain from deceptive practices.
  4. Establish a Whistle-blower Mechanism: Encouraging employees and insiders to disclose dark patterns and other deceptive practices could significantly aid consumer protection agencies in identifying and addressing violations. The EU should consider instituting a whistle-blower mechanism that safeguards and incentivises those who divulge information regarding companies' engagement in dark patterns. This approach would enhance the likelihood of detecting such practices and convey a potent message to businesses about the importance of compliance.
  5. Elevate Public Awareness and Promote Consumer Reporting: Empowering consumers to recognise and report dark patterns is vital for efficacious enforcement. The EU should allocate resources to public awareness campaigns and educational initiatives that inform consumers of their rights and the potential harm caused by deceptive practices. Additionally, consumer protection agencies should devise user-friendly reporting mechanisms, such as hotlines and online portals, which encourage consumers to disclose their encounters with dark patterns. This strategy would assist agencies in pinpointing potential enforcement targets and foster a sense of collective responsibility among consumers to counteract deceptive practices.

Prioritising consumer protection enforcement is essential to effectively combat dark patterns and safeguard consumers' rights within the digital marketplace. By reinforcing coordination among consumer protection agencies, allocating resources to advanced detection and monitoring tools, escalating penalties and sanctions, instituting a whistle-blower mechanism, and raising public awareness, the EU can ascertain a robust and proactive approach to enforcement that evolves in tandem with the dynamic landscape of dark patterns. While personalised choice architectures adhering to GDPR and UCPD requirements may not contravene existing regulations, they still have the potential to exploit users' vulnerabilities and biases. Consequently, the EU must adapt its consumer protection enforcement strategies to capture such practices, fortify transparency and consent requirements, advocate ethical design principles, and enhance consumer awareness.

Final Thoughts

The EU's approach to regulating dark patterns within the evolving European digital policy agenda has thus far been proactive and responsive to emerging threats. However, the EU must continue to update and refine its legal framework to ensure it remains future-proof and can tackle the ever-changing landscape of dark patterns. By staying vigilant and adapting to new developments, the EU can protect consumers from deceptive practices and maintain trust in the digital marketplace.

To achieve this goal, the EU should consider the following measures:

  • Regularly review and update the UCPD to ensure it captures emerging dark patterns and deceptive practices. This includes expanding the directive's scope to address new forms of manipulation and strengthening the enforcement powers of consumer protection agencies.
  • Enhance collaboration between consumer protection agencies, the technology industry, and the AI research community. This collaboration can help develop shared ethical guidelines, technical safeguards, and best practices for responsible design while improving the detection and monitoring of dark patterns.
  • Increase public awareness and education on dark patterns and their potential harm. Empowering consumers with knowledge and tools to recognise and report dark patterns will strengthen enforcement efforts and discourage companies from employing such deceptive practices.
  • Encourage the development of industry standards and self-regulation initiatives to promote responsible design and discourage the use of dark patterns. By creating a culture of ethical design and transparency, companies can demonstrate their commitment to protecting consumer rights and building trust in the digital ecosystem.
  • Monitor global consumer protection and digital policy developments, ensuring the EU remains at the forefront of tackling dark patterns and maintaining a safe and fair digital marketplace for its citizens.

By taking these proactive steps, the EU can ensure that its consumer protection legislation remains future-proof and effective in the face of emerging dark patterns. As technology and the digital landscape evolve, a comprehensive and adaptable regulatory approach will be essential to preserving user autonomy, safeguarding privacy, and fostering a trustworthy digital environment for consumers and businesses.

ANNEX I: Design Assessment Framework and the Pursuit of Digital Equity.


ANNEX I: Design Assessment Framework and the Pursuit of Digital Equity

Given the escalating demand for user-centric design, formulating a 'Design Assessment' framework analogous to Data Protection Impact Assessments (DPIAs) conducted before processing personal data is warranted. This framework would ensure digital products and services are ethically designed, transparent, and user-oriented whilst minimising potential adverse impacts on users. The primary objective of the Design Assessment is the systematic identification, evaluation, and mitigation of potential risks and issues associated with the design and development of digital products and services. By undertaking a Design Assessment, designers can ensure compliance with regulatory principles, digital equity, user-centred design, and adherence to pertinent laws and regulations.

Essential Components of the Design Assessment

a) Scope and Context: Designers ought to delineate the scope and context of the digital product or service, encompassing target audiences, intended use, and any specific features or functions that may raise ethical, legal, or practical concerns.

b) User-centred Design Principles: Designers should ascertain how the product or service adheres to user-centred design principles, such as accessibility, inclusivity, transparency, and meaningful user engagement.

c) Regulatory Compliance: Designers must demonstrate how the product or service aligns with the regulatory framework by showcasing its modular, scalable, consistent design and explicating its adherence to regulatory requirements.

d) Risk and Issue Identification: Designers should discern potential risks and issues associated with the design and development of the product or service, including ethical concerns, privacy implications, and potential for dark patterns or manipulative practices.

e) Risk Mitigation and Management Strategies: Designers should devise strategies to mitigate and manage identified risks and issues, such as implementing design changes, incorporating user feedback, or conducting additional testing and evaluation.

f) Stakeholder Involvement: Designers should engage relevant stakeholders, including users, regulators, and other affected parties, in the Design Assessment process to ensure diverse perspectives are considered and potential concerns are adequately addressed.

g) Documentation and Review: Designers should comprehensively document the Design Assessment process and its outcomes and periodically review and update the assessment to reflect any changes in the product or service, user feedback, or the regulatory landscape.

To effectively implement the Design Assessment framework, designers should adhere to these steps:

Step 1: Plan and prepare the Design Assessment by defining its scope, context, and objectives.

Step 2: Conduct a thorough analysis of the digital product or service, focusing on user-centred design principles, regulatory compliance, and identifying potential risks and issues.

Step 3: Develop and implement mitigation and management strategies to address the identified risks and issues.

Step 4: Engage stakeholders throughout the Design Assessment process to ensure diverse perspectives and concerns are considered.

Step 5: Document the Design Assessment process and outcomes and establish a process for regular review and updates.

By integrating the Design Assessment framework into their design and development processes, designers can ensure that their digital products and services comply with regulations and contribute to a more equitable, user-centric digital environment. A regulatory body would be responsible for ensuring that companies adhere to the guidelines in the framework and for enforcing penalties in the event of non-compliance. The regulatory framework should promote a more responsible and user-centric approach to designing and developing digital products and services. By standardising design practices, promoting transparency and accessibility, and empowering users to make informed decisions, the framework aims to foster digital equity and combat dark patterns and manipulative practices in the digital realm.

This framework is congruent with the European Commission's endeavours to promote digital equity and address gaps in existing regulations, such as the Directive on Unfair Commercial Practices. In its recent study on dark patterns and manipulative personalisation, the Commission acknowledged the growing recognition of the harmful effects of such practices on users and the need for regulatory action to ensure digital equity. In the context of the Digital Services Act, a design regulation framework offers a concrete approach to addressing the regulation of online advertising and dark patterns, which have been the subject of ongoing debates and discussions among policymakers. By providing a standardised and streamlined methodology for designing digital products and services, the framework can serve as a valuable tool for policymakers in developing regulations that promote digital equity and protect user rights. Overall, a design regulation framework should represent a step towards a more responsible and ethical approach to designing digital products and services, prioritising user needs and interests, and ensuring equal fairness offline and online.

The concept of 'digital fairness' is inextricably linked to our notion of a regulatory framework to ensure user-centred design, thereby preventing dark patterns across the entire "visibility spectrum."[4] As the European Commission opens public consultation on digital fairness, there is a growing recognition of the need for increased regulation in the digital realm to ensure users are treated fairly and not subjected to manipulative or harmful practices.

The Commission's study on possible directive gaps underscores the need to update existing legislation to better align with the online world. This includes addressing dark patterns and manipulative personalisation used to coerce Internet users into actions against their will or interest. Design regulation could serve as a framework for addressing these issues by ensuring that digital products are designed with a user-centric focus and that harmful or manipulative practices are averted. The idea of a Digital Fairness Act further signals the growing recognition of the need for greater regulation in the digital realm. By regulating online advertising and dark patterns, policymakers can help to prevent harmful or manipulative practices and ensure that users are treated fairly. Regulation mandating user-centred design can serve as a framework for addressing these issues and ensuring that users are treated equitably. With the proper regulatory framework in place, it is possible to create a digital environment that is fair, transparent, and user centric.

Leiser, M.R. (Mark)        Dark Patterns and Digital Fairness        


[1] Professor of Digital, Legal, and Platform Regulation at VU-Amsterdam (m.r.leiser@vu.nl)

[2] European Commission, Directorate-General for Justice and Consumers, Francisco Lupiáñez-Villanueva and others, Behavioural Study on Unfair Commercial Practices in the Digital Environment: Dark Patterns and Manipulative Personalisation: Final Report (Publications Office of the European Union 2022) https://data.europa.eu/doi/10.2838/859030 accessed 15 March 2023

[3] FTC, Bringing Dark Patterns to Light (2022) https://www.ftc.gov/system/files/documents/reports/bringing-dark-patterns-to-light/bringing_dark_patterns_to_light.pdf  accessed 16 March 2023.

[4] Leiser and Santos “Enforcement of Dark Patterns Regulation across the Visibility Spectrum”, (forthcoming).