DG Justice and Consumers, European Commission
Rue Montoyer 59
1000 Brussels
17 March 2023
Re: Digital Fairness - Fitness Check on EU Consumer Law
Dark Patterns, Consumer Protection and Digital Fairness:
A roadmap to illuminating user-centric design
By Dr Mark Leiser[1]
As contemporary society grows increasingly reliant on technological advancements and the digital sphere, it is incumbent upon legal regimes to evolve concomitantly to safeguard consumer interests. In recent years, a phenomenon warranting particular attention is the emergence of "dark patterns" - manipulative and deceptive user interface designs aimed at surreptitiously guiding users' behaviours and impinging upon their autonomy and control. The submission assesses the extent to which extant and forthcoming legislation is equipped to address this pernicious practice through the prism of the digital fairness review.
I. Enforcement Actions by Consumer Protection Agencies
The European Commission has studied dark patterns and manipulative practices, interfaces designed to push consumers into making choices that may not be in their best interest.[2] The Federal Trade Commission (FTC) released a report in 2022 showing how companies are increasingly using sophisticated design practices known as “dark patterns” that can trick or manipulate consumers into buying products or services or giving up their privacy.[3] Within the EU, consumer protection authorities have adopted various enforcement measures targeting companies that employ dark patterns and deceptive design techniques in contravention of the Unfair Commercial Practices Directive (UCPD). The UCPD's objective is to safeguard consumers from unfair and deceptive commercial practices, encompassing those that exploit or manipulate consumer behaviour through dark patterns. Regulatory actions instituted by consumer protection agencies range from imposing fines and injunctions to filing legal complaints. These enforcement measures deter businesses from utilising dark patterns, which entail misleading or coercive methods that erode user autonomy and control. The consequences of such patterns can be harmful, giving rise to forced continuity, privacy Zuckering, and compelled action. Personalised choice architectures employ personal data, such as browsing history, demographic information, and preferences, to customise user experiences and shape decision-making. By presenting options capitalising on cognitive biases, these designs can subtly sway users towards choices that advantage the company, such as selecting a costlier product, committing to a subscription, or disclosing personal information. For instance, the Dutch Authority for Consumers and Markets (ACM) has sanctioned a company for inadequate transparency regarding personalised pricing. Other e-commerce platforms might utilise a user's previous purchases to prominently feature higher-priced items, aware that the user has a propensity for procuring expensive products. While this may not overtly contravene GDPR or UCPD, it engenders ethical quandaries concerning exploiting users' vulnerabilities and biases for commercial profit. Innovative forms of dark patterns emerge as technological advancements and user experience design evolve. This section critically analyses the European Union's preparedness to address these nascent challenges by scrutinising four distinct categories of developing dark patterns:
I will now outline consumers' opportunities, risks, and challenges in line with these four categories.
Opportunities for Improved Consumer Experiences:
1.1 Enhanced Personalization: AI-powered communication can offer consumers more personalised experiences by tailoring information to their needs, preferences, and interests. This could make terms and conditions, privacy policies, and other disclosures more engaging and easier to understand, ultimately leading to more informed decision-making.
1.2 Streamlined Interactions: AI-driven systems like ChatGPT-4 can improve the efficiency and effectiveness of consumer interactions with platforms, simplifying processes and saving time. For instance, AI could help users navigate complex legal documents, quickly locate relevant sections, and obtain concise explanations of key provisions.
1.3 Dynamic Adaptation: Large-language models can adapt to changes in user behaviour and preferences in real time, allowing platforms to serve their customers better and address their concerns. This can lead to more effective communication and greater satisfaction among consumers.
2.1 Deceptive Practices: AI-generated content can create new dark patterns that manipulate users' behaviour and exploit their vulnerabilities. For instance, AI-powered systems could provide misleading or biased information, selectively present terms and conditions, or pressure users to decide against their best interests.
2.2 Lack of Transparency: AI-driven systems may need to be more explicit about the distinction between human-generated and AI-generated content, making it difficult for consumers to assess the credibility and reliability of the information they receive. This lack of transparency can undermine trust in platforms and services.
2.3 Privacy Concerns: Using AI to personalise experiences may require collecting and processing large amounts of personal data, raising concerns about privacy and data protection. This can lead to potential violations of GDPR and other privacy regulations.
Challenges and Regulatory Response:
3.1 Clear Disclosure Requirements: Companies employing human avatars should be required to disclose the artificial nature of these avatars and their commercial intent. This can be achieved through visual cues, disclaimers, or other means that make it apparent to users that they are interacting with a non-human entity.
3.2 Guidelines for Responsible Avatar Design: The EU can develop guidelines for the ethical and responsible design of human avatars, outlining best practices and principles that companies should adhere to when developing and deploying these technologies.
3.3 Enhanced Monitoring and Enforcement: Consumer protection agencies should be equipped with the necessary tools and resources to monitor the use of human avatars and ensure compliance with disclosure and design requirements.
Addressing the Challenges:
4.1 Develop Ethical Guidelines and Technical Safeguards: The EU should work closely with the AI community to develop ethical guidelines and technical safeguards that promote responsive design and discourage using dark patterns. This may include incorporating fairness, accountability, and transparency principles into AI systems to prevent deceptive practices and uphold consumer rights.
4.2 Invest in Research and Development: The EU should invest in research and development to improve its ability to identify and monitor AI-generated dark patterns. This may involve developing advanced detection and monitoring tools and exploring new methods for analysing and interpreting AI-generated content.
4.3 Strengthen Regulatory Frameworks: The EU should consider updating its regulatory frameworks to address the unique challenges AI-driven systems pose. This may include creating new disclosure requirements for AI-generated content, particularly in consumer settings like AI-powered communications with customers, enhancing data protection rules, and providing more explicit guidance on applying existing consumer protection laws to AI-generated dark patterns.
By proactively addressing the opportunities and challenges presented by AI-powered systems like ChatGPT-4, the EU can ensure that consumer protection remains a priority in the rapidly evolving digital landscape. This will involve balancing the potential of AI to improve consumer experiences with the need to safeguard against the risks associated with deceptive practices and privacy violations.
Maintaining the Efficacy of Consumer Protection Enforcement
To preserve the efficacy of consumer protection enforcement amid the emergence of dark patterns, the European Union (EU) should contemplate the following initiatives:
Prioritising consumer protection enforcement is essential to effectively combat dark patterns and safeguard consumers' rights within the digital marketplace. By reinforcing coordination among consumer protection agencies, allocating resources to advanced detection and monitoring tools, escalating penalties and sanctions, instituting a whistle-blower mechanism, and raising public awareness, the EU can ascertain a robust and proactive approach to enforcement that evolves in tandem with the dynamic landscape of dark patterns. While personalised choice architectures adhering to GDPR and UCPD requirements may not contravene existing regulations, they still have the potential to exploit users' vulnerabilities and biases. Consequently, the EU must adapt its consumer protection enforcement strategies to capture such practices, fortify transparency and consent requirements, advocate ethical design principles, and enhance consumer awareness.
Final Thoughts
The EU's approach to regulating dark patterns within the evolving European digital policy agenda has thus far been proactive and responsive to emerging threats. However, the EU must continue to update and refine its legal framework to ensure it remains future-proof and can tackle the ever-changing landscape of dark patterns. By staying vigilant and adapting to new developments, the EU can protect consumers from deceptive practices and maintain trust in the digital marketplace.
To achieve this goal, the EU should consider the following measures:
By taking these proactive steps, the EU can ensure that its consumer protection legislation remains future-proof and effective in the face of emerging dark patterns. As technology and the digital landscape evolve, a comprehensive and adaptable regulatory approach will be essential to preserving user autonomy, safeguarding privacy, and fostering a trustworthy digital environment for consumers and businesses.
ANNEX I: Design Assessment Framework and the Pursuit of Digital Equity.
ANNEX I: Design Assessment Framework and the Pursuit of Digital Equity
Given the escalating demand for user-centric design, formulating a 'Design Assessment' framework analogous to Data Protection Impact Assessments (DPIAs) conducted before processing personal data is warranted. This framework would ensure digital products and services are ethically designed, transparent, and user-oriented whilst minimising potential adverse impacts on users. The primary objective of the Design Assessment is the systematic identification, evaluation, and mitigation of potential risks and issues associated with the design and development of digital products and services. By undertaking a Design Assessment, designers can ensure compliance with regulatory principles, digital equity, user-centred design, and adherence to pertinent laws and regulations.
Essential Components of the Design Assessment
a) Scope and Context: Designers ought to delineate the scope and context of the digital product or service, encompassing target audiences, intended use, and any specific features or functions that may raise ethical, legal, or practical concerns.
b) User-centred Design Principles: Designers should ascertain how the product or service adheres to user-centred design principles, such as accessibility, inclusivity, transparency, and meaningful user engagement.
c) Regulatory Compliance: Designers must demonstrate how the product or service aligns with the regulatory framework by showcasing its modular, scalable, consistent design and explicating its adherence to regulatory requirements.
d) Risk and Issue Identification: Designers should discern potential risks and issues associated with the design and development of the product or service, including ethical concerns, privacy implications, and potential for dark patterns or manipulative practices.
e) Risk Mitigation and Management Strategies: Designers should devise strategies to mitigate and manage identified risks and issues, such as implementing design changes, incorporating user feedback, or conducting additional testing and evaluation.
f) Stakeholder Involvement: Designers should engage relevant stakeholders, including users, regulators, and other affected parties, in the Design Assessment process to ensure diverse perspectives are considered and potential concerns are adequately addressed.
g) Documentation and Review: Designers should comprehensively document the Design Assessment process and its outcomes and periodically review and update the assessment to reflect any changes in the product or service, user feedback, or the regulatory landscape.
To effectively implement the Design Assessment framework, designers should adhere to these steps:
Step 1: Plan and prepare the Design Assessment by defining its scope, context, and objectives.
Step 2: Conduct a thorough analysis of the digital product or service, focusing on user-centred design principles, regulatory compliance, and identifying potential risks and issues.
Step 3: Develop and implement mitigation and management strategies to address the identified risks and issues.
Step 4: Engage stakeholders throughout the Design Assessment process to ensure diverse perspectives and concerns are considered.
Step 5: Document the Design Assessment process and outcomes and establish a process for regular review and updates.
By integrating the Design Assessment framework into their design and development processes, designers can ensure that their digital products and services comply with regulations and contribute to a more equitable, user-centric digital environment. A regulatory body would be responsible for ensuring that companies adhere to the guidelines in the framework and for enforcing penalties in the event of non-compliance. The regulatory framework should promote a more responsible and user-centric approach to designing and developing digital products and services. By standardising design practices, promoting transparency and accessibility, and empowering users to make informed decisions, the framework aims to foster digital equity and combat dark patterns and manipulative practices in the digital realm.
This framework is congruent with the European Commission's endeavours to promote digital equity and address gaps in existing regulations, such as the Directive on Unfair Commercial Practices. In its recent study on dark patterns and manipulative personalisation, the Commission acknowledged the growing recognition of the harmful effects of such practices on users and the need for regulatory action to ensure digital equity. In the context of the Digital Services Act, a design regulation framework offers a concrete approach to addressing the regulation of online advertising and dark patterns, which have been the subject of ongoing debates and discussions among policymakers. By providing a standardised and streamlined methodology for designing digital products and services, the framework can serve as a valuable tool for policymakers in developing regulations that promote digital equity and protect user rights. Overall, a design regulation framework should represent a step towards a more responsible and ethical approach to designing digital products and services, prioritising user needs and interests, and ensuring equal fairness offline and online.
The concept of 'digital fairness' is inextricably linked to our notion of a regulatory framework to ensure user-centred design, thereby preventing dark patterns across the entire "visibility spectrum."[4] As the European Commission opens public consultation on digital fairness, there is a growing recognition of the need for increased regulation in the digital realm to ensure users are treated fairly and not subjected to manipulative or harmful practices.
The Commission's study on possible directive gaps underscores the need to update existing legislation to better align with the online world. This includes addressing dark patterns and manipulative personalisation used to coerce Internet users into actions against their will or interest. Design regulation could serve as a framework for addressing these issues by ensuring that digital products are designed with a user-centric focus and that harmful or manipulative practices are averted. The idea of a Digital Fairness Act further signals the growing recognition of the need for greater regulation in the digital realm. By regulating online advertising and dark patterns, policymakers can help to prevent harmful or manipulative practices and ensure that users are treated fairly. Regulation mandating user-centred design can serve as a framework for addressing these issues and ensuring that users are treated equitably. With the proper regulatory framework in place, it is possible to create a digital environment that is fair, transparent, and user centric.
Leiser, M.R. (Mark) Dark Patterns and Digital Fairness
[1] Professor of Digital, Legal, and Platform Regulation at VU-Amsterdam (m.r.leiser@vu.nl)
[2] European Commission, Directorate-General for Justice and Consumers, Francisco Lupiáñez-Villanueva and others, Behavioural Study on Unfair Commercial Practices in the Digital Environment: Dark Patterns and Manipulative Personalisation: Final Report (Publications Office of the European Union 2022) https://data.europa.eu/doi/10.2838/859030 accessed 15 March 2023
[3] FTC, Bringing Dark Patterns to Light (2022) https://www.ftc.gov/system/files/documents/reports/bringing-dark-patterns-to-light/bringing_dark_patterns_to_light.pdf accessed 16 March 2023.
[4] Leiser and Santos “Enforcement of Dark Patterns Regulation across the Visibility Spectrum”, (forthcoming).