Indian legislators introduced the Personal Data Protection Bill (PDPB) into parliament in December 2019. Recent policy changes by messaging company WhatsApp have resurfaced the need for a comprehensive bill like PDPB. The company is requiring Indian users consent to their data being shared with its parent company, Facebook. Meanwhile, citizens in the European Union are…
On March 14, the European Data Protection Board (EDPB) published draft guidelines on “dark patterns.” The guidelines are intended to provide UX designers and consumers with the means to identify dark patterns—deceptive marketing and UX designs that violate the General Data Protection Regulation (GDPR).
More specifically, EDPB’s guidelines define “dark patterns” as “interfaces and user experiences” that can be implemented into these platforms and used to influence social media users to make “unintended, unwilling and potentially harmful decisions regarding the processing of their personal data.”
The guidelines make clear that this dark pattern analysis is applicable to all organizations that present an online user interface which prompts any provision of personal data by a consumer—not just social media platform providers.
These dark patterns can be divided into the following categories:
- Overloading – presenting users with an overwhelming amount of “requests, information, options, or possibilities” in order to lead the user to share more personal data;
- Skipping – an interface or user experience that’s designed to induce users to forget or disregard data protection;
- Stirring – appealing to users’ emotions or utilizing “visual nudges” to induce the user to make choices that are out of the ordinary for them;
- Hindering – preventing users from becoming informed or properly managing their decision through acts of obstruction or use of blockades;
- Fickle – designing the interface to be inconsistent or unclear so that users are unable to navigate data protection processes; and
- Left in the Dark – designing the interface to hide information or data protection processes so that users are unsure as to what is being processed-—or their options for preventing said processing.
The EDPB Guidance does not carry legislative or regulatory implications. However, on December 14, 2021, the European Parliament Committee agreed to ban “dark patterns” and advertising that targets minors. On April 23, 2022, the European Parliament and the European Commission entered into a political agreement on the proposed Digital Services Act (DSA), which will effectively implement this ban.
And although the GDPR does not explicitly address the use of “dark patterns,” Article 5(1)(a) requires data be “processed lawfully, fairly and in a transparent manner.” This section may be used by covered entities to determine if data was processed properly or in a way that indicates dark patterns. This Article may also be used to assess if the processing practice violated the GDPR’s requirements for transparency, data minimization, and accountability.
Further, several state privacy regulations, such as the California Consumer Privacy Act (CCPA), ban the collection, sale, or distribution of personal information obtained through improper methods of collection—such as the use of manipulative user interfaces.
Platform Provider Best Practices
Considering the increased regulatory efforts surrounding dark patterns in both the European Union and the United States, online platform providers should consider their responsibilities.
The guidelines evidence that many of the dark pattern categories utilize processes that are intended to avoid consumer privacy right requirements to collect consumer consent prior to collecting their personal data. Additionally, where consumer consent is obtained by the platform provider to properly collect consumer data, the use of dark patterns by third-parties who have gained access to the platform may result in the invalidation of any consent that is obtained.
Because public platform providers cannot protect against all bad actors who may attempt to improperly collect consumer personal data, the guidelines provide providers with the following recommendations for mitigating the compliance risks of dark patterns:
- Avoid continuous prompting – avoid use of frequent pop-ups so that users do not experience fatigue and decreased vigilance as to what they are responding to;
- Reasoned preselection – unless there is a good reason, maintain “privacy-friendly settings” and avoid preselecting privacy settings on users’ behalf;
- Availability of privacy settings – provide users with a website that clearly presents data protection actions and information in an accessible manner—and that provides them with significant availability to update or change the settings initially selected; and
- Consistent Policy Application – ensure that privacy policies are consistent across all user-interfacing websites and applications.
* * * * * * *
To read our coverage on the Connecticut Data Privacy Act (CDPA) including information on its applicability, enforcement and the obligations it places on businesses, click here.
For ADCG’s Breach Report and more News Updates discussing: The Global Cross-Border Privacy Rules Forum’s meeting last week to work on an agreement for worldwide data protection rules; Minnesota Senate Education committee approved new limits on how tech companies can use data belonging to students gathered through school-issued devices; The Bank for International Settlements (BIS) has released a report calling for new governance systems; and Lawmakers zero further in on Meta/Facebook, click here.
To browse through our previously published articles and news alerts, please visit our website, and don’t forget to subscribe to receive free weekly Data and Cyber Governance news and Breach Reports directly to your email.