
Building for hate: Designing for deception
Understand the difference between persuasive and manipulative design in user interfaces and the consequences behind these design decisions.
Creating a new product begins with crafting the "happy path" –the ideal user journey. Designers often approach product development with intentionality and empathy, advocating for users. This is the foundation of user-centric design. Sadly, not all products and companies prioritize the greater good; instead, they create sneaky features. To avoid harmful behaviours, it's crucial to understand how certain design choices – especially when used with malicious intent – can exploit users' behaviour.
Back in 1995, Dr. Kimberly Young warned of the addictive potential of computers. She was one of the first to address technology addiction; 30 years later, here we are. While memes about digital dependence are widespread, Internet Addiction Disorder (IAD) is a serious behavioural condition, comparable to a gambling disorder, and it can severely affect someone’s well-being.
Design patterns – reusable solutions to common UI needs – shape user behaviour and create expected behaviour based on standards. For example, tapping a next button intuitively brings users to the following step of the flow. These patterns exist to make experiences easier. Now imagine if each next button reacts differently. While these patterns simplify digital experiences, they can also be engineered to keep users hooked.
When a pattern prioritizes retaining users in the tool over their well-being, it becomes poisonous. If users fully understood a product’s intentions, would they still engage with it differently? Design patterns support ease of use and clarity. The line between persuasive and manipulative lies in intent. A pattern can persuade for mutual benefit – or manipulate for unilateral gain. When patterns lead users toward actions that primarily benefit the company, they are often called deceptive patterns.
The Rise of Deceptive Patterns
Deceptive patterns are subtle ‘tricks’ that mislead users into decisions they might not make knowingly. During the COVID-19 pandemic, many digital solutions adopted unfair mechanisms to boost profits. Thankfully, in that same period, awareness about those patterns grew, and there was pushback against manipulative design. Digital products often collect demographics and behavioural data to understand users and improve user experience. While some data collection is harmless, ethical concerns arise about the intention and the management of this data.
A study on the impact of Facebook posts shows that exposure to negative content creates more negative posts and worsens users' moods. Exposure to positive content generates optimistic posts and boosts people's moods. This indicates that social media platforms significantly influence users' emotions, behaviour, and even voting decisions.
A 2024 study by the University of South Australia examined 240 popular mobile applications and found that 95% employed at least one deceptive design pattern. Mainly, those apps employ strategies that trick users into sharing personal data and spending unnecessarily.
The US Federal Trade Commission (FTC) fined Epic Games in 2023, a record $245 million fine, as part of a settlement over deceptive design. The Fortnite game guided users, many of them minors, to make unintended purchases. The FTC also filed a lawsuit against Amazon for employing similar tactics on its website.
The combination of multiple deceptive patterns creates the process of enshittification of digital services, which describes how digital platforms degrade over time. Products often start with user-focused intent but slowly shift toward exploitative, profit-driven models.
Common Types of Deceptive Patterns
- Rewards create a feeling of winning, an extra dose of dopamine. Notifications with new likes, exciting matches, and fresh lives to play with can give an extra shot of dopamine. This sensation of pleasure can be easily addictive.
- Scarcity and urgency. There isn’t a more addictive phrase than “only today!” or “expires in 5 minutes!”. This immediate urge to press the button triggers FOMO (fear of missing out), pushing users to act before thinking, creating a sense of rush and making users feel like they don’t have time to question their decision, run, “you will lose!”, “The 5 minutes are almost gone!”.
- Trick questions. There are poorly written copies, and there are evil copies. Questions are tailored to mislead and confuse users, subtly guiding them to answer as the application wants. At a glance, questions are about one thing, but reading carefully, they ask another thing entirely.
- Confirm Shaming: Guilt-inducing language, such as “Sad to see you go” or “You’ll lose your exclusive benefits,” aims to guilt users into staying on the platform or taking a specific action.
- Hide and seek. Critical settings – like privacy controls – are buried under layers of multiple taps, screens and confusing menus, making it difficult for users to find or modify them.
The Role of GDPR and Privacy
Manipulative design often clashes with user privacy laws. Implemented by the European Union in 2016, the GDPR – General Data Protection Regulation – sets the legal standards for how companies work around data protection and privacy rights. The GDPR is considered one of the most rigid laws and an essential component of EU privacy and human rights law.
Countries around the globe are following the EU and creating their own regulations. Yet, deceptive patterns exploit grey areas in GDPR compliance. For instance, this study from 2020 reveals that only 11.8% of cookie policies on websites meet the GDPR minimum legal requirements.
Toward Ethical Design
To fight back, the European Commission is preparing the Digital Fairness Act, a legislative proposal targeting deceptive patterns, personalisation, contracts, and influencer marketing. A 12-week public consultation is set to begin in spring 2025, with the Act expected to be proposed in 2026.
Ethical design starts with transparency, clarity, and respect. Language should be simple, intentions transparent, and user needs central. It prioritizes users by balancing business goals with a commitment to improving lives. There isn’t a well-structured metric to define if the pattern is helpful or harmful, but developers should build with ethical principles and make choices that benefit people and society. This brings us back to Dr. Young's early warnings. Some apps might make ‘unaware’ mistakes, but others are conscious of their addictive behaviour. Now, in a deeply connected era, creating for well-being seems more imperative than ever.
Digital products shape how people feel, think, and behave. When built with empathy and ethical responsibility, products can foster healthier relationships between humans and technology. But when design choices prioritise engagement metrics over human needs, the consequences can be profound; from diminished well-being to compromised trust.
Design is never neutral. Every interface, every flow, every notification carries intent. Developers aren’t just shaping user experiences; they’re shaping human experiences.
As regulations like the GDPR mature and new rules, such as the Digital Fairness Act, emerge, the industry will face pressure to justify how it earns attention and consent. But legal compliance alone isn’t the goal. True progress means building products that people can trust – products that respect autonomy, support clarity, and enhance lives.
The most sustainable path forward is one that prioritises human dignity. Because good design isn’t just what works, it’s what’s right.
This is an evolving topic –one that merges ethics, design, and policy. As legislation evolves to protect users from exploitative patterns, developers must stay informed and critically engaged. For those interested in digging deeper, Deceptive Design's reading list offers an extensive collection of resources on persuasive and manipulative design –and the real-world consequences of crossing that line.
