A dark pattern is a deliberate design choice that nudges a user toward the outcome the product wants, not the outcome the user wants. The distinction matters: ordinary bad design is usually accidental, fixable, and equally inconvenient for the company. A dark pattern is intentional and asymmetric — friction is added to the path the user prefers and removed from the path the product prefers.
This guide is a working catalog of the patterns most worth recognizing. Recognition is the goal here. Once you can name the pattern, you stop falling for it as easily.
Confusing defaults
The default state of a checkbox, toggle, or radio button does most of the work. Most users do not change defaults.
Patterns to watch for:
- Pre-checked boxes for "share data with partners" or "marketing emails."
- Default privacy settings set to the most permissive option, with stricter options buried.
- "Recommended" choices highlighted in a way that implies they are the safe option, when they are actually the option that benefits the product.
Test: ask whether the default works in the user's interest or the product's. If those align, the default is fine. If they diverge, the default is doing the talking.
Forced continuity
Forced continuity is the gap between "easy to start" and "hard to stop." A free trial that requires a card, a one-click signup followed by a multi-step cancellation, a subscription that auto-renews silently with no email reminder.
Indicators:
- Cancellation is buried under three or more menu levels.
- Cancellation requires a phone call, email, or chat session when signup did not.
- Renewal happens with no advance warning, then becomes "non-refundable" the moment it processes.
Forced continuity is often the most expensive dark pattern in real money.
Hidden opt-outs
Opt-outs that exist on paper but are hidden in practice. The setting exists, the regulator is satisfied, the user never finds it.
Common forms:
- Privacy controls split across five or six sub-pages with no overview.
- Settings labeled in product jargon rather than user language ("Personalization signals" instead of "Tracking").
- A toggle that turns off one specific use of your data while four related uses remain on by default.
If you cannot find the off switch in under thirty seconds, that is the design.
Misleading button hierarchy
The visual weight of a button communicates more than its label. A prominent colored button with a quiet gray link underneath sends a clear message about which action the product wants you to take.
Patterns to recognize:
- "Accept all" rendered as a bright primary button while "Reject all" or "Manage preferences" is a dim text link.
- Confirmation dialogs where the destructive option is visually styled as the safe one.
- Skip / dismiss buttons that look like cancel actions, when the primary CTA is the one that proceeds without your consent.
Read the labels, not the colors.
Consent fatigue
Consent fatigue is the cumulative effect of too many decisions. After the fifth cookie banner, a user clicks "Accept all" because every prompt feels identical and the cost of reading them is too high.
This is exploited deliberately:
- Consent flows that present dozens of partner toggles, each off-by-default in theory but presented as a wall of text the user will not read.
- Re-prompting after a few weeks if the user previously declined.
- Region-specific banners that reset on every visit.
The most practical defense is browser-level controls and habits — not in-the-moment vigilance.
Notification pressure
Once an app has notification permission, it has direct access to your attention. Many apps treat this as a marketing channel rather than a signal channel.
Patterns:
- Engagement notifications styled to look like real activity ("Someone is looking at your profile!" with no specifics).
- Notifications that re-route into in-app feeds rather than the action they imply.
- Notification settings split across "system" and "in-app" toggles, so turning off one leaves the other on.
If you find yourself opening an app because of a vague nudge that turned out to be nothing, that was the point.
Dark patterns vs. ordinary bad design
Bad design is universal. Dark patterns are asymmetric. The simplest test:
- Bad design hurts both the user and the company.
- Dark patterns hurt the user and benefit the company.
A confusing settings page may be either. If the confusing path always ends in a more permissive default or a paid upgrade, it is no longer an accident.
What to check before clicking accept
A short checklist to apply to any consent prompt, signup flow, or permission request:
- Is the default the user-favorable option, or the product-favorable option?
- Is "Reject" or "Decline" given equal visual weight to "Accept"?
- Are pre-checked boxes hidden behind expanding sections?
- Is the cancellation path symmetric with the signup path?
- Are the toggles labeled in plain language, not product jargon?
- If you decline, will the same prompt return next session?
- Does the permission request match what the feature actually needs?
If multiple answers favor the product, treat the prompt with skepticism and look for the quieter alternative.
Summary
Dark patterns are not a moral category — they are a design pattern with a measurable effect on outcomes. Calling them out is not about outrage, it is about recognition. A user who can name the pattern in the moment makes a different choice. Multiply that across enough users and the incentive to ship the pattern starts to weaken. That is the long game.