Categorizing Dark Patterns

Yel Legaspi
5 min readJan 20, 2020

--

Illustration by absurd.design

I recently stumbled upon a study done by Princeton analyzing 53k~ pages from 11k~ shopping websites to characterize and quantify implementations of dark patterns. The study is very well done and it is one of the most extensive and detailed studies I’ve come across regarding dark patterns. It properly puts into perspective how prevalent the issue is as it even points out that there are organized businesses implementing interface design to purposely deceive users (a couple openly).

We identified 22 third-party entities that provide shopping websites with the ability to create and implement dark patterns on their sites. Two of these entities openly advertised practices that enable deceptive messages.

Some context about Dark Patterns

Dark patterns are user interface designs whose main goal is to deceive users into an unintended action that would either benefit someone else or harm the user, or at times, both.

A simple example of this would be a popup asking a user to take some kind of action that shouldn’t be mandatory but can not be dismissed. Here is an example from CrazyEgg (they have since removed it) tweeted by Michael Fienen @fienen.

To read more about this, well, the information available is a lot but is quite sparse. There’s darkpatterns.org, a Wikipedia entry, and this writeup from the Verge from 2013 and many more from a simple Google search.

Categorizing Dark Patterns

As I’ve mentioned, the study is quite extensive and one of the first references I’ve read where it properly defines and categorizes the different types of dark patterns. Here is the list from the study:

  1. Sneaking — Attempting to misrepresent user actions, or delay information that if made available to users, they would likely object to.
  2. Urgency — Imposing a deadline on a sale or deal, thereby accelerating user decision-making and purchases.
  3. Misdirection — Using visuals, language, or emotion to steer users toward or away from making a particular choice.
  4. Social proof — Influencing users’ behavior by describing the experiences and behavior of other users.
  5. Scarcity — Signaling that a product is likely to become unavailable, thereby increasing its desirability to users.
  6. Obstruction — Making it easy for the user to get into one situation but hard to get out of it.
  7. Forced action — Forcing the user to do something tangential in order to complete their task.

Snippet from https://webtransparency.cs.princeton.edu/dark-patterns/

I’m pretty sure this list would still evolve, either by re-wording the terminology or breaking down further a category.

Debatable Patterns

The other thought that popped into my head is how we may misjudge implementations as a dark pattern when in fact it is not only true but it is also useful for a user. Take this scenario as an example:

  1. A user goes on a shopping website, browsing for shoes and clicks on one entry.
  2. On the product page, the user is notified by the website via a closeable pop-up with bold, big, color red text (in all caps) that the shoe he’s browsing only has 2 stocks left.

Many would identify this as a dark pattern, which is understandable. However, is the implementation done really to “deceive users into an unintended action that would either benefit someone else or harm the user” if:

  1. The information presented is true.
  2. The dialog can be closed without extra cognitive effort. Meaning, it is not trying to deceive the user by making the way to close the dialog harder.

Now, to add to the mix: What if the user really wanted to buy that shoe, and did, and felt happy that the website informed him/her about the scarcity of the shoe because he/she was able to take action that brought him/her value?

Granted, there are different ways of doing this. eBay and Amazon do this effectively in their product pages, in my opinion. And I guess that’s the moot point in this train of thought — as of now, even if all the Ifs above are satisfied, the classification of some implementations is subjective. Many would still classify the pattern in my example as a dark pattern but I’m sure that one user who got the shoe won’t.

Illustration by absurd.design

There is a question somewhere in that scenario that is within the lines of “How can we objectively identify dark pattern implementations?”. I think that question is too deep for now, considering the effort in “combating” dark patterns is young. There are many things to identify, classify, debate on, etc.

Luckily, Princeton had also done this in their study. If you read the paper my scenario above is actually accounted for and is categorized as Low-stock Messages (page 21). They also did, in my opinion, a fair and objective filtering of what can be considered a deceptive Low-stock message. And I think how Princeton did it is a great jump-off for future evaluation (or standardization).

What I’m also interested in is the question of “Should we start categorizing these patterns by severity as well?”. Why? Because answering that question can make the categories be actionable (I feel). Some patterns explicitly deceive you — like adding an item in your basket by default as a “promo” and some patterns mildly annoy you, like a popup with adverts. The difference in probable harm should make the categorization in terms of severity and by this, we can tackle the most harmful first. I also think that there is a possibility that some of these trivial/least harmful patterns (aligned/overlaps with marketing gimmicks) will actually get weeded out as the time goes by.

I think the How Question and Should Question are both deep — and both very difficult for us to answer collectively. However, I think answering the Should Question can light a fire that forces us to answer the How Question more effectively and put that answer into action. That can be more valuable to us as a community and can trickle down in solving issues down the patterns categorized by the Should question.

As I’ve hopefully relayed in my scenario above — some types of dark patterns are trivial and can be subjective. And while the community collaborates and decides regarding these trivial scenarios there are types of dark patterns that we can take action against— or the very least propose a sound solution.

--

--

Yel Legaspi
Yel Legaspi

Written by Yel Legaspi

UX & Product Designer. Santiago, Chile. www.yellegaspi.com

No responses yet