Ethical Framework in UX: The Dark Patterns To Avoid

By Junaid DarUpdated on Feb 17, 2021

Business ethics remain the most undervalued facets of organisational success. They, however, go beyond the organisational sphere and can manifest in user experience design – where the manipulation of ethical principles can severely undermine the user experience.

Studies suggest that over three-quarters of America’s largest corporations are actively trying to integrate ethical decision-making frameworks into their organisations, showing a general direction towards wanting to be “better”.

The same applies to the digital sphere, where the value-sensitivity of designs has led to major debates surrounding the development of technology and its ethicality.

The concern isn’t that online design practices are illegal. Most of the time, they aren’t. However, knowledge is a powerful tool that can either be used for good, or for selfish purposes.

In the domain of digital marketing, knowledge has often been used to misinform, mislead, and even psychologically manipulate users into making decisions that they would have never made otherwise. This is where a harmless practice ventures into the realm of unethicality, and leads to a number of harmful practices towards users on the Web.

What Are Ethical Issues in Design?

Design is the undisputed backbone behind any successful website, product, or application, and is a key driver of user behaviour. For marketing purposes specifically, consumer trust and reliability of the brand become important factors in the decisions that are made.

Sadly, these decisions aren’t always in the user’s interests. Such practices are known as dark patterns, which are UI/UX manipulations that play on a specific subset of design psychology and are designed to mislead or trick users into making an undesirable action. This term was coined by design consultant Harry Brignull in 2010 when gauging ethical frameworks surrounding design practices.

Several practices fall into the ‘dark patterns’ category notably:

The Roach Motel

Roach Motel Example

Ever been in a situation when you want to download a particular software program, and its partner software is automatically downloaded because you didn’t see a small, pre-ticked checkbox in the corner of the screen?

If so, you’re already familiar with what a roach motel looks like.

These are situations that designers create which have easy and almost forced opt-in systems and difficult opt-out systems. This less-than-ethical business practice is employed to generate increased revenue, garner more subscriptions and receive more clicks.

Manipulating User Behaviour

Hidden Costs in UX

In a world where user data is abundant, companies engage in user behaviour analytics to improve their understanding of user experience.

There is undoubtedly a grey area surrounding practices like user behaviour tracking and the tools that supplement it. Such practices relate to the understanding of how a consumer’s mind works so that they may be more easily swayed by tactics that are meant to push them to act a certain way.

This, by no means, indicates that businesses engage in psychological manipulation, however. certain practices have the potential to make users feel misled, which can be harmful to an organisation’s ethical integrity.

Tools like Hotjar, Clicky, and CrazyEgg are absolutely harmless on their own but offer the potential to be used in harmful ways. Their deliberate misuse can trigger a sense of customer conditioning or stimulus-induced behaviours that may lead to a sense of customer anxiety or even anger.

A prominent example of this includes the growing phenomenon of fake news, in which the fabrication of news misguides viewers. This may force viewers into behaving in ways that are in accordance with the way the news is reported.

Users, therefore, consciously or otherwise, establish trust with a website depending on 3 key outcomes:

Firstly, were their actions what they initially intended? If not, did they consent to them willingly, or were they pushed to act that way?

Secondly, could the outcome be reasonably predicted? If not, what happened? Did they make a genuine mistake, or was the situation purposefully crafted to make that mistake?

Thirdly, is there any step of the process that seems unethical on face-value? The user will ask “was my user behaviour manipulated in any way?”

This places a significant amount of responsibility on UX professionals to design user experiences that do not influence the actions of their users in ways that may be perceived as negative or harmful.

Surveillance and Privacy

Privacy in UX

The growth of surveillance technology has created ease in the lives of many people of all ages, unhindered by geographical boundaries. Surveillance equipment that assists the elderly, monitors that track babies’ movements, and software like Car Connection, which helps parents monitor their children’s driving, are all examples of improvements. UX professionals have made significant strides in ensuring that technology has not just become more user-friendly, but helpful for many difficult tasks.

It is important not to let these stories deceive you into believing that it’s all sunshine and roses. There remain real concerns about the use of surveillance for personal gain. In her book, “The Age of Surveillance Capitalism”, Shoshanna Zuboff – a retired Harvard professor – introduces the concept of “surveillance capitalism” the idea that your personal data is treated as a commodity for sale in a market that then allows organisations to engage in data manipulation on a grand scale.

Even though the experience in the use of these products and services may be designed with the best intentions, UX professionals need to assume a certain degree of accountability for the use of data that drives these processes. A prominent example of user data being used nefariously is Facebook’s data scandal, where Cambridge Analytica accessed around 87 million of its users’ personal data due to reasons centred around the lack of safeguards to protect consumer data and the near-absent monitoring of developers. This led to genuine worries regarding the ethical frameworks that corporate technology businesses adhere to, and worldwide governments’ plans for the need for greater oversight and regulation.

Truthful Statistics

Reliable Statistics in UX

Visualising data allows people to gain knowledge from data.

The methods around visualising data range from standard scatterplots to intricate interactive systems which convey the purpose and meaning of the data more easily.

The way this visualisation is approached can impact how people conceive the data and its message. In order to create an accurate representation of a set of statistics, a carefully put together a combination of design elements need to come into play.

Often when data is collected, it confirms some parts of a specific study and rejects others. Due to factors like sample size, the results can land on a spectrum of accuracy. Results that only represent a part of the sample size don’t communicate an accurate picture of reality.

Data is manipulated or misrepresented to depict a distorted reality.  Researchers, unfortunately, use such opportunities to make it look like their findings represent a more significant reality than they actually do. Alternatively, they might trim data or outlying results in the name of ‘cleaning data’ to skew the results in a certain direction.

While these are not strictly categorized as illegitimate techniques in influencing user behaviour, there still needs to be a conversation about their application. Objectivity in research is potentially compromised since many of these mistakes or deliberate oversight occur as a result of inherent biases, or an intent to manipulate user behaviour.

Reliable Testing

Testing and Ethics in UX

Any digital product that is released in the market needs to be held to a very high standard, in order to ensure more efficient innovation – and to protect the consumer experience.

The process of testing products during and following development can be biased. It may lead to making overambitious claims to the consumer. One of the ways testing can compromise consumer expectations is when the wrong sample size is used. For instance, conducting alpha testing with the people who may have more technical expertise than the average person, or conducting it with too few consumers, can create a final product that is not viable for the consumer population.

Furthermore, during development, testing should be used to determine that the product delivers on all of its intended purposes and is subsequently marketed with transparency. A skewed testing group or ineffective white and black box testing method could lead to a product that may not fulfill consumer expectations and still make it to market, which could be detrimental.

Distraction in Interfaces

UX Distractions

With the same features that allow user interfaces to provide us with ease, comes the risk of distraction that can be off-putting at best, and even life-threatening at worst.

The use of a GPS on one’s phone, for example, makes driving to unknown locations extremely convenient, right? Well, that same phone can be the cause of distracted driving, which sadly claims thousands of lives every year.

Similarly, user interfaces can be made to be purposefully distracting so as to take the user’s attention away from the content that it provides, and obscure their reading patterns. This is in direct conflict with the visibility principle that dictates that UI design should be done in a way that keeps relevant information visible, accessible and does not distract the user from the purpose of the website. The purpose of this is to allow users to make choices in a clear and transparent way and to not overwhelm them with unnecessary information.

The practice of intentionally misleading consumers by displaying cheap prices exclusive of additional costs of product is referred to as hidden cost.

This tactic is used to trick users into buying seemingly cheap items and trapping them into a position where they need to incrementally buy other items to bolster revenue. UX designers should be wary about the placement of products on pages so as to not allow users to be distracted, which is particularly important for e-commerce interfaces.

When faced with a situation in which a product is “nearly out-of-stock” or on a website that gives live updates on users that have bought certain items, users run the risk of being lured to buy quickly and buy now. Referencing the golden Quidditch ball from Harry Potter, this situation attempts to usher users into fad buying, where users are made to buy an item because everyone else is doing it. UX designers should try to make sure that their interfaces are truthful and steer clear of any inadvertent attempts at deceit.

Though distractedness is usually associated with being the fault of users, it is important to recognise the role of UX professionals in ensuring its minimisation. Transparency, honesty, and a user-first mindset are important principles that UX designers should constantly turn back to whenever they’re confronted with difficult choices.

Build Trust With Ethical Practice

If architects are responsible for designs that assist the livability of people in homes, then – as people responsible for the information architecture of the internet – UX designers are responsible for the “livability” of people online.

What should this information architecture look like?

Practices that are based upon dishonesty, manipulation and breaches of privacy are clearly bad places to start.

To preserve the moral framework that should guide the design of UX, information architecture should try to build trust with users by providing important and relevant information of the service that they are providing, being transparent with their policies, creating interfaces that minimize distraction, provide fact-checked information, and respect user privacy.

After all, an ethical business is already a successful business, winning both the support of its users and the wider public.

By |February 17th, 2021|

About the Author:

Junaid is an Electrical & Electronics Engineer by profession. He has a passion for UI/UX and Website Development. At Rezaid he focusses on everything from Design, content creation, technical SEO, and new online growth strategies. Junaid enjoys reading, movies, blogging and squash.

One Comment

  1. zortilonrel March 15, 2021 at 1:02 am - Reply

    I’ve recently started a blog, the info you offer on this site has helped me tremendously. Thank you for all of your time & work.

Leave A Comment

Rezaid

Bespoke Software Design and Development.

CONTACT US

  • Manchester Technology Centre, Oxford Road, Manchester. M1 7ED
  • +44 161 327 2955
Img Rezaid Favicon