By Design

What does it mean to be a designer? How accountable are technology designers accountable for the actions by users on their platform? How do we reckon with addictive design that is profit driven?

What does it mean to be a designer? I really don’t remember the word design popping up much in casual conversation until I went to graduate school… for design and technology. But today, I feel like I hear and read the word design all the time, in so many different contexts outside of my job as a designer.

Perhaps I’m in a bubble. Or perhaps when the web exploded, design became something that everyone felt they could do. I mean, anyone can go on SquareSpace and design a website, right?

But then again, SquareSpace uses templates. A person using a service like SquareSpace is designing within a selection of templates… designed by a professional designer.

It’s too easy to assume that design is easy and anyone can do it. For example, a software developer could design and program a piece of software, but that’s the same as saying a carpenter could design and build a skyscraper.

I don’t think I’d pay an architect to design a dog house. I would probably feel pretty good about paying a carpenter to upgrade my house with some small designs. But building something that can last?

Something that is functional and equally inspiring? And completely safe to live in? I think I want someone that’s been trained, licensed and designed a lot of buildings. I’d want an architect.

When we talk about the design of digital products, there’s a lot of titles out there and a lot of confusion about what specific roles actually do.

Generally speaking though, the person that designs the experience you have with any given technology platform is called a User Experience Designer.

How many buttons should be on this screen? How many screens must a user go through in order to perform a task? What happens when the user clicks here? How does the user interact with this product? That’s all User Experience.

Unlike architecture, there is no licensing organization that deems someone a “credible designer.”

There are educational programs that seek to teach aspiring user experience designers, but even then the curriculum is all over the place.

Many people that occupy these positions were never formally trained, they just have a lot of tacit knowledge from starting on the ground floor as the internet exploded.

In Networks WIthout a Cause, Geert Lovink states that “Software does not result in a creed or a set of dogmas, but in a social order.”

That’s a lot of responsibility for a designer to shoulder, especially for the products that we use and rely on everyday.

We need to hold the designers of these systems responsible.

Of course, these designers are paid by the executive team driving the agenda. Sometimes it’s not easy making ethical decisions when your paycheck depends on you being a bit blind.

So here we are today, trying to understand violent right wing extremists storming the capitol decrying a fraudulent election.

This situation is the product of a social order imposed upon us through the design of technology.

January 6th, 2021 was shocking, but to critics of social media it wasn’t surprising.

Donald Trump used Twitter throughout his first presidential campaign and throughout his presidency to spread lies, misinformation, and sow division amongst American citizens.

The technology was designed to allow and amplify this behavior. If any person in the tech industry tells you otherwise, they’re either lying to you or are completely incompetent.

To further illustrate this point, I’d like to talk about some horrible moments in Silicon Valley chronicled by Mike Montiero in his fantastic book Ruined by Design.

In 2014 Facebook decided to run an experiment on over 600,000 people using their service. They filled their newsfeed filled with overwhelmingly negative news to see if it had an effect on their mental health.

Were any of these Facebook users aware that they were being experimented on? No.

That same year, US Immigration and Customs Enforcement, ICE, awarded Palanti (Pal An Teer) a $41 million contract to build a database to keep track of immigrants. For ICE, this database is absolutely essential in discovering targets.

In 2015, a Black software engineer tweeted out that Google Photos AI was categorizing photos of him and his Black friends as “gorillas.”

In 2016, Andrew Bosworth, a Facebook VP, circulates an internal memo stating that

“We connect people. That’s why all the work we do in growth is justified. All the questionable contact importing practices... That can be bad if they make it negative. Maybe it costs someone a life by exposing someone to bullies… Maybe someone dies in a terrorist attack coordinated on our tools.”

4 years before the attack on the Capitol, Facebook brushed off potential coordinated attacks using their platform as a side effect of growing the company.

In that same time period four years ago, Facebook ran an emotional study by analyzing the usage patterns of 6.4 million Australian youth on their platform to figure out when they feel their most worthless... in order to target them with higher value ads.

And then a bit later that year, Cambridge Analytica used Facebook’s platform to collect data on 87 million users and sway an election.

Oh, and that bug with Google Photos categorizing black people as gorillas?

Google finally fixed it 3 years later by removing the gorilla characterization from the AI database. They literally had to remove the classification to stop their software from being racist.

And I don’t know what’s worse: understanding that your product has horrible side effects and brushing them off, or just being grossly incompetent.

For example, in 2018, Buzzfeed reported that the gay dating app Grindr, with 3.6 million daily active users, had shared its users’ HIV status with two other companies... because no one thought not to include it.

This reeks of a company that sees your personal and private information as their property. They can do whatever they want to do with it, because that data now belongs to them, not you.

And once a platform starts tracking you, there’s now a singular place for someone acting in bad faith to prey upon you.

The Guardian has written about how smart devices have become a new hunting ground for stalkers, jilted lovers and exes. It’s easy for an ex to use a device they had previously set up to keep tabs on their former partner.

Stalking and harassment isn’t limited to people in our close social network though. Emily Chang, a journalist for Bloomberg TV, reported a tweet from a troll asking her to “eat his high-quality sperm” and invited her to a whipping.

A Twitter spokesperson replied that the tweet wasn’t in violation of their anti-harassment rules. Nice.

And then you have Mark Zuckerberg who has said that he’s okay with holocaust denial viewpoints on Facebook because, people sometimes get things wrong.

You would think after a couple of years of these horrible moments, tech companies would begin to curtail this kind of behavior, and yet...

In 2019, The Wall Street Journal reported that Flo Health’s Period & Ovulation tracker was telling Facebook when its users were having their periods or expressing an intent to get pregnant.

This happened even though Flo specifically stated in their privacy policy that it wasn’t doing it.

What’s the point of trying to read through a block of legalese before clicking the “I Agree” button if the company is just going to ignore their own policies?

And then, that same year, a white supremacist opened fire on two mosques in Christchurch, New Zealand.

Wearing a camera, he livestreamed his murder of 50 people.

Despite a public outcry to take it down, the video went viral across multiple platforms.

Facebook had to remove the livestream 1.5 million times in the first 24 hours.

Why? That’s the way Facebook was designed to work.

Mike’s book came out before Covid-19, so he doesn’t chronicle 2020 and the failings of social media during a global pandemic.

But, you lived through it.

You might have seen some tech executives starting to publicly realize that their technology is dangerous.

Do you think that it just dawned on them in the last year?

To what extent are companies that provide platforms responsible for the damage they inflict?

Facebook and other technology companies have vast numbers of users. When they make decisions, they ignore edge cases.

Edge cases are the problems or situations that occur only at an extreme operating parameter.

For tech companies, they consider something an edge case when it affects under one percent of their user base.

For a company like Facebook, with over 2 billion users, an edge case would still affect 20 million people.

I know, it’s hard to deny that these companies provide some value.

Without technology, 2020 would have been much lonelier.

We would have felt less connected, that’s for sure.

There are always trade offs.

Yet, they always seem to get just a bit more from us than we bargained for. And that’s from companies that rake in billions of dollars in profit. Those profits go to shareholders and executives.

That’s a very small share of people reaping the rewards while the world becomes more detached, more lonely, more tribal, more violent, more extreme.

And this is all behavior from companies that most of us would consider “above board.” As if their moral compass wasn’t that broken.

But what about the bottom feeders? The people that design with inherent contempt for their users.

Have you filled out a form that tricks you into giving an answer you didn’t intend?

Have you ever tried to purchase something, but somewhere in that purchasing journey the site snuck additional items into your basket?

Have you subscribed to a free trial that gets converted to a premium subscription, and then you spent hours trying to figure out how to cancel your subscription?

Have you found yourself clicking through a series of buttons to only be directed exactly where you didn’t want to go?

These are dark design patterns.

And perhaps you just realized that one or more of those companies you consider above board is actually a bottom feeder.

Dark design patterns can be found throughout tech products. They’re methods of manipulating the user into specific behavior.

At the end of the day, many tech companies are just mining their users to extract as much value as possible. For example, how many sites have you used that ask your birthday and gender?

How many sites really need to know your gender?

If you find yourself “well that question wouldn’t bother me…” that’s probably because you’re not an edge case.

At least,

not this time.

2021 NERDLab