Ethical anti-design, or designing products that people can't get addicted to.
2021-02-21 12:00:00 +0100
In the photo above, a Wii Remote is sitting on a table next to an open window. People who grew up playing the Wii might remember it; while playing Wii Sports, the game would give you a pop up window of it with a message politely reminding you that you can always take a break. While it might seem a bit counterintuitive, there's a couple of different reasons why a game would tell you to stop playing. In the case of online, subscription-based games, the company might actually benefit from you logging off every once in a while (don't worry, I'll get to social media eventually). It's not like you're cancelling your subscription, and every second you're online chews up valuable bandwidth. In other games, things like level-grinding, while tempting, might ruin the experience. In a game like Pokémon, for example, if you get your entire team up to level 60 before taking on the Elite Four, you'll win, but not in the most glamorous or satisfying way. The case of the Wii was a bit different. It seemed like Nintendo was aware that their audience was mostly younger children, and they felt the need to intervene when the player spends too much time playing the game out of interest for the player themself (and maybe the appeasement of the parents). That's not the sort of thing we see often anymore, and I always wondered why.
Bandwidth isn't as expensive as it used to be. The mechanics of many modern video games don't allow for level grinding anymore. Has Nintendo stopped caring about our children? Well, the industry has changed quite a bit. Today, many games benefit from having players spend as much time online as possible. In the mobile game Clash of Clans, players can prevent their base from being attacked by always staying online. The faster you complete a game, the quicker you'll be pushed to buy its DLC or better yet, its Season Pass. Most obviously, if you spend more time online, you'll encounter more players, some with creative (and expensive) skins or other purchasable cosmetic items. The more people like this you see, the more you may feel pressured into buying them yourself. This is a worrying trend, and it seems to be affecting most of the Triple-A game market. However, we have someone else to thank for these company's drive to keep our eyes on our devices for as long as possible.
In the case of social media, it was always in their best interest to keep us online. Very simply, the longer you look at your screen, the more likely your eyes are to come across an advertisement. Social media companies have poured billions of dollars into refining techniques to keep us engaged. I often joke that TikTok is the parent function of this sort of thing: short, colourful, algorithm-tailored, multi-sensory content that's constantly being streamed to you without interruption. The only thing that could make it more addictive would be if they IV-ed sugar directly into your veins while they were at it. And still, TikTok probably invests billions of dollars every year in refining their platform to make it as inviting as possible. Instagram is following suit; the way they present content is starting to feel more and more like TikTok with each passing month (especially since they introduced Reels). It's no surprise my generation spends so much time on social media, and nobody can blame them. As I was researching this topic, taking a closer look at the user interface of Instagram, my biggest challenge was to hold myself back from just scrolling through the endless stream of algorithmically chosen content.
It's scary to think about. So, I'll ask. All this user-retention stuff is great, but what if it didn't have to be like that? Rather, what if we designed products that weren't designed to be addictive?
Dark Patterns and the Fediverse
I get it, a big part of an app's addictiveness comes from its usability, and the notion of designing apps to be less usable feels a bit ridiculous. That being said, there might be a market for it. Last week, I wrote an article called "The Fediverse only solves half the problem" where I talked about the issue of dark patterns such as infinite scrolling at length. The point is, FLOSS-abiding technology has a philosophical obligation to put the people using it first. In the case of social media, a big part of that was creating an ecosystem (i.e. the Fediverse) where you didn't need to depend on a central authority; an ecosystem where you could be in charge of your own data. While the organization of the Fediverse was a big step forward, we still have to deal with dark patterns.
Most open-source developers (myself included) don't have much formal work experience in UI/UX design. That's why so much open-source software is either extremely brutalist (see Blender) or very, very closely related to the design of its proprietary counterpart (see KDE). The Fediverse seems to fall into the second category.The UI of Mastodon (and all of its clients) looks quite a bit like Twitter. In a sense, this was probably intentional. The closer the UI designers of Mastodon could mimic Twitter, the more newcomers would immediately feel comfortable with the Fediverse. With that comes the problem of dark patterns. Again, Twitter's interface was very intentionally designed to maximize the amount of time per day a person spends online. The Fediverse really doesn't need that, but it has it anyway. That's why I'm proposing that we reevaluate the way we design Fediverse interfaces and clients in light of how we can best strike a balance between creating a positive experience and one that puts the well-being of the person using your product first.
What is ethical anti-design?
Ethical anti-design comes in two parts. First of all, it's worth asking more generally, what exactly is anti-design?
Aesthetically, anti-design was an Italian design movement that served as a critique on consumer culture. While spiritually relevant, that's not the kind of anti-design I'm talking about. Instead, I'm talking about the much more pragmatic anti-design that's really common in interfaces. Anti-design is kind of like the opposite of design: designing not for usability, but for unusability. Again, somewhat unintuitive, but it can be very useful. We've all encountered anti-design many times in our lives. You know when the trash can on your computer asks you to confirm before permanently deleting your files? That's anti-design. Dangerous actions like permanently deleting files are often necessary. That being said, the designers want to make sure you don't do it by accident. That's why they implement things like confirmation checks to make sure that doesn't happen.
One of my favourite examples is how Github deals with repository deletion. On the settings page of your Github repository, all the "dangerous" actions are grouped together in a very scary looking red box. At the very bottom of that box is the option to permanently delete the repository. If you click on it, Github takes this shtick one step further and asks you to type the name of the repository into a dialogue box. This forces you to be fully, consciously aware of what you're doing as you're doing it, and rightfully so. Deleting the wrong repository could be frustrating, and if you're dealing with coordinating a large number of collaborators, devastating. Fortunately, it's not every day that you have to delete a Github repository so the company is fairly safe in making it a reasonably difficult thing to do.
Ethical anti-design takes these ideas and applies them to an ethical code.
If we were all Marcus Aurelius then maybe we wouldn't need to worry about social media addiction. The thing is, not everybody is a pure being of stoic efficiency; most people respond really well (or rather, really poorly) to behavioral design. Anti-design puts interventions in place to ensure a person doesn't make dangerous actions unintentionally. Ethical anti-design asks, "what else might be considered dangerous, as per my ethical code?" As I've surely made clear by now, one action that I consider dangerous is infinite scrolling. So, if I'm designing an app and I want to apply ethical anti-design, then I need to make sure the right measures are in place to prevent the person from zoning out and scrolling through social media all day.
What does ethical anti-design look like?
In the case of infinite scrolling, I think there's two routes an ethical anti-designer could take to help the person using their product scroll healthily: they could get rid of it or they could interrupt it.
Before we had infinite scrolling, we had pagination. In fact, pagination is still relatively common. Content is broken down into separate pages. When the person scrolls to the bottom of the page, if they want to keep looking at content, they have to click the "next" button and wait for the new content to load. This feels a bit antiquated now that we have effective asynchronous web applications, but there are two key features here. First of all, the time it takes for the website to load in the new content interrupts the otherwise seamless scrolling. This period of time—being either an instant or several seconds—forces the person to pause and maybe even think about what they are doing. If they have a moment to reflect on it, then maybe they'll know for themself whether or not this is what they ought to be doing. The other component of this is the "next" button. This button tells the person "that's all, though there is more if you want it." This forces the user to actually think about what they're doing, which helps to prevent mindless scrolling as well.
The other option would be simply to interrupt the infinite scrolling. This could be very similar to pagination, or it could be very different. Regardless, the way you go about doing it is very similar: force the user to actively choose whether or not they want to see more content and offer them an opportunity to take a pause. Imagine this: you're scrolling through Instagram posts. After about twenty posts, you encounter a button that says "Show More." To see more content, you need to hold down the button for a little more than one second. As you do this, you're shown an animated circle that completes once the time is up. Once that happens, twenty more posts are shown. This ensures that you aren't relying on the person's network connection to break up their scrolling, giving just enough time to make the user completely aware that this experience isn't continuous. The button, again, forces the user to physically interact with the interface to see more content. In practice, this would be very similar to pagination, but it would feel much more natural to someone who is used to infinite scrolling being the norm.
However, infinite scrolling isn't the only thing that causes social media addiction. Another way they hook us is by getting us into the numbers game. In my generation, a lot of one's self-esteem comes from your followers, your follower to following ratio and how many likes and comments you get on your posts. In reality, I haven't encountered many people who are seriously judgmental about this sort of thing; most people are harder on themselves about it than anyone else ever could be. Interestingly enough, this is something that Instagram actually seems to be aware of. They've redesigned the profile page a number of times to change the emphasis on people's follower count, and more recently, they've updated the look of posts to hide the number of likes they've received. It's still technically possible to see how many likes someone else's post has received, but you have to jump through enough hoops to make it not incredibly worthwhile. I would also consider this ethical anti-design: people are tempted to compare themselves to others online, and Instagram is intervening. Personally, I think this could go much further. For an example, Instagram already shows mutual followers on a person's profile page; what if they didn't show your follower count at all? If we were to use social media as a way to connect to people we know, then your follower count shouldn't really matter, should it?
Where's all this going?
Behavioral design is as problematic as it's standard practice among the social media giants. Naturally, these dark patterns have leaked into the Fediverse. The only way for us to to escape these patterns is to make an active effort to avoid them. It won't be easy; I too have a hard time bringing myself to use websites that still use rudimentary pagination, but at the same time, that's kind of the point.
If I haven't already made it super obvious, I intend on developing a new Fediverse client that applies ethical anti-design. My plan will probably take this idea to its extremes. The goal here isn't necessarily to create the Fediverse client everyone will use, but rather to demonstrate what this sort of thing could look like. My hope is that ultimately, other developers will be inspired by it and try to work these ideas into their own projects. I don't have a lot of information on it right now because frankly I'm still very early in the process, but if you want to hear more about it I'd encourage you to follow me on the Fediverse where I plan on talking about this project quite a bit and right here on this blog where you can expect to see a more official announcement in the coming weeks.
If this is something you'd be interested in working on or in any way helping out with, I'd love to hear from you!