When you're autistic and you choose the computer route, you get a subclass. Some people choose programming language theory, compiler design, Plan 9 (in general)… I think I chose "human-computer interaction," and despite having had this subclass for something like five years now, it didn't actually occur to me that this was the thing I was into until fairly recently. In fact, I had to take a class on HCI in my second year of university and I hated it. There was never a moment where I looked at my computer screen, brightly rendering the UI for Android Studio in the dimness of the night, and I said to myself "this is what I've been waiting for. This is why I came to university." A few things I learned in that class have come up in the years since, but even having done what is ostensibly "human-computer interaction" work, I basically never think about it.

I actually have another name for the sort of thing I'm into, and I think this is a big part of why it took me so long to realize HCI was something I even cared about. When I was writing my bespoke Rube Goldberg blog/gemlog machine, I had to organize my articles into categories for the first time. There was a whole bunch of articles on a similar theme that explored the way people interact with computers, how computers interact with us, and how that shapes our experience of the world. I wasn't taking HCI at the time; I was taking a course in forest ecology. So naturally, I decided to call it "human-computer ecology."

I remember in my first year I was taking a course on Women's Studies. In my very first lecture, my professor tried to explain what Women's Studies is as a discipline, and they contrasted it to most fields in saying that Women's Studies is an area of research with an explicit agenda—that agenda being to understand and dismantle hierarchies of power, with an emphasis on patriarchal hierarchies. I didn't find this very convincing, because most fields have an agenda, implicit or otherwise. The stated agenda of the field of HCI is to "facilitate" interactions between humans and computers, in view of accomplishing some goal. The implicit agenda of HCI as an industrial application is to maximize product engagement and/or attention.

When you hear people talk about the difference between the modern web and "old web" or "indie web," especially those critical of the modern web, you'll often hear the subject of friction come up as though it's a good thing. Friction is typically considered a bad thing unless it's intentionally introduced to make you less likely to make a mistake (like cancelling your Amazon Prime subscription). There's something very smooth about the modern web—both in that smoothness is a sort of design motif that shows up everywhere you look, once you're looking for it, and in that the web doesn't seem to ask anything of you, ask you to do anything, so much as you almost slip into it, sliding down an endless ramp of content that takes more effort to get out of than into. Compare this to the way an AI generated paragraph seems to flow over your brain like water, leaving no evidence it was even read in the first place.

My first attempt to Do HCI, all the way back in high school, was with this Pixelfed client called Resin that actively re-introduced friction into the UI to make it easier to escape that content ramp. I called this practice ethical anti-design, referencing that same design principle Amazon uses to keep you trapped in your membership.

More recently I've written about how my thoughts on ethical anti-design have changed over the years:

In hindsight, Resin feels like a parody of itself. But I guess that's fine; I was basically a kid back then. It was an idea that obviously came from a place of being dangerously addicted to social media, a reaction to seeing people I looked up to making all the same mistakes, and a desire to make something better. I'd always explained it as my attempt to try and inspire people to do UI design differently, and if nothing else, in the years since, it's got me rethinking a lot of basic assumptions about how we design computer systems for humans.

On my gemlog I have a much less annoying way of navigating my old posts, including a page that actually breaks down my archive by category. If you take a look at the "Ecology for computers and humans" category you'll notice they tend to have a lot less to do with software design and a lot more to do with how I and the people in my life relate to computers. I guess it's my attempt at being more descriptive than prescriptive. I think that a computer is a thing you should have a relationship with. A relationship that stems from an honest understanding of what it means to relate to something that is inanimate and yet alive in an ineffable way. It's something you take care of, and something that can enable you to take care of yourself. It can influence your life, but it shouldn't be the conduit through which you experience the real world. It should compliment your life rather than control it.

I think there's some more insight to be gleaned from considering the way we relate to computers on a macro scale rather than how we interact with them on a micro scale. I think by reconsidering those basic assumptions about how computers should be interacted with, we might be able to find ways to escape the toxic patterns HCI researchers have left us with. This is the main reason I'm interested in things like uxn, esoteric programming languages, Plan 9, and really any project asking "what if we went on the computer, but differently?"

Respond to this article

If you have thoughts you'd like to share, send me an email!
See here for ways to reach out