Desire paths, path dependency and the inevitability of technology

Right now, I can't seem to escape the idea that technology is a lot like urban planning. Maybe this is because I just read James Scott's Seeing Like a State. The other day I was walking through the National Gardens in Athens, a beautiful almost tropical park in the centre of the city.

Palm trees at the Athens National Gardens.

It sticks out — literally. Only recently, I learned that it was commissioned by the former queen of Greece in the mid 19th century, in direct denial of the surrounding climate. Big circular paths wind around massive palm trees, with irrigation systems throughout.

The park is also peppered with 'desire paths'; a weaving network of footpaths that people actually want to use rather than the paths that were planned and built into the park's landscape. Desire paths are unplanned, and present themselves gradually — they are literally produced by consistent foot traffic.

A classic desire path; stomped right out of the grass
A classic desire path; stomped right out of the grass

Desire paths are in a fist fight with another concept I spend a lot of time reflecting on: path dependency. Path dependency describes the phenomenon in which events in the past affect events in the future. E.g. "The anti-trust suit against Microsoft in the 90's will affect the anti-trust suit against [insert any of Big Tech company here]." This means that a past set of assumptions are driving what's on offer now — rather than our expression of what we actually want.

Technology is not exempt from this phenomenon: digital products are designed in a way that makes it extremely hard to see any potential desire paths. The hard-coded nature of, say, the UI of a web app is susceptible to path dependency (I'm looking at you Bootstrap). You have to click these buttons in that sequence to achieve a desired effect.

Unfortunately trying to find desire paths in digital spaces is a huge challenge without massive user research budgets, and a team of data scientists looking solely at user experience. And that's assuming that all teams design humanely — dark patterns are also much harder to fight than the landscaped paths in my local park.

If we can't cut corners in these situations, then we cannot remake systems to suit our needs

Here's one of many examples: a few weeks ago, I spent 45 minutes email-shouting at a company who insisted I call them to cancel my subscription. They'd provided no way to cancel online. This tactical, anti-user friction, is exploiting the fact that desire paths are largely impossible to make in digital environments. If we can't cut corners in these situations, then we cannot remake systems to suit our needs. In turn, designers won't be able to see the patterns of desire, which are necessary to inform change in those systems.

In capitalism, we expect consumerism to be the primary way of expressing our desire paths. But, just as with technological systems, we still have to choose from an existing and limited set of options. If we don't like any of them, we're told to build our own.

What happens when these habits and limitations are experienced in environments that have huge consequences for people experiencing them? Immigration authorities are operating on a path dependency and political focus to make applications as challenging as possible — how does that manifest when they build websites for the application process? What ways can designers use randomness and research to spot and serve desire paths? What responsibilities do states and other powerful actors have to look for desire paths? And what role can organising play in articulating what we actually want out of the technology that hard codes our systems?


© 2022 Alix Dunn Design & Developed by rinconelloinc & yudax Powered by Notion & Super