top of page

An Ethics for Convivial Digital Infrastructures - Cian O'Donovan

The hand of an elderly person holding a hot drink in a ceramic teacup
Photo by Claudia Love on Unsplash

Cian introduced the provocation by explaining that a lot of his time is spent researching digital infrastructures in social housing schemes: how technologies such as CO2 detectors, alarm cords in toilets, and thermostats impact the residents and staff.

These questions push us to understand why certain technologies are available to use in the first place. The emergence of these technologies can be better understood if we look back to the development and application of technologies in the past — especially during war.

Cian used ‘Operation Igloo White’ as an example: this was a military operation initiated in 1968, during the Vietnam war. For six years, thousands of tiny sensors — shaped like twigs,

leaves, and animal droppings — were deployed along the Ho Chi Minh trail. These sensors were designed to, in part, automate intelligence gathering. They would detect nearby movements and sounds, send reports back to control centres, and direct strike aircraft to specific targets. The operation was a failure in terms of cost, military strategy, and of course, ethics.

Cian O'Donovan

However, it was deemed a success in that it infrastructuralised a facet of society in a way that had never been done before. Igloo White demonstrated that political and military operations could be automated, centrally controlled, and deployed on a global scale. This ambition is

part of what Paul Edwards called ‘closed-world discourse’.

“Closed world discourse, through metaphors, techniques, and fictions, as well as equipment and salient experiences linked the globalist, hegemonic aims of post World War II American foreign policy with a high technology military strategy, an ideology of apocalyptic struggle, and a language of integrated systems”

An important attribute of the theory of closed-world systems is that you’re either in it, or out of it. Alternative ideas or locally produced knowledge generally have no way in — and yet, investment in military technology is often credited with instigating technologies such as microwaves, radar, and even the internet, which spill-over into the rest of society.

So part of the work being done to examine the technologies used in housing associations is to challenge the design logics that underpin these spill-overs. Cian cited the work of Lucy Suchman, who has spent years in Paolo Alto, studying the ethics and practices of technologists producing robots and software. In her paper, Imaginaries of omniscience: Automating intelligence in the US Department of Defense, she analyses the language and promises of Silicon Valley, looking at white papers, blog posts, and legal hearings. It’s clear from this research that the logics of control that shape our technologies remain the same as they were forty years ago.

This is key because technologies developed for war, which have very specific purposes and narrow capabilities, are spilling out into hospitals, care homes, and households. This is not limited to individual technologies; this over-spill applies to digital infrastructures too, such as with internet connectivity. The re-shaping of environments via the proliferation of ‘control technologies’ is necessary to expand and maintain the closed-world — and this presents an opportunity to redirect tech policy away from this, and imagine alternative technologies and infrastructures.

Currently our collective imaginations around technology are shaped by markets, standards, patents, and the role of monopolies i.e. the globalist aims of America post WW2 — there is little room for collective participation in the direction of innovation. Imagining alternatives requires us to look at how these control technologies have persisted through the decades, and what knowledge they have created for the closed-world.

"Currently our collective imaginations around technology are shaped by markets, standards, patents, and the role of monopolies... There is little room for collective participation in the direction of innovation."

Looking at how technology is used in a sheltered housing scheme for older people in Hastings, it’s clear to Cian that any alternatives we imagine should prioritise maximising care. Cian approached the scheme with two research questions:

  1. How do digital infrastructures contribute to the wellbeing of residents and staff?

  2. What opportunities do residents and staff have to alter these? (In other words: who or what is in control?)

The technologies used in the scheme — and elsewhere — are not necessarily novel or radically transformational. For instance, fall detectors, which alert members of staff if a resident falls over, have been around for decades. Other technologies include tablets, which are mounted to the door to each residents’ rooms. Residents have mixed sentiments about these integrated technologies, and these key themes emerged from qualitative research:

  • Distrust and praise: in some cases, technology made residents feel safer and in more control. But in other cases people felt disempowered, as if they were reduced to a number in the system.

  • There were feelings of coercion when it came to using certain aspects of the system, such as participating in wellbeing calls.

  • There was a noticeable disconnect between the residents, and the apparent purpose of the technology. Some failed to see the point of tablets mounted to the door, with one resident pointing out that it would light up in the middle of the night for no clear reason.

These themes illustrate that the rolling out of digital technologies in this context has not resulted in any radical transformations in health or social care. Rather, the technology introduces new social dynamics in the housing schemes: some are more empowered by new digital infrastructures, whereas others are made more vulnerable. While digital services may be improved incrementally, the improvements focus on the quality of the technology itself, and not how the technology improves social relations and care; surveillance technologies designed to understand needs fail to capture what matters most to the people themselves.

This dynamic was further exemplified when over a million people in the UK were put on a list of ‘highly vulnerable’ persons during the height of the Covid-19 pandemic. The awareness of being classified in this way by an opaque system will inevitably change how people feel about themselves.

With this in mind, how can we get ideas in and out of the closed world, and eventually build convivial infrastructures? A first step is to remind ourselves that standalone technologies, with no human dimension, will not be effective or meaningful solutions for care. For instance, an at-home blood pressure kit realistically requires a professional to use correctly, and to record the results. Furthermore, internet connectivity is framed as an individualistic benefit, when the prospect of free internet in communal areas can actually enable new capabilities which would benefit the collective.

A more convivial approach to digital design would bridge the divide between front-line workers and the makers of these technologies by looking at the system as a whole, and devising solutions that are coordinated across housing and healthcare, rather than treating these two areas as separate considerations.

An important first step along this journey will be building capabilities amongst technologists that allows them to recognise, resist and repurpose technologies which are designed to see the world like a battlefield. Critical here are the curricula of computer science and engineering departments, as well as the quantified success metrics that drive today’s internet enterprises. It is through these processes that technology design opens up to valuing human relations and worldly effervescence over instrumental control and efficiency.

Reclaiming control-technologies and altering them to directly address needs requires significant rethinking of current prevalent methods: reframing ignorance and uncertainty as positive values can help break out of the closed-world logic. Looking back at the sensors deployed in Vietnam, the persistent messaging around these is that they were providing the military with certainty, which they could then act on — but the sensors couldn’t differentiate between people on trail delivering bombs and food. The rudimentary nature of technology did not extinguish uncertainty, but close it off.

If we embrace uncertainty, we can avoid getting caught in the narrow applications of technology; the idea that any piece of technology can give you an irrefutable and wholly accurate view of certain situations. When reimagining new digital infrastructures, we have to accept that we cannot know everything. The ‘need’ for certainty is only present during war time, and therefore not required in times of peace – any technologies that spill over from wartime innovations are not only built for the wrong purpose, but also at odds with convivial ethics.

"We can avoid getting caught in the narrow applications of technology"

In order to resist the most pernicious logics of control, we must recognise how they’ve historically persisted in our technologies, and fight against the fallacy that the future is ‘inevitable’, as narrowly defined by a powerful minority. If we believe in that inevitability, we give up our right to have a say — which is why we need to have more conversations about conviviality, to reorient our positions in these pervasive systems and eventually enact positive change.

@DrCianO'Donovan _________________________________________________________________________

Sign up to our mailing list



Stay up to date with the world of tech ethics by signing up to our newsletter here

bottom of page