Technology Power: for whose benefit?

Pragmatic ways to mitigate the downsides

Guest blog by Jeremy Green and Fred Barker

As part of Alan’s Scanning our Future project, Jeremy Green and Fred Barker were commissioned to research the potential UK future impacts of new technologies over the next 15 years. This blog is a summary of some of their conclusions.

Some might say we stand at a point where developments in technology can provide the material basis for all humans to have a life of material and cultural abundance. Drudge work – both mental and physical – can be eliminated through automation, leaving more time for creativity, community and caring for each other.

But this cornucopia has arrived, not on to a blank slate, but in a specific time and place, and a specific social and economic framework. Here, productive activity is organised around an economic and financial system that requires most work and transactions to take place in the context of financial profit.

Many clever people spend their waking hours thinking about how the capabilities of transformative technologies can be fitted in to this economic model – how they can be ‘monetized’. They need to be clever, because their task is a difficult one. These transformative technologies have the potential to create abundance, but the economic model is based on scarcity. The technologies could do lots of wonderful things, but for some of the most powerful technologies, it’s not obvious how they could make money for anyone.  

The way in which the technology industries, and their finance backers, have responded to this conundrum goes a long way towards explaining why technology developments are the source of so much anxiety, and why technology is often perceived to have negative rather than positive implications for personal and community resilience.

A number of themes in industry’s response are discernible, including: extension of the monetization of peoples’ attention; redefinition of the boundaries of privacy; and shifting the definition of ownership and personal property.  They have also encouraged the belief that both governments and society are powerless in the face of the inevitable forward march of technology, a view that extends from the inability of governments to censor or control information, through the growth of digital currencies and markets for illegal services, to the inability of governments to collect taxes from the technology giants.

Further observations on these four themes are:.

  • Attention: Google and Facebook (and Twitter, and all the others…) didn’t invent advertising, but the tools available to them have allowed them to bring it to a new level. What they are selling is not the ‘services’ they provide to users, but those users’ attention. They are already good at it, and they keep getting better. That’s why the services are so distracting and so compulsive; that is how they are meant to be. In terms of personal resilience, the challenge is to manage this distraction and compulsion.
  • Privacy: The assault on attention is armed with information about our preferences, desires and behaviour. Apple and Google know where we walk, how long we stay there, where we drive and where we park, what we are thinking of buying, what news sources we regularly look at and how long we spend looking at them.  If we buy a wearable device they know how often we exercise, whether our performance is improving.  If we buy ‘smart’ objects for our homes they know how warm we like it, and when we’re not in.  Robot vacuum cleaners map out the layouts of our homes and send the information to the manufacturer so that they can sell it on.
  • Ownership: It is not just our data that we do not own. There is an increasing trend towards ‘servitization’, whereby companies seek to sell us services where they would once have sold us things. This makes for continuing revenue streams on the balance sheet, reduces our ability to repair what we have ‘bought’, constrains the extent to which we can find innovative uses for our things, and limits our ability to share. This is even more the case with digital goods, where sharing is re-conceptualized as ‘piracy’.
  • Regulation and control: over recent decades the state has become less able to regulate markets and transactions in the name of society. Communications technology, digital currencies and online markets do not respect borders.

Responding to the challenges posed by technology

Our society is still at an early stage of understanding the manipulatative and pervasive impacts of technology, and how to take back some power. This issue is likely to be one of the priorities of the next Scanning our Future project. Here are some initial pointers:

  • Adaptation and Assimilation: learning to come to terms with a technology, including developing techniques to mitigate and minimize their negative impact (e.g. social media diets, weekend Luddite).
  • Hacking and Subversion: learning how to change a technology so that it does more of what we want and less of what we don’t want (e.g. privacy settings, using ad blockers).
  • Embrace: careful and selective adoption of technologies which can contribute significantly to individual or group resilience (e.g. drones for community mapping projects).
  • Shape: engaging with the development of technology, especially at the level of development funding to maximise the positive and minimize the negative impacts (e.g. participating in P2P networks, user support groups). There is much exciting work going on at the moment around the development of ‘platform cooperatives.
  • Influence: seeking to influence public bodies at local, regional and national level, including local authorities and government.

For more about the Scanning our Future project see