Did you ever wonder what technology really wants?

Kevin Kelly, writer and founding editor of Wired Magazine, did. And he put down the answer really well in his book “What Technology Wants” published in 2010.

It’s still really worth reading, giving inspiration to anyone who wants to gain understanding on how we should shape technology to do more good and less harm.

If you want Kelly’s short answer to what technology ‘wants’, it’s more or less:

“… to generate more options, more opportunities, more connection, more diversity, more unity, more thought, more beauty, and more problems. Those add up to more good, an infinite game worth playing.”

Kelly puts down a series of good guidelines that are key in order to play this game well, and I will come back to them at the end of this post (especially noting that they are a complement to the importance of protecting human values that I addressed in this post).

As a conclusion Kelly notes that technology actually seems to have its own direction:

“Technology is acquiring its own autonomy and will increasingly maximize its own agenda, but this agenda includes – as its foremost consequence – maximizing possibilities for us.”

What set off Kelly in his research however, was a series of more basic questions that many people might ask themselves, from small ones such as ‘Should I get my kid this gadget?’ to fundamental ones such as ‘Should we allow human cloning?’

He realized that in order to answer those, he first needed to understand what technology really is. What its nature is like.

Searching for the answer he first discovered that technology is a surprisingly anonymous and little used term, given that it has been a close and useful partner to humanity for tens of thousands of years.

To better encompass all aspects of technology before going ahead he coins the term ‘technium’, including not only physical technology in itself but also culture, art, social institutions and intellectual creations of all types.

And analyzing what the concept of ‘want’ means, he notes that even bacteria want something – food for example – and that the meaning of ‘want’ has to do with tendencies, urges and trajectories.

In line with what a number of other writers and thinkers have started to note in the last decade, Kelly observes the similarities in the development of technology and evolution of life, and he outlines the Technium as a natural successor to biologic evolution.

He notes six major stages in evolution of life – six kingdoms – and nominates Technium the Seventh Kingdom.

But he also observes three important differences between biology, which is self assembled, and technology which is created (mostly) by humans:

1. Biology rarely borrows a feature which is no longer in use, to solve another problem. The Technium does this all the time.

2. Biologic life develops by incremental transformation, the Technium by jumps.

3. In biologic life species go extinct, inventions don’t.

(He actually argues successfully that not one single invention has ever gone out of use or is no longer manufactured).

Kelly then discusses a couple of concepts:

Exotropy – the rising flow of sustainable difference – the inversion of entropy, noting that a modern semiconductor microprocessor has the highest sustainable energy flow per gram per second in the known universe.

Deep Progress – arguing that it’s beyond doubt that life of humans has gradually improved substantially through history, but also that science needs prosperity and populations.

He then gets to another key concept which has been proposed by others but which is still controversial – that mutations and natural selection are not enough to explain evolution of life.

One example is the DNA molecule that has been found to be the absolutely most optimal design for doing what it needs to do. Still, taking into account the immense number of possible designs of this molecule, it’s too unlikely that it should have been self assembled by pure chance in the time span of life on earth.

Adding a third component – a kind of push in evolution which gives direction – helps. And this component seems to exist.

Kelly underlines that it’s not about something supernatural. Instead he indicates two driving forces in evolution of complex systems:

1. Negative constraints – laws of geometry and physics.

2. Positive constraints – self-organizing complexity generates a few repeating new possibilities.

These two facts explain what has been detected in several areas of complex systems: Complex adaptive systems tend to settle into a few recurring patterns – patterns that are not found in parts of the system.

From this observation he proposes a triad of evolution with these aspects:

– Functional – adaption through natural selection.

– Historical – the lottery of random changes, accidents or other circumstances.

– Structural – inevitable patterns that emerge in complex systems

Kelly sums this up, stating that “life is an inevitable improbability”.

He then observes that development of technology can be described by a similar triad, with the fundamental difference that the functional aspect through adaption in biologic systems is replaced by an intentional aspect in the technium – openness to human free will and choice.

And here’s the core of Kelly’s findings – our intimate and inseparable union with the technium on one hand, and our opportunity and duty to shape it on another.

“Humans are both master and slave to the technium, and our fate is to remain in this uncomfortable dual role. But our concern should not be about whether to embrace it. We are beyond embrace; we are already symbiotic with it.

Our choice is to align ourselves with this direction, to expand choice and possibilities for everyone and everything, and to play out the details with grace and beauty.

Or we can choose (unwisely, I believe) to resist our second self. When we reject technology as a whole, it is a brand of self-hatred.

By following what technology wants, we can be more ready to capture its full gifts. “

This is where Kelly starts to investigate how we should choose. And after having rejected possibilities of cancelling technology development altogether out of fear for its consequences (the Unabomber), or trying to slow it down in order to find a more human pace (the Amish), he finds this dilemma – which is also a kind of a golden rule for technology use:

“To maximize our own contentment, we seek the minimum amount of technology in our lives. Yet to maximize the contentment of others, we must maximize the amount of technology in the world.”

At this point Kelly is ready to get instrumental and proposes a number of checklists which I find really useful.

They are all based on the concept of ‘conviviality’ of technology.

The first is a five point list on how we can deal with inevitable risks and dangers in new technologies:

1. Anticipation

2. Continual Assessment

3. Prioritization of Risks, Including Natural Ones.

4. Rapid Correction of Harm

5. Not prohibition but Redirection.

The second is six aspects with which we can measure the conviviality of a certain manifestation of a technology (look for more of …):







The third and last checklist is an observation of what life ‘wants’, and consequently, given that technology is the inevitable extension of nature, also what technology wants – at the same time something we should have in mind when trying to shape technology to express its best aspects.

Life wants increasing:














Apart from the beauty and elegance in Kelly’s analysis of technology and its origins, I find his conclusions extremely efficient and accurate. The checklists he proposes can be applied in an infinite number of cases and for a very long time frame.

However, one aspect that he almost doesn’t touch at all is the huge importance of the development of human values and social systems which have grown in parallel with technology, almost as a virtual reflection of each other, tightly interlaced but with obviously much more attention given to the human and social aspect than the technological.

I addressed the importance of human values for the survival of a highly technologically developed society with super intelligent systems in this post – these values are actually necessary to prevent self destruction, and at the same time our only hope to be respected by a consciousness far more intelligent than ours.

And I believe that it is by following this double path – protecting fundamental human values and following the spirit of nature while shaping technologies we create – that we can reach the highest level of good in evolution.


2 thoughts on “Did you ever wonder what technology really wants?

Add yours

Leave a Comment. Latest comments are displayed on top. Comments are not threaded.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at WordPress.com.

Up ↑

%d bloggers like this: