Kevin Kelly has been “trying to listen to what the technology wants, and the technology is suggesting that it wants to be watched”. I have been a big fan of Kelly since I stumbled across his Cool Tools blog years ago. I was working as a product manager, I loved how he thought about products. I love how he thinks about other stuff. This post is a response to Kelly’s article, The Technium (which is quoted throughout). To you sir, I bow deeply.
I’m listening to my humanity, and what my humanity is suggesting is that it wants to be watched.
The Identity Machine
Our inner experiences and outer actions are getting much longer, and more visible half lives through their instantiation as digital artifacts that we copy, push, aggregate and endlessly revise. Digital technology, and social media in particular, are bringing our insides out, capturing our behaviors without context, and creating a fossil record of our impermanence. We are using the world’s largest copy machine, primarily to make copies of ourselves.
Consider this behavior in the context of just a few questions relevant to our time:
- What does immediate and always-on connection with a population larger and more diverse than anything we’ve ever had access to mean for our human experience?
- What does the accelerating environmental instability and natural resource reduction – driven by us – mean for our environmental security?
- Why do we routinely use medical technology to extend our lives, even when the quality of that life is very poor?
- Why, despite the incredible abundance created by our scientific and financial advancements, and the existence of a high-functioning global distribution system, have we not distributed this abundance to the many, many people who are still struggling with basic needs for their health and safety?
And ask yourself, why has our cultural response to these and similar questions been to create and propagate more versions of ourselves? We actually have a great tool to solve the big questions of our time, but haven’t popularized it to solve questions much bigger than what we like.
Most prosumers still produce and consume in the pattern of the mass market era – we act out the message of the medium, which today mostly involves us copying ourselves on the internet and then staying very busy iterating all those copies. So the copy machine has become a pattern machine, and a pattern machine is an identity machine. We are creating strong patterns, in our private and collective channels, and for me the a really interesting question is, why aren’t we designing technologies to disrupt our patterns instead of continually reinforcing them? Perhaps the best way to prepare for an unpredictable future is with technology that is designed to serve impermanence.
“It’s hard to convince people to take that long-term perspective because the future is so uncertain,” Kelly tells us, and he is right. But one way to frame long-term planning is by designing values instead of objects. You can’t plan for a future based on durable goods and discrete services – their obsolescence begins the moment you name them – but you can plan for the values that serve a caring humanity, serving a sentient planet, and start designing technology that either mutates or eliminates itself in service to those values.
The Trouble with Humans
We can use the technologies of identity proliferation and privacy collapse, not to reinforce our notions of self, and our values around privacy, but to break them. These are two aspects of the same coin. The urge to reach out and share is innately human; it is beautiful. In a networked age, the need for digital identities were a necessary first step to compensate for the lost intimacy of proximity that the web allows us to leave behind. But in doing this, we immediately introduced the uncomfortable experience of profile decay: watching our former selves die, via the asymmetrical change rate of our profile and emerging self. In an effort to alleviate the dissatisfaction of our innately human condition, we quickly learned to amend, revise, and version our profiles to match the myriad contexts and developments of our constantly emerging lives. We’re already experts at doing this with our memories, but our digital memories are more resistant. In service of this Sisyphean task – capturing our complex and ephemeral nature with a tool that makes permanent a fraction of what we are, after it’s already happened – we willingly give more and more information away.
We are using our digital tools in a way that creates an unsatisfying result, but it is so, so close to our human experience, that we mostly haven’t noticed that throwing more information at the problem of impermanence, isn’t working. Creating a better versions of our past selves is not going to make us comfortable with whatever is bothering us right now. Through our digital communication, we are trying to recreate the human experience in a non-physical context, and it’s frustrating the shit out of us, because we exist in a physical context. Digital versions of ourselves offer the tantalizing promise of a cleaner, more sterile, less painful humanity, but this is also a despairing one because such a thing does not exist. Our bodies force the full reality upon us, through our emotions, through our illnesses, and, of course, ultimately through death. In trying so hard to exorcise the the painful, we are also forfeiting joy and beauty, all the fresh and luscious life of a complete and present life. We’re so caught up in using our hyper-consumptive tools to craft a more accurate version of our human experience, that we haven’t noticed they’re failing to serve our humanity. We look around and say, hey, where did all the lovely virgins go? Oh, we sacrificed them.
We reinforce our identity patterns with technology that recommends we consume and act in the same way our past selves did. Sure, we can influence the algorithm, but we don’t, because it’s just too difficult to resist a medium that continuously delivers a recognizable, incomplete (often preferable) version of ourselves back to us, based on who we were.
And it’s not just our identity we’re consuming though the content. The medium is payloaded with the identity of an elite design class that disproportionately values technology and the business models predicated on that technology; people are an afterthought, and we tacitly adopt the same position when we adopt the technology.
Information is compressed experience. Design is a compressed, and directive, value system. When we consume these things we consume the experiences and values of others, but we are not, by and large, asking if the those are the right values and experiences for us, in the lives we want to live. The values of a twenty-something who makes a lot of money designing technology, works with other bright and talented colleagues, and is swaddled in a closed feedback loop and the extended adolescence companies like Google and Facebook provide their employees, might not be a very close match to lives and values of the population that adopts so much of what they make. This is not a judgement of what is “better” – neither one is – but they are different, and this difference is amplifying the identity gap we already experience when use this technology to look at ourselves. So when we consider if technology is helping us achieve what we want in our lives – financial security, a healthy environment, more leisure time, and greater intimacy with our loved ones, come to mind – we should consider if these are also the most immediate and felt concerns of the people who are designing the technology, and setting the cultural standard of use, for the rest of the population.
We have accelerated the rate at which we replicate ourselves, and it’s become a compounding mechanism to reinforce the same identity patterns we are used to. But it is not so hard to imagine that we might rotate the lens to get a very different view of things. That we might collide the personal Higgs Fields of our identities with enough awareness to shake loose some perspective that is broader than our assumptions about the well-practiced self.
If we run our pattern archive through a technology designed to disrupt, rather than reinforce our behaviors, what will we learn about ourselves? What will we learn about each other? If there were no privacy filters on Facebook, and we had access to the social behavior stream of the third largest, most diverse county in the world, what would we see, and how might this inform our actions in the world? What would happen if we took the amplification power of our pattern machine, and used it to start producing insights about our behaviors, instead of more of the same behavior.
The future doesn’t pull, we spring it. Our patterns pull, in that we fall most easily into the highest volume practice of our past, but the future isn’t a force in its own right. We make it up based on our choices, and our choices are based on awareness. And because technology is increasingly becoming a ubiquitous and near frictionless accomplice to our pattern making, we are loosing our ability to even think about it as a disruptive, awareness-creating force, which is frightening. Because the technology runs all the time, if we follow its lead, we will come to the manifest destiny of singularity – but not because it was a forgone conclusion. It will happen because we forfeited our power, our human perspective as a partner to technology, and blindly followed our own invention into letting the fraction of the life we designed it for, become all of the life we live.
The questions about how we handle digital overload, and how we protect our privacy are valid. It seems strange then, that our primary response has been to design and use more technologies that interrupt us so that we can give away more information. Despite our protests, our actions indicate that we are in a fairly willing collusion with our technology. If we are going to continue on this path, and it’s hard to imagine we won’t, it is time to start asking what do we want to happen as a result? We need to design the human future we want, and then design and use our technology to help us create it. We have ceded so much of our power and perspective, that our primary solution to these problems is how we might make the same tool more efficient at creating a result we don’t fucking like. Instead we might ask: In a future where everything is known about me, how to I want that society to treat me? A perfectly reasonable cultural response to that question, for instance, is to create a society in which we have eliminated shame, and the devastating consequences it brings. Shame is a social disease, it’s only contagious if you spread it, and our poor, our ill, our addicted, and our abused die from it every day.
Scaling Identity to the Collapse Point
“…it’s very common to see these network effects kick in where…the more you have, the more attractive you become…and so you have explosions…We shouldn’t be too envious of that kind of scaling, because it’s a very ephemeral thing, and it’s a very natural thing.” Kelly wrote this about the growth patterns of technology companies, but it’s equally true about our personal identities. The most glaring examples of this are the substance of celebrity media, but as prosumers, we are all engaged in dialing up the wattage of our personal spotlights. When our identities are deeply enmeshed with a system that scales to super nova – as its natural mode of operation – what should we be preparing for in terms of our human experience with our digital selves?
We have made some very useful things, and it is time to take a look at what we’ve done. We have made a magnificent tool to study ourselves, but in order to do so, we must change the technology to encourage reflection, rather than replication. Reflection is not the same as consuming our own performance. What is the design that will allow us to truly turn the technology on ourselves? What is the design that will foster space for the attention to our inner experience, instead of encouraging us to simply document it?
In a world where there will “be more minds and artificial minds everywhere” we are ready to start designing for the collective consciousness, instead of the user experience. Let us design for the human experience, for the sentient experience. It is time to question our complicity in exploiting ourselves back to the market as data-generating commodities, and start designing and demanding technologies that treat us like the gorgeous, interconnected beings we are. It is time to design the cultural reaction we want, in the future we are creating.