Innovation and Human 2.0—Are Superhuman Capabilities an Inevitable Reality or a Romanticized Prediction?
LPK’s Vice President and Managing Creative Director Valerie Jacobs and Creative Director Bryan Goodpaster are featured in the Spring/Summer 2013 issue of Viewpoint Magazine with their collaborative article “Dial E for Evolution.”
In the article, Val and Bryan introduce the idea of the “Human 2.0,” mankind’s predicted capabilities in response to rapid advances in technology and our fluent, almost dependent, use of information systems. In essence, our brains continue to evolve in concert with our environment, which is increasingly computerized.
PREPARING FOR HUMAN 2.0
Abiding with today’s speed of change arguably requires superhuman capacity. Leaps in knowledge, the world’s unabated data growth and the pace of compounding computation are disorienting to say the least. The exponential rate of life is the incomprehensible stuff that existential questions are born of—what is our role as humans within this time of chronic disequilibrium? In the seemingly unnatural, hyper-accelerated time we live in, it’s logical to question whether our minds and bodies were intended to live at this tempo—whether technology is outrunning our evolutionary, adaptive capabilities. Or is the ability to thrive and survive in the new technological paradigm already deeply coded within our dinosaur brains, latent in our sensory capacity?
Half a million to two million years ago, our ancestors first assumed their place at the top of the food chain by learning to hunt to survive. Our ability to hunt was directly related to how far we could see and how far and how fast our legs could take us. Our bodies and brains adapted to meet prehistoric demands. At the same time, our innovations helped us surpass our bodily limitations and made us able to take down a gazelle or, much more recently, cultivate crops. Millennia on, we continue to both optimize our natural humanity and leverage science and technology as a means of surpassing our human limitations.
Today, our tools and the ways we use them are infinitely more complex. We are in many ways oblivious to the complexity within our environments, of the sensors, geo-locators and circuitry that surround us. We live within the now clichéd state of ubiquitous computing, a digital universe that is growing by five trillion bits of storage per second. “Computers are everywhere. They are now something we put our whole bodies into—airplanes, cars—and something we put into our bodies—pacemakers, cochlear implant,” observes Cory Doctorow, author of For the Win and Makers.
As circuitry and sensors become more intuitive, more compatible with our humanity, they seemingly evaporate from our awareness. While we are increasingly mindful of the fidelity of the foods, medications and beauty products that we put into and onto our bodies, we are blissfully negligent of technology’s effects on our minds and bodies, of our potential like-mindlessness. We happily concede our dignity and privacy in the name of innovation. Our tools have moved from being an extension of our physical selves to an extension of our mental, cognitive selves. The momentum of technology and its pervasive consolidation in our lives presents a straining tension of opportunity and apprehension.
Pushing the physical limits of silicon, compounding computer power more than doubles every 18 months. Similarly, we are at a point in history when every year we double our knowledge about our brain’s function and its 100 billion neurons. But what will we learn about the estimated 1,000 trillion neural connections that scientists believe are firing within the brain? Opening a Pandora’s box of human potential, there is no doubt we are paving the way for artificial intelligence. Yet, we know that our brains have been shrinking in size over the past 30,000 years. In that time, our brains have, on average lost mass equivalent to the size of a tennis ball. Like computer chips, our human circuitry is getting smaller—but what does that mean to its performance? Are we getting dumber? Could we be essentially offloading our mental capabilities to the machine—or are we simply losing unneeded neural networks and evolving to meet the demands of a new environment?
One theory is that our brains are getting smaller because there is a correlation between intelligence and survival; historically brains have contracted in size as societies become larger and more complex. This suggests that the safety of industrialized and networked society requires less of our larger, “dinosaur” brain. Studies show that we, yes, may be offloading some memory functions. In one study, research subjects appeared to be reluctant to commit any fact to memory that could be easily retrieved elsewhere. Perhaps our brains are devolving, relinquishing some of our evolved, mental capacity and recommitting to our early hunter-gatherer programming. Perhaps that points to a new schism—a new divide between the cognitive haves and have nots, a divide between machine rulers and machine ruled.
A second theory puts forth a slightly different, more optimistic perspective, theorizing that our modern brains are shrinking because, in fact, they are being rewired to work faster and more efficiently—working more collaboratively and less aggressively. In essence, they are evolving to function in a way that is more connected, perhaps more human.
Still, we cannot be certain whether technology is instigating our evolution or hampering it. Signs show that not only are we still evolving, but our evolution might even be accelerating. “We know the brain has been evolving in human populations quite recently,” says paleoanthropologist John Hawks from the University of Wisconsin-Madison. Hawks and his colleagues have found evidence of a host of new human mutations originating in the last 40,000 years.
Maybe we are already seeing the forebears of this evolution in the new breed of prodigious, attention-deprived savants. Historically, people with superhuman talents and capabilities have often possessed some sort of mental disability. With a different lens, these savants may be viewed as the first ascendance of Human 2.0 or even 3.0. Nevertheless, it begs the question: what untapped capabilities and supersensory mutations might be emerging as we prepare to cope with the new technological paradigm?
Compared to the rest of the body, we know the least about our brain’s recent evolutions or its untapped capabilities. However, we know it adapts to the influence of technology incredibly rapidly. In 2008, Gary Small, author of iBrain: Surviving the Technological Alteration of the Modern Mind gave us some of our first observations into just how fast our brains respond to technology. He authored what is considered to be the first study to actually observe the brain during internet use. He introduced internet technology to people without prior internet experience. After only five days, subjects’ brains had adapted: our mental filters basically learnt how to shift into overdrive. Small observed the brain adapting in a way that mimics the efficiencies of computer processing.
Indeed, we may be becoming more like our devices, or are our devices becoming more like us. Ray Kurzweil, renowned futurist and author of The Age of Spiritual Machines, has calculated the year 2045 to be what he predicts as the Singularity—the time when machine intelligence meets and then surpasses human intelligence, taking on human-like characteristics. This can be a frightening proposition. From a fundamental perspective, one can assume that as we gain new human capacity, we will shed vital aspects of what we have characterized as distinguishing us as uniquely human—the carnage of evolution. The conundrum presented is whether or not our advancement compromises our humanity, whether our inevitable ascent discards vital portions of our humanness.
Sophocles said that nothing vast enters our lives without a curse. Many would assert that we have jeopardized the qualities that we hold unique to our humanity as we opt into what anthropologist Amber Case calls “ambient intimacy.” The world we live in is populated by people with their heads down, hypnotically buried in their mobile devices. Tiffany Shlain, director of the film Connected, cautions that ‘we are—actively, daily—affecting the connections in our brains as we plug into our smartphones and tablets and laptops. And since it’s having such an impact on our minds, we need to do it mindfully, and sometimes not do it at all.’ Diana Senechal, author of Republic of Noise, offers an even more critical view, noting, ‘The chatter of the present, about the present, cannot always grasp the present.’ She asserts that we have compromised our ability to be self-reflective, experience downtime and maintain a healthy inner dialogue.
Could we be sacrificing our sense of self in the interest of the collective? Long before Kurzweil predicted the Singularity, the ancient yogis predicted a race of transcendent men with superhuman capabilities and expanded consciousness. We are on the precipice of a connected, global consciousness, a point in which we will be required to cast off some of our original code. In 1962, Marshal McLuhan profoundly predicted this culmination point, ‘The next medium, whatever it is—it may be the extension of consciousness…’
While these scientists and futurists may be accurate, evolution is basically a series of trade-offs. When we emerged from the waters millions, if not billions, of years ago, we traded the physiology of swimming for the physiology of walking upright. Later, it was the species that adapted their diets and adjusted to new geographies that were able to step across evolution’s next frontier. Today we are adapting to an environment of convergence and connectivity. We as a species must reconcile our next trade-offs and harness the power of our next incarnation.
The next generation of bionetic interfaces is emerging. We will soon witness a flood of immersive computer systems interfaced with our thoughts and nano-computer circuitry manifested as organic tissue. The promise of genomic sequencing is absolutely fascinating when viewed as “organic software” that could potentially transform the physicality of the hardware. When the burgeoning future of this advancement is realized, we will have no choice but to reconcile our computer-human co-evolution.
Taking an optimistic view, the blueprint to our human-computer co-evolution may be unrealized. Our seemingly superhuman capacity may be merely untapped, deeply latent within an original code anticipating a complex new future. A new level of mental computation, calibre of sensory perception and spectrum of consciousness may unexpectedly awaken as we rise to hold our place at the top of the evolutionary food chain. Without question, we will follow parallel pursuits in an effort to evade existential catastrophe—simultaneously tapping into our inherent, natural and original code while also hunting all means of science, computation and artificial intelligence to propel us to find our new dignity with super-intelligence in Human 2.0. One thing is certain, the speed of the unblinking infosphere will present an evolutionary challenge: to take the leap or not.
This article originally appeared under the title, Dial e for Evolution, in Viewpoint No. 32: Art, Spring/Summer 2013, pp. 90–91. This is just one of Jacobs’ and Goodpaster’s contributions to Viewpoint Magazine, others include: “Bitches, Barbarians and Narcissists,” “Generation Why” and “Through A Screen Darkly.”
LPK Vice President and Group Director of Trends Valerie Jacobs isn’t just a forecaster of design, she’s a seasoned storm chaser. Her trend work is grounded in a strategic approach that incorporates research, analysis and translation of data into actionable strategies for consumer brands with the nerve to keep up. Follow Val on Twitter at @futureglimmer or email her at email@example.com.
Bryan Goodpaster is a creative director at LPK, where he is often called upon for his non-traditional approach and strategic consultancy—helping crack wicked brand problems and strategic conundrums for many category-leading brands. Follow Bryan on Twitter at @bryangoodpaster or email him at firstname.lastname@example.org.