This article was taken from the March 2015 issue of WIRED magazine. Be the first to read WIRED's articles in print before they're posted online, and get your hands on loads of additional content by subscribing online.
There's no magic behind creative thinking. You're born to do it -- you just need, as Steve Jobs did brilliantly, to keep asking: 'Why doesn't it work?' of our brains, speech, or the mere fact that we use tools. It is that each of us is in our own way driven to make things better. We occupy the evolutionary niche of new. The niche of new is not the property of a privileged few. It is what makes humans human.
We do not know exactly what evolutionary spark caused the ignition of innovation 50,000 years ago. It left no trace in the fossil record. We do know that our bodies, including our brain size, did not change -- ur immediate pre-innovation ancestor, Homo sapiens, looked exactly like us. That makes the prime suspect our mind: the precise arrangement of, and connections between, our brain cells. Something structural seems to have changed there -- perhaps as a result of 150,000 years of fine-tuning. Whatever it was, it had profound implications, and today it lives on in everyone. Behavioural neurologist Richard Caselli says: "Despite great qualitative and quantitative differences between individuals, the neurobiological principles of creative behaviour are the same from the least to the most creative among us." Put simply, we all have creative minds.
This is one reason the creativity myth is so terribly wrong.
Creating is not rare. We are all born to do it. If it seems magical, it is because it is innate. If it seems like some of us are better at it than others, that is because it is part of being human, like talking or walking. We are not all equally creative, just as we are not all equally athletic, but we can all create.
The human race's creative power is distributed in all of us, not concentrated in some of us. Our creations are too great and too numerous to come from a few steps by a few people. They must come from many steps by many people. Invention is incremental -- a series of slight and constant changes. Some changes open doors to new worlds of opportunity and we call them breakthroughs. Others are marginal. But when we look carefully, we will always find one small change leading to another, sometimes within one mind, often among several, sometimes across continents or between generations, sometimes taking hours or days and occasionally centuries, the baton of innovation passing in an endless relay of renewal.
Creating accretes and compounds, and as a consequence, every day, each human life is made possible by the sum of all previous human creations. Every object in our life, however old or new, however apparently humble or simple, holds the stories, thoughts and courage of thousands of people, some living, most dead -- the accumulated news of 50,000 years. Our tools and art are our humanity, our inheritance and the everlasting legacy of our ancestors. The things we make are the speech of our species: stories of triumph, courage and creation, of optimism, adaptation, and hope; tales not of one person here and there but of one people everywhere; written in a common language, not African, American, Asian or European, but human.
There are many beautiful things about creating being human and innate. One is that we all create in more or less the same way. Our individual strengths and tendencies cause differences, but they are small and few relative to the similarities, which are great and many. We are more like da Vinci and Einstein than not. But it's now business that needs help in fighting the new kinds of asymmetric challenges of the networked era.
The Renaissance belief that creating is reserved for genius survived through the Enlightenment of the 17th century, the Romanticism of the 18th century and the Industrial Revolution of the 19th century. It wasn't until the middle of the 20th century that the alternative position -- that everyone is capable of creation -- first emerged.
In the 40s, the brain was an enigma. The body's secrets had been revealed, but the brain, producing consciousness without moving parts, remained a puzzle.
As the west recovered from world war II, new technologies appeared. One was the computer. This mechanical mind made understanding the brain seem possible for the first time. In 1952, Ross Ashby synthesised the excitement in a book called Design for a Brain. He summarised the new thinking elegantly: "The most fundamental facts are that the Earth is over 2,000,000,000 years old and that natural selection has been winnowing the living organisms incessantly. As a result they are today highly specialised in the arts of survival, and among these arts has been the development of a brain, an organ that has been developed in evolution as a specialised means to survival. The nervous system, and living matter in general, will be assumed to be essentially similar to all other matter. No deus ex machina will be invoked."
Put simply: brains don't need magic. A San Franciscan named Allen Newell came of academic age during this period. He abandoned his plan to become a forest ranger (his first job was feeding gangrenous calves' livers to fingerling trout) to become a scientist instead. One Friday afternoon in November 1954, he experienced what he would later call a "conversion experience" during a seminar on mechanical pattern recognition. He decided to devote his life to a single question: "How can the human mind occur in the physical universe?" "We now know that the world is governed by physics," he explained, "and we now understand the way biology nestles comfortably within that. The issue is how does the mind do that as well? The answer must have the details. I've got to know how the gears clank, how the pistons go and all of that." As he embarked on this work, Newell became one of the first people to realise that creating did not require genius. In a 1959 paper called The Processes of Creative Thinking, he reviewed what little psychological data there was about creative work, then set out his radical idea: "Creative thinking is simply a special kind of problem-solving behaviour." He made the point in the understated language academics use when they know they are on to something: "The data currently available about the processes involved in creative and non-creative thinking show no particular differences between the two. Creative activity appears simply to be a special class of problem-solving activity characterised by novelty, unconventionality, persistence and difficulty in problem formulation."
It was the beginning of the end for genius and creation. Making intelligent machines forced new rigour on the study of thought. The capacity to create was looking more and more like an innate function of the human brain -- no genius necessary. Newell did not claim that everyone was equally creative. Creating, like any human ability, comes in a spectrum of competence. But everybody can do it. There is no electric fence with genius on one side and the general population on the other.
Newell's work, and that of others in the artificial-intelligence community, undermined the myth of creativity. As a result, some of the next generation of scientists started to think about creation differently. One was Robert Weisberg, a cognitive psychologist at Philadelphia's Temple University.
Weisberg was an undergraduate during the first years of the artificial-intelligence revolution, spending the early 60s in New York before getting his PhD from Princeton and joining the faculty at Temple in 1967. He spent his career proving that creating is innate, ordinary and for everybody. Weisberg's view is simple. He builds on Newell's contention that creative thinking is the same as problem solving, then extends it to say that creative thinking is the same as thinking in general, but with a creative result. In Weisberg's words, "when one says of someone that he or she is
'thinking creatively', one is commenting on the outcome of the process, not on the process itself. Although the impact of creative ideas and products can sometimes be profound, the mechanisms through which an innovation comes about can be very ordinary."
Said another way, normal thinking is rich and complex -- so rich and complex that it can sometimes yield extraordinary, or "creative", results. We do not need other processes. Weisberg shows this in two ways: with carefully designed experiments and detailed case studies of creative acts -- from the painting of Picasso's Guernica to the discovery of DNA and the music of Billie Holiday.
In each example, by using a combination of experiment and history, Weisberg demonstrates how creating can be explained without resorting to genius and great leaps of the imagination.
Weisberg's work, with subtitles such as Genius and Other Myths and Beyond the Myth of Genius, did not eliminate the magical view of creation nor the idea that people who create are a breed apart.
It is easier to sell secrets. Titles available in today's bookstores include 10 Things Nobody Told You About Being Creative, 39 Keys to Creativity, 62 Exercises to Unlock Your Most Creative Ideas and 100 What-Ifs of Creativity.
Weisberg's books are out of print. The myth of creativity does not die easily.
But it is becoming less fashionable, and Weisberg is not the only expert advocating for an epiphany-free, everybody-can theory of creation. Ken Robinson was awarded a knighthood for his work on creation and education, and is known for the moving, funny talks he gives at TED. One of his themes is how education suppresses creation. He describes "the really extraordinary capacity that children have, their capacity for innovation." Robinson's conclusion is that "creativity now is as important in education as literacy, and we should treat it with the same status." Cartoonist Hugh MacLeod makes the same point more colourfully: "Everyone is born creative; everyone is given a box of crayons in kindergarten.
Being suddenly hit years later with the 'creative bug' is just a wee voice telling you, 'I'd like my crayons back, please.'"
The psychologist Karl Duncker was the creator of the "candle problem", a test where the subject is given a candle, a box of tacks and a book of matches. The task was to attach the candle to a wall -- the solution was using the tacks to pin the box to the wall, and use it as a shelf, but most people couldn't see past the box as being anything more than a receptacle for tacks. Duncker also wrote that the act of creation starts with one of two questions: '"Why doesn't it work?' or, 'What should I change to make it work?'"
These sound simple, but answering them can lead to extraordinary results. One of the best examples comes from Steve Jobs, cofounder and CEO of Apple. When Jobs announced Apple's first mobile phone, the iPhone, in 2007, he said: "The most advanced phones are called smartphones. They are definitely a little smarter, but they actually are harder to use. They all have these keyboards that are there whether you need them or not. How do you solve this? We solved it in computers 20 years ago. We solved it with a screen that could display anything. What we're going to do is get rid of all these buttons and just make a giant screen. We don't want to carry around a mouse. We're going to use a stylus. No. You have to get them and put them away, and you lose them. We're going to use our fingers."
It is no coincidence that Jobs sounds like one of Duncker's subjects thinking aloud while trying to attach a candle to a wall.
The step-by-step process is the same. Problem: smarter phones are harder to use because they have permanent keyboards. Solution: a big screen and a pointer. Problem: what
kind of pointer? Solution: a mouse. Problem: we don't want to carry a mouse around. Solution: a stylus. Problem: a stylus might get lost. Solution: use our fingers.
Apple sold four million phones in 2007, 14 million in 2008, 29 million in 2009, 40 million in 2010, and 82 million in 2011, for a total of 169 million sold in its first five years in the phone business, despite charging a higher price than its competitors did.
How?
For several years, starting around 2002, I was a member of the research advisory board of a company that made mobile phones. Every year it gave me its latest phone. I found each one harder to use than the last, as did other board members. It was no secret that Apple might enter the mobile phone market, but the risk was always dismissed, since Apple had never made a phone. A few months after Apple's phone became available, my board met and I asked what the company thought of it. The chief engineer said, "It has a really bad microphone."
This was true, irrelevant and revealing. This company thought smartphones were phones, only smarter. They had made some of the first mobile phones, which, of course, had buttons on them. These had been successful. So, as they added smarts, they added buttons.
They thought a good phone provided a good phone-call and the smart stuff was a bonus. Apple made computers. For Apple, as Jobs's announcement made clear, a smartphone was not a phone. It was a computer for your pocket that, among other things, made calls.
Making computers was a problem that Apple, as Jobs described it, had "solved" 20 years ago. It did not matter that Apple had never made a phone. It did matter that phone makers had never made a computer. The company I was advising, once a leading phone manufacturer,
lost a large amount of money in 2007, saw its market share collapse and was eventually sold.
"Why doesn't it work?" deceives us with its simplicity. The first challenge is to ask it. The chief engineer did not ask this question about his phones. He saw rising sales and happy customers and so assumed that nothing was broken and there was nothing to fix.
But Sales + Customers = Nothing Broken is a formula for corporate cyanide. Most big companies that die kill themselves drinking it. Complacency is an enemy. "If it ain't broke, don't fix it" is an impossible idiom. No matter the sales and customer satisfaction, there is always something to fix. Asking, "Why doesn't it work?" is creation inhaling. Answering is creation breathing out. Innovation suffocates without it. "Why doesn't it work?" has the pull of a pole star. It sets creation's direction. For Jobs and the iPhone, the critical point of departure was not finding a solution but seeing a problem: the problem of keyboards making smarter phones harder to use.
Everything else followed.
Apple was not unique. Korean electronics giant LG launched a product much like the iPhone before the iPhone was announced. The LG Prada had a full-sized touchscreen, won design awards, and sold a million units. When Apple's very similar direction -- a big touchscreen -- was revealed, competitors built near replicas within months. These other companies could make an iPhone, but they could not conceive one. They could not look at their existing products and ask, "Why doesn't it work?"
The secret of Steve was evident in 1983, during the sunrise of the personal computer, when he spoke at a design conference in Aspen, Colorado. There was no stage, and there were no visual aids.
Jobs stood behind a lectern with yearbook hair, a thin white shirt, its sleeves folded as far as his forearms, and -- "they paid me 60 dollars, so I wore a tie" -- a pink-and-green bow tie. The audience was small. He gestured widely as he envisioned "portable computers with radio links", "electronic mailboxes", and "electronic maps".
Apple Computer, of which Jobs was then cofounder and a director, was a six-year-old startup playing David to IBM's Goliath. Apple's sling was sales; it had sold more personal computers than any other company in 1981 and 1982. But despite his optimism, Jobs was dissatisfied: "If you look at computers, they look like garbage.
All the great product designers are off designing automobiles or buildings but hardly any of them are designing computers. We're going to sell ten million computers in 1986. Whether they look like a piece a shit or they look great. There are going to be these new objects in everyone's working environment, in everyone's educational environment, in everyone's home environment. And we have a shot at putting a great object there. Or if we don't, we're going to put one more piece of junk there. By 1986 or 1987 people are going to be spending more time interacting with these machines than they spend in a car. And so industrial design, software design and how people interact with these things must be given the consideration that we give automobiles today, if not a lot more."
Twenty-eight years later, Walt Mossberg, technology columnist for the Wall Street Journal, described a similar discussion that happened near the end of Jobs's life: "One minute he'd be talking about sweeping ideas for the digital revolution. The next about why Apple's current products were awful and how a colour, or angle, or curve, or icon was embarrassing."
A good salesman sells everybody. A great salesman sells everybody but himself. What made Steve Jobs think different was not genius, passion, or vision. It was his refusal to believe that sales and customers meant nothing was broken. He enshrined this in the name of the street encircling Apple's campus: Infinite Loop.
The secret of Steve was that he was never satisfied. He devoted his life to asking, "Why doesn't it work?" and "What should I change to make it work?"
But hang on. Surely there is an alternative to starting by asking, "Why doesn't it work?" What if you simply start with a good idea?
Ideas are a staple of myths about creating; they even have their own symbol, the light bulb. That comes from 1919, the age of silent movies, a decade before Mickey Mouse, when the world's favourite animated animal was Felix the Cat. Symbols and numbers would appear above his head, and sometimes he would grab them to use as props -- question marks became ladders, musical notes became vehicles. One symbol lived long after the cat: when Felix had an idea, a light bulb appeared above his head. They have represented ideas ever since. Psychologists adopted the image: after 1926, they often called having an idea "illumination".
The creativity myth confuses having ideas with the actual work of creating. Books with titles such as Making Ideas Happen and How to Get Ideas emphasise idea generation, and idea-generation techniques abound. The most famous is brainstorming, invented by advertising executive Alex Osborn in 1939 and first published in 1942 in his book How to Think Up.
Research into brainstorming has a clear conclusion. The best way to create is to work alone. The worst way to create is to work in large groups and defer criticism. Steve Wozniak, Steve Jobs's cofounder at Apple and the inventor of its first computer, offers the same advice: "Work alone. You're going to be best able to design revolutionary products and features if you're working on your own. Not on a committee. Not on a team."
Brainstorming fails because it is an explicit rejection of ordinary thinking -- all leaps and no steps -- and because of its unstated assumption that having ideas is the same as creating.
Partly as a result, almost everybody has the idea that ideas are important.
Ideas are like seeds: they are abundant and most of them never grow into anything. Also, ideas are seldom original. Ask several independent groups to brainstorm on the same topic at the same time, and you will likely get many of the same ideas. This is not a limitation of brainstorming; it is true of all creation. Because everything arises from steps, not leaps, most things are invented in several places simultaneously. For example, four different people discovered sunspots independently in 1611; five people invented the steamboat between 1802 and 1807; and two people invented the silicon chip in 1957. When political scientists William Ogburn and Dorothy Thomas studied this phenomenon, they found 148 cases of big ideas coming to many people at the same time and concluded that their list would grow longer with more research.
Having ideas is not the same thing as being creative. Creation is execution, not inspiration. Many people have ideas; few take the steps to make the thing they imagine. One of the best examples is the aeroplane. The brothers Orville and Wilbur Wright were not the first people to have the idea of building a flying machine, nor were they the first people to begin building one, but they were the first people to fly.
Kevin Ashton cofounded the Auto-ID Center at MIT and coined the term "internet of things". This extract is from his new book, How To Fly A Horse: The Secret History of Creation, Invention and Discovery (William Heinemann), out now
This article was originally published by WIRED UK