David Byrne on how tech affects music and the way we listen

This article was taken from the October 2012 issue of Wired magazine. Be the first to read Wired's articles in print before they're posted online, and get your hands on loads of additional content by <span class="s1">subscribing online.

David Byrne has turned an entire building into an instrument, created a 65-metre-long flow chart on a stretch of pavement, spoken at TED, used a PowerPoint presentation as public art, built a singing robot, made several films, and installed bike racks on the streets of New York. He also knows something about music from his origins as frontman of Talking Heads, not forgetting collaborations with Brian Eno and Fatboy Slim. His new book, How Music Works, out this month, explains exactly that, drawing on Byrne's own experiences, the history of music and scientific research. Byrne tells Wired about Edison, technology and the rise of zombie music.

Read more: The best headphones for any budget in 2021

Wired: Why choose to write a book on how music works, rather than a straightforward musician's autobiography? David Byrne: Well, it seems like everyone's doing an autobiography these days, and some of them maybe have a lot more to tell than I might. A few of these pieces began as commissions, one was a TED talk. They are all about music in context, or how music is affected by its context, whether that context is the technology that delivers it to you or the financial structures that allow you to buy it, or maybe the architectural structures that affect the way it sounds. How they all exert pressures on the music to be a certain way.

When you started with Talking Heads, though, playing legendary music venue CBGB in Manhattan, you write that you tried to strip all that away and remove everything you could. It sounds almost like a design aesthetic rather than music -- something Steve Jobs would do? [Laughs] To some extent! Jonathan Ive: punk rocker!

We were just starting out, and for some of us, it was too much at that point to formulate that "I want my music to be this and this and this". That's too much to know. But you can work the other way, by elimination -- I know that I don't want this or this. Let's see what's left.

You take everything away until you get to the point where, if you take another leg away, the table falls down. And then you go: OK, what's left is the essence of what we want to be doing and we'll start with that. That totally minimal thing gets added to or corrupted really quickly --it doesn't mean you abandon the rules, but you find ways to let other things seep in.

Those other things -- does technology shape them? You write that there's no such thing as neutral technology. Maybe people used to think Power-Point and various other things were just tools, like a hammer. But no, every tool, including hammers, is about doing very specific things. With a hammer, it's easier to hit a certain kind of nail than it is other kinds of thing. And so they lead you, subtly, to do certain kinds of things. At the end of the 1800s, Edison invented a recording system that didn't use any electronics. There was no microphone: the sound was focused through a physical horn that made a little needle etch a wax cylinder. The technology could record certain things, such as a singer, but percussive things, such as a bass drum, those kind of big bass-sound impulses, made the needle jump in the recording and the playback, so a lot of the time they relegated the bass to the back of the [recording] venue, or sometimes took it out altogether in jazz record<span class="s1">ings. Which meant that the jazz ensembles that were recorded were not the same instrumentation as you would hear live. And so what got disseminated, what people heard and recognised as jazz, early jazz, bore almost no resemblance to what was actually being played. The technology limited what could actually be distributed. That's an early and blatant example, but it continues. It's a little more subtle. There are other things now that shape the music in other, more subtle ways.

How do you escape that influence? Is it through events such as Playing the Building, where you turned a building into an instrument? I think that there will be a movement.

As recorded music has become more ubiquitous and easy to access, music listening also becomes more solitary -- you listen with your headphones, or ear buds on your computer. Then there's this simultaneous urge to experience music as a social phenomenon, whether that means music is the same -- it doesn't have to be, it can just be being in a place where sound is happening, where you're enveloped by sound in some kind of way, and that affects the social structure.

That social idea -- in the book you talk about sharing mixtapes as a powerful social interaction. Does seeing what your Facebook friends are listening to on Spotify have the same effect? My feeling with that is: OK, fine. But part of the mixtape thing was that it was reciprocal: that you actually had to meet the person, and exchange the tape or mix CD, you meet the person and give them something. It establishes this system of obligations and mutual exchange that creates a little community. Which is very different from online, where people pass things around, but there isn't that mutual obligation. It's more broadcast. Which is fine, you can broadcast what you like, what you've found and seen, but it doesn't create a real community.

Does using headphones change the music we listen to?

You would think there would be a response to that -- people have been getting their music through headphones and ear buds for quite a while. I think music is about the texture and the space that is created in headphones or ear buds. It's not about a big beat -- that's going to sound better in a club or a car than little tiny headphones. The songs are more about texture and atmosphere. It's an artificial space.

You describe MP3s as "zombie music" and "music in pill form" They've improved! When they firstappeared, the music was gutted out. But it's amazing that you can lower the amount of information needed to get something that, perceptually, sounds the same. The illusion has been made pretty much perfect now.

You talk a bit about the science of music, especially neuroscience and psychology. Has research changed your approach

to making music? Only a little bit. I'm a little aware of how the audience-performer relationship is an evolutionary thing at work.

In the sense that people come to experience a certain thing, they want to confirm their relationship with their peers, that you're there as a catalyst for that. But it's completely possible that, as a performer, you intuit that sort of thing ahead of time. You don't need neuroscience to tell you that.

Do you think it's useful to look under the bonnet and see what's going on? Like putting people in an MRI scanner and seeing how the brain is stimulated? I'm fascinated. The little bits I've read about it confirmed that music connects to a lot of different centres in the brain. Rather than there being one language centre, one vision centre, music is connecting all these different areas at once. Maybe that's what it's about, as opposed to one central thing. It's a word or a phenomenon that is about networking all these other areas.

How Music Works (Canongate, £22) is published on October 18

This article was originally published by WIRED UK