Bone Flutes, Pianos, and iPods: Notes on Music, Technology, and Embodiment

Is there any connection in the way that music and technology interacted at the dawn of history and today?

The word “technology” commonly refers to modern inventions — the ubiquitous machines, large and small, that fill our daily lives. Likewise, in the world of music, “technology” is often shorthand for modern methods of sound reproduction. This association is evident in the subtitle of Mark Katz‘s emblematic book on the impact of recording — Capturing Sound: How Technology Has Changed Music. But what if we take a broader view of technology, looking beyond our modern era? Doesn’t technology — especially in the realm of music — have a much longer history?

Consider for instance what archeologists unearthed in southwestern Germany during the summer of 2008. There, the archeological team uncovered fragments of three Paleolithic era musical instruments made from bone and ivory that resembled the modern flute. One of these bone “flutes”, despite being found in a dozen pieces, was remarkably complete in design. This delicate instrument — not even a foot in length — was carved out of a vulture’s wing bone with stone tools. With five finger holes and a notched end, it was probably blown from the end like a Japanese shakuhachi. The other two flutes were made of mammoth ivory, just like a figurative sculpture found nearby.

Maybe the most important thing about the “Hohle Fels Flutes“, however, was their age. The creation of these primitive instruments dates back some 35,000 years, preceding the construction of even the wheel by 30 millennia. As the Hohle Fels flutes demonstrate, people have used technology to modify the world since prehistoric times, making use of objects and techniques to help them hunt, stay warm, and even create music. The Hohle Fels flutes serve then as powerful a reminder that technology has been changing music (and changing musicians) many millenia before such modern inventions as the iPod or even the phonograph.

Hohle Fels Flute

Try for a moment to imagine the initial impact that the new sounds emanating from the bone flutes must have had on these prehistoric ears. By arranging pitches through a set of finger holes, which a Paleolithic musician could play by covering in various combinations, new sounds were realized to supplement the vocal range of our ancestors. Perhaps just as importantly, these innovations, regardless of the primitive context, not only opened up new ways of thinking about music, but also at the same time opened up new ways of touching it.

After all, tactile aspects of instruments are important because the acts of thinking and touching are deeply entwined. As theories of grounded cognition argue, the mind isn’t a metaphysical computer; it can’t be separated from the body. Instead, thought grows out of our embodied abilities. When we see, we only see through the viewpoint afforded by our eyes. When we hear, we only hear a certain the range of frequencies detectable to our ears. Our sense of space is tied to our bodies’ immersion in an environment. What we know, understand, or imagine is related to our senses, to the ways we move in and interact with the world. These kinds of ideas have driven diverse research, such as J.J. Gibson‘s ecological psychology or Maurice Merleau-Ponty‘s Phenomenology of Perception.

This perspective is linked to both the world of technology and music. Tools and instruments don’t just enhance existing abilities. Often, they actually transform the way someone does (or thinks about) an activity. When you use a calculator, for example, you don’t need to do all the math in your head — what you have to do instead is push buttons in the right order. This is a key point from Edwin Hutchins‘s seminal work on distributed cognition, which considers “the system of person-in-interaction-with-technology”. Music is full of such systems, where technical instruments and embodied action influence each other. Instruments, as a rule, facilitate certain kinds of musical actions that the body performs through direct physical contact.

As technology goes, the bone flute is fairly simple, yet already made to plug into a human body. Its holes match up with the hand; its notched end, with the mouth. The body, then, shapes the instrument at the same time the instrument shapes the body. Instrumental music emerges from this resonance between body and tool, marked by both their possibilities and their constraints. A flutist, for instance, must learn special ways of breathing and coordinating the fingers — techniques that are specific to the technology.

* * *

Bartolomeo Cristofori Piano

Obviously, this three-way interaction between music, technology, and the body isn’t unique to the Stone Age. It’s taken various forms in different times and in different places. Consider, for instance, a more recent and familiar example: the piano.

Already, this may feel more “technological”. After all, a piano is a large and intricate machine, a carefully calibrated assemblage of wood and metal parts. Its present design has developed over a fairly long period of time. Starting in the early 18th century, Bartolomeo Cristofori experimented with a hammer mechanism that could combine the power of the harpsichord with the dynamic range of the clavichord. While the piano was fairly well established around the 1770s, the technical innovations continued throughout the 1800s. By the end of the 19th century, Steinway & Sons had registered nearly 70 patents for their technologically sophisticated instrument.

Initially, pianos were handmade by teams of skilled artisans. In the second half of the 19th century, though, the production of pianos, like everything else at this time, became increasingly industrialized. Prices dropped as more instruments — still of a quality nature — hit the market, which helped cement the piano’s place in middle-class parlors across Europe and North America.

Along with this boom in piano-building came a boom in music publishing. Through four-handed transcriptions and piano-vocal scores, people could now hear Mozart symphonies or Rossini operas — pieces that usually required big, expensive orchestras — played in the privacy of their own homes. Four-hand arrangements also let pairs of amateurs play difficult pieces that were originally written for a solo pianist. All of Beethoven’s piano sonatas, for example, were rewritten as do-it-yourself duets. As musicologist Thomas Christensen has discussed, such transcriptions both democratized and commercialized music in new ways.

This emerging popular culture, however, depended on specific technical features. The piano’s wide tonal range meant that it could imitate large ensembles. The size and layout of the keyboard meant that two people could play it simultaneously. Still, all this would have been meaningless without the pianists’ embodied knowledge. Though the instruments and sheet music were mass-produced, musical sound only emerged through a musician’s physical labor. Pianists practiced scales and finger exercises, developed a tactile intimacy with the space of the keys, and cultivated the technique necessary to operate this complex musical machine. As with the bone flute, then, the instrument opened itself to the player, while reshaping the player at the same time.

While this particular combination of technology and embodiment produced certain kinds of sounds, it also influenced ideas about music’s cultural significance. In Horace Greeley’s 1872 book, The Great Industries of the United States, the newspaperman-turned-politician claimed that “The piano-forte may be properly declared to have been the most important single influence which has wrought the social change for the better.” Playing the piano in the nineteenth century, therefore, helped construct social bodies, especially for bourgeois women. The physical discipline that musical training honed became a mark of their social distinction, of their gender and class identities — identities which also restricted their performance to the domestic sphere.

Throughout the 20th century, of course, sheet music and the parlor piano were largely displaced by new musical technologies — player pianos, records, radios, tapes, CDs, and, finally, in the 21st century, iPods. But what does the interaction of music, technology, and embodiment look and sound like in the Age of the iPod?

New Potential, New Limits

Even if music has lost its body, we haven’t lost ours.

Within the past decade or so, it may seem that music has suddenly dematerialized. Thanks to lightning-fast internet connections, there is now a global flow of information, in which music is a primary commodity — even if it is not always paid for. iTunes and the iPod — both introduced in 2001 — make this particularly clear. When an “album” is purchased online, the buyer does not receive a physical object (nor does the buyer pay with physical money). Instead, a file is downloaded that can play on a computer, be transferred to an iPod, or burnt to a CD. This music isn’t tied to a particular artifact like an LP. It’s ultimately just a cryptic, matrix-like pattern of ones and zeros that can be converted into music by computers like any other information-based file.

Even if music has lost its body, we haven’t lost ours. Digital technologies, from MP3 players to musical video games, have clearly changed the way we experience music — but we still experience it in our physicality. So what is the relationship between the iPod and our bodies, in terms of resonant contact between body and tool?

Once again, this technology is made for the human body. Tiny earbud headphones fit snugly into the ears like hand in glove — except in this case, your body is the glove and the earbuds are the hand. And the iPod itself is small enough to tuck into a pocket or wear on an armband, permitting complete mobility. Especially as new models shrink, the iPod almost disappears or blends in as a part of the body. Remember Apple’s iconic ads for the device? They don’t depict the machine on its own; they show exuberant, plugged-in, dancing bodies, waltzing, twirling, and swinging along to their iPods. The ads highlight the physical freedom that the iPod enables, even encourages. Whether you’re jogging around the park, driving down the freeway to work, or flying home on an airplane, a massive library of music, weighing no more than 4.9 ounces, can go with you.

New potential, though, also brings new limits. In Hutchins’ terms, the piano was a relatively “open” technology, visible and audible to everyone in the room. As mentioned earlier, it can even be played by two people. In contrast, the iPod is typically closed. The headphones are tightly integrated with your ears. The music masks the sounds of your environment but isn’t accessible to others around you. Headphone splitters or certain stereos allow multiple people to listen to the same iPod, of course — but usually with some loss of portability or bodily integration.

Compared again to the piano, the iPod (and other media for recorded music) foster a passive mode of listening. You don’t need to produce a single note, nor do you even need to practice. Your body can practically disengage. But successfully operating an iPod does require certain skills that we can easily take for granted — besides using the click wheel while driving. iPods and iTunes, on a more interesting level, guide the ways that we collect and categorize music. Playlists constitute a personal soundtrack that can shape the mood, even the feeling, of exercising, commuting, or anything that we find ourselves doing with our earplugs attached. This ultimately contributes to a certain sense of identity, or what sociologist Tia DeNora is getting at when she refers to “music as a technology of the self”.

iPods may not develop the motor memory involved with playing scales, but they can still change the way your body feels in the world. In a way, I may be making a somewhat perverse argument that the iPod is really a new musical instrument because it opens up musical possibilities and interacts with the body much the way bone flutes and pianos did. But I don’t want to mistakenly suggest that the bone flute, the piano, and the iPod represent either mutually exclusive moments or a continuous evolution. After all, these three technologies, these ways of knowing and touching music, coexist today. I have both a piano and an iPod. Even if I don’t play the bone flute, some people do. Funnily enough, the easiest way to hear this prehistoric sound is to look it up online. This kind of technological coexistence, of course, isn’t unique to music. I could have written this article with a pen, a typewriter, or a computer.

The future will doubtless bring other musical technologies, with new possibilities and new limitations — as well as unexpected uses for existing devices. The idea of iPod as instrument might not seem so strange in a few years. It would not be the first technology to blur the line between producing and reproducing music — just look at turntablism or even laptop music. The experiments, actually, have already begun. Ben Smith, a fiddler and composer of electroacoustic music, has started performing on a specially programmed iPod Touch. By tapping, tilting, and shaking it, he controls various sonic parameters, transforming the live input of his fellow musicians. (In a recent performance, Synchronic Shades, Smith’s collaborator was saxophonist/composer Shawn Allison, who has posted a recording on his website.) While this kind of interactive processing has been around for some time, it has usually involved someone sitting at a computer. Smith’s set-up, on the other hand, permits greater mobility. It turns the technician into another instrumentalist, making music through “performative gestures”. As the audience watches Smith touch the device and move it in real space, they listen to him physically manipulating sound through the iPod. And while at first glance, this might seem peculiar to us, perhaps the caveman’s initial experiments with skeletal leftovers once seemed so, too.

From Paleolithic flutes, to pianos, to iPods, the central point remains: just as the mind isn’t free from the body, music isn’t free from its material base. And, with the exception of certain vocal music, this base involves a resonant interface of tool and user, instrument and musician, machine and organism, technology and technique. In other words, the iPod listener, the pianist, even the prehistoric flutist, are all a similar kind of cyborg.

Jonathan De Souza grew up in Ontario, Canada, where he learned to play violin, viola, guitar, piano, mandolin, saxophone, concertina… He’s currently doing a PhD in music theory and history at the University of Chicago. Besides instruments and embodiment, Jonathan works on post-WWII experimental music and movie musicals about cowboys (though not at the same time). He also plays jazz, folk, and (sometimes) classical music around the Chicago area.