It was a pleasure to be on the Field & Foley podcast; we discussed field recording, game audio, experimental music, and my philosophies about not believing that there are any solid boundaries between any of it. Listen here!
Want a challenge? Try to play back interface sounds on the show floor at CES. [Intel Booth, CES 2012.]
For those that might not know, for the last decade I earned my living designing digital installations: Multi-touch interactive walls, interactive projection mapping, gestural interfaces for museum exhibits, that sort of thing. Sometimes these things have sound, other times they don’t. When these digital experiences are sonified, regardless of whether they are imparting information in a corporate lobby or being entertaining inside of a museum, clients always want something musical over something abstract, something tonal over something mechanical or atonal.
In my experience, there are several reasons for this. [All photos in this post are projects that I creative-directed and created sound for while I was the Design Director of Stimulant.]
Expectations and Existing Devices
It’s what people expect out of computing devices. The computing devices that surround me almost all use musical tones for feedback or information, from the Roomba to the XBox to Windows to my microwave. It could be synthesized waveforms or audio-file playback, depending on the device, but the “language” of computing interfaces in the real world have been primarily musical, or at least tonal/chromatic. This winds up being a client expectation, even though the things I design tend not to look like any computer one uses at home or work.
Yes, I strapped wireless lavs to my Roomba. The things I do for science.
Devices all around us also use musical tropes for positive and negative message conveyance. From Roombas to Samsung dishwashers, tones rising in pitch within a major key or resolving to a 3rd, 5th, or full octave are used to convey positive status or a message of success. Falling tones within a minor key or resolving to odd intervals are used to convey negative status or a message of failure. These cues, of course, are entirely culture-specific, but they’re used with great frequency.
The only times I’ve ever heard non-musical, actually annoying sound is very much on purpose and always to indicate extremely dire situations. The home fire alarm is maybe the ultimate example, as are klaxons at military or utility installations. Trying to save lives is when you need people’s attention above all else. However, even excessive use of such techniques can lead to change blindness, which is deep topic for another day. Do you really want a nuclear engineer to turn a warning sound off because it triggers too often?
The Problem with Science Fiction
Science fiction interface sounds often don’t translate well into real world usage.
This prototype “factory of the future” had to have its sound design elevated over the sounds of compressors and feeders to ensure zero defects…and had to not annoy machine operators, day in and day out. [GlaxoSmithKline, London, England]
My day job has been inexorably linked to science fiction: The visions of computing devices and interfaces that are shown in films like Blade Runner, The Matrix, Minority Report, Oblivion, Â Iron Man, and even televisions shows like CSI set the stage for what our culture (i.e., my client) sees as future-thinking interface design. (There’s even a book about this topic.) People think transparent screens look cool, when in reality they’re a cinematic conceit so that we can see more of the actors, their emotions, and their movement. These are not real devices – they, and the sounds they make, are props to support a story.
Audio for these cinematic interfaces – what Mark Coleran termed FUI, or Fantasy User Interfaces – may be atonal or abstract so that it doesn’t fight with the musical soundtrack of the film. If such designs are musical, they’re more about timbres than pitch, more Autechre than Arvo Part. This just isn’t a consideration in most real-world scenarios.
Listener Fatigue
Digital installations are not always destinations unto themselves. They are often located in places of transition, like lobbies or hallways.
I’ve designed several digital experiences for lobbies, and there’s always one group of stakeholders that I need to be aware of, but my own clients don’t bring to the table: The front desk and/or security staff. They’re the only people who need to live with this thing all day, every day, unlike visitors or other employees who’ll be with a lobby touchwall for only a few moments during the day. Make these lobby workers annoyed and you’ll be guaranteed that all sound will be turned off. They’ll unplug the audio interface from the PC powering the installation, or turn the PC volume to zero.
This lobby installation started with abstract chirps, bloops, and blurps, but became quite musical after the client felt the sci-fi sounds were far too alienating. Many randomized variations of sounds were created to lessen listener fatigue. There was also one sound channel per screen, across five screens. [Quintiles corporate lobby, Raleigh NC]
Music tends to be less fatiguing than atonal sound effects, in my experience, and triggers parts of the brain that evoke emotions rather than instinctual reactions (in ways that neuroscience is still struggling to understand). But more specifically, sounds with without harsh transients and with relatively slow attacks are more calming.
Randomized and parameterized/procedural sounds really help with listener fatigue as well. If you’re in game audio, the tools used in first- and third-person games to vary footsteps and gunshots are incredibly important to creating everyday sounds that don’t get stale and annoying.
The Environment
Another reality is that our digital experiences are often installed in acoustically bright spaces, and technical sounding effects with sharp transients can really bounce around untreated spaces…especially since many corporate lobbies are multi-story interior atriums! A grab bag of ideas have evolved from years of designing sounds for such environments.
This installation had no sound at all, despite our best attempts and deepest desires. The environment was too tall, too acoustically bright, and too loud. Sometimes it just doesn’t work. [Genentech, South San Francisco, CA]
Many clients ask for directional speakers, which comes with three big caveats. First, they are never as directional as the specification indicate. A few work well, but many don’t, so caveat emptor (they also come with mounting challenges). Second, their frequency response graphs look like broken combs, partially a factor of how they work, and so you can’t expect smooth reproduction of all sound. Finally, most are tuned to the human voice, so of course musical sound reproduction is not only compromised sonically, but anything lower than 1 kHz starts to bleed out of the specified sound cone. That’s physics, anyway – not much will stop low-frequency sound waves except large air gaps with insulation on both sides.
The only consistently effective trick I’ve found for creating sounds that punch through significant background noise is rising or falling pitch, which lends itself nicely to musical tones that ascend or descend. Most background noise tends to be pretty steady-state, so this can help a sound punch through the environmental “mix.”
One cool trick is to sample the room tone and make the sounds in the same key as the ambient fundamental – it might not be a formal scale, but the intervals will literally be in harmony with one another.
Broadband background noise can often mask other sounds, making them harder to hear. In fact, having the audio masked by background noise if you’re not right in front of the installation itself might be a really good idea. I did a corporate lobby project where there was an always-running water feature right behind the installation we created; since it was basically a white noise generator, it completely masked the interface’s audio for passersby, keeping the security desk staff much happier and not being intrusive into the sonic landscape for the casual visitor or the everyday employee.
Music, Music Everywhere
Of course, sometimes an installation is meant to actually create music! This was the first interactive multi-user instrument for Microsoft Surface, a grid sequencer that let up to four people play music.
These considerations require equal parts composition and sound design, and a pinch of human-centered design and empathy. It’s a fun challenge, different than sound design for traditional linear media, which usually focuses on being strictly representative or on re-contextualized sounds recorded from the real world. Listen to devices around you in real life and see if you notice the frequency (pun intended) with which musical interface sounds are commonplace. If you have experiences and lessons from doing this type of work yourself, please share in the comments below.
Posted: November 20th, 2012 | Author:Nathan | Filed under:theory
Randy Coppinger recently wrote a great blog post recently about not getting caught up in tool choices or being dogmatic about approach, and just focusing on the problem at hand, tools be damned. I couldn’t agree more.
This reminded me of a corollary to Randy’s thesis: The tools we use shape what we create.
While that’s perhaps self-evident, that’s not always a good thing. Tools can be creatively inspiring, but they can also become handcuffs, blinding us to better ways of working. Or, even more sinister, subtly changing our work to be something other than we intended.
If you’re a woodworker, what you can do without a lathe can be pretty limiting. Or a jigsaw. Or a coping saw. If you have all of these things, your creativity can be freed from many restrictions, since you are removing constraints upon your ability to execute your ideas. Such restrictions can really get in the way of earning a living in a crowded marketplace. A car mechanic, for example, won’t get a lot of work if he doesn’t have a hydraulic lift and an impact wrench.
So it goes with digital creative professionals. A visual designer these days can’t operate without a computer, and it’s a tough life without Adobe Photoshop. An audio professional can’t work without a DAW, and it’s hard to be competitive if you don’t know ProTools reasonably well. A field recordist needs microphones. And so it goes.
But microphones or recording techniques can have certain tonal characteristics, just like how raster artwork (e.g., created in Photoshop) looks different than vector artwork (e.g., created in Illustrator). It’s important to realize that the tool choices we do make aren’t always going to be neutral. Every tool choice imparts some color to our output. Rendering one’s idea in charcoal will be emotionally quite different than rendering it in pencils. Different sizes and shapes of chisels affect the texture of a sculpture. Capturing a sound with a Rode NT1a will sound different than with a Neumann TLM-103, even on the same material and from the same perspective, yielding different emotions and tones when listened to.
None of us can afford a warehouse of infinite tools…the Tardis tool shed doesn’t exist. But neither is poverty an excuse to not be aware of your tools’ influence on your work. Knowing the attributes of your microphones lets you know what you might want to modify and sculpt audio recordings in post, just as a woodworker might use fine-grit sandpaper by hand for those last touches that really give a piece the personality of the artist, not just the texture of his or her tool’s “fingerprint.” A person on a limited budget and constrained equipment can achieve greatness by adding tons of knowledge and insight. A metalworker with limited tools might only be able to create things of a certain scale, just as a limited-resource recordist might pick only certain subjects to record due to the limitatons of his or her kit.
Randy’s point is one that I absolutely agree with: Properly frame the problem and establish a conceptual framework for solving it, and let that dictate the tools you use. Don’t always rely on the old standards. Expanding this line of thinking, however, forces you to also look long and hard at the tools you use. Always be slightly suspicious of your equipment, which influences and colors what you create. That can be wonderful and enhance the source material. Or horribly inappropriate and lose the character of the original. But there’s a big difference between being conscious of those differences and being blind to their influence on your work. That way lies ambivalence, which I’ve written on before.
A common phrase on Jeff Wexler’s production sound forum regarding equipment versus the user goes something like, “It’s not the arrows, it’s the archer.” Knowing your arrows’ quirks lets you play with the results. And that’s where a creative professional moves from being an informed craftsperson to becoming an empowered artist.
Posted: February 3rd, 2011 | Author:Nathan | Filed under:theory
The audience says "meh" when you say "meh." Image: Arch Wear/Zazzle.
When you’re creating something, nothing kills faster than ambivalence.
I’m not talking about ambiguity. When the viewer or listener comes to your work, it’s OK to be ambiguous. The best art and design only goes halfway: The viewers themselves must ideally step up to the work and actively engage with it (or be engaged by it) in order to leave a significant emotional impact.
This is where a lot of abstract art fails. Too much mystery with too little to draw emotional interest can render the piece inaccessible even to willing viewers, a reaction that many have to the works of Rothko and Pollack, and even the much-maligned Wolff Olins Olympic logo design. Music can do this, too, when compositions are too abstract and even alienating, whether it’s some of the later works of Autechre or the atonal and complex works of Ligeti. But by leaving a few things tantalizingly uncommunicated, the audience can really engage their senses and curiosity to create a lasting impression which they, themselves, have helped create.
Ambivalence doesn’t lie in the work, or in the audience…it comes from the maker of the work. Ambivalence can be the result of making arbitrary decisions for the sake “done.” It can also come from facing an issue with the work and ignoring it or punting on it for later, and never circling back around to it.
In sound, ambivalence often comes from not taking a stand on big issues, like representation versus abstraction. If one scene has a mix of both very literal and very abstract sounds, the viewer may not understand what emotional state the characters are in. A confusing mix of diegetic and non-diegetic sounds – a classic snafu by sound designers who are driven by the coolest sounds, not the most appropriate sounds – can seriously muddle the narrative message. In game design, this can mean the difference between a player being oriented properly in the game world to misinterpreting sound cues that can lead to poor decision making in-game.
You can recognize ambivalence when you say, “tsk, I guess this will be OK.” You can anticipate ambivalence when you hit the point of, “we’ll circle back on this sound later in the mix and see if we can make it better,” but it never happens. You can smell ambivalence when working with clients, producers, or directors who don’t have clear visions for the emotional content of certain moments.
How does one fight ambivalence? One makes a stand. One analyzes the context of the design problems, and creates a framework, theory, or design approach that all decisions can refer back to. One digs one’s heels in and says, “For this use, and in this context, this approach feels emotionally right, for these reasons, and all aural decisions should be based on this framework.”
It’s not all bad news if this decision-making framework fails to produce the right results. If it doesn’t solve the problem, you at least know it’s the core thinking that’s flawed, not the specific sounds you chose. It’s how you’re using the sounds that’s the problem. The great thing about discovering that level of failure is that you can revisit the highest level of the problem and discuss it…this keeps the discussion at a more strategic level, which will help to prevent the client(s) from micro-managing the actual sound design and implementation process. That’s where your expertise comes in, and is most relevant.
Sure, it’s important to be right. But I think it’s more important to have an opinion, early and forcefully, even if it doesn’t work out. Fail early and often, as so many creative professionals suggest. Get your co-workers and clients used to evaluating your approach and thinking than the nitty gritty details of implementation. The former can help “scaffold” your decisions as you revise, whereas critiquing only the latter may not ever resolve the core issues of how sound can support the visual narrative.
Being wrong is better than being ambivalent…as long as you do so early enough that you can reframe the problem and course-correct before the due date.
Posted: November 9th, 2010 | Author:Nathan | Filed under:theory
"Done" can take a while. But what does "done" mean?
When is something done? Completed? Finished?
It was bad enough when we lived in an era of cutting tape, using paint, or processing film, when sometimes the medium could only support a certain amount of additions or revisions. Today, we live in a digital era of unlimited choices, multiple undo’s, and possibly infinite iteration. How and where can a creative professional call a piece finished? As they say, “Perfect is the enemy of ‘done.'”
As an art major in college, and this topic came up a lot in critiques and discussions. I think part of the challenge is how we, as artists and designers and engineers, emotionally interpret “finished” as a word. Even when it’s the deadline that tells us something must be finished, we need to choose our battles over what needs more or less attention.
One term that one of my professors has stuck with me ever since: You’re done when the piece is largely “resolved.”
I like this phrase because it reframes the question of “done” into a context of problems that need solving, or issues that require resolution. Are the frequencies sufficiently different to prevent muddiness or masking? Is the color in this photo helping to tell the story or the moment? Are these paint strokes in this corner supporting the tone and balance of the piece? When you ask yourself what remains unresolved, you stop flailing around for vague emotional cues of “done-ness” and you start asking the hard questions of what is and isn’t succeeding in supporting the core story or message you’re trying to convey.
It’s easy to wonk out into semantic arguments around the meaning of “done,” so don’t let that become a procrastinating strategy! Ask yourself, in whatever words you like, if the piece is self-consistent, self-evident, unambiguous, and seems resolved. That simple act can help clear up a lot of personal and aesthetic confusion.
Posted: June 8th, 2010 | Author:Nathan | Filed under:theory
Understanding your relationship, or lack thereof, to your body can lead to creative insights.
Denial of the Physical
In 2005, I heard a 1993Â radio interview with Frank Conroy, a now-deceased fiction writer, and he described how he preferred writing in bed. He spoke with another author who did this also. Nelson’s own take on it is that he wrote best when his mind flowed without concern for his surroundings. Staying in bed was a strategy to disconnect his brain from his body to facilitate creative flow. The less he was aware of his body, the more his mind could reel and wander.
While Conroy might have had a self-destructive streak, something about this insight seemed familiar. I started to notice my own patterns and methods for staying creative and generating ideas, and realized that, indeed, the idea of disassociating the body from the mind is something that I also do.
Over the years, I’ve learned to obey these rhythms and how to use them in order to stay creative in a deadline-centered world.
Last month, an international group of eight members the online sound community (myself included) attended a “beta test” webinar with Sonnenschein, and it was excellent. The format will be about half lecture and half discussion of attendees’ work, submitted beforehand. I think that the opportunity to learn more about how the human brain interprets audio is essential learning for anyone involved in music or sound, just as the study of visual perception is paramount to visual and interaction design. This class will focus on taking theory and making it practical in one’s work.
Here’s David’s own description of this webinar.
SEMINAR TOPIC:Â PSYCHOACOUSTIC TOOLS FOR CREATIVITY
Do you desire to produce really effective soundtracks that reach your audience through neurobiological resonance, tapping into how they subconsciously perceive the world through sound? Would you like more access to your own brain power for finding innovative approaches and solutions? Every professional sound designer can benefit from understanding and experiencing the science of sonic storytelling. In this seminar we will explore the neurobiology and psychology of hearing and how these underlying principles can support creative sound design.
WHAT WE’LL DO
In the second half of each unique 2-hour seminar, David will screen, analyze and discuss video clips pre-selected from submissions by the participants (max. 5 min., 100mb file size). If you have something ready or a work-in-progress, send info on the genre, length and any particular area of sound that you’d like to discuss, to dsonn22@gmail.com.
It’s a steal, too: You get access to one of the best minds on sound design for US$40. It’s limited to 25 people, so definitely sign up soon. If you’re active on socialsounddesign.com (see my earlier post about this awesome community), you’ll probably recognize a lot of peeps in the class.
To paraphrase our state’s governor in Predator: DOOO EET! GET TO DA CHOPPAH!
The title of this article isn’t what you think it is.
You can’t shop for electronics or technology without hearing “prosumer.” People assume this portmanteau is a contraction of “professional-consumer.” Only marketing wonks have made it so.
That is neither its original meaning, nor the topic of this post.
The term was coined in Alvin Toffler’s seminal book Future Shock as a contraction of “producer” and “consumer,” predicting the merging of the roles of consumption and production into the life of one individual, primarily due to customization of mass-produced objects and the creation of highly specialized products. That is, person A makes widget X, who sells X to person B who makes widget Y, which person A, in turn, buys…it’s a massively networked set of cottage industries. This trend has exploded in the last decade. When Wired writes about micro-manufacturing and “no more factories,” we’ve probably arrived at a prosumer tipping point.
That, dear friends, is what this post is about. And yes, this is audio-related. Chances are, this article is probably about you, too.
I learned a long time ago to share my mistakes with others. It keeps me humble, and reaches two groups of people: Those more experienced than me who can help correct my errors, and those who might not have tread these waters before and who can learn from my experiences.
Which brings us to today’s post: recording ambiences using a pair of miniature omnidirectional microphones in boundary layer mounts. I learned a ton doing this, but the end results weren’t great. Today we’ll talk about what I accomplished and why it might not have worked out as well as I had hoped.
After my recent post on urban ambiences, I decided to record some fresh ambiences using a pair of DPA 4060 microphones using two techniques I hadn’t tried before: spaced-pair stereo and boundary-layer microphones.
The deeper I get into field recording (or, in the words of some, a “phonography“), the more the parallels between it and how I approach photography comes into focus. I feel that the differences between these media, while significant, are outnumbered by the similarities.
Photographs are single moments in time and sound is a stream of realtime stimuli. They reach different senses. One could argue that cinematography has more in common with audio recording, in terms of perception of the media over time, but I think that’s more in how it is consumed than how it’s captured.
I actually find that the physical activities and workflow of field recording, specifically, is conducted far more like nature photography. The two activities are also among the few digital creative endeavors that actually can yield some degree of exercise!
For me, the parallels between field recording and photography are most visible in fieldwork, temporal abstraction, recontextualization, and introspection.