By Michael Worth
Epic battles between armies. Intense chases through the far reaches of space. Gritty, “up close” combat between two mighty warriors. These are some of the backbones of great video game experiences. And, of course, the music has to support that gameplay element. There is nothing more exciting than jumping into a boss fight, accompanied by your own larger-than-life soundtrack.
As game designers, we love video game action music. Bombastic brass blasts. Walls of drums pounding away. Choirs chanting in obscure tongues. Yep, modern orchestral action music is about the biggest adrenaline rush you can put in your game (at least from an audio standpoint). However, music that sounds good by itself, and music that works inside the game are often two totally different items, and it’s important to be able to choose and implement music that supports the gameplay, and fits in with all of the other audio information flying around the game.
A good way to think about how audio fills up a game is to imagine a blank canvas that on which you paint. Since the canvas is blank, the whole surface is white. Now, imagine you start painting on the canvas, using black paint. Once you’ve painted an area, you really can’t add any more to that area. It’s already saturated. Now, if we imagine that the lower portion of the canvas is where the low sound frequencies are, and the upper portion is where the high sound frequencies are, then we see that once we paint into an area (meaning, having sound in that region), then that area is “filled up”, and we cannot add any more paint/sound to that area.
How does this work in a game: Well, take this blank canvas. At the bottom, write “low frequencies”, at the top, write “high frequencies”. Sound exists as frequencies—the higher the pitch/note, the higher the frequency. Now, scribble a whole mess of black on the bottom of the white board. That’s the equivalent of having a lot of low frequencies being emitted by some audio event. You can see that there’s only so much “white space” with which to put frequencies in. Once a section of your frequency board is colored in, you really can’t add any more color to that area.
When the composer puts heavy, deep drums, synthesizers, and brass in, those frequencies directly compete with the frequencies of gunfire, explosions, and damage impacts. This has a number of adverse effects. First, it means that the music is masked by the sound effects, and the sound effects are masked by the music. Gunfire, explosions, and damage impacts are critical for conveying combat information to the player; this means that the game is less effective at helping the player realize his game state. Bad Juju!!
Second, it means that your speakers are getting an immense burst of sound at specific frequencies, which can overload the speakers, resulting in ugly, distorted sound, and possibly even damaging the gamer’s sound system.
Now, the first solution that a game producer would implement is to turn down the music. Simple and effective. However, that means that all the great music you spent money on is now not being heard. Additionally, it makes the emotional momentum you’re trying to create dissipate, as suddenly, your powerful music gets masked by the sound effects, which is jarring to the player
For example, here (http://www.youtube.com/watch?v=be-JaNB0M60) you’ll initially hear some great, tribal orchestral music, with low drums and beefy brass. Sound very epic and exciting. Now, listen to the section from :29 to about 1:10. The minute your troops engage in combat, several low spell, damage, and explosion sound effects come into the game. What happens to the music? It’s almost completely obscured. Why? Going back to the whiteboard analogy, the music existed in that low area of the whiteboard. When the sound effects came in, they painted that entire low section of the whiteboard black. There was simply no more room for the music to speak through, so the music got “painted over” by the more important sound effects. This is a classic example of a great score that sounds great on its’ own, but runs into trouble when implemented in game, against sound effects.
An alternate solution (and one that I personally like) is to borrow from the Hollywood model, and ask your composer to write music that avoids the frequencies of the sound FX. Simply put, if you have lots of gunplay, explosions, booms, and low frequencies, ask your composer to avoid low frequencies in his writing. That usually means avoiding low, boomy drums, low brass, and low synthesizers. Now, the music your composer writes may sound “light”, by itself, but when put into the game, against the sound effects, it will fill out the game’s sound beautifully, and allow the sound effects to sound clearly and powerfully. Again, using the whiteboard analogy, you’re coloring in the low part of the board (low frequencies) with your combat sound effects, and filling out the mid and high portion of the whiteboard (mid and high frequencies) with music. The result? A more full, balanced and appealing “painting of sound” for the gamer.
In The Empire Strikes Back (http://www.youtube.com/watch?v=wdPPht17jjs), music is playing while the Millenium Falcon is fleeing hyperspace, while being pursued and fired upon by T.I.E. fighters. Notice how the music has almost no low drum and brass sounds, it’s basically a bunch of strings chugging along, with higher brass and winds accenting parts of the cue. Now, lay that against the low engine sounds and laser fire in the movie, and it perfectly rounds out the audio experience, while maintaining that sense of excitement and urgency.
To bring in a game example, notice the action music at about two minutes, when you begin flying with Leonardo Da Vinci’s device (http://www.youtube.com/watch?v=W8Qz9ah8cKQ). The music is all in the middle and high frequencies, allowing the fireworks, the wind rushes, and the combat hits and impacts to sound clearly. Again, the composer is clearly thinking how his music will fit on the canvas, and avoids the areas of the canvas that will be “painted in” with dialogue and sound effects.
So, as you are developing your upcoming title, and you talk with your audio developer about providing music, try to develop the music from a “canvas” standpoint, and ask your composer to write music that avoids the sound effect frequencies that you are building in the game. You will be pleasantly surprised to hear and feel the difference!
Mike Worth is a composer and Emmy award winning orchestrator, and the audio director of Play Eternal, a Philadelphia based studio developing games for digital distribution on consoles. As a composer, Mike has written original music for feature films such as Rush Hour 3, and Ivory. In television, Mike has written music for MTV, VH-1, NBC, and Comedy Central, and has received an Emmy award for orchestrations and MIDI orchestrations for Nickelodeon’s The Wonder Pets. Mike’s video game credits include sound design for the upcoming game Warlords: Battlegrounds!, sound design for the XNA title Fittest, and music for the upcoming iPhone game, Kirin Wars. Mike is also active as a game audio evangelist and speaker, and has spoken about game audio at the University of Pennsylvania, the Philadelphia Grammy Foundation, the 2009 Northeast G.A.N.G. Summit, and the 2009 GameX Industry Summit. Mike was the 2008 Chair of Game Audio Education for the IASIG, and the 2009 Audio SIG Chair for the IGDA. He lives outside of Philadelphia with his wife, Sarah, and daughter, Katharine.