Video icon 64
Build your creative skills with practical video courses from Tuts+. Start your free trial today.
Advertisement

A Primer on Creating Interactive Music for Games

by

As the game industry continues to grow and change, composers and musicians are increasingly interested in learning more about ''how to break into the game industry'. What many novice composers and musicians do not realize is how sophisticated and involved the process of writing music for games has become.

In the very early days of games, there may have been a single-line melody for a game, played during the menu or at the end of a level. During these days, the music and sound FX were often composed (programmed) by one person.

Thin Ice for Intellivision - composed by George "The Fat Man" Sanger

While revolutionary at the time, we can quickly see the problem with this type of music implementation - the tune gets repetitive very quickly, and may even cause the player (or the parents) to mute the audio altogether.

There are some sectors of the game industry that still rely on the one-man-band approach - hiring one audio person to provide all the sound assets for a game including sound effects, voiceover, music and integration. Games for mobile phones, smaller web-based games, and some social-network games might fall into this category. While the quality bar has raised significantly and the technology available has given these individuals a lot more flexibility, this approach is still very limiting to both the composer and the music.

However, more and more developers are coming to realize the real return-on-investment for good audio in their games, and so are hiring audio specialists for each different aspect of the game. From iOS games to Xbox Live all the way up to AAA PC and Console games, there is an increasing need for highly skilled, highly defined job-roles in game audio.

With this in mind, we're going to take a look specifically at the concept of interactive music for games.


What Is Interactive Music

Interactive music can be loosely defined as "music which is composed and implemented to respond to particular 'inputs' during gameplay." Perhaps the most simple demonstration of this technique is the classic use of a unique theme of music assigned to specific levels in the game-world.

Super Mario Brothers Level 1-1

Many of us will remember fondly the extremely catchy tunes played during the various levels of games like The Legend of Zelda and Super Mario Brothers, or even those stretching back into earlier PCs such as the Commodore 64. These games often had unique themes for the Overworld and Underworld, for Dungeons and Boss Battles. However, Super Mario Brothers also took things a step further, by playing a unique theme when you collected a star (which granted invincibility) or when you were running out of time (the music would increase in tempo).

Already by the year 1985, interactive game music had reached a relatively high level of sophistication, accounting for player state, game area, and so forth. That said, if you spend a lot of time exploring that first level, you may still quickly tire of the theme.


Avoid Repetition

In a recent presentation at the Develop conference in Brighton, Game Composer Jason Graves explained that his three guidelines for composing game music were: "repetition = certain death, repetition = certain death, repetition = certain death..." Indeed it seems that while the primary purpose of interactive music is to underscore and emphasize (and occasionally influence) the gameplay, it is equally important to avoid too much repetition.

With this in mind, we could propose that some of the primary goals of interactive music in games are to:

  • underscore gameplay
  • emotionally emphasize key elements of gameplay
  • influence the player's decisions where necessary
  • avoid listener fatigue.

To achieve the goals above, let's examine some interactive music techniques used in today's games.


Interactive Music Approaches and Examples

In order to help you visualize how each of these techniques might be used in a game, let's use a very simple gameplay scenario as a demonstration:

"The Hero/Player is exploring the Overworld area and is attacked by a creature. He enters into combat and defeats the creature. The hero then continues to explore and approaches the entrance to a cave. He hesitates at the mouth of the cave, then slowly enters. As he continues to walk forward, the player is again attacked, this time by a much larger creature (a Boss!), and enters into combat. The player nearly dies in combat, with his health nearing zero, but narrowly defeats the creature. This completes the level, and the player is awarded a bonus for completing his objectives."

Referencing the above scenario, we'll take a look at the main techniques used in desigining interactive music. These can be loosely divided into a few categories: Horizontal, Vertical, Generative, and Hybrid.


Horizontal

Horizontal interactivity in music is perhaps the most simple technique, and it is a fancy way of saying "move intelligently from one piece of music to another in a linear fashion." In our example above, we may have one theme for the Overworld, one theme for combat, one theme for the cave, and a final theme for the boss battle. Each of these themes would be composed and scripted in such a way as to allow the music to transition intelligently from one theme to the next while still sounding musical.

Some examples of horizontal interactivity include:

  • Musical themes change based on game area/level
  • Music themes change based on game state (such as exploration or combat)
  • Music themes change based on player health or time (e.g. as when Super Mario Brothers theme speeds up).

These are the most simple uses of horizontal techniques, but they can be extended to become more sophisticated. For example, rather than a single, static theme transitioning to another static theme, we can have intelligent cueing or branching of themes. In our example above, rather than a single Overworld theme, we may have several variations and the longer we explore, these variations will randomly play, rather than one variation looping incessantly. Likewise, as we approach the cave, we may randomly choose one of several cave themes to play. The general idea here is that rather than a linear, scripted approach, we're allowing a bit more musical diversity by 'cueing' up multiple possibilities.

This technique can be greatly extended by using a sort of 'micro-composing' technique, where instead of writing one long, linear theme, a composer creates multiple, smaller chunks and phrases which can intelligently and musically connect and interchange with one another. In this way, a great deal of variation can be introduced. A few years ago, I interviewed "Tomb Raider" composer Troels Folmann and he went into more detail about the process of micro-composing...


Vertical

Vertical interactivity introduces the concept of 'stems', which are musically grouped divisions of a given theme. For example, in a given orchestral theme you may have a string section, horns, winds, percussion, and basses. Each of these could be grouped into their own 'stem' which could be played independent of or in conjunction with any other stem. Similiar to track groupings in your DAW, stems are a way to increase musical variation as well as increase or decrease musical intensity.

Typically, the vertical and horizontal approaches will be utilized in tandem, creating a sophisticated musical mix which can adapt in intensity, phrase variation, as well as thematic variation. In our example above, we may begin with strings and winds lightly playing the Overworld theme. As the player continues to explore, we gradually bring in the rest of the stems to add intensity just before the first combat encounter. As we enter combat, the theme transitions to a more intense tune with full orchestra in swing. As we defeat the enemy, the theme transitions back to the main theme. We then approach the cave and all the stems fade out except the basses. We enter the cave and a new theme begins, again with just basses. Slowly we fade in the strings and winds and just before the boss encounters the horns swell in to join. The theme changes once again and in full orchestra we hear the boss battle theme. As the player takes damage, however, the stems begin to fade out individually, and we slowly begin to hear a new stem - a scary high-pitched violin mix which helps reinforce the idea that the player is about to die. Upon defeating the boss, however, we switch to a new theme - perhaps something more lighthearted, to help the player confirm their victory.

Quite frequently, you will encounter different parts of a game which employ each of these techniques to different degrees. Sometimes a straightforward, scripted theme is called for, but other times a more sophisticated approach is needed - one which employs both horizontal variation and vertical addition and subtraction. Composer Jason Graves used these techniques to great effect in his "Dead Space" score, which has recieved critical acclaim. Graves uses a combination of verical and horizontal approaches in his score, but also employs the use of stingers which we'll talk about in a bit. He takes the approach of 'overproducing' providing up to 20 individual stems for a given theme, and he belives that longer themes yield better results. As such, his minimum 'theme' length is 2 minutes.

Jason Grave's score for "Dead Space" is a great example of horizontal and vertical composing techniques, as well as effective use of stingers.

Another excellent example of interactive music technique is demonstrated in the hit game, "Batman: Arkham City" from Rocksteady Studios. Composer and Audio Director Nick Arundel recently spoke to Develop magazine about the musical approach used in the game, stating that "There are types of harmonic writing that work when you're going to branch that don't work when you're layering....we have a sheet of cues which he constantly refers to which define the harmonic movement of patterns of chords, ostenato figures, the theme, and types of variations of the theme - all four or eight bars long." The full interview can be found here.

And here's a short sample of how the music underscores the gameplay.

Batman:Arkham City


Transition Techniques

With the primary design approaches delineated, we can now turn our attention to the important aspect of 'transitions'. Whether using horizontal, vertical or hybrid approaches to composing, in any case you will need to consider how each type of theme will transition to the next in a musically pleasing way. Below is a short overview of the various techniques used for creating transitions.

  • Looping - It goes without saying that very often the musical themes used will be looping. Sometimes the loop points will be close together, and sometimes the theme may be up to 2 minutes or more (as in the Dead Space example).
  • Crossfading - The most simple approach of transitioning from one piece to the next, crossfading uses a short fade-out/fade-in approch. Depending on the audio engine used, these crossfades can be done at specific points in the theme - for example at the end of a bar, beat, or custom phrase marker.
  • On-Bar/On-Beat - Speaking of on-bar or on-beat, it is important to consider how and when your transitions will happen. Today's game audio engines can often follow a specific tempo and time signature, further allowing you to determine exactly when and where you'd like to transition out of one piece and into another.
  • Flourishes/Stingers - The use of one-shot phrases to overlay important moments is called a flourish or a stinger. In our example above, a flourish might be used when the player defeats the creature, or perhaps when he picks up a health unit or treasure. These can also be used to induce emotion - for example by playing a high-pitched (and scary) string part as the game camera cuts to the end boss. Lastly, stingers can be used to mask the transition between one phrase and another - for example a drum roll followed by a cymbal crash may play at the end of one phrase to mask the beginning of the next phrase.
  • DSP - Finally, the use of DSP can be helpful in some scenarios. For example, when the players health gets low, we may apply a lowpass filter, cutting the higher frequencies to underscore the dire nature of the player's situation.

Conclusion

I hope you've found this overview of interactive music useful. The techniques described above cause the composer to think very differently about the way their tunes are designed and structured. It is important to mention, too, that a good familiarity with game audio engines such as Wwise and FMOD will greatly help the composer to compose music that is both useful and technically relevant to the game design.

If you've found this tutorial interesting, please let us know in the comments. If there is enough interest we can provide additional tutorials which further illustrate the techniques described above, as well as provide specific integration examples within Wwise or FMOD.

Advertisement