Jump to content

Adaptive music

From Wikipedia, the free encyclopedia
(Redirected from Interactive music)

Adaptive music is music which changes in response to real-time events or user interactions, found most commonly in video games.[1] It may change in volume, arrangement, tempo, and more. Adaptive music is a staple within the role-playing game genre, often being used to change the tone and intensity of music when the player enters and leaves combat.[2]

History

[edit]

The first example of adaptive music is generally said to have been in Space Invaders by Taito in 1978. The game's simple background music, a four-note ostinato which repeats continuously throughout gameplay, increases in tempo as time goes on and the aliens descend upon the player.[3] However, some argue that this is not an example of adaptive music as the sound which speed up could just as equally be considered sound effects of the aliens' movement.

Other early examples of adaptive music include Frogger by Konami from 1981, where the music abruptly switches once the player reaches a safe point in the game, and Sheriff by Nintendo from 1979, where different pieces of music play in response to events such as a condor flying overhead or bandits approaching the player.

George Lucas' video game development group LucasArts (before becoming Lucasfilm Games) created and patented the iMUSE interactive music system in the early 1990s, which was used to synchronise video game music with game events.[4][5] The first game to make use of this system was Monkey Island 2: LeChuck's Revenge in 1991.

Techniques

[edit]

Vertical orchestration

[edit]

Vertical orchestration is the technique in which the music's arrangement is changed. Musical layers are added and removed in response to game events to affect the music’s texture, intensity, and emotional feel without interrupting the flow of music.[1] Layers are generally faded in and out for smoother transitions.

In video games, this technique may be used for more subtle game events than horizontal re-sequencing, such as an increase or decrease in intensity during a battle.

Horizontal re-sequencing

[edit]

Horizontal re-sequencing is the technique in which different pieces of music are transitioned between. Musical pieces in a “branching” sequence are transitioned between in response to game events. The most simple kind of transition is a crossfade; when triggered by an event, the old piece is faded out while the new piece fades in.[1] Another kind is phrase branching; in this case, the change to the next segment starts when the current musical phrase has ended.[6] Another kind involves using dedicated "bridge" transitions, which are sections of music composed to join the two pieces of music together.[1]

In video games, this technique may be used for more significant game events, such as a change in location, beginning of a battle, or opening of a menu, as it generally draws more attention and makes a greater impact than vertical orchestration.

Algorithmic generation

[edit]

Some video games generate musical content live using algorithms instead of relying solely on pre-made musical pieces (such as in horizontal re-sequencing and vertical orchestration).

Spore uses an embedded version of the music software Pure Data to generate music according to certain game events such as the phase of gameplay, the player's actions in the "creature editor", and the duration of the gameplay session.[7] In addition, Ape Out features a procedurally generated jazz soundtrack which changes based on the intensity of gameplay and the players inputs.[8]

Blending music and sound effects

[edit]

Some video games, such as Rez and Extase, synchronise their sound effects with the background music after being triggered by the player to blend them with the music. This is done by delaying playback of the sound effects until the beginning of the next bar or measure. Dead Space 2 is another example of this. The background music appears to be arranged into four layers, each a stereo track corresponding with a specific level of "fear". Each of these layers is then either individually or collectively mixed during gameplay depending on a variety of game variables, such as the distance the player is from enemies.[9] This creates an interactive musical landscape in which the player's actions actively and instantaneously take part, enhance, shape and influence the music.

Uses

[edit]

As goal or reward

[edit]

Music games such as Sound Shapes use an adaptive soundtrack to reward the player. As the player improves at the game and collects more "coins", the soundtrack, which is entirely composed of the melodies and beats created by these "coins", intensifies.[citation needed]

See also

[edit]

References

[edit]
  1. ^ a b c d Redhead, Tracy (2024-07-04). Interactive Technologies and Music Making: Transmutable Music (1 ed.). London: Routledge. doi:10.4324/9781003273554. ISBN 978-1-003-27355-4.
  2. ^ Sporka, Adam; Valta, Jan (2 October 2017). "Design and implementation of a non-linear symphonic soundtrack of a video game". New Review of Hypermedia and Multimedia. 23 (4): 229–246. Bibcode:2017NRvHM..23..229S. doi:10.1080/13614568.2017.1416682. S2CID 46835283.
  3. ^ Fritsch, Melanie (2021). Summers, Tim (ed.). The Cambridge companion to video game music. Cambridge companions to music. Cambridge, United Kingdom: Cambridge University Press. ISBN 978-1-108-60919-7.
  4. ^ Sweet, Michael (October 2, 2014). Writing Interactive Music for Video Games: A Composer's Guide. Addison-Wesley Professional. p. 99. ISBN 978-0321961587. Frustrated with the state of music in games at the time, two composers at LucasArts Peter MccConnell and Michal Land created one of the first adaptive music systems, called iMuse. iMuse (Interactive MUsic Streaming Engine) let composers insert branch and loop markers into a sequence that would allow the music to change based on the decisions of the player. The iMuse engine was one of the first significant contributions to interactive music for video games. Its importance in shaping many of the techniques that you see in video games today cannot be overemphasized. (...) Other excellent iMuse titles includes Grim Fandango (1998), which features an incredible jazz-based soundtrack composed by Peter McConnell. (...)
  5. ^ Collins, Karen (August 8, 2008). Game Sound: An Introduction to the History, Theory, and Practice of Video. The MIT Press. p. 102, 146. ISBN 978-0262033787.
  6. ^ Sweet, Michael (13 June 2016). "Top 6 Adaptive Music Techniques in Games - Pros and Cons - Designing Music NOW". Designing Music Now. Archived from the original on 13 November 2018. Retrieved 13 November 2018.
  7. ^ Kosak, Dave (20 February 2008). "The Beat Goes on: Dynamic Music in Spore". GameSpy. IGN Entertainment, Inc.{{cite web}}: CS1 maint: url-status (link)
  8. ^ Wright, Steven (27 February 2019). "How 'Ape Out' Creates a Soundscape Worthy of Smashing". Variety. Variety Media.{{cite web}}: CS1 maint: url-status (link)
  9. ^ Kamp, Michiel; Summers, Tim; Sweeney, Mark eds (2016) Ludomusicology : Approaches to Video Game Music. Sheffield: Equinox. pp 188-189. ISBN 9781781791974