Today when teaching Sound for Interactive Media at AIM, I got to nerd out about music games. I mean, I always get to, but this was closer to serious nerdage. This is what we spent two hours watching and discussing. I've included some Kickstarter videos in this, and while they're old - I think the stories behind their musical aims were valuable to hear.
I'm starting off this playlist with a game a student mentioned in class today, that I hadn't heard of. Vib Ribbon is a cute, and maybe a little bit rude (?) music game for Playstation. Being that the Playstation 1 was the only console I've ever owned, I am so disappointed I never got to play this. Vib Ribbon has some built in songs, but as you progress, you can put in your own CD and play songs off of it.
Next is classic Audiosurf. This uses a few aspects of any music want to plug in from your library, to craft the level. Here is Moby's Thousand which demonstrates a little about speed and beat detection capacities.
Audio Overdrive is an Ableton plugin made by Nils Iver Holtar. It takes in MIDI from Ableton instruments to craft the game level.
140 uses the soundtrack to create the game level also, but arguably gameplay could involve a certain amount of musical sensibility to predict and "feel" what's coming next.
Unlike Audio Overdrive and 140, Dubwars focuses on player actions. In Dubwars, the player does not have control over the gun, there is no trigger, weapon swap or reload functions. The gun types, and when/how the shoot is based off of the music. This game gives the player the capacity to "learn" the song in a way. As they get to know the song better, they know optimal times to focus on getting power ups, wiping out enemies, or fleeing when they can't shoot.
So, now that we've looked a little at direct translation of music to game elements, David Kanaga takes us a little further. He discusses music as object, substance, and organism - highlighting different ways that we interact with sound and conjunction with other media.
Panaromical is a collaboration between David Kanaga and Fernando Ramallo. Ideally I think, you use a MIDI controller like the Korg NanoKontrol to play with audio and visual. Each fader or knob is mapped to a different track, effect instrument or other aspect of the audio as well as different visual elements. I think this continues an interesting discussion around music games, and that is that players are more likely to have a visual bias than an audio one. So, for how long and when is the audio a key part of play?
Soundself takes in audio from the player and a visual and auditory response. This video talks about pitch and volume as the main data, but Soundself goes on to examine other expressive techniques like timbre and rhythm, as it furthers development.
Robin shares his thoughts on the role of sound design and music in games and interactivity with media.
The main programmer for Soundself, Evan Balster has also gone on to make his software in to an app. It detects a variety of qualities to your voice and midi maps it to a library of instruments.
As we began to talk about affordances for musicianship and the line between game and tool, we look at Björk's Solstice. Which shows you pitched notation, and takes inspiration from the rhythm of orbiting planets, to allow you to make your own music in a very playful and visually pleasing manner.
Next we look at Samantha Kalman's Sentris. Samantha's aim in this game is to convey the feeling and outcomes of musicianship. This game allows you to make music and then keep it!
As a class of composers, hoping to make music for games, I also show them how developers make their own music, using the wide variety of very accessible composition tools, like Sentris. The soundtrack to Ian Maclarty's Action Painting Pro was made in Figure, an iOS app made by Propellerhead, the makers of prominent DAW, Reason.
We then also took a quick look at theUnity Asset Store, in both the audio library sections and scripting assets for playing with audio. This gave the students a little more idea about how much developers can do without them.
We then had a quick look at David O'Reilly's Mountain. Music being inherently a manipulation of time, these sorts of games that also play explicitly with time can be very appealing to composers. A student also mentioned a few secret things that happened when they played songs Ode to Joy and Mary Had a Little Lamb to their Mountain. I'm yet to try it.
We then also looked at tools for musicians. VIZZable by Bob Jarvis, is a VJ Max For Live plugin for Ableton. It has a lot of potential for gameplay aspects, especially in its use of a visual drum rack. Being able to trigger visuals with MIDI, gives us more potential for branching narratives, and the effects give us potential to convey visual tone and setting variations.
From there we took a quick look at John Zorn's Cobra. John Zorn is an experimental composer who creates "game scores" based on interactive parameters for improvisation.
To take a break from brain overload, we watched the audio team for Guildwars 2 talk about their process in 2015 expansion, Heart of Thorns. You often get great results when Googling you favourite Hollywood or AAA film or game and "sound design".
We took a glimpse at DIY hardware capabilities. I'd love to play a duet with an mango.
We talked a little bit about hardware interfaces for games, and how they might impact musicianship
That's where I probably should have talked about mine and Amani's project, CTRL_Coda - a series of games controlled by musical instruments. This is a video of Etch A Synth by Richard Adem, Dom Willmott, and Alexander Perrin - one of the projects involved in CTRL_Coda.
I also wanted to show Cello Fortress, but I forgot.
Then, just for the composers and Ableton nerds, here's Greg Gordon showing how he did the soundtrack to the Disney Tron game.
So, that was my conclusion! It was a bit brain heavy, but I loved the chance to nerd out about the potentials of music in games. I could have gone on for much much much longer!