Music, sound design, audio integration

Rookery is currently being developed. This blog describes some of the WIP concepts and decisions made.

Rookery is a walking simulator set in the Victorian era in London. As a poor woman living in the rooks, you journey to the rich neighbourhoods of the city, during which you experience more and more hallucinations. The extreme poverty and feeling of guilt over your lost child are dragging you further and further down.

The game should get under your skin and it shouldn't be a 'happy' experience. Contrary to glorifying this period and place in time, the game depicts the extreme everyday horrors of Victorian London.

The audio plays an important part in making the world come to life and supporting the story and setting. Aspects such as the feeling of guilt over your lost child and the widely spread deceases are audible throughout the game. By blending realistic sounds and instrumentation with scary effects and modulation, the audio wants to get on the players nerve.

Adaptive audio design

As the project is being developed by a very small team over a short period of time, the audio has a very important role in making the world and story come to life. The video showcases a WIP (April, 2020) version of the FMOD Studio project. Adaptable parameters and randomness are used to create a dynamic and interactive world. Blending nondiegetic music with diegetic sounds, by blurring the line between music and sound design, improves immersion.

The sketches below illustrate the nonlinear audio design. As the game mostly consists of walking around, most sounds are constant varying loops (atmosphere, music, etc) or emitted from world objects, with a few sound triggers for certain effects. Music can transition either by (cross)fading or quantised timing. Strange sounding DSP effects are modulated over the audio, using a LFO that increases in intensity the more delirium the main character experiences. Using a simple priority system and FMODs snapshot functionality, a simple hierarchy has been created. Sounds such as certain hallucinations or worldobjects lower the amplitude or frequency-range of other sounds lower in the hierarchy.

Most game data for audio to react to is based on the players position in the world, along with sounds emitted from objects and creatures in the world. This map showcases al location based triggers for the audio. As the games ark is a constant buildup, from the player start (1) to the final scene (12) the music and atmosphere increase in general intensity. Certain sounds, such as sick children within houses you walk by and scary hallucination sounds, are emitted when the player walks by. At trigger 10 an impactfull visual effect of birds breaking out of a window and flying by is played. This effect has a strong sound that drowns out most other audio. This is used to transition to the final part of the music. From this point on the more held back, melancholic sound transitions to an extreme and dramatic sonification of the characters emotions.



Because the game has a realistic visual art style and a setting based on real life, I decided to mainly use non-synthetic instruments. For the delirium elements I used some synthesis, but mainly processed the acoustic track. This communicates to the player that when strange things start happening to the music, you're hallucinating. Virtual orchestration is combined with recorded instruments to achieve a realistic sound on a small budget.


The concept for the music and its nonlinear system has been conceptualised and tested using a tool I am developing for a range of experiments called NADT. The video shows how the music can quickly be tested, which saves a lot of time implementing and switching back and forth between programs.