Merch: Vinyl

Here's a list of all the records we've done vinyl runs for. Availability may vary.

'Jugglernaut' from Everhood

'Jugglernaut', written for Everhood : Eternity Edition

Boss Fight Challenge

100 bosses, 5 seconds at a time. A medley of three new tracks, written for Pwnisher's Boss Fight Challenge.

Paradise Marsh

this is a working draft.

Track Descriptions

Ribbiting

This is the music from the release trailer. I brought back the "turtle drum" from the original demo, "First Visit to the Marsh". Despite consisting of only four samples, this drum sounds good across a wide register. The four samples cover different dynamic levels for the drum.

Hop Along

This is the music from the big second trailer we did, which was a full score, both music and sound effects. A lot of audio that ended up in the game (for instance, the frog dialogue) I designed for this trailer first. Sometimes trailers can be a bit of a distraction from actual work on a project. As a result, I've often used them as opportunities to make progress on aspects still needed for the game.

Falling to Earth

This is the music and sound design that plays when you first start the game. The music synchronizes to the stars falling, and this whole segment ended up having a very FEZ feel to it. Eti always hoped for this kind of sound, while I tried to avoid it if possible. I try to make my projects all feel different.

Fishing Trips

This track combines the music from eating mushrooms and going fishing in the game. The Lost Souls content update added fishing. Both of these ditties use physically-modeled drums: kick, snare and cymbal. These were fun to make as the percussive instruments used are very expressive to play. You'll also notice that the fishing music is only percussion. The music system is very ambient and goes through many different chord changes. As a result, I thought it would make sense for another global system such as fishing to only use percussion. This way, the two interplay seamlessly, without the need for mutual exclusivity.

The Airwaves of This Place

This is the first of five radio pieces from the Lost Souls content update. It came together while experimenting with clarinet sounds collected over the years. The ambience combines the sound of thin ice vibrating on a lake with distant siren sounds. This tries to evoke both a recent time in the marsh, and something from a bygone era. The ice sounds tie you to the marsh as it is, while the sirens evoke a different time, when the marsh was more inhabited. The radio pieces are also littered with garbled chatter. These could be the speaking fragments of people who were on a past radio program, or perhaps people who were at that location at some point.

Fellow Traveler

This radio piece suggests people enjoying a campfire while someone plays acoustic guitar. I recorded the guitar in my bedroom back while I was attending music college. I was taking an elective guitar course on the fingerpicking style of Joni Mitchell back then. During the course we explored lots of alternate guitar tunings; this track uses DADEBE.

A Selfish Mind

This radio piece suggests someone playing a toy synthesizer in a treehouse. The branches creak as they sway back and forth, in an attempt to create an ominous feeling. I recorded the keyboard during the development of Hyper Light Drifter. During a work trip to Los Angeles I stayed at the house of its game director, Alx Preston. He had a little keyboard with an arpeggiator for self-accompaniment, and a pitch knob. At the time, I recorded the piece into my phone as "El Niño".

The Meaning Was Lost

This radio piece suggests someone whistling while trapped at the bottom of a well.

Already Home

This radio piece suggests people living in one of the game's eroded stone buildings. Someone plays the piano on an especially cold evening. A heavy door closes in the background of a large, castle-like space. I recorded the piano in 2018 at the Heart Machine studio, while working on Solar Ash. Most of these radio pieces started out as voice memos or demos taken with a field recorder.

Found Souls

This short piece plays when you collect all the souls in the Lost Souls content update. The sound design of this segment is a collage of pre-existing sound effects from other parts of the game. This is something I did often on this project to save time, since there were SO many things to make for it. In the end, I didn't even come close to all the sounds I wanted to create for the game.

The Celestial Fields

This is the music for the end of the game area, a type of heavenly plane. Eti wanted something uplifting, ethereal, and choral for this segment. The music builds in a slow manner with the addition of more and more layers over time. It then recedes a bit towards the end to create a sense of tension as you approach the final bench. As you walk, there are special flowers you can brush by that trigger musical tones in accord with the track. These work in a consonant way no matter how far along in its playback you are.

Famicase (Demo)

When I first played a demo of Paradise Marsh, I saw that the main menu had this striking, rotating cartridge. This object is actually the source of inspiration for the entire game. This cartridge was an artist's submission to a famous competition called Famicompo. People submit cover art for Famicom games that don't exist. In this game, Eti found the inspiration to bring it to life.

This design inspired me to create a thematic piece of music. It has a downtempo, Zelda-type feel to it. Eti felt it had too much of an adventurous feel for the game. He imagined the music could be more ambient and FEZ-like in nature. To both be happy, we had to end up somewhere in the middle.

I wrote this demo during a period where I got very deep into digging up old soundfonts. Soundfonts are digital sample banks that often feature sounds recorded in the 1990s. They've fallen out of favor for the most part, and so are something of a novelty now. Sometimes hobbyists are the ones who create these banks. Other banks may come from old hardware synths and samplers. Through the adoption of soundfonts, I've been able to introduce a lot of older sounds to my palette.

First Visit to the Marsh (Demo)

This is the very first attempt I made to score game footage. The footage that you see in the music video is an obfuscated version of the same footage I scored to. Some of the ideas we landed on were present from the very beginning. This includes creatures with calls that tune to the score, and a music system inspired by Breath of the Wild. The system plays little ambient phrases and takes its time. There's plenty of room to breathe for the player; the ambience of the environment fills in the gaps.

Eti vetoed some of the early instrumentation choices here, like the string-ish bass. But the general vibe has remained the same since the beginning.

This demo is also before I decided to synthesize all the creature sounds. This ended up being a very challenging path to go down. The beetle creature call sound in this track is actually an electric piano. It's from a sample library called "Jazzman"; it also has tape effects on it.

Standing in Place (Demo)

This piece is a demo of what represented a turning point in how the music system worked. I found that re-orienting the same elements in different ways gets you a lot of variety. The system create this by changing timing, pitch, volume, and panning. It also adds the chance for muting, reversing, and applying effects like filter and reverb.

Undulate

An hour of crickets and music. Undulate is a 60 minute screen recording of the main menu of the game. The main menu uses an even more ambient variant of the gameplay music system. Phrases trigger in a similar fashion but are longer and more stretched out. I used a spectral plugin from Michael Norris to blur the tracks out in a way similar to the popular effect "Paulstretch".

In addition, there is a procedural cricket generator that triggers the sounds of a bunch of crickets who each have a set position in the stereo field. This includes distance which is captured by volume. Each cricket also has its own unique pitch which is seeded, and its own unique rate of repetition, both in the case of how much time is between the individual chirps in a call, and then how frequently that cricket takes a break.

All of this is handled with both deterministic randomness and "non-deterministic randomness", which is just another way of saying randomness that you're not trying to control the outcome of every time. All of this put together has an ebb and flow quality to it, where these long ambient music phrases flow in and out periodically along with the cricket sounds.

Each phrase has three elements, and sometimes it will play all three. Sometimes it only plays two or one. This contributes to the variability of the system. Additionally, there's variation in things like reverb, delay, pitch, filtering, volume, panning and timing between elements.

Messages in Bottles

This track is comprised of a set of phrases that were initially designed to be used for gameplay music via the procedural music system. This was one of the earlier versions of that system. However, it turned out that these phrases had too much harmonic variation in order to be a good foundation.

Instead, these phrases found a home in essentially a miniature version of the music system, which triggers them whenever you find a message in a bottle. With each passing line of a poem in one of these bottles, it will play a single phrase and it stitches them together with variations in timing and pitch.

This track encompasses all of those phrases in a random pattern so that you hear all of the material at least once. I also added effects just to flesh it out and give it a slightly different feeling for the soundtrack album.

The primary music system we did end up with only uses three types of harmony: F# Lydian, F# Minor Variants (essentially a combination of Aeolian and Dorian), and C# Mixolydian, which was added in the Lost Souls content update. While the samples for these have the aforementioned key centers, the game is always slowly modulating the pitch of the music, +/- 3 quartertones.

High Concept

For mood, we sought to create all of the sounds using synthesized sources as much as possible, and to create a stable foundation from which to have fun things pop out at you, like musical critters and the different sorts of interactions. The music is generally meant to be introspective, while also helping you find creatures. The ambient audio is highly dynamic, reacting to the environment around you and what you do, whether it be brushing through cattails or trees, thunder coming from a particular spot with accurate timing and direction, or the pitter patter of rain being different on the surface of water or land. Eti gave lots of really solid direction about how he wanted things to feel, and I generally ran with those ideas. We wanted the marsh to feel very dynamic and grounded, while making the sky feel ethereal and magical. When designing the audio for the star system, I was inspired by The Witness' environmental puzzles, which really make you feel like a wizard as you drag lines across large swaths of the environment. We wanted the monolith to feel ominous, but also uncanny, so it operates in a different way than the rest of your surroundings. As you approach it and look at it from certain angles, the rest of the audio drops out, or the sound of the monolith modulates completely. Sometimes the hardest part is getting these sorts of choices to really stand out in an effective way. The sanity test for any clever implementation is to test it against a simple one-time sound, or a simple piece of looping audio, and see if it makes that much of a difference in people's experiences.

Attention to Detail

A lot of attention was given to try to make the sounds and (by extension, the world) feel organic, and to help the player be attuned with what is happening around them. I tried to incorporate lots of subtleties in hopes that they would add up. Some of the minor details include the way the wind changes as you walk around a tree, or the way it blows past your ears when you're moving extra fast. The sound of creatures is a bit different when it's reverberating from further away, and each creature, even within the same species, has a slightly different sonic behavior. There are lots of little things like this in the game, and in a vacuum they may not be noticeable or amount to much, but the hope is that they all accumulate to encourage a heightened sense of awareness and immersiveness.

All Synths, All the Time

The choice to limit myself to only using synthesizers (and effects) to design all of the sounds proved to be quite the challenge. Certain sounds, like water and voices, are very difficult to pull off convincingly. I tend to try and find a unique angle on every project that will be fun and challenging. In this case, the challenge has at times outweighed the fun, but the sense of accomplishment has generally made up for the grind.

In order to get synthesizers to sound like realistic(ish) sources, lots of effects and modulation tricks were tantamount.

Procedural Voiceover

The creature voices were one of the more challenging aspects of the sound design for Paradise Marsh. Some characters went through numerous iterations before we landed on the right feeling. The voiceover system concatenates short soundbites together to create longer sentences and phrases. This system also has modifiable controls, such as speaking rate, ‘terseness’ (how many soundbites it takes to say something), base pitch and pitch range. The volume of dialogue also changes to match changes to the visuals, such as font size.

The actual soundbites were created from longer, rendered out performances. Typically these were performed using a keyboard with modulation such as pitch bend and mod wheel automation to add variety and to bring out the full sonic possibilities of the synth patch.

We also experimented with creating progression in the dialogue. Before the creatures are completed constellations, a high pass filter is applied to the voices. Also, musical twinkle sounds are played underneath the dialogue to give it a more magical feel. These were added using a simple system, although it might have been more economic to bake them into the creature sounds. But sometimes an idea would arise during development and it seemed like the easiest idea to try it would be to hack something together in code using samples already imported into the game.

Trees

Ambience

The trees in the game are a prime example of a deployed system that may or may not be more sophisticated than is necessary. This is why it might be prudent to playtest and ask framed questions afterwards to get a sense of what people notice and don't notice. Despite that, sometimes things that go unnoticed are working well, so it can be hard to parse what is best.

Alignment Based Crossfading

This system utilizes two mono sound sources that emit from the location of each tree. The output is a crossfade of these two sources that is determined by the player's position orientation to the tree. For instance, a player on the north side of the tree might hear 100% of one sound and not the other, while the opposite would be true when standing on the opposite side.

Poor Man's Capsule Attenuation

When Eti introduced pickups into the game, specifically the mushroom which makes the player quite large, an issue arose where the attenuation of tree sounds worked fine at ground level, but would be too quiet when walking around as a giant. In order to solve this in a way that didn't just use a larger sphere (which would increase the distance attenuation in all directions), I opted to track the vertical position of tree sound emitters with the height of the camera, but only within a set range.

This allows the emitter to move up and down along the vertical axis of the tree, in essence replicating the attenuation behavior of a capsule.

Global Wind

While working on the Lost Souls update after release, I came up with the idea to generate a global 0..1 value that would represent the current intensity of wind in the world. This value uses a simplex waveform and properties such as min/max amplitude and frequency to create a sense of pseudorandom fluctuation. The signature of this wind changes periodically throughout each in-game day, and not only drives the volume and crossfade amount between the two sound sources, but also the shaders for trees and other leafy objects.

Spirit Forest

Drones

For the Lost Souls content update, Eti built a special "spirit forest" biome, which had a huge canopy of trees. I thought this would be the perfect place to dust off my old "tree drones" experiment from Solar Ash.

Instead of using actual synths, I used subtly effected sine-wave loops, while reimplementing similar features such as scale-based portamento of pitch as you move around the trees, and focus-based attenuation.

Call and Response

Another feature of the spirit forest is that there is a huge center tree, and then a bunch of smaller trees around it. As an aesthetic feature I thought it would be cool to have these trees communicate with each other in some fashion, and so I came up with a simple call and response system. The center tree emits a sound periodically, and then one of the outside trees responds to it in kind, playing a similar variation of the same emotive sound, but at a different pitch. I ended up just rendering stereo assets for this and then using the left and right channels individually as mono sounds, where the center tree would play one and an outer tree would play another.

Foliage

I devised a system to generate audio for when the player moves through leafy, collisionless objects such as cattails, bushes, and tree leaves. The player's movement speed is used to modulate the pitch and volume of a looping piece of sound. This is done using colliders, typically spheres and boxes. This modulate happens when the player is within said collider (or a group of colliders working in tandem). Once the player leaves the collider(s), the sound remains in its last known position, ie. near the foliage you were just moving through. This adds just a bit of a realism as you're running through stuff - the faster you're moving away from the foliage, the quicker the sounds attenuate and pan away from your current position.

Outro Foliage : Tuned to Music Progression

During the outro there is special foliage that when walked through, plays musical tones. These actually choose from different subsets of pitches based on how much of a linear, looping piece of music has been played. To accomplish this, we check what the current sample position of the linear piece is, and then choose different pitches accordingly.

// PSEUDO-CODE

// hash is a string of duodecimals that represents the 12 pitches.
// ie. 0123456789ab == C C# D D# E F F# G G# A A# B
if (samplePos < 0) {

    if (Distance(benchPosition) < 25)
        hash = "1468ab";
    else
        hash = "038a";
}
else if (samplePos <= 483116)
    hash = "0368a";
else if (samplePos <= 552132)
    hash = "0258a";
else if (samplePos <= 1104264)
    hash = "0357a";
else
    hash = "0138a";

// come up with a new repeating sequence whenever the hash has changed.
if (mxHash != hash) {

    mxOrder = Liszt.Make(hash.Length, (i) => i).Shuffle();
    mxHash = hash;
}

Vertex Environmental Audio

By far the most complex (and expensive) audio system in Paradise Marsh is the vertex system, used for intelligently attenuating environmental sounds unique to each biome. The game's terrain is generated using 3D perlin noise, and along its topography, a lattice of vertices is also generated for use in audio. Because the geographic features in the game such as water, coast, and dry land are all determined by the heightmap of the terrain, it was easy to piggyback off of this idea in order to create subsets of vertex data to use in tracking different sounds for those same geographic features. This system is also used to attenuate rain.

Calculations

To figure out distance, we simply calculated the distance to the closest vertex. Panning is calculated by measuring the angle between the two "widest" vertices. In other words, the vertices that are the most "left" and "right" from the camera's perspective. The weight of these vertices is also weighted by distance, so that closer vertices matter more than those further away. This makes it so that if there is a vertex all the way on the left side, but it's super far from the camera, it won't make the stereo field wider than what is expected.

Optimization Using Spatial Hash Grids

At one point, this system was using around 10% of the game's entire performance budget. In order to remedy this problem, the system went through many stages of optimization, the most important of which being the implementation of a spatial hash grid. I was introduced to this concept by Dan Reynolds from Unreal Engine's Audio Team, who had implemented something similar when working on their Matrix Awakens (?) demo. I knew that while I could build such a system myself, it might be pertinent to ask for help from a more experienced friend, and so I asked my colleague Charlie Huguenard to help build a hash grid system that we could implement. Essentially, before this system was deployed, when you were in a biome, the game would iterate over all the vertices related to a particular terrain type (ie. coast) that existed in that biome, despite the fact that a large portion of these vertices were too far away to be important. In order to reduce the amount of iteration needed, the spatial hash grid breaks up the biome into a grid of smaller chunks, where each chunk has a smaller list of vertices to iterate through. This way, we only iterate through vertices that are nearby enough to matter to the audio output of the system.

Storms

The rain sounds in the game are also attenuated using the vertex audio system. There are different rain sounds that emanate from the water, and from land. This was done to try to create a more immersive sense of weather as you move around. Because the lightning is a global shader and can't be pinpointed to a certain location, the sound of thunder is used to do this by itself. The sounds are triggered in a random direction and with a random delay time that is correlated to an attenuation amount. This mimics the way thunder behaves in real life, where it takes time for the sound of a lightning strike to reach your ears. The longer it takes, the further away the strike was. I originally learned this as a kid while playing the Cyan game "Spelunx and the Caves of Mr. Seudo".

// PSEUDOCODE

LightningStrikeVFX();

float thunderMinDelay = 0f;
float thunderMaxDelay = 1.5f;

//delay between light and sound:
yield return new WaitForSeconds (Rando.Range(thunderMinDelay, thunderMaxDelay));

// get a random upward position.
Vector3 origin = 100f * new Vector3(
    x: Rando.Range(-1f, 1f),
    y: 1f,
    z: Rando.Range(-1f, 1f)
);

//thunder audio
AudioSystem
    .Get("enviro.thunder")
    .Prep(volume: Map(
        map: delay,
        fromA: thunderMinDelay,
        fromB: thunderMaxDelay,
        toA: 1f, toB: 0.5f)
    )
    .Relocate(playerPosition + origin)
    .SpatializeOnly()
    .Fire();

--

Object Pooling

In an open-world game where there are thousands of objects (and by proxy, sounds), having a pooling system can help reduce performance cost, by making it so you don't have to keep destroying and creating objects. Instead, we created a pooling system, where audio objects are lent out when needed, and returned when no longer in use. It works a bit like borrowing books from a library.

Music System

Because the game is an open-world sandbox, I felt that an ambient phrase-based approach might work best. The music is comprised of vibes, each of which has its own scale, key, and instrumentation. Each vibe has 20 phrases, each of which contains three layers that can be played in conjunction at any one time. The game initially shipped with two of these vibes F# Lydian and F# Minor Variants (bits of both Aeolian and Dorian), but the intention was always to have more. In practice the amount of effort required to create a vibe was significant, and it also turned out the game did not need too many to really feel fleshed out. A third vibe, C# Mixolydian, was added for the Lost Souls update.

I think "open" is a key concept in trying to devise a system that feels natural, and works in most contexts. I took cues from my work with Troupe Gammage on Solar Ash, which in turn was heavily influenced by the music system from Breath of the Wild. The music phrases are generally slow and have plenty of silence between them, so that the game breathes and the player is never overly inundated.

The system is also driven to a large degree by the location of creatures. Music will emanate from nearby creatures, and if there are none nearby, you will get music that is lonelier and more ambient.

In nature there is all sorts of genetic mutation and differentation, and the music system plays with a similar idea by creating infinite variations upon itself. The music contains a few different "moods", each with 20 phrases, where each phrase contains three solo performances that are meant to be played alongside each other. However, these three layers are modulated in all sorts of ways, whether it's volume, panning, pitch, effects, delaying the timing, reversing and so forth. Some of that can be attributed to laziness, as at times I had more fun building out variation in code than in just creating more assets. But ultimately the focus on diegetic audio (ie. sound in the physical world), whether it be environmental cues or music, and having systems drive a lot of the audio, helps to keep things feeling intelligent and natural.

Variability

A lot of different techniques were employed to make this phrase-based system as dynamic and varied as possible. The phrases were designed to work in different configurations, which can mean things like only hearing one or two of the three layers, having the layers be randomly out of phase against each other, or even pitching one layer up or down. Effects such as high and low pass filters, and an echo are also applied to each phrase to create even more variety.

Predictability in Place

During the development of the Lost Souls update, I made it so the system re-uses the same phrase assets repeatedly so long as you keep the player within a certain world radius. Once you move out of this small area, the system will use new phrases. Even so, all of the other variability of the system continues to operate regardless, creating more subtler variations.

Creature Echolocation

One of the more unique components of the music system is that its layers are attenuated in the world based on the location of (up to) three nearby creatures. The intention is that the game is busier musically as you approach creatures, giving the player a different feel depending on whether they are in a more active creature area or not. If there are only, say, one or two creatures nearby, the music system will only trigger as many layers as there are creatures.

Night and Day

At night, the music system behaves slightly differently. The playback is randomly transposed up either a fifth or an octave, so that the music is lighter, brighter, and more bouncy. An echo effect is also applied to each phrase, with random variations. All of this intended to create a more twinkly, magical feeling, which coincides with access to the monolith being made available at night as well.

Biome Specific Loadouts

There are certain biomes that modify the music system in unique ways. When you are in a landfill, the music is reversed. When you are in a snowy area, the music is made to be more muffled sounding.

Spacing

There are many circumstances where music is undesirable. Typically this is because a feature is active that doesn't require gameplay music, such as messages in a bottle, but there are also environmental and behavioral circumstances where we don't trigger music. Some examples of this are when the player is jumping, using the net, or near a monolith. These are just little ways to break up the pacing of the music, and to reward the player for taking a break, or changing their behavior.

Empty Mode

Late in development it became clear that there was something missing from the progression of the game's music. Because the amount of music is directly linked to how many creatures are nearby that can be caught, the game gets more sparse musically over time. While this feels nice as the music ramps down, it would get to a point where the game gets eerily silent. In orer to alleviate this I took all of the music assets from the game and created long, blurred out, even more ambient versions of the music that would play one at a time when you were in an environment with no creatures about. This keeps the soundscape a bit more active, while commenting a bit on the loneliness of your current predicament. This also does a nice job of differentiating between areas where there are creatures to catch currently, and areas where there are not, whether it might be because you've already caught them, or because it's a time of day when they're all "asleep".

Main Menu

The ambient music for the main menu was designed on a bit of a whim - I knew we needed something and thought I might be able to make the pre-existing music system work with a bit of tweaking. The main menu randomly picks "empty" music assets to play, in groups of 1 to 3 layers per trigger, along with effects, and the chance to play things in reverse. In addition, I wanted to give the sense that the perspective of the menu is of someone looking up at the sky from the marsh, as opposed to something more otherworldy. So I created a system that procedurally creates a wave of crickets - it ebbs and flows at different rates.


Monoliths

In Paradise Marsh, monoliths are the ancient, mysterious pillars that contain the technology (or magic) needed to send the creatures you collect back into the sky to unlock the stars. From early on Eti and I had discussions about making them sound otherworldy and a bit eery. I wanted them to literally stand out from the environment, and so as you approach them, other sounds in the world duck out almost completely to make way for eery, alien sounding drones. Beyond that, the amount of ducking is stronger when you're standing parallel to the monolith, while facing its side will not affect the mix of the game as much. The sound of monolith is comprised of different loops which crossfade as you walk around the monolith. This is meant to give them a bit of a dynamic and responsive feel to the player. In order to figure out how to continuously crossfade between multiple sounds, I came up with the following algorithm :

/// 
public static float MultiCrossfadeEqP(float x, int i, int count) {

    var startX = (float)i/count;
    var domainSize = 1f / count * 2;
    var endX = startX + domainSize;

    if (x >= startX && x <= endX) {

        // map x within this domain to 0..2.
        x = (x - startX) / (endX - startX) * 2f;

        return x <= 1f ? Mathf.Sqrt(x) : Mathf.Sqrt(1 - (x - 1));
    }
    // edge case for final crossfade looping back around.
    else if (i == count - 1 && endX > 1f) {

        endX -= 1f;
        if (x <= endX)
            return Mathf.Sqrt(1f - (x/endX));
    }

    return 0f;
}

This lets you iterate over multiple sounds and calculate the correct volume level for all of them, based on where in an 0..1 range you are and how many sounds you are trying to crossfade between successively.

Additive Mixing

This system was in part inspired by conversations I had with Dan Reynolds while working on the Lyra sample project for Unreal Engine 5. Essentially, when you have a lot of different gameplay scenarios that could theoretically want to adjust the mix of the game, it's useful to allow these mix changes to stack. This way, you don't have two systems both trying to directly access and change say, the volume level of the music mix, at the same time, thus creating a race condition / situation where the two changes zipper and create weird and undesirable results. In order to create such a system, I used a dictionary to keep track of different mix adjustments that can be requested simultaneously. These adjustments are then added up using decibels.

It was also important to separate out the user's volume controls, so that they would not be affected by in-game mix changes, or vice versa.

// each mix channel has a user volume and a variable length dictionary of faders, each with a db value. These all get summed every frame to determine the correct db for the channel.
public void Update() {

    x = USER_DB;

    foreach (float f in Faders.Values)
        x += f;

    if (x > DB_CEIL)
        x = DB_CEIL;

    SetFloat(Name, x);
}

Live Consoles

In order to alleviate the amount of times I had to hop back and forth in code, I came up with the idea to use data assets as if they were singletons, in order to drive parameters while the game was running. Typically this was used to mix the volume of systems, such as creature calls, weather, etc., but could also apply to other types of properties. I opted to go for data assets over having system values on prefabs where possible, because of the potential room for error that can come from having instantiations of prefabs. This creates multiple places where a value can be controlled from, and I wanted to minimize that as much as possible.

Creatures

For some of the animals, I tried to mimic their real life sounds - for others, I had to find other influences, or come up with something symbolic that would really put their personality first. The spider for instance, I imagined to be somewhere between 'Boo' from Super Mario 64 and the spiders from Peter Jackson's Tolkien films. Eti asked if we could model the beetle's voiceover after Johnny Bravo, so it has a deep, arrogant bug voice. The animal sounds can also be a bit different stylistically whether its dialogue or calls out in the world. This has a lot to do with the choice to make the creature calls musical. They always select a harmonious pitch, allowing them weave seamlessly into the musical soundscape of the game.

Wind Chimes

One of the many small features that got some attention - these use physics to figure out the magnitude of impacts between the chimes. Each set of chimes has a deterministic sound profile. (sample set, octave, pitches)

Compass

While debugging the game, I had made a teleport function to help us get around the game quickly, randomly moving to different biomes. At some point I thought this could actually become a fun feature. One of the best parts about working with Eti on this game was that I could go off and build something (even a gameplay feature), and pitch it as something to include. I combined teleportation with the eye closing mechanic that quits the game so that the feature had more polish, as well as including a chord of chime tones. The eye closing mechanic uses a custom filter I built called 'Molasses' which is sort of like a gain fader and a low pass filter combined. Throw in the cartridge appear sound from tbe menu pitched down and you get sort of an ehtereal 'whoosh' each time you reopen your eyes. I made sure to build this on a separate branch and take some good videos to help pitch the idea, and luckily Eti was on board.

We added some polish by incorporating it into a special item you could find and pick up : the compass. Sound wise, we added a special sound design for it that would act as an 'attract mode'. This sound is based on the rotation speed of the compass dial (re-using fishing rod sounds).

float audioTimer = 0f;
bool noAudio = true;
void TriggerAudio() {

    if (noAudio)
        return;

    audioTimer += Time.deltaTime;
    float rotationalDelta = Mathf.Abs(needle.transform.rotation.eulerAngles.y - needlePreviousEulerY);
    float rate = needleAudioRate / rotationalDelta;
    if (audioTimer > rate) {

        float dynamicVolume = 0.1f + Maf.VolCurve(Maf.Normalize(rotationalDelta, 0f, 12f)) * 0.9f;
        float baseVolume = active ? 1f : 0.2f;

        AudioSystem.Instance
            .Get("compass.tick")
            .Attenuate(transform, minDist: Constant.ATTEN_MAX_DIST_DEFAULT * 0.5f)
            .Prep(baseVolume * dynamicVolume)
            .Fire();

        audioTimer = 0f;
        needlePreviousEulerY = needle.transform.rotation.eulerAngles.y;
    }
}

Audio for Particles

Options for triggering audio in synchronization with GPU effects are somewhat limited. We tried to tie in sounds to things like meteor showers, which used particle systems, but ultimately we tried to shy away from these sorts of implementations where possible due to the limitations they placed on audio feedback. The only promiment use of audio feedback for particles in the game is for the sparkle effects that you see on interactible objects. These trigger at the end of each particle playback loop, which is one of the few callbacks that could be easily set up for audio.

Cut Features

1 Star, 1 Sine

One of the earliest audio ideas I tried was mapping a looping sine tone to every star individually in such a way that by moving across the star field, you'd get shifting pad-like chords with attenuation and panning based on look direction. This proved to be very expensive and also required DSP operating at audio rate (instead of on the game thread) in order to get all of these looping sounds to fade in and out smoothly enough for the player to whip their head around wildly. There were some ear shattering bugs, and lots of finicky-ness with trying to get the volume changes to feel just right. At the end of the day we went with a simpler approach, one-shots when you hover over stars, saving the looping sounds for the constellation drawing. This proved to be the better way to go.

Culling Sounds

For awhile, we had sounds for when trees and other tall objects would cull in and out of the visual space in the game. Typically this happens when the camera enters a distance threshold in front of the player, or the camera turns towards an object that is suddenly in view. This was interesting as it added a lot of new sonic detail, but ultimately it can be a bit distracting. After all, culling is not something that conveys much in the way of useful information to the player.


Locator

In a procedurally generated environment, sometimes it's useful to be able to stumble upon features at their normal location(s), instead of just ploppong them down wherever by hand. In order to support this, I built a debug script that could be placed on a prefab, which would draw a single pixel line straight up to the sky in tbe world from wherever it was, as well as display distance and facing direction of objects in a list in tbe debug HUD.