Presentation: Abracadata!

I gave a microtalk at GDC 2018 as part of a session at the Artificial Intelligence Summit called 'Turing Tantrums: Devs Rant!'. I shared a thought experiment about exploring the possibility space of abstracted data relationships that cross disciplinary boundaries. Unlikely data marriages!

Transcript:

As a bit of an outsider I thought instead of ranting, it might be better to > share a thought experiment around an area of interest for me lately ...

AbracaDATA!

Games are a treasure trove of data.

A lot of what happens in games has something to do with numbers and math, and this stuff is great for creating and reinforcing the internal relationships in a game.

The most common relationship is player input, and how it drives just about everything. But let’s focus elsewhere.

Maybe your game has an enemy. A blue rectangle OH NO! It’s being shuffled around the world with some movement instructions. And you spruce it up with an animation, and maybe it feels good with some tweaking, but if physics and animation share their data, maybe they make even better decisions. Also that bearded man is now a giant dog thing.

Of course, you may not want all yours systems to share, and sometimes a hand-authored, isolated thing might be what you need. But tying camera movements to explosions or using text to drive a gibberish voiceover are examples of ways that tentacular relationships can improve the way a game feels.

Speaking of gibberish, I spend a lot of time thinking about sound, and lately I’ve been thinking about data sonification: using a data input to generate a sonic output. When a gameplay event occurs, like a footstep, we like to trigger a sound. This is a very useful and simple form of data sonification.

Another common practice is to map an object’s spatial position visually, to a relative spatial position, sonically. For instance, mapping the X position on screen to the stereo pan position of a sound, or mapping an object’s distance from the camera to a sound’s volume and brightness.

These mimic the way we hear things in the real world and are simple victories. But the examples I’ve given so far are well known and commonly employed. They’re perfect for clarifying, giving the player more coordinated feedback about what they’re interacting with.

But I want to talk about some of the less utilitarian places these relationships can go. Why not do more to re-contextualize this data instead? We could springboard ourselves into explorations of relationships that are weird, novel, counterintuitive and wonderfully asymmetric.

Here’s a silly one. There’s a game called ‘Sonic Dreams Collection’, where changing the size of the game window on the main menu changes the pitch and speed of the music. But what if it went beyond that? Suddenly you might care about window sizes in this strange new context, and it might elicit a reaction normally not reserved for the size of your window…

Or what if you tied the movement pattern of the ripples of a nearby river to the hair physics of your player, but only inject the data as you move away from it? What is this environment trying to evoke? Negative magnetism? (what does that even mean)

Finding meaning here can be a bit like trying to parse through a tarot card reading. You draw some random sources and try to map meaning onto their relationship.

Maybe the gag about window size didn’t inspire a deeper search for meaning, but what about a more opaque & esoteric data abstraction ? You might experience it as a kind of intelligence. And we could employ these relationships in subtle but cumulative ways.

Bees, for instance, perform a figure-8 called the ‘waggle dance’ that relays important locational info to other bees. You could create a dumb version of this cooperative relationship using abstracted data, and employ it within a system of similar objects. Maybe the relationship relies on distance between the objects, so that when they get close, they appear to share information with each other through sound or movement.

As worldbuilders, we could hint at a deeper ecology, through layers of data abstraction that might seem cooperative, adversarial, emergent, or mysterious and difficult to verbalize. We can suggest that with or without the player, the actors in this ecosystem are hopelessly entangled, and will carry on with their ebb and flow, just like we all do. Could be cool ...

The nice thing is that unlike the "waggle dance", you don’t need to prove it out with science. Maybe even the most arbitrary data relationship could feel like real intelligence if it’s been sufficiently abstracted. Players will conjure up their own interpretations, they like to do that. So you just need to convince them that they are experiencing something meaningful.

In other words, I think that by making different parts of a game communicate and share information in non-traditional ways, we can emulate the vitality we experience from real intelligence, and as a result it may be possible to manufacture a deeper sense of meaning and causality.

And the more liberal the different parts are in communicating with unlikely partners, the more things may start to get downright ecological.

And an interesting ecology of data relationships would probably have different kinds at play ... opaque ones, transparent ones, those that seem arbitrary, those that are rational, the esoteric (opaque+arbitrary), the absurd (transparent+arbitrary), the accessible (transparent+rational), the intelligent? (opaque+rational) (i totally made this up)

I think the abstraction and recontextualization of data can lead to all sorts of results. But if we sense there are meaningful relationships of cause and effect at play, that could lead us to suppose there is intelligence, and that could bring more depth to our experience.

So give it a shot! Things will definitely happen if you let your systems co-mingle.

You could let the volume level of a creature’s mating call drive the probability that other creatures respond in kind.

You could light a room using the average color of the last 60 frames.

You could take the wave propagation system used to drive visual wind FX and map it to the size of an NPC’s shoes.

But in any case,

AbracaDATA!

Or perhaps … Abraca...dada?(ism)

Link: GDC Vault: 'Turing Tantrums! AI Devs Rant'

Presentation: Serialism & Sonification in Mini Metro

I gave a talk at GDC 2018 as part of a session at the Artificial Intelligence Summit called 'Beyond Procedural Horizons'. I talk about how we combined data sonification and concepts taken from the musical approach known as Serialism to build a soundscape for Mini Metro.

Transcript:

Serialism & Sonification in Mini Metro

or, how we avoided using looping music tracks completely,by using sequential sets of data to generate music and sound.

Serialism

In music, there is a technique called Serialism that usessequential sets of data (known as series), set about ondifferent axes of sound (pitch, volume, rhythm, note duration, etc),working together to create music.

In Mini Metro, we apply this concept by using internal data from the game and externally authored data in tandem to generate the music.

You might have noticed that the game has a clock - the game is broken up into time increments that are represented as hours, days and weeks. (though of course faster) And before diving in, it’s important to know that we derive our music tempo from the duration of...

1 In-Game Hour = 0.8 secs = 1 beat @ 72 bpm = our master pulse, one in-game hour.

We use this as our standard unit of measurement for when to trigger sounds.In other words, most of the sounds in the game are triggered periodically, using fractional durations of 0.8 seconds.

In Mini Metro, the primary mode of authorship lies in drawing and modifying metro lines. They’re also the means in which everything is connected.they serve as the foundation upon which the soundscape of pitches and rhythms is designed. The lines are represented by a unique stream of music generated using data from different sources.

The simplest way to describe this system is that each metro line is represented by a musical sequence of pulses, triggered at a constant rate with a constant pitch. This rate and pitch is constant until they are shifted by a change in gameplay. Each metro station represents one pulse in that sequence. Each pulse has some unique properties, such as volume, timbre, and panning, and these are calculated using game data. Other properties still are inherited from lower-levels of abstraction, namely unique loadouts for each game level. Some levels tell the pulses to fade in gradually, other levels might tell the pulses to trigger using a swing groove instead of a constant rate, and the levels are differentiated in other ways as well.

[1, 2, 3, 4, 6]

All of this musical generation is done using sets of data. And referring back to Serialism, the data is quite often sequential, in series. These numbers actually represent multiples of time fragments. Or, to put it more simply, rhythms. In the case of rhythms and pitches, the data is authored, so we have more control over what kind of mood we’d like to evoke. This data is cycled through in different ways during gameplay to generate music.

So we’ve got some authored data generating musical sequences, but what about using game data? Ideally we could sonify it to give the player some useful feedback about what is happening.

Combinations of game and authored data are found throughout Mini Metro’s audio system. There’s lots of game data and authored data being fed into the system, working in tandem. Authored data is often used to steer things in a musical direction, while game data is used to more closely marry things to gameplay. In some cases, authored data even made its way into other areas of the game. Certain game behaviors like the passenger spawning were retrofitted to fire using rhythm assignments specified by the sound system.

You might ask why go through the trouble to do things this way? Well, it is really fun. But beyond that there are a variety of answers and I could go into a lot of depth about it, but I think the most important reasons are:

Immediacy & Embodiment

Immediate feedback is often reserved for sound effects and not music, and immediate feedback in music can feel forced if not handled correctly. This type of system allows us to bring this idea of immediacy into the music in a way that feels natural.

The granularity of the system allows the soundscape to respond to the game state instantaneously and evolve with the metro system as it grows, and a holistic system handles all of the gradation for you. When your metro is smaller and less busy, the sound is smaller and less busy. As your metro grows and gets more complex, so does the music and sound to reflect that. When you accelerate the simulation, the music and sound of your metro accelerates. When something new is introduced into your system, you’re not only notified up front, but also regularly over time as its sonic representation becomes a part of the ambient tapestry.

Embodiment

And this all (hopefully) ties into a sense of embodiment. Because all of these game objects have sounds that trigger in a musical way, and all use a shared rhythmic language that is cognizant of the game clock, and use game data to further tie them to what is actually happening in the game, things start to feel communal and unified.

It’s an ideal more than a guarantee, but if executed well, I think you can start to approach something akin to a holistic experience for the player.

Thanks!

Link: GDC Vault: 'Beyond Procedural Horizons'

Presentation: Curveballs

I gave a microtalk at GDC 2017 as part of a session called 'Composer Confessions'.

Transcript:

I often find the beginning of a project to be the most exciting period. There's a sense that anything is possible, which has often propelled me down many avenues in search of the 'thing'. Along the way, Murphy's Law that "anything that can go wrong, will go wrong" kicks in, and I rarely end up where I intended. Sometimes projects turn out worse, but often in my experience, they turn out different, or better. During the course of my work on a thing, there tends to be a psychological transition, from an exciting beginning of boundless possibility, to a starker, more responsible reality. You bring an idea out of the ether, and give it life. It gradually frames itself and the edges come into view, and increasingly, the ‘thing’, as I’ve so lovingly called it, solidifies. This process is a necessary part of the work, and it's a part of life too, right? We have little choice but to persist through the unending number of curveballs thrown our way, starting with being born! … And so we do our best to adapt.

The curveballs often take us down paths we would never take... or could have never imagined. They open doors for us that would otherwise remain closed, and they lead us to new conclusions. They change us. In my opinion this malleable process is often necessary to complete a work, regardless of whether we see or believe in it at the time. I find that only in the days that follow, can I truly begin to comprehend its meaning.

When starting work on a new project, I often set off in the first direction that sparks my curiosity. I try to keep the utilitarian needs of a video game in mind, but if I’m staring down a new idea, I seize it as an exciting challenge.

This tendency of mine perpetuates a myth I can’t help but tell myself time and time again... THIS WILL WORK! ...right? Before I know it, I might be neck deep in an esoteric cobweb of systems, or sinking large units of time into something wonderful! … and wonderfully inappropriate. I’m guilty of occasionally forgetting that when working on a project with real-world restraints, my ideas have to be grounded in that same reality.

I spent a few years creating music and sound for a game called ‘Hyper Light Drifter’. During this time I had a dream that I would write long, artfully composed pieces of music. They would move effortlessly from idea to idea, all while evoking the perfect expression every step of the way. I wanted to write these songs at the piano, and notate them completely before even thinking about the electronic production or how they would even be implemented in a video game.

Here I had created a set of goals for myself, which is something I often do during a project ... but this time, they ended up not suiting me or the project at all. First off, I hated using software notation, and I wasted lots of time inefficiently grumbling my way through it. Second, the piano pieces I had written, with all their shifts in tempo, energy and tone, were very hard to translate to an electronic sound.

I underestimated the difficulty of recreating the essence of a performance in a completely different context. I’ve transferred pieces to new instruments before, and thinking this was no different, I took it for granted. Lastly, the compositional style was not nearly agile enough to keep up with contextual changes during gameplay. For example: The player would move into a new area, and start doing a new thing, and the music would be off somewhere else, expressing some other idea.

It seems obvious looking back, but at the time, I thought I could pull it off. The path I had chosen proved to be a long, brambly one with a big dead end sign. It was emotionally tough to put so much time and effort into an unsuccessful direction, and it took a while to pick myself up and set out again.

I only stumbled into the right solution when I was thrown a curveball.

The developer decided to create a demo, and I suddenly needed to write music for a very large vertical slice of the game in a very short period of time. Granted, it was a curveball I was aware of, but due to poor planning, it crept up on me in a way that I think still serves the purpose of this talk. Feeling urgency, I had to forego my agenda and just write something ... Anything! And put it in the game.

This simple process yielded the best results, and changed the trajectory of the project for me. I learned a valuable lesson …

Sometimes I am too precious about my work. I tend to inflate the importance of ideas I've devised before they've proven themselves to work. And sometimes it's hard for me to let go of those expectations and move forward.

Towards the end of Hyper Light Drifter, as the pressure really started to mount and I was struggling creatively, I picked up a mantra ...

Broad Strokes

Painting with broad strokes has often been counter-intuitive for me. The rabbit hole is my favorite pitfall ... (pun intended). But by prototyping my work in practice, not in a vacuum, and saving detail-oriented decisions for later, I've started to save lots of time.

The curveballs are frequent now. But they are more useful than ever.

The full session also features Austin Wintory (Journey), Mick Gordon (DOOM), Wilbert Roget III (Star Wars), Jason Graves (Dead Space) and Grant Kirkhope (Banjo-Kazooie).

Link: GDC Vault: 'Composer Confessions'