Made for the month long Garden Game Jam, I was responsible with implementing an audio system framework in Godot. This system features:
Adaptive audio behaviors in response to states, switches, and parameters.
Per sound, priority-based, and max in-game voice limiting.
Modular music system with multiple tempo-synced horizontal transition modes and vertical layering.
Simplified workflow requiring minimum code for audio hooks while also decoupling the audio system so audio system work and changes can all be handled in the audio code environment.
Extendable audio behaviors through inheritance of core classes.
Below is an overview of the class structure.
I teamed up with two friends to participate in the Pirate Software Game Jam 14, my first game jam. In this game you are a concert hall attendant trying to keep a noisy, disruptive crowd from ruining the performance! We chose to interpret the jam’s theme of “It’s Spreading!” through the game mechanics as noise whispers and conversation spread between audience members. In addition to composing and creating sound effects, I was also assigned lead of the project to make gameplay decisions and guide the overall vision of the game.
Originally made by me in 2023 for teaching audio middleware using FMOD, this project was completed in 2024 so my friend, Nathan Link, could use it to learn FMOD and practice game audio creation as part of his Master’s project.
A very short, linear, third-person shooter. The game is programmed by me in Unity using C#. I originally created this project with the intention of using it to teach alongside basic audio middleware implementation. The audio setup is designed to allow students to make their own Wwise project and implement their sounds through easy-to-use dropdown menus in the game engine.
This project is the first game I've developed completely solo with the goal of learning Unreal Engine blueprints. It's an 8 level, first-person shooter inspired by games like Doom and Halo. It was my first venture into programming and engaging directly with a game engine's systems. The game is currently in a fully playable state from start to finish.
When I first started my Master's Thesis in 2021, I pretty much knew nothing about music production, had never composed in any electronic genres, only ever wrote notes on scores, and had no idea what most sounds in modern music were or how they were made. I had just started using Reaper as my DAW and, with its sparse initial state, was building my plugin library alongside my knowledge from the ground up. I wanted to use this project as a springboard into learning as much as I could about modern music production and it ultimately played a huge role in my ability to go deeper into game audio.
My last year participating in the SFSU Game Collab. I was assigned audio lead on Mystic Mayham, which was a multiplayer AR spell combat mobile game. I also assisted Bard Battles with rhythm game implementation where players could at any time trigger a modular melody to layer over the background music that would be the source of the rhythm for the player to follow using Wwise.
This was a very busy year for the SFSU Game Collab. We were a team of about 6 music students supporting 12 game teams simultaneously. I was assigned audio lead on two projects and did most of the audio work for both games: Containment and Godsworn. I later took on audio lead for City Run and Descend midway through the semester.
This is the first game I ever worked on as part of the SFSU Game Collab course and where my sound design and implementation journey began. Congratulations on making it this far down the page!