Published: September 2021, Square Enix
My first project with Deck Nine Games, and therefore the first project I ever worked on, was Life is Strange: True Colors.
About a year before landing the job, I had just finished up my Masters in Fine Arts in Digital Production Arts at Clemson University. The program itself was geared mostly toward (or at least, had the most infrastructure for) creating VFX and Animation tech directors for feature films and television. Going into video games was not unusual, but required a little more of a self direction to get into. And while many people there specialized in a certain digital art field (Character VFX, content pipeline management, surfacing, animation, etc), I fell into a very broad category of “generalist” – something of which, at least for the animation industry people we talked to, was not exactly super marketable.
It was not until the time of my thesis defense that I Was given a different term to go by and market myself as: Technical Artist. This turned out to be a good fit for me, as the job of a technical artist is super, super vague and all encompassing. I interviewed with many companies following my graduation, each time for a technical artist role, and every time the job description was completely different- sometimes it was tool creation, sometimes it was animation, sometimes it was materials and graphics programming.
Eventually, about a year after graduating, I was interviewed by Deck Nine Games- very likely with the help of another DPA graduate who worked there, also a technical artist, who specialized in graphics programming. I was given the job, moved out to Colorado just as the country shut down for Covid, and was finally introduced to the game I would be working on, known at the time as Life is Strange 3, and the little tech art niche I would be filling: Character Locomotion.
I learned this later, but when I was initially going to be hired, I was going to be put on the Life is Strange: Remastered team doing material work. However, my Master’s Thesis all about narrative based gameplay animations got me moved over to True Colors, where the locomotion animation system needed some love and care.
The Locomotion System: as it was
The Locomotion system in True Colors was fairly proprietary at the time. I’m not entirely sure how movement itself worked (I’m fairly certain we used the character movement component and root motion), but I grew intimately familiar with how the animation handling itself worked.
First, the usual- animations were mo-capped, cleaned up, then handed to gameplay animators to polish further and tweak when needed. But then, animation sets were then set up per character in a pretty nifty, specialized tool. the tool (whose name I forgot) was used to define animation state machines, where someone would define the rules between states- the blend times, what state goes to what, and most importantly, which frames go to which. animations for a state could be defined, and then could be processed to create a huge list of animation transitions per animation, as a sort of brute force motion matching system.

The state definitions and motion matching lists would be exported out as Data Assets in Unreal, and applied to the desired character. desired states that match the ones in the state machine we created could be passed into a special animation node, that would make sure that the transition we wanted existed, and that a frame existed to transition to. if both were true, the animation at the given frame would be loaded up, and we’d transition out of the animation we were currently in- the entire system would have a huge data footprint, but then only had two animations maximum at a time in the locomotion animation node.
The rules of WHEN to transition between states was not handled in the proprietary tool, but instead in a separate c++ class. If a given requirement was met, then the new state would be requested of the animation node.

My work on the Locomotion System
My first task was making the state machine handling more designer friendly. the meant exposing a lot of variables that at the time were just magic numbers in the c++ class. This was a fairly easy task, and a good introduction to using the Unreal Engine, which I had not used in any REAL capacity before that point. These were things like minimum and maximum angle thresholds for start walks, stick hold times, etc.
following that, my main task was to polish the transitions from Idle to movement. What we wanted originally was the ability to have start walk animations for 8 directions. this meant a lot of fiddling around with motion matching and state machine setups to make sure I was able to get transitions at the proper angles that also felt reasonably smooth.
While I was able to implement this rather quickly, there was back and forth about whether or not it was what we actually wanted. The main issue with these kind of robust start animations (an battle I fought in each game afterwards) in a step-by-step animation system such as this is that these robust start walks can be really, really clunky, for the following reasons:
- To minimize foot sliding or harsh rotations, you might want to make sure the animation plays out in its entirety, and or cutting off input during the start walk state. This might mean that player gets stuck waiting for the animation to end before being able to change directions
- Additionally, any commitment to these animations means you might make bigger movements than you would want in places where you want more precise movement controls
- If you don’t have a way to handle mid-direction changes, you could slide or rotate the start walk you started with in odd directions that look bad

For all these reasons, and a general direction of keeping responsiveness to player input, we ended up scrapping multi-directional start walks, and opted instead for just playing a generic forward start for any direction we used.
From there, most of my work was owning the locomotion system and adding polish to transitions. I also worked on functionality for:
- Fidget animations
- Fidget animation handling came in two flavors- ones that ran between random delay times (which was true for most of the game) and ones that had a chance of randomly happening only at the end of a loop of the idle animation (which was true for when Alex is in the mine shaft at the end of the game- as the idles could be way too far off to blend nicely otherwise)
- Accessory animations
- A Character was also given the ability to have an accessory – Think Alex’s bag in the first scene, or the lantern in the mine shaft – These were separate actors that had synced animations. While the system to run that was there before me, I was able to work with the content teams to ensure that the animations and rigs of the accessories were where they should be to ensure the character and accessory were properly synced.
- Gameplay Facial Animations
- In my first brush with custom C++ animation handling, I was able to expand our locomotion animation node to also handle blending between facial animations based on the running speed of the player. However, not much was dedicated to getting all the animations made an polished, so it mainly just worked to add a consistent face animation on top of gameplay animations to have a more seamless animation overall. This was toggle-able, so that in certain situations (again, the mine shaft scene), it could be turned off to allow the normal animations to handle the facial animation for us.
- Point of Interest look-ats
- There was already a system built in for looking at characters, but nothing for taking certain intractable objects and making them targets for a look at. This meant creating new behavior for entering the range of an intractable that sent it’s position data to the player to look at. The system including setting up how long to look at a new thing, how long to linger when leaving the range of the intractable, how quickly to look at it, and how to blend in and out of look-ats. It also meant making a special case for being within range of an interact-able at the end of a cinematic, which could cause a really weird, literally eye-popping blend to happen. And a lot, a LOT, of troubleshooting single frame head pops on hard cuts between cinematics and gameplay.
Adventures in Bug Fixing
Late into production at Deck Nine is the time where coding specialties are thrown out the window and it’s all hands on deck to fix whatever is thrown your way. This lead me to work on a few different systems of note, such as:
- The Disappearing Color Filter
- One of the options when setting up the game is using a color blindness filter. The issue is, in our shipping builds, the filter would only appear to work when subtitles were onscreen. This bug was thrown my way, and while I was very confused as to why it was given to me, I did my best to find the issue. When I dug into the code, I realized Slate, the back-end that handles UI, only draws the filter if it has ANY UI to draw- this explained why we were seeing it in our debug builds (on screen debug messages count as UI and were always on) and not shipping. To fix this issue, I went into the main gameplay window UI that is always open, and at the very bottom left corner juuuust off screen, added a blank white square. That fix has lasted in every game published since to fix this issue (though, having learned recently that we’re supposed to be using the filter as an in-house debugging tool to check if the game is accessible to the colorblind on OUR end, it makes more sense why Slate handles it like that).
- Using One Controller
- Unreal engine is built for multiplayer games, and as such, the engine level controller handling makes sure to give different controllers different controller IDs. however, I was one day given a design list that detailed how controllers should work- basically, you should just be able to plug in any controller – even have two in at the same time- and be able to play with either one. This meant going into engine level code and making a change to ensure that ANY controller input was just pushed through controller ID 0.
- High Score Tables in the Arcade Games
- My other main non-bug fix contribution to True Colors was programming the high score tables into the arcade games. This meant making the table contents into saveable data, and making sure there was two sets of data- one for the story, and one for the extras. For the Arkanoid port, there was also a push from Taito to make sure that certain words weren’t even allowed as names- while I was able to get a few of these in, we didn’t have much direction on what ALL was disallowed- especially since, in game, we HAD to have “ASS” as the previous high scorer.
- Speaking of “ASS” – there were supposed to be multiple references to ASS having the top score depending on the scene. While this was never implemented, there still exists in code a function I wrote that, when entering a scene with one of these special dialogs, would add a new high score for ASS above the player if they had the top score- an “ASS Injection”, I called it.
Since it’s been forever ago (a long, long 4 and a half years), I’m sure I’m forgetting certain bugs I could talk about, or at least forgetting enough details that would make it interesting- a crash caused by some objects not having generated IDs, a weird quirk in the checkpoint system during replay mode that would cause Ducky to have a hat when he shouldn’t, some characters having bad positions when loading, the Foosball minigame scoring UI- a weird hole in the navmesh that only appeared on Xbox and eventually cleared itself out after trying to solve it for weeks- but that should cover my main responsibilities.
