GDC Recap 1: Getting to SF and getting my GDC on!

This year I attended GDC, the Game Developers Conference. It is the largest conference for those involved in the game development industry and it is seen as a fantastic resource to learn skills, network with peers, and see where the industry is headed. Its not really a place for those that want to play games, but there are tons of interesting things for those who want to know how games are made.

Getting There.

GDC is in San Francisco. I do not live in San Francisco. Getting to GDC involved a long bus ride from southern California all the way up to the bay area. I could fly, but I actually like the relaxed feel of riding the bus over the panic of airports. I always fill my phone up with game podcasts and game devs podcasts to get myself in the zone, and try to plan out what things I will be doing each day using the GDC app.

This year I picked out a hotel that was much nicer and much closer to Moscone Center than the place I stayed last year. I learned my lesson booking last minute. Last year I was unprepared. I didn’t give myself enough time to relax, and I planned an activity for every minute of every day. I didn’t know about the importance of EventBrite RSVP’s for getting into events, and I didn’t know what to expect when it came to networking with people around me. This year turned out a lot better!

Day 1

AI Arborist: Proper Cultivation and Care for Your Behavior Trees
More “I”, Less “A” in AI Interviews
Math for Game Programmers: Ranking Systems: Elo, TrueSkill and Your Own
How Defold Helps Indie Devs Perform Live Updates with AWS (Presented by Amazon)

Highlight of this day was Upload VR’s party! I couldn’t RSVP for it because I waited too late, but then later in the day a person on the GDC app linked to a post by the CEO about extra invites.  A few minutes later, I had one! The party was an interesting mix of music, dancing, booze, food, and all things VR. There were about 6 full green room stages for shooting VR/AR video, and each one had a unique demo to experience. There were multiple live performances using VR, including a demo where two ballerinas in motion tracking suits performed a dance with CGI rendered backup dancers mimicking their movement. There was also a live demo of a surgery training VR exercise.


Note: I know these posts are coming very late, I have been very busy lately and I'm gonna do my best to catch up!

Open Source…works!: bbtools success story!

So, some of my funner recent projects have been bbtools and boxdox-bb. These tools were aimed at reverse engineering ArcSys games, most notably the Guilty Gear Xrd and Blazblue series of games. As with any of my projects, at some point they were my primary daily focus, but over time, I have moved onto other things.

boxdox-bb is public and updated to the most recent version of Blazblue Central Fiction, the Xrd files are updated to the current version of Revelator. As far as I am concerned, the projects are live and workable.

bbtools’ source code is public on Github. This means that anyone can see the code I used, fork the project if I disappear and continue it, etc. This takes a lot of work for me. I have to stop working on the project itself, ensure that all code and dependencies are properly accounted for, write a README, and keep the online source code repository up to date.

With bbtools, several users have gone out of their way to add features and help with what I started.

This is the holy grail of open source as far as I am concerned.

I got started by picking apart the work of others and learning what I could. Being able to provide similar opportunities and mentorship towards people that are interested in reverse engineering games is important to me.

For example bbtools has code in it to support taking character moveset scripts for Blazblue Chronophantasma and recompiling them to and from Python to help users make custom movesets. At the time, I didn’t bother doing this for Xrd because Revelator wasn’t out on PC yet, so interest was very low.

However Github user Labreezy had something else in mind, and implemented this feature for Xrd for themselves. Now, in a closed-sourced world, maybe Lazbreezy would have given up. Maybe they would have built their own tool for themselves. But instead, they have submitted their changes as a pull request, sharing it with the world, so everyone gets this new feature I didn’t have time to make myself!

Github user suShirow has been helping out as well, updating command databases bbtools relies on with findings from our Discord and independant research!

Others have done this and honestly nothing makes me happier than seeing other people dive into the code alongside me and head into battle!


Automating Framedata: How they do it.

Sometimes I feel like the way framedata works is just as abstract and random as this video about plumbuses. Anyways, lets start from the beginning.

Note: This article was sponsored by my Patrons: To help support the creation of more content like this, please consider becoming a Patron here

What is Framedata Anyways?

When people talk about framedata, they are generally talking about a specific set of properties associated with a attack in a fighting game. Knowledge of these properties can be used to understand how fast a move is, what combos after said move, and how safe the attacker is when the move is blocked. This data is essentially what dictates the flow of the game.

Startup – How many frames does the attack take to become active
Active – How many frames does the attack remain active
Recovery – How many frames until the character can move or block after the move is over.
Hitstun – How many frames is the opponent stunned when the attack hits
Blockstun – How many frames is the opponent stunned when the attack is blocked.

To give an example of a situation where framedata is effective, lets say you are playing SF5, and its a mirror match; Ryu vs Ryu, the classic matchup. You opponent keeps doing st.MP followed by st.MP, hadouken on block. You keep feeling like you have to just sit back and watch it happen, as pushing any button just ends up with you being counterhit by the second st.MP.

Ryu’s st.MP is +1 on block. It has a 5 frame startup. This means that you have a 4 frame gap to do a move before the second st.MP hits. Using a list of framedata allows you to come to this conclusion, and then look for 4 frame or less moves to counter your opponents simple blockstring.

Continue reading Automating Framedata: How they do it.

Introducing StreamGen!

StreamGen: Because dantarion is Lazy

So, I have found myself wanting to not only make more youtube videos, but also enter the Twitch/Hitbox arena. Now,  one of the things that makes my streaming efforts a lot different than others is that I have a LOT of things I need to display at once for my stream to work, other than just game, cam, and a blurb of info.

When I stream KenBot, id like to show the game itself, the GUI of the bot, and possibly the monitor I am using to code. Now, OBS is perfectly capable of doing what I want, but when it comes time to polish it and make overlays+backdrops, I found myself annoyed.

I am perfectly capable of opening up Photoshop, creating something that looks amazing, and then using that on my stream. However, one thing that bugs me is that if I wanted say, lower the gamesize by 20% and increase everything else in my stream layout to match, I’d have to go through the following process.

  1. Make whatever adjustments I need to do in my PSD in photoshop.
  2. Export to PNG
  3. Move around sources in OBS and resize them to fit perfectly on the new layout
  4. Stream

Now, this is fine for  a streamer that only has a couple different layouts, but I wanted a bit more flexibility! I am a web developer by trade, so my initial thought was, why not HTML!

Continue reading Introducing StreamGen!

KenBot: C# has yield!?! and other relvalations

So, it just hit me that the yield keyword that I wanted to use to use to implement a bot in Python can actually be used in C# in an iterator. This has inspired me to update my design to use yield.

Let me explain what I mean. Now, normally, my bot goes through a loop like this.

  1. Collect information about current game state.
  2. Check and see if something is happening that requires the bot to stop what it is doing to react
  3. Continue doing what the bot was doing.

I implemented this using a simple state framework, where each state has variables it uses to store how far it is along doing whatever its doing. Each frame the state is called, and it presses what every buttons it needs to press that frame. The annoying part about this is that if you wanted to make a state that say, presses D, F, D, F, HP, you would have to write it so that it’s a function that gets called 5 times, and inputs the proper button on each frame. That would look something like this.

int i = 0;
function example()
 if(i == 0)
 if(i == 1)

I don’t like this, and it was the primary thing that made me get annoyed working on KenBot.

Now, the yield keyword isn’t normally used the way I am preparing to use it, but it actually solves more than one issue. The updated code will work like this.

function example()
    yield "I just pressed D";
    yield "Oh yeah, just pressed F";

In this example, it may seem like its the same about of glue code in between the code for each button press,  but there’s normally a lot more decision making and programming going on than this.

I am going to try to refactor my code this weekend and see how far I get into this and see if it helps anything.

KenBot: Postmortem or New Beginnings?

So, I put a lot of work into the most recent framework for KenBot, and I created something that was usable for more than just me. I brought it to EVO, but I wanted to spend more time enjoying EVO than sitting around babysitting a Kenbot station,  so not many people got a chance to play it. The code repository I published has been used by multiple people to create SF4 playing bots for different characters, to the point where I consider it to be a success.

However, I am not satisfied with the framework. It is clunky to use, and I want to make something that I can modify and use for any game, period. To do this, I need to abstract some major features out of my current codebase.

Continue reading KenBot: Postmortem or New Beginnings?


So, for the rest of this year at least, I want to try to keep myself to a pedigree. One Blog Post a day.

This would be hard if I really wanted every post to be a indepth article about game modding, or programming, or an opinion  piece, so I am opening up my blog to a bunch of new topics, bringing it more in line with a personal blog than just a dev one.

Heres a few ideas I want to do.


Not just game reviews, or media reviews, ANYTHING. I plan on trying to review one thing a week, weather that is a game, movie, TV dinner, hair product, piece of clothing, microcontrollers from Adafruit, etc.


There are a lot of obscure and interesting things that shaped who I am as a person today, and I think it would be nice to reflect on some of these things, people, etc, and write a bit about how they affected me.

Dev Progress Update

I want to take one day a week after work and focus on either working on, or releasing source code to one of my many, many unfinished products that I have accumulated over the years. I want to put all my code on either Github or Google Code so that others can take my projects and fork them if they want to continue development on it in the future.

Project Dantarion Update

I also want to post once a week about progress being made on the rewrite of my game prototype using Lua and UE4.

Cool Link

This is kinda the “I don’t have time” post that I want to do, rather than go a day without posting. These will just be a link to something, like a page, a song, a video, etc, and a shot paragraph about why I think its cool or interesting. I have no excuse not to be able to do one of these even if time is short, I could even post them from my phone.


I live in a beautiful place. I want to try to post a picture once a week of something, even if its just my roommates cat, or the sunset.

A New Era of Dantarion


This month begins another attempt at self-reinvention. As time passes and we get older, I think its important to take moments in time and reevaluate what you are doing with your time here on earth. Some call it a “mid life crisis, but the concept is the same no matter what your age is. You wake up one day, think about your current life, and wonder “Whats next?”

I had been thinking about this a lot recently, and came to a bit of a conclusion as to what action I want to take in this regard. At first, I thought about each aspect of my life separately.

I want to get in better shape.
I want to advance my skills and my career.
I want to eat better.
I want to meet new people and do exciting things.

Yes, these are stereotypical New Years resolutions that people make, and at first glance, these things are only kinda related, and it might make sense to think about them separately. But I thought indepth about them, and realized that the thing keeping me back is the same in all cases.

Continue reading A New Era of Dantarion

Breaking UNIEL: Part 2: Archives and Art

So, today I finished work on

It is a site that allows you to view hitboxes, hurtboxes, and sprites for UNIEL. However, it took a lot of work to get from nothing to this, so I wanted to document this. So lets start with the files on the disc!

Getting to the files

Once I had the disc extracted onto my computer, I looked around and located what looked like the character files.

I assemed that .pac was a container, and that they were gzipped. I used WinRAR to extract these and I took a look inside using Hex Workshop. Examining the file quickly showed me a simple structure with a list of file offsets, sizes, and pointers to filenames. Writing an extractor took about 15 minutes.

Now, it was onto the next step. Getting the ART.

Getting the Art

I referenced muave’s Melty Blood viewer for information on the .HA6 file, which contains all of the art. Its just another container file, containing uncompressed DDS textures. I was able to write an extractor easily, which left me a folder of DDS textures. After realizing that the DDS formats for all of them was the same, i modified my script to output PNG instead of DDS.

So, now we have the sprites? 

Merkavas Standing Sprite, Tiled

No, not exactly. You see, video card hardware works best with textures that are multiples of a power of 2, like 128, 256, 512, etc.  It seems like French Bread used some kind of tool that split up their sprites into 32×32 squares onto a texture that is 512 pixels tall. In addition to that, the textures aren’t stored colored. Since the game lets you select a color scheme for each character, the textures are stored with an indexed, palette. This means they are all 1byte greyscale, where color 0, black, references palette_color[0] and color 255, white, references palette_color[255].

For the next step, I needed to apply the palette as well as take the 32×32 tiles and rearrange them into sprites.

The “cg” file contains the information about the texture chunks and how to arrange them into sprites. Luckily, it also contains the default palette for each character. After a lot of trial and error, the above picture ended up as…..

Reconstructed, Colored Merkava Sprite

Some of you may look at this and say….Why is his hair cut off? 

The hair is actually stored in a separate sprite for this particular frame. The file that holds the character script dictates which images are drawn where, along with the hitboxes, hurtboxes, character state, etc. But that ill have to talk about next post.