This month begins another attempt at self-reinvention. As time passes and we get older, I think its important to take moments in time and reevaluate what you are doing with your time here on earth. Some call it a “mid life crisis, but the concept is the same no matter what your age is. You wake up one day, think about your current life, and wonder “Whats next?”
I had been thinking about this a lot recently, and came to a bit of a conclusion as to what action I want to take in this regard. At first, I thought about each aspect of my life separately.
I want to get in better shape. I want to advance my skills and my career.
I want to eat better.
I want to meet new people and do exciting things.
Yes, these are stereotypical New Years resolutions that people make, and at first glance, these things are only kinda related, and it might make sense to think about them separately. But I thought indepth about them, and realized that the thing keeping me back is the same in all cases.
So, today I finished work on http://finalclause.dantarion.com/hitboxes
It is a site that allows you to view hitboxes, hurtboxes, and sprites for UNIEL. However, it took a lot of work to get from nothing to this, so I wanted to document this. So lets start with the files on the disc!
Getting to the files
Once I had the disc extracted onto my computer, I looked around and located what looked like the character files.
I assemed that .pac was a container, and that they were gzipped. I used WinRAR to extract these and I took a look inside using Hex Workshop. Examining the file quickly showed me a simple structure with a list of file offsets, sizes, and pointers to filenames. Writing an extractor took about 15 minutes.
Now, it was onto the next step. Getting the ART.
Getting the Art
I referenced muave’s Melty Blood viewer for information on the .HA6 file, which contains all of the art. Its just another container file, containing uncompressed DDS textures. I was able to write an extractor easily, which left me a folder of DDS textures. After realizing that the DDS formats for all of them was the same, i modified my script to output PNG instead of DDS.
So, now we have the sprites?
No, not exactly. You see, video card hardware works best with textures that are multiples of a power of 2, like 128, 256, 512, etc. It seems like French Bread used some kind of tool that split up their sprites into 32×32 squares onto a texture that is 512 pixels tall. In addition to that, the textures aren’t stored colored. Since the game lets you select a color scheme for each character, the textures are stored with an indexed, palette. This means they are all 1byte greyscale, where color 0, black, references palette_color and color 255, white, references palette_color.
For the next step, I needed to apply the palette as well as take the 32×32 tiles and rearrange them into sprites.
The “cg” file contains the information about the texture chunks and how to arrange them into sprites. Luckily, it also contains the default palette for each character. After a lot of trial and error, the above picture ended up as…..
Some of you may look at this and say….Why is his hair cut off?
The hair is actually stored in a separate sprite for this particular frame. The file that holds the character script dictates which images are drawn where, along with the hitboxes, hurtboxes, character state, etc. But that ill have to talk about next post.
July 24th saw the release of a new fighting game, Under Night In-Birth, on both PSN and retail media. As with any fighting game, understanding move properties is very important to getting an advantage on your opponent. In the past, this data was gathered through intensive training sessions and extended periods of gameplay. After hundreds of hours of gameplay, you can figure out, X move is faster than Y move, etc.
However, its 2014 and I don’t have time for that. Kappa.
Lets get the data we want!
The first thing I did was buy the game, and get it running both on both the retail and jailbroken PS3. Then I examined the games disc structure. A quick glance around showed a folder called script. A quick glance in that folder showed a bunch of plan text scripts.
There was literally a text file with the words “NoLocalDebug” in it! And in it contained a list of constants for debug mode, with everything set to 0 (off)
So, I look this file and changed all the 0’s to 1’s, enabling all the listed debug functions available in the game, and booted it back up! The result was some extra data appearing in-game! The game now shows the startup, active, and recovery frames of each move, as well as frame advantage on hit/block. Heres a video of it in action!
And heres another!
However, this was not good enough for me. Its one thing to be able to see the framedata in-game, but its another to be able to review the framedata offline. Ideally, youd want all the info laid out in front of you in a table, allowing you to study it when away from the game.
I will cover the voyage towards that in my next post
KenBot v1 was very basic. Most of his gameplay revolved around one specific chain of events.
Is the opponent doing nothing? Mash DB,DF.
Is the opponent doing something near me? Mash D+PPP,F+PPP.
Am I being thrown? Mash Tech!
This alone proved effective, but with some limitations. I can’t seem to get past about 3 frames of input delay. This means that KenBot will never be able to react to a 3 frame move on reaction! However, since he mashes DB,DF, he has a 50% of randomly blocking one of them anyways. Command grabs will hit him, unless they are slow like Abel, Honda, etc.
Here KenBot counters cr.LP and cr.LK, but NOT st.LP or st.LK, because they are 4 frames So, this means that many jabs, and a ton of command grabs will just hit KenBot! And Shoryukens! And…a bunch of other stuff too. At this stage KenBot didn’t understand overheads, or moves that were too fast to punish with DP or Ultra. You could Focus backdash at the right distance and KenBot would fierce DP!
He had to become smarter! He had to become more aware! And in order to do so, I needed to get more feedback from the bot! So, I took KenBot, who at this point had a ton of hardcoded reactions for a few character states, and rewrote the code and added a GUI.
My next article will be about training KenBotv2 to…do a lot more than DP.
So, the world likes KenBot.
Twitter followers. http://twitter.com/dantarion
Thanks to all that have been spreading my work around. Its pretty awesome to see something I made in the past week get thousands of views.
Normally I work on boring things like modding tools, or reference sites. Its fun to work on stuff that is a bit more entertaining for the average gamer out there.
Well, lets think about how any AI needs to work. Not even that, lets think about how any game is played. Almost all gameplay can be summed up kinda like this…
The player makes inputs based on stimuli provided by the game.
Its easy to think about this from a real players perspective. You see your opponent jump at you, you wait until they are on their way down, input a Shoryuken motion (forward, down, down-forward) and press medium punch…and then you hold focus attack, dash forward, and input your ultra!!!
This sequence is common, but what does it take for a bot to do this?
The bot must be able to input commands.
The bot must be able to read game state.
For example, in the above example, the bot may need to know…
The X and Y location of it and its opponent. It needs to know when the opponent leaves the ground, and how close they are, in order to know how to time the SRK
How much meter it has, so it knows whether it has ultra or meter to FADC ultra with.
To input commands, the bot simply sends keyboard events to the game. To read gamestate, the bot reads the memory of the game’s process to determine what is happening. The values were found through Cheat Engine.
Seems pretty simple right? However, what happens if the person jumps backwards? The bot would see the person in range, but in reality the person will be out of range by the time the SRK is inputted and comes out!
KenBot v1 wrote status information to a file. I then used OBS to overlay the status information onto the game in my recordings.
When KenBot loses, I look at the footage frame by frame and adjust how things work, coding exceptions, new rules, and some character specific matchup knowledge!
Here is a good example from my previous post.
In the last two rounds the bot keeps getting hit by long range pokes. These moves are so slow that the bot SHOULD be blocking or SRK’ing on reaction based on distance…so what was happening?
Two failures. The bot tried to SRK backdashes, and failed multiple times. Then the bot got stuck in its Karathrow script.
This was unacceptable. Time to take KenBot to the lab.
In this video, you can see KenBot’s accuracy in offline play. It won’t walk forward unless Balrog goes to neutral, and its SRK mashing is fast enough to ATTEMPT to shoryuken Balrogs EX headbutt (even though HP shoryuken happens second, the EX headbutt wins)
In my next article ill write more about making KenBot…PSYCHIC.
So, I have dabbled in messing around with Street Fighter 4 PC, reverse engineering the file formats, building the Ono! editor, and trying to rip framedata…but…this post is about a new project.
I stumbled upon lullius’s http://www.slitherware.com/, a site where he has made a variety of tools that do things in realtime while the game runs, such as a hitbox viewer, camera controls, and a macro playback engine…and even a Bot.