I’m very pleased to be able to announce that Baseborn (formerly “codename Ifrit”) is finally out of beta and has hit its first release. Thanks to all of you who tested the game back in January, we were able to squash a ton of bugs and add some new features, including a complete overhaul of the AI, a remastered soundtrack, new physics and effects for weapons, and the ability to restart the game at any time.
Perhaps the most important change is that it’s no longer necessary to download a specific version of the game if you’re a Mac or Linux user; thanks to the most recent version of FLAKit we can bundle the whole game into a single file and run it on any computer through a web page. This should also allow the game to run on computers that aren’t supported by Adobe’s flash projector, like PowerPC Macs.
If you missed it, we ran a series of posts during the last week that went into detail on various aspects of Baseborn’s development process. You can find them here:
Click on “Buy Now” to download it. You can name your price, or type in a “0″ to get it for free. Donations are appreciated, though.
In the beginning levels, I wanted the music to flow together as seamlessly as possible. Each time you change “zone” (go from beach to forest, or forest to tower), a new layer of instrumentation is added. Each layer is meant to alter the feeling to fit the zone, while still maintaining a theme and underlying tone. The Hellther and boss battle music are intentionally independent. I’ll explain this in a bit. Ideally, I would have created my own audio mixer in code, and faded each track in one at time when it was necessary. I was not able to do this due to limits on time and programming experience. I ended up having separate audio files for each layer, but it included all the previous layers as well. I would stop one track and start the other whenever I needed to change song. Fortunately, this was still enough to convey the feelings I wanted.
As a warning, these descriptions might become quite “artsy”, “romantic”, even “chimerical“.
When I wrote this beginning piano piece, I literally sat there with my eyes closed until I brought my mind into a state of empathy for what this shipwrecked character would be feeling. Amnesia, smallness, loneliness, desperation. Then, I just started playing. What you hear is the result. It’s very sparse. This is to contribute to the aforementioned mood, and so that the audio wouldn’t sound cluttered when other layers were added. It’s pretty straightforward, with only a brief key change towards the end of the movement. This key change is important though. It lets you know that everything is not safe, and something is definitely wrong here.
When you enter the forest, a small violin and triangle enter the scene. It was inspired by “Gustaberg” and “Ronfaure” from the Final Fantasy XI Soundtrack. A high, suspended violin note persists, and later a second violin harmonizes to give some motion. This and the sparse triangle give a feeling of “lostness” and “barrenness”.
The tower layer has a noticeable change in mood. It gives a royal sound with a rolling snare drum, and a brass melody line. This one is inspired by “The Kingdom Of San D’Oria” from the Final Fantasy XI Soundtrack. The melody starts off with a mid-range french horn and trombone harmonization. Then, it’s joined by a high trumpet harmony, which actually feels more like a melody. Towards the end, a weighty tuba part comes in to give the key change some emphasis. The intensity grows with this track.
This is where the bass kicks in. There is a sudden drop when the contrabass part comes in. The deepness is meant to set off worry and convey “fear”. The notes I chose for it are intentionally dissonant. I found and amplified the tensions that existed within each chord. A steady and cold bass drum beats to bring out the darkness. The audio is rather full at this point, and the tension of the music is growing.
The instant this track starts, everything is suddenly silenced. No more instruments, no more music. All you hear is some awful ringing and a relentless drone in the distance. This one was inspired by Ravenholm from Half-Life 2. The purpose was to give a sudden and extreme change so that this strange area stood out from the rest. The goal was to make something awful and twisted. The place is riddled with serpents and demons and horrifying shadows of yourself that are all trying to kill you.
After the quietness of the Hellther ambiance, I wanted to give a jolt when the player encountered the boss. This comes with a string ensemble playing very dissonant, stabbing, staccato chords. A descending hammer dulcimer line comes in to add some variety in timbre. Then, when the rolling snare comes in, the cello begins to branch off from the main rhythm, playing some beats in between to add to the sense of urgency. At this point, for variety, the dulcimer first plays an ascending line, and then the normal descending line with harmonizing 5ths.
From here, a nylon-string guitar plays some sustaining arpeggios to add to the layer. Now, we add some bass drum hits and cymbal crashes to really amp up the panic. The dulcimer becomes more active by playing bits of a steadily ascending scale. The end of each bit ends on a tense note. By now, the player has begun combat and is realizing the intensity of the battle. This section ends with all instruments hitting and cutting out, and the dulcimer ringing out with the now-familiar descending line.
The next section quiets down a bit for drama. We’ve got the cello laying a foundation and the dulcimer plays a more active line to keep things going. It plays three-note sequences that rise up the scale each measure. The rhythm of the dulcimer and the cello complement each other in such a way that adds to the “unsettling” feeling.
At the end of the bar, we hear the familiar descending dulcimer line, and the string ensemble plays descending chromatic stabs, alternating in octave as it goes down. Then the snare drum returns to pick things back up again. The dulcimer keeps the same rhythm, but continues to climb in the scale, getting more tense as it goes. At the end of this bar, it all climaxes and then releases into the next section.
Here, everything falls and the tension eases. Insert some lush, flowing strings, and some peaceful harp arpeggios. The peace doesn’t last long, as the final chord of this bar re-introduces the tension. Now the dulcimer takes the lead with a somber melody. We have the strings and percussion hitting every other beat, with some offset rhythms at the end of each measure. It all quickly grows in intensity, then we have a big cymbal wash and . . . a small triangle trills out for a measure or two. Then the piece repeats from the very beginning.
This was a piece I wrote last-minute to go over the credits. I wanted it to be peaceful, and leave the player with a calm feeling. It was inspired by the faint blue-glowing crescent moon in the background as the credits roll. I was listening to a lot of George Winston at this time, so he was consequently an inspiration for this as well. This one was not played by hand. I created the notes in my MIDI editor.
Most of the sound effects in Baseborn were not created by me. Nearly all of them came from a great resource: freesound.org. There were some sounds that I created, though.
First, I found a synth sound that sounded “airy”. Then, I essentially ran a finger over several piano keys. This created the sound! It took a few tries to get it just right, but it was pretty straightforward.
The creation process for this one was very similar to the frost attack sound. It just took some extra equalization to get the tone right.
This one was just a matter of finding a synth sound, and playing a sustained note. I took a small chunk of the sustained note, and copy/pasted that chunk one after the other. Then I did a crossfade between each one so it all sounded like one piece.
Working on Baseborn was one of the best times of my life. I’ve never been so productive and learned so many new things in such a short time. Working with Chris was awesome, and I’m glad we decided to go on working together after the class was over. However, we encountered a number of technical problems during development, and I wanted to share some of them.
One major hurdle early on was that both Chris and I were using Flashdevelop to do our programming, but our professor required all work to be submitted in the form of a Flash Professional .FLA project file. Due to some incompatibilities between the Flex and Flash Professional compilers, all the code we were using to embed art and media assets would have to be rewritten, which, in a project of Baseborn’s scale, would have required an absurd amount of refactoring. Neither Chris nor I had access to Flash Professional except for one day per week (when our class took place), I knew we had to find a way to make our project work in both environments with a minimal amount of changes.
From this necessity FLAKit was born. Instead of embedding assets with Flex metatags or importing them as FLA project symbols, FLAKit loads images and sounds asynchronously from the project directory. This actually helped speed up our iteration time; embedding assets slows down compile time dramatically, while loading them at runtime is almost instant. In a future version of FLAKit I hope to add support for live updating to further increase productivity.
In the end we were able to take our finished project and open it in Flash Professional with no changes whatsoever. I’ll never forget the rush of excitement I as I checked and double-checked that I had opened the correct file. It was almost too good to be true.
Before Chris and I were working together, he asked me how one would go about creating and using animations when the tools built into Flash Professional were unavailable. I briefly explained the concept of a sprite sheet and demonstrated my usage of it in my own project. I was using my own custom bitmap canvas rendering framework at that point with sprite sheet rendering built in, but it would have required far too many changes in order to work in Chris’ project. I wrote a quick class that showed off the basics of a sprite sheet system, then polished up the interface afterward. As you can see from his article on Baseborn’s art process, Chris took to sprite sheets pretty immediately. This class ended up serving us through the entirety of Baseborn’s development, and it’s now included in FLAKit.
Another challenge was learning to use version control as a team. I’d been using Mercurial and Bitbucket on my own for some time before the class, but I’d never been in a situation where other people were contributing to the repository as well. The first few revisions were full of overwritten changes and merge errors, and more than once my commit message contained an apology for messing up Chris’s work or reintroducing a bug we had previously fixed.
Fortunately, it didn’t take too long for us to get familiar with the tools. TortoiseHG and Kdiff3 make using Mercurial a breeze, and Bitbucket’s hosting and issue tracker helped us to stay on track with our goals and tasks. I can’t recommend them highly enough.
One of the biggest mistakes I feel we made was in designing the structure of the code. The update method of our main class is over 450 lines of code and contains a lot of logic that should have been encapsulated into the player class. Towards the end it was difficult to make even minor changes. The enemy class is an example of good entity design; all of its logic is hidden away and it never directly modifies global data. If we had followed this pattern for the player class, our code would have been much more manageable. Unfortunately, the AI logic has received the most work and is the most recent code in the game, so the lessons I learned from writing it came too late to be of any use elsewhere in the project. We’ll be sure to keep this kind of encapsulation in mind in our future projects.
We encountered a lot of technical challenges during Baseborn’s development, but overall we managed to pull through and ship a game I’m quite proud of. Working on a team to bring a game to completion was a great learning experience, and I’m excited to see where we go from here.
Baseborn was the result of many “firsts” for us. For Jake, this was the first game he ever completely finished, and the first project of this kind he had ever worked on with another person. There were many more firsts for me; too many to count. Since I put together most of what you see and hear in the game, we thought it’d be appropriate for me to write about the experience, the process, the tools, and the thoughts and inspiration behind it all.
I had fiddled around making graphics before, but nothing like what Baseborn would require of me. While I certainly did my fair share of coding, a vast majority of the man hours I put into this project came from the artwork. This was partly due to my inexperience and perfectionism, but mostly because of the seemingly endless amount of art we needed. Paint.NET was my weapon of choice. We became good friends, and shall remain so. I enjoy its balance between simplicity and capability, and its built-in effects were invaluable to much of the art.
My approach to drawing the graphics started out as sloppy trial and error. Chuck some pixels onto the canvas and try to connect the dots until it looks like a thing. It didn’t take too long to develop a real workflow, though, and it went something like this:
Put on some music that “invokes the essence” of what I want to draw.
Google Images for reference.
Using existing images for scale, draw an outline of the basic parts of the vision.
Choose color palette and fill with basic color.
Create details and shade with slight deviations from original palette.
There were a few things I came to recognize as crucial to the process of drawing. They’re kind of no-brainers, but I often overlooked them in the beginning:
USE LAYERS. That is, of course, unless you love to spend hours redrawing the exact same thing except for one or two pixels.
Use and don’t use a grid. It is very useful to be able to line things up and make sure you’re even and all, but sometimes it can make your drawing feel stale and too mathematical. Be sure to turn off the grid and just free-hand it now and then. I found that it usually resulted in a more natural and emotional drawing.
Often look at your progress from different distances. When you’re up close and a pixel takes up a third of the screen, you can easily lose perspective. You should zoom out now and again to make sure things are looking the way you think they are.
The first character I made for Baseborn was a little elf thing, which, after some tweaks and color changes, became the Elf Mage enemy.
That drawing was pretty much free-handed. I decided that I needed a better starting point for the rest of the characters. So, I looked up the original Final Fantasy White Mage, analyzed and traced it, shuffled some pixels around, recolored it, and BAM. Our Mage character was born.
By copying the White Mage, I was able to figure out how to shape different body parts and clothing to make them look the way I wanted. From this point on, I loosely based all my characters on my Mage sprite. For some, I referred to other original Final Fantasy characters.
Most of the backdrops were influenced by the music I was listening to while drawing them. Interestingly, the backdrops would go on to influence the music the I created for them. I made good use of Paint.NET’s built-in effects for the backdrops. I used the noise generator frequently, as well as blurring, cloud and flame renderers, and glowing effects. For our firey level backdrop, we actually wound up combining the ideas of a stereotypical “hell” scene and Minecraft’s “Nether”. Since then, we have referred to it as the “Hellther”.
Unfortunately, I was not aware of the various animation assistance plugins available for Paint.NET. Consequently, I wound up spacing everything out by hand, which ate up a ton of time and was a real hassle when we had to tweak frame sizes.
Jake created some awesome animation classes inspired by those found in the Flashpunk framework. I’ll keep code out of this blog post, but you can download and read about Jake’s tools here. Nearly all of the animations were done by eye, without much reference. The key was to test often to make sure it flowed naturally.
When creating the animations for the skeletons, I wanted it to look like they were dragging their feet while having a little bit of a zombie-style, outstretched arm pose. What I ended up with was what looked like some sort of nerve spasm dance. It was funny and all, but I really wanted the foot-dragging. I tweaked it further until I thought it looked good. I go and test it out again, and now the skeleton looks like he’s a DJ, laying down beats and scratching vinyls. Obviously it was meant to stay that way. So it did.
And with that, I bid thee farewell. In another post to come, I’ll talk about the music of Baseborn!
Probably the thing I’m most proud of in Baseborn is the AI. In the original design document, the enemies you faced were simply going to be static targets that the player had to eliminate in a set amount of time, as a “combat training exercise”. When Chris and I teamed up, we decided to greatly increase the scope of the project, and proper enemy behavior was one of the first things I started on.
One of my early goals was that the core of the AI system be versatile enough that creating a new type of enemy was as simple as changing a few variables. To facilitate this, I specified a number of behavior parameters that an enemy type could combine, such as “stand ground”, “afraid”, “no melee”, and “no ranged”. By designing the base AI procedure to work differently depending on these parameters, we were able to quickly design new enemy types with almost no class-specific code. Additionally, any bugs we found only needed to be fixed in one place, which was a lifesaver as the system started to grow in complexity.
The base AI procedure (called the “think” method) is called each frame. Here’s the structure of the method:
For the sake of my own sanity, I designed each task to only modify a certain set of data. For example, the enemy’s heading is only ever set in the adjustHeading() method, and his position is only ever set in the move() method.
When an enemy is killed, there is a chance that he’ll drop a pickup of health, mana or ammo for the player to collect. This method simply checks for collision with the player and applies the effect accordingly.
Some enemies have no default behavior. Whether this is simply because they aren’t supposed to do anything (the debris on the beach near the beginning of the game is actually an enemy type), or because their behavior is designed specifically (such as the Doppelganger enemies that appear towards the end of the game), this allows the think procedure to exit early.
Here we check if the enemy’s health has been reduced to zero, and stop the procedure if it has. From here on, all the tasks are related to moving and attacking, and we can’t have our defeated enemies doing that.
Line-of-sight is calculated as a simple rectangle check. The enemy looks to see if the player is within a rectangle offset from his position to see if the player is within its bounds. This step moves the rectangle to correspond to his location.
This is the most involved task by far, and deserves a more in-depth explanation.
There are a number of cases in which the enemy should reverse his heading. The most obvious is that he should turn to face the player, if the player is in his field of view…but not if a wall is between them. Additionally, he should reverse his movement direction if he reaches the edge of a platform (but not if the player is across a gap and within ranged weapon distance) or bumps into an obstacle (but not if the obstacle he’s bumped into is the player backed into a corner).
The first step is to find out whether the player is in view. This involves a rectangle check and a raycast, as shown here:
In this case, even though the player is obviously within the view rectangle, a wall is occluding the enemy’s view.
Not so in this case. Since the player is in view, the enemy should turn and walk towards him.
Each time an enemy moves, he remembers where he was last. If at any time his position hasn’t changed since last time, he knows he’s hit a wall and that the physics engine is preventing him from moving forward. Unless the player is in view, he should turn around and walk the other way.
Next, the enemy checks to see if his next move would cause him to fall off the platform he’s standing on. Most of the time this would cause him to turn so as not to fall, but if the player is in view he will wait on the edge and try to attack him or knock him off.
If the enemy can see the player, he goes into attack mode. Depending on his behavior parameters, he may try to shoot, get in closer for a melee attack, or run away.
If the enemy’s health is below 33%, he goes into flee mode. During this phase he’ll rush off of platform edges, and stop trying to chase the player. After a certain amount of time he’ll start to regain health, and eventually drop out of flee mode and resume his normal behavior.
Finally, the enemy’s position is updated based on his current speed.
This was my first time working with any kind of AI, and I’m very pleased with the way it came out. I look forward to taking what I’ve learned on this project and applying it to future implementations. If you’re interested in seeing the code behind the tasks, you can find it in the Enemy class on Baseborn’s BitBucket repository.
Ifrit has reached Alpha status! I’m very excited about all we’ve done so far, and I’m looking forward to moving forward with the project. First, however, I’d promised we’d release a preview build this week, so here we are. The project is hosted over at Bitbucket, and you can download the release here.