Categories
Game Development Geek / Technical Linux Game Development Personal Development

Ludum Dare #24: Are You In?

August 24-27 is Ludum Dare #24. Ludum Dare is a 48-hour solo game development competition. It’s also a 72-hour game jam where the rules about tools, team sizes, and distribution are a bit more lax.

I realized that the last time I participated in a compo was Ludum Dare #20. It’s been too long.

I’m not sure if I will be working in C++ with libSDL or if I will be using Stencyl, which I used successfully during the 2011 Meaningful Game Play Game Jam to create two prototypes. I’d love to try out Unity, but according to the developers, they “currently have no plans or commitments to port the editor to Linux.”

Theme voting is happening right now. Friday night is the announcement of the theme and the kick-off of both the compo and jam. Are you in?

Categories
Games Geek / Technical

The Localization of Zelda

Thanks to a Google+ post by Brian Green, I spent part of last night reading through Legends of Localization: The Legend of Zelda.

It’s a fascinating breakdown of the differences between the NES and Famicon Disk System versions of The Legend of Zelda. I had no idea that that the Famicon had a microphone on the second controller or that the audio capabilities of the systems were so different.

I also realized that it has been so long since I have played the game that I’ve forgotten some of the strange text and secrets.

The author has done a few other comparisons, including Earthbound, Super Mario Bros, and Final Fantasy IV, or as those of us who grew up in the US called it, Final Fantasy II.

Categories
Game Design Game Development Geek / Technical

A New Direction for “Stop That Hero!”

As much as Stop That Hero! has
provided me with a great opportunity and learning experience,
recent events have led me to seriously invest time
in a much needed redesign. As a casual strategy game, the game
left players with a fun and exciting way to be evil and have
fun at the same time. Still, I’ve received feedback
over the months that have led me to question some of the
original design decisions I’ve made, some of which might be
leaving money on the table, so to speak.

So, the good news is that I’m taking all of the great stuff I’ve
done so far, and I’m going to recreate “Stop That Hero!” as an FPS.
As a strategy game, I find the game enjoyable, but the masses seem to
yearn for something a bit more visceral.

“Stop That Hero!: Reloaded!” puts you in the role of the hero,
fighting off the minions of an evil villain bent on taking over
the world. With the roles reversed in this new design, I think it
can be much more enjoyable and easier for fans to relate to the characters.

It will feature multiple weapons, urban and jungle environments, an
innovative cover system, and customizable uniforms for you and
members of your elite squad of minion-hunting friends (multiplayer content,
including special hats, exclusively available as DLC).

I don’t want to give too much away, but I am excited about this
new direction for “Stop That Hero!” For now, I’ll leave you with this
mock-up to give you a taste of what to expect:

STH: Reloaded
STH: Reloaded mock up. I had to use my niece's toy since I didn't have a gun or banana handy.
Categories
Game Development Geek / Technical Marketing/Business Personal Development

GDC Badge Pro Tips

While I won’t be going to the Game Developers Conference this year, I thought I would share some tips for making the most of your GDC 2012 badge and holder. These tips are especially important for people who will be attending their first GDC, such as some of the fantastic students I met when I spoke at the University of Iowa last Friday.

Feel free to share this post. And thanks, Ian Schreiber, for these tips when I attended my first GDC last year!

Categories
Game Development Geek / Technical Marketing/Business Personal Development

Hear Me Speak Live at the University of Iowa

I’ll be part of a group of game developers talking to students at University of Iowa on Friday, February 24th, 2012.

Where: Room 240 of Art Building West, Iowa City, IA

When: 4PM

Other speakers include people from Glass Cannon Games, Zach Ellsbury of Seraphic Software, iOS developer Karl Becker, and P.J. Lorenz, organizer of the Midwest Indie Game Developers Meetup group.

Categories
Geek / Technical

The Future Is Disorienting

I just jumped years into the future with both my cell phone and my computer. The future is a mixed bag.

The Cell Phone

For quite a few years now, I’ve had a flip-phone and was paying for service out of contract. When people started getting smartphones, I kind of wanted one, but never enough to justify the cost of a phone or tying myself to a new 2-year contract to get a free one.

Share photos on twitter with Twitpic

I did the math recently. I found out that for what I pay just for voice, data, and texting would be equivalent to getting a new smartphone with a contract. I’d even be able to take advantage of the data plan since the one with my flip-phone which became nearly unusable when they stopped supporting some of the already-poorly supported video options. I went with my fiancee’s Sprint-based provider instead of AT&T, and that referral netted me a discount that made it even cheaper.

It's here! And freezing! on Twitpic

So now I have an Android, the HTC EVO Shift, and I’m fairly happy with it. It’s way more useful than my old phone.

Plus, now I can play the games everyone has been talking about.

Plus plus, I have a device to test my own projects when I start mobile development.

However, I’m still getting used to figuring out how to hold it since I seem to keep accidentally hitting the search button or some part of the screen. I still hit the volume control on the side too often, and I press the power button as I slide the device into my pocket if I’m not careful. And as great as having a keyboard is, the phone automatically unlocks when I slide it out, which has accidentally happened in my coat pocket more than once already. While my old phone let me easily look up a name in my phone book since I could type a letter and it would jump to that section of my phone book, this new device seems to think I want to include everyone I’ve ever emailed, Tweeted, circled on G+, or friended on Facebook when I want to search for someone to call.

But this is the future, and the benefits are outweighing the negatives. Being able to fit a computer more powerful and integrated than my first desktop into my pocket has so far been pretty amazing.

The Computer

And speaking of computers, I’ve mentioned my dying laptop a few weeks ago. My Dell Precision M90 was a fine machine until the video card started failing. I’ve purchased a replacement: the Precision M6600.

Share photos on twitter with Twitpic

I’m going from:

  • 32-bit 2GHz CoreDuo -> 64-bit 2.2GHz i7 Quad Core
  • 2GB RAM -> 8GB RAM
  • NVIDIA Quadro FX 2500M (512MB) -> Quadro 3000M (2GB)
  • 100GB HD -> 750GB HD
  • 17″ WUXGA display -> 17.3″ 1920×1080 display

That last item unfortunately feels like a downgrade to me. I really liked having a 1920×1200 display. The extra lines of resolution were great for coding work. Unfortunately, it seems that 1920×1200 will no longer be available for most laptops. There’s no real explanation, either. There’s merely speculation that the hardware manufacturers will save money, especially since most people probably just want to watch movies on their laptops anyway.

Still, this new laptop came with Windows 7, and while I’ve used Windows Vista by being the default tech support guy for my family, I hadn’t really become familiar with anything since Windows XP, which is what my old laptop came with. The new interface is somewhat jarring. When I was running Firefox and had the download window open as well as the main browser, I wanted to minimize it, and so I clicked on the button in the taskbar, like I normally would. Instead of minimizing Firefox, it popped up a the windows as buttons so I could choose which one I wanted. Darn new-fangled tech doing things differently than I expected…

But no matter! I repartitioned the drive so that Windows gets about 100GB, and I installed Ubuntu 11.10 on the remaining space. And it turns out that Ubuntu is all new and shiny, too.

See, I used Ubuntu 10.04 Long-Term Support (LTS), and I didn’t want to upgrade to newer versions because I wanted a stable development system to do my work on.

So here I am, jumping two years’ worth of Ubuntu development, and this new Unity interface threw me for a bit until I realized they were trying to make it seem more Mac-like. There’s a launcher, and the buttons are all big and shiny, and there’s a Software Center where I can easily click and download apps and even purchase them. There’s quite a few indie games there, and it seems like it could be a decent marketplace.

But after looking at the entire games collection, I took the time to find the Terminal (more on that later) and typed: “apt-get install nethack-console” as fast as I could.

And then I added the Terminal to the launcher.

Seriously, it’s nice that the UI is 3D-powered and all, but sometimes I don’t want to mess around with a frickin’ mouse. I LIKED being able to open the system menu at a keypress and see ALL of my installed applications. Now I have to explicitly type to search or click on Dash, then click on “More Apps”, and then click some more? On the old system, I could hit Alt+F1 to open the “Applications” menu, then use the arrow keys to move to “Accessories” > Terminal. Now, I press the Windows key to bring up Dash, and if I know the name of the application, I can type it out right there, but if I don’t, it’s faster to click on “More Apps”, then go to “Installed” and click on “see all”, then scroll down until I find it.

I also liked being able to figure out how to open a file within an application by just looking at the screen. Unity has Mac-like context menus that applications share at the top of the screen, which…ok, whatever. Maybe it’s great, and I’ll try it, but why Unity hides the menu until you mouse over it, I don’t know, but I’m not the only one who is unhappy with it. Maybe I just need to get used to it, but there were a couple of times when I opened an application and wondered why the heck I had no way to open a file since the window I was using (seriously, why would I want to shift my eyes all the way to the top of the screen when the application I’m using is over here?) didn’t have a File option.

Unity also seems to hate the idea of customization. At least Windows let me move the taskbar around the screen. Why does the launcher have to stay stuck to the left side? Why do my windows have to have the close and minimize buttons to the left like on a Mac when I prefer them on the right? I’ve yet to find a way to change anything to work the way I want it to work.

I’m aware that it is possible to replace Unity with Ubuntu Classic UI or KDE, but I’m going to try to stick things out just to see how most users are expected to run their new Ubuntu-based systems. I actually do like being able to see all running application instances in an Exposé-like way, and there are parts of the UI that feel more intuitive. And Ubuntu 11.10 does seamlessly run 32-bit and 64-bit applications. I ran Stop That Hero! just to see if I had all of my development libraries installed, and I didn’t even realize that it shouldn’t have worked on this new 64-bit machine. So that was a pleasant surprise.

But if this is the future of computers, where laptop screens get smaller resolutions instead of larger ones, where the UI is made out of candy and shiny at the expense of being useful for people who know what they are doing, I’m not sure I’m happy about it.

I feel like the future is pushing me out of the way to make room for people who just want to watch widescreen movies or use a “workstation” as nothing more than a consumer electronics device.

Am I being unreasonable? Am I just disoriented because everything is different in the future? Is it better and I just need to get used to it?

Categories
Game Design Game Development Geek / Technical Marketing/Business Personal Development

An Online Conference You Can Attend #AltDevConf

If you’re not familiar with AltDevBlogADay, you should be. Each day, a game developer posts on a variety of game development topics. There’s a huge backlog of content there now, and while the recent redesign has made it difficult to find the category you want (you have to click on a post to see only some of the tags available as of this writing), it’s great getting regular, up-to-date, state-of-the-art tips and tricks from the people in the trenches. Authors can be mainstream game programmers, indie developers, academics, or anyone who has something valuable to share.

AltDevConf

It seems to be such a successful site that they’ve decided to host an online conference. AltDevConf will be held on February 11th and 12th (that’s this coming weekend), featuring three tracks: education, programming, and design & production.

Our goal is twofold: To provide free access to a comprehensive selection of game development topics taught by leading industry experts, and to create a space where bright and innovative voices can also be heard. We are able to do this, because as an online conference we are not subject to the same logistic and economic constrains imposed by the traditional conference model.

As it doesn’t look like I’ll be attending GDC this year (I’m still hoping to win an All Access Pass with my GDC magnets), AltDevConf seems like a high-quality substitute. While it won’t be the same as rubbing elbows with other indies or meeting cool celebrities in the gaming world, I’m excited about it.

Do you plan to attend? Will you be speaking?

Categories
Geek / Technical Marketing/Business

My Offsite Backup Solutions

To go along with my last post on indie maintenance and disaster plans, I’d like to mention how I currently back up my important data.

Local Backups

I have two active computers. My main development machine is my currently dying laptop. My desktop has a backup of my laptop’s data. Using rsync and SSH, I can transfer files between devices easily, which was really helpful when I needed to replace the desktop in 2010. I simply rsynced files to my laptop, then rsynced them back to the new desktop. As my laptop has been failing recently, I’ve been using any lucidity on its part as an opportunity to rsync files to my desktop in anticipation of the laptop dying at any moment.

I also have a 1TB hard drive connected to the router, which means any computers on my network could make use of it. Unfortunately, it has to be formatted as NTFS and requires the use of Samba, which means it isn’t a perfect solution, and it also means that I use it way less than I should.

So there’s data redundancy within my computer network. If one machine or drive fails, the important data is also available on the other, and I could always find a way to make better use of my 1TB drive so that losing both computers wouldn’t be a catastrophic data loss.

But what happens if I get robbed and lose all of this equipment? Or if a fire breaks out? Or some other disaster that takes out all of the data since it is all in the same office?

Remote Solutions

Since my main project, Stop That Hero!, uses git for version control, I paid for a Micro account on GitHub, which gives me 5 private repositories and the ability to add one collaborator for just $7/month. So if I lose everything, at least I can continue to work on the project once I get a new computer.

What about other data?

While I have a DropBox account, I only have limited space available (although signing up for your own account with that link gives us each 250MB extra). DropBox offers tiered pricing plans and a team/business plan, but I can’t justify the expense at this time. I’ve been using DropBox for private data backups and as a way to quickly provide a link to a file. I know a few Flash game developers have used DropBox to put up their game for FlashGameLicense.com.

An alternative to DropBox is SpiderOak. It offers way more space than DropBox, and if you choose to pay for more space, you get more than double the capacity for the same price. Plus, data encryption works both ways, when sending or receiving. According to the SpiderOak site, they claim to be a “zero knowledge” backup provider:

This means that we do not know anything about the data that you store on SpiderOak — not even your folder or filenames. On the server we only see sequentially numbered containers of encrypted data.

Now, this encryption means that it takes a lot longer to backup files. Since you get so much more space (2GB to start, and we each get a free GB if you use the link above), backing up a few GBs of data can take a good part of your day when you start out. Plus, unlike DropBox, you aren’t tied to the DropBox folder. You can configure whatever folders and files you want to backup, and you can still share files publicly. You can configure SpiderOak to automatically back up changes at a schedule you set, and it will keep track of previous versions of files for you, too.

I’ve been fairly happy with SpiderOak so far. The only issue I ran into was related to how it was backing up my Projects folder while I was working on it. I rebuilt my project, and apparently SpiderOak was in the middle of processing the folder my project lives in, and it choked. It was probably because a file like Game.o was deleted and replaced, and it didn’t know how to handle it mid-processing. I managed to get it unstuck, but it took a support email and a perusal of the support forums to find out how. To prevent problems in the future, explicitly tell it not to backup the folder where your project build lives.

So these are some free solutions with paid options that allow you to sync multiple computers and share with friends. What if you’re looking for something more cost efficient as well as private? I have a friend who pays for a dedicated remote solution. Carbonite is $59/year for one computer, which implies that you can’t use it to sync multiple computers or share files with friends, but it’s another option available to you. Mozy is an alternative, and it’s basic plan is $5.99/month for 50GB with options to pay a little extra to sync multiple computers. There’s also a Mozy Pro set of plans for servers as well as desktops that charge per gigabyte on top of a flat fee.

What does your backup plan look like?

Categories
Game Development Geek / Technical Marketing/Business

Indie Maintenance and Disaster Plans

My Dell Precision M90, which has been running like a champ for more than half a decade despite my cats’ attempts to get their fur clogged in its fans, is finally dying. I’ve been seeing graphical glitches for some time, but I’ve been able to continue working, and the glitches eventually go away. Except when they don’t. And recently, the machine won’t boot correctly.

Well this isn't a good sign. on Twitpic

The culprit seems to be a failing video card, which is way too expensive to replace. It’s frustrating since doing so would probably give this machine another few years of life.

I’ve been very happy with this machine, but it’s been slowly getting worse, and I realized that I had no plans for replacing it. So I’ve been either putting off the research so I can do the work I need to do, or I’ve been desperately trying to get the machine back up and running so I can continue to do that work, all the while knowing that I am going to need to spend some time (and money) on finding a replacement.

A large company probably has plans for this sort of thing, with IT departments bringing in spare equipment or ordering replacements. In fact, some companies have entire disaster preparedness plans in place. Replacing equipment quickly to ensure business continuity is just a part of such plans.

Since I purchased this laptop through Dell Small Business, I was able to get next-day on-site tech support that I only needed to take advantage of once towards the end of the extended warranty last year, and I was also able to replace the A/C adapter quickly after the cats chewed through the old cord a few years ago. Even knowing that the warranty was expiring, I didn’t really think through how I would continue to work without the laptop, which I should have realized was as inevitable as a hard drive dying.

And now that I think about it, perhaps the cats should worry about a replacement plan as well…

As an indie or solo entrepreneur, what do you do it? How prepared are you for equipment failure? Do you only start to worry about it the day your computer fails to boot, or do you anticipate the day your development equipment needs maintenance and replacement? Or do you constantly replace your machines with the latest and greatest and so don’t need to worry about longevity?

Categories
Game Development Geek / Technical General

Integrating LodePNG with an SDL Project

In efforts to port Stop That Hero! to the Mac, I ran into a strange issue involving PNG image data.

See, the level layout in “Stop That Hero!” is defined by a 50×33 PNG. The colors of pixels in the PNG correspond to tiles and structures in the game. Grass tiles are represented in the PNG as green pixels, water is blue-green, mountains are gray, and so on.

This PNG (blown up since 50×33 is so tiny):

Level 4 PNG

results in this level layout:

Stop That Hero! Bringing evil back...

I use libSDL_image in order to load image formats other than BMP, and on Windows and GNU/Linux, everything works as expected.

On the Mac, however, I was seeing a problem. It was as if most of the tile data was not getting loaded correctly. Instead of seeing grass, fields, forests, and mountains, I was only seeing mountains. And structure data was also not loading correctly. The player’s castle wasn’t appearing either, so the game always ends in defeat.

After ruling out endian issues (Intel-based Macs aren’t going to require any data-parsing changes from Intel-based Windows or GNU/Linux), I found that the pixel colors being returned from a loaded PNG weren’t what I expected.

I expect red-green-blue(RGB) data to be (0, 255, 0) for grass tiles, but the color that I was seeing was slightly different.

And it turned out that I wasn’t alone. A thread on the libSDL mailing list discussed a similar pixel bug on Mac OS X, and it turned out to be related to libSDL_image’s use of Apple’s ImageIO for the backend. I’m still not quite 100% clear on what the actual problem is, but the best I can figure out is that ImageIO tries to helpfully convert the image you’re loading so that it is more optimized for rendering on the specific Mac running the code. It’s not a problem if all you want to do is render images to the screen, but is a problem if you’re depending on the pixel data to be accurately decoded.

Last week a fix was introduced to solve this issue, but as it isn’t in the release version yet, and I didn’t want to convert my data or change how I was doing things, I decided a better way would be to replace libSDL_image in my own code. Thanks to a conversation on Google+, I was introduced to stb_image and LodePNG, both of which are liberally licensed, comprehensive, PNG-handling modules of code. By comprehensive, I mean that unlike libSDL_image, I don’t also have to require zlib. You just drop in a couple of files into your project, and you’re done.

I opted for LodePNG because unlike stb_image, it not only loads PNGs but also saves them, and I want to make sure I don’t have to switch libraries again when I get around to creating a level editor. Also, quite frankly, it was less intimidating than stb_image being a .c file that leaves the production of the associated .h as an exercise for the programmer.

LodePNG had some examples associated with it, and while one example uses libSDL, it wasn’t clear how to load a PNG into an SDL_Surface. The example simply rendered the PNG to the screen. It was not what I wanted, and I could not find any code out on the Internet that used LodePNG and libSDL together.

So, in the interest of filling this gap, here’s how to load a PNG with LodePNG and store it into an SDL_Surface:

SDL_Surface * loadImage(const char * filename)
{
    //Using LodePNG instead of SDL_image because of bug with Mac OS X
    //that prevents PNGs from being loaded without corruption.

    std::vector<unsigned char> image;
    unsigned width, height;
    unsigned error = LodePNG::decode(image, width, height, filename); //load the image file with given filename

    SDL_Surface * surface = 0;
    if (error == 0)
    {
        Uint32 rmask, gmask, bmask, amask;

#if SDL_BYTEORDER == SDL_BIG_ENDIAN
        rmask = 0xff000000;
        gmask = 0x00ff0000;
        bmask = 0x0000ff00;
        amask = 0x000000ff;
#else
        rmask = 0x000000ff;
        gmask = 0x0000ff00;
        bmask = 0x00ff0000;
        amask = 0xff000000;
#endif
        int depth = 32;
        surface = SDL_CreateRGBSurface(SDL_SWSURFACE, width, height, depth, rmask, gmask, bmask, amask);

        // Lock the surface, then store the pixel data.
        SDL_LockSurface(surface);

        unsigned char * pixelPointer = static_cast<unsigned char *>(surface->pixels);
        for (std::vector<unsigned char>::iterator iter = image.begin();
                    iter != image.end();
                    ++iter)
        {
            *pixelPointer = *iter;
            ++pixelPointer;
        }

        SDL_UnlockSurface(surface);

        SDL_Surface * convertedSurface = SDL_DisplayFormatAlpha(surface);
        if (convertedSurface != NULL)
        {
            SDL_FreeSurface(surface);
            surface = convertedSurface;
        }
    }

    return surface;
}

Technically, the piece of code related to convertedSurface isn’t necessary, but SDL_DisplayFormat and SDL_DisplayFormatAlpha convert the surface to one that is optimized for rendering. And it doesn’t modify the pixel data, which means that if you depend on it for map layout or for doing interesting effects at run-time, it just works, like you expected.