Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report: Very Close to Finishing Linux Port

In last week’s report, I had figured out that my installation of Docker was missing some of the newer features available, which explained why I had to do so much more work to do what seemed like a simple thing.

This week, I managed to make great progress on my port work.

Sprint 59: Ports

Planned and Incomplete:

  • Create Linux port

My work was split up into the following tasks:

  • T: Use bind mount to load current project and relevant Projects directories as read-only
  • T: Build custom libraries for 32-bit
  • T: Build custom libraries for 64-bit
  • T: Build game using libraries for 32-bit
  • T: Build game using libraries for 64-bit
  • T: Update Cmake files to package up assets correctly
  • T: Write script to coordinate all of the tasks to put together a finished binary with 64-bit and 32-bit support

I once again put in more hours than expected, and I successfully built custom SDL2 libraries in both 64-bit and 32-bit varieties.

As I’ve mentioned, I am trying to use an older version of Debian to build my code because I want the resulting binaries to run on as many Linux-based distributions as I can.

So I have been using debian/eol:etch as a base image, but I quickly discovered that the version of CMake available in Etch is v2.4.5, which is too old for some of the features my CMake-related scripts needed.

I tried Debian Lenny, which has a CMake at v2.6, but it turns out that the FindSDL2.cmake that I was using has a QUIET keyword that it didn’t know about.

So I switched to Debian Squeeze, which has a CMake at v2.8.2, which seemed fine. I did run into an issue in which I thought there was yet another compatibility problem, but it turned out to be a problem with an environment variable not actually being passed in.

In trying to build my custom libraries, I have this section in my CustomLibs.cmake:

# Extract SDL2 into build directory (assume only one tarball exists).
# Creates ${SDL2_EXTRACTED_DIR}/build/.libs/libSDL2.so files.
    FILE (GLOB SDL2_TARBALL "${THIRD_PARTY_LIB_DIR}/SDL2-*.tar.gz")
    EXECUTE_PROCESS(COMMAND ${CMAKE_COMMAND} -E tar xzf "${SDL2_TARBALL}" WORKING_DIRECTORY "${PROJECT_BINARY_DIR}/")
    STRING(REGEX REPLACE "^.*(SDL2-[A-Za-z0-9.]+).tar.gz*$" "\\1" SDL2_EXTRACTED_DIR ${SDL2_TARBALL})

And I was getting an error along the lines of:

string sub-command REGEX, mode REPLACE needs at least 6 arguments total to command.

Oh, was this something else that has changed? CMake’s documentation doesn’t seem to indicate anything. I spent quite some time trying to track down what was going on, eventually upgrading to Debian Wheezy before deciding I could downgrade again to Squeeze.

Eventually, I figured out that THIRD_PARTY_LIB_DIR wasn’t set to anything, and so SDL2_TARBALL was similarly not set correctly, which meant the remaining commands were not able to execute properly.

Before this porting work, I had a toolchain file that set THIRD_PARTY_LIB_DIR to a particular directory on my machine that had the SDL2 library’s tarball. That is, a CMake-specific toolchain file set the variable, which meant that the CMakeLists.txt and other files knew about that variable.

Obviously that specific path won’t work inside the Docker container, so I wanted to set that library by use of an environment variable. And I struggled to figure out why my Docker container’s CMD line seemed to be able to output the ENV variable, and the script being called seemed to be able to output the variable, but CMake seemed ignorant of it.

Well, I realized that within my CMakeLists.txt and CustomLibs.cmake, I never needed to pay attention to environment variables before, as I was using that toolchain file to set the variable.

In order to make use of an environment variable, I had to do the following:

IF (NOT DEFINED ENV{THIRD_PARTY_LIB_DIR})
    MESSAGE(FATAL_ERROR "THIRD_PARTY_LIB_DIR is not defined.")
ELSE()
    SET(THIRD_PARTY_LIB_DIR $ENV{THIRD_PARTY_LIB_DIR})
    MESSAGE("THIRD_PARTY_LIB_DIR is defined as ${THIRD_PARTY_LIB_DIR}")
ENDIF()

Note that in the IF statement, I can refer to the environment variable with ENV, but in the SET statement, I need to use a dollar sign to refer to it.

Once I figured that issue out, I found that I needed to install libxext-dev because otherwise SDL2 wasn’t going to build.

Then I ran into a different environment variable being missing, so I solved it similarly to the way I did it above.

And then I ran into a frustrating problem.

My custom libraries get built by extracting the tarballs for various SDL2-related libraries. When SDL2_image got extracted, I then run configure on it with some custom arguments to get as small a binary as I can, then build it.

Except I kept getting the following error:

 aclocal-1.16: command not found
WARNING: 'aclocal-1.16' is missing on your system.
         You should only need it if you modified 'acinclude.m4' or
         'configure.ac' or m4 files included by 'configure.ac'.
         The 'aclocal' program is part of the GNU Automake package:
         <https://www.gnu.org/software/automake>
         It also requires GNU Autoconf, GNU m4 and Perl in order to run:
         <https://www.gnu.org/software/autoconf>
         <https://www.gnu.org/software/m4/>
         <https://www.perl.org/>

I tried installing automake, then wondered if I needed to install a bunch of other automake-related tools separately, but then I came across this Stack Overflow response that pointed me in the right direction:

It defeats the point of Autotools if GNU This and GNU That has to be installed on the system for it to work.

As well as this response:

As the error message hint at, aclocal-1.15 should only be required if you modified files that were used to generate aclocal.m4

If you don’t modify any of those files (including configure.ac) then you should not need to have aclocal-1.15.

In my case, the problem was not that any of those files was modified but somehow the timestamp on configure.ac was 6 minutes later compared to aclocal.m4.

So I checked, and I see that the directory that SDL2_image was being extracted to had files with timestamps that reflect my system’s current time.

If I manually extracted those files on my host system instead of within the Docker container, the timestamps show 2019 as the year, meaning that whatever files were in the tarball, the timestamps of those files were getting preserved.

So either Docker bind mounts do something wacky with timestamps, similar to how virtual machine shared drives can get kind of iffy, or maybe there was a compatibility issue with tar just like how I was dealing with compatibility issues with CMake.

And sure enough, I found that in tar v1.26 around 2011, tar’s –atime-preserve option had a major fix to make it work correctly.

So once again, I upgraded my Docker image to use Debian Wheezy as a base image, and tar was suddenly extracting files with preserved timestamps, which meant that it no longer was asking me to have automake installed.

And the rest of my build scripts were successfully able to build and package up the custom libraries! Woo!

So the next piece to work on was similar. I had to update my scripts to accept environment variables and such, and then I could build the Toy Factory Fixer project just fine.

And the only reason that it isn’t finished yet? I ran out of time in the week. Basically, I ran into an error due to code that requires C++11 compatibility, and Debian Wheezy is using GCC 4.7, which would require the use of a –std=c++0x flag for compatibility.

So either I change my code or I pursue that flag, which I think might require me to install an updated libstdc++, and at that point, why not just upgrade to Debian Jessie?

Either way, I expect that my project will successfully build soon, and then I can work on making sure that my CMake packaging-related lines understand how I’ve created separate directories for my resource files since the last time I worked on it years ago.

After that, I want to make sure all of the library building, project building, and packaging into a single tarball to distribute becomes a one-line script.

And then the port will be done.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report: More Docker Work

In my previous report, I talked about how my initial plan to setup Docker containers to help build the Linux-based version of Toy Factory Fixer was a bit unclear. I didn’t know what workflow I was trying to support, which made it difficult to envision how to setup Docker to help me make it happen.

This week, I continued the work with a clearer vision.

Sprint 58: Ports

Planned and Incomplete:

  • Create Linux port

Despite putting in more hours than expected, I only made a little progress.

My work was split up into the following tasks:

  • T: Create 32-bit docker container using Debian
  • T: Use bind mount to load current project and relevant Projects directories as read-only
  • T: Build custom libraries for 32-bit
  • T: Build custom libraries for 64-bit
  • T: Build game using libraries for 32-bit
  • T: Build game using libraries for 64-bit
  • T: Update Cmake files to package up assets correctly
  • T: Write script to coordinate all of the tasks to put together a finished binary with 64-bit and 32-bit support

I only finished the first task, as I spent the week struggling to figure out why Docker’s support for multiple architectures (multi-arch) was not working as advertised.

My host system is using amd64 as the architecture. According to the documentation, I should be able to use a Dockerfile or a docker-compose.yml file, and either way, I should be able to specify the platform as pretty much anything supported by the specific Docker image I’m referencing.

In my case, I am using debian/eol:etch, which has an i386 version out there.

So my docker-compose file looks like:

version: "2.4"
  
networks:
    linuxportbuilder:
        external: false

volumes:
    linuxportbuilder:
        driver: local

services:
    linuxportbuilder64:
        image: gbinfra/linuxportbuilder64
        build:
            context: .
        restart: "no"
        networks:
            - linuxportbuilder
        volumes:
            - linuxportbuilder:/data
        ports:
            - 22

    linuxportbuilder32:
        platform: linux/386
        image: gbinfra/linuxportbuilder32
        build:
            context: .
        restart: "no"
        networks:
            - linuxportbuilder
        volumes:
            - linuxportbuilder:/data
        ports:
            - 22

And my Dockerfile was:

FROM debian/eol:etch

RUN apt-get update && apt-get install -y mawk gcc autoconf git build-essential libgl1-mesa-dev chrpath cmake pkg-config zlib1g-dev libasound2-dev libpulse-dev
 
CMD ["/bin/bash", "-c", "echo The architecture of this container is: `dpkg --print-architecture`"]

As you can see, the service linuxportbuilder32 specifies a platform of linux/386.

But when I run docker-compose up, I see output indicating that it is running amd64.

I tried modifying my Dockerfile’s FROM line to say:

FROM --platform=linux/386 debian/eol:etch

And I still see amd64.

I eventually learned about the buildx plugin for Docker, which uses Buildkit (something I haven’t looked into yet but apparently it is a Thing), but the docs say that it comes with Docker for Desktop out of the box. Well, I don’t use Docker for Desktop, as I just have the docker package that comes with Ubuntu 20 LTS’s apt repos. I would expect that things should work, right?

Well, I find I can download buildx and place it in ~/.docker/cli-plugins. And then I need to use it to create a new builder, then tell Docker I want to use that builder instead of the default one.

And I think I needed to turn on experimental features by creating a ~/.docker/config.json file and adding an entry for it.

And now if I use buildx directly, I can spin up an image and a container that thinks it is using a 32-bit architecture.

But when I used docker-compose, it still seemed to use the old builder. Frustrating. And the docs make it sound like it should just work.

I eventually got it to build a 32-bit image, which I can determine from using docker manifest, but I couldn’t run a 32-bit container because I would get: “WARNING: The requested image’s platform (linux/386) does not match the detected host platform (linux/amd64) and no specific platform was requested” and I get this error DESPITE THE FACT THAT I WAS SPECIFYING THE PLATFORM EVERYWHERE!

But I found that despite a certain environment variable value supposedly being a default, I needed to specify two different environment variables:

DOCKER_BUILDKIT=1 COMPOSE_DOCKER_CLI_BUILD=1 docker-compose up

And then I was able to spin up containers in both 64-bit and 32-bit! Success!

That is, until a colleague informed me that docker compose (note the lack of a hyphen) was a thing.

WHAT?!

So I discover that docker-compose is v1 and old, while docker compose is v2 and the new hotness.

And while I didn’t want to install Docker for Desktop for Linux, I did find that docker compose didn’t exist in the default apt repos, so I uninstalled my existing Docker and Docker-related tooling, added Docker’s repos for my Ubuntu system, and installed those, and now I have the official Docker tooling installed.

And suddenly I no longer needed to setup a new builder or use environment variables. Everything worked as advertised.

And I’m a little annoyed that I spent a good chunk of the week figuring out what was essentially a workaround for using older Docker tools, but I suppose that means now you know in case you need it.

Annoyingly, if instead of dpkg –print-architecutre I had used uname -m, I would still see amd64, even though supposedly the docs claim that it should show the i386 architecture. I see people report this behavior as of posts from a few years ago, so it seems to be the case today as well. I am not sure if there is something special happening with Alpine images or if the docs are just wrong, but it also makes sense, as Docker containers are using the host system’s resources.

Anyway, once I could consistently spin up 32-bit and 64-bit containers, the next step was to get my toolchains and source libraries for SDL2 and such mounted into the containers. It was both easy and hard, because while the mechanism to mount a directory as read-only into the container is straightforward, some of my build scripts assume the directory structure on my host system is in place.

Basically, I needed to change some things to make it more generic so that it works locally on my host system as well as in the containers.

For example, my third-party libraries and my toolchains are located at ~/Projects/Tools but I didn’t want my container to require that my own user account’s name is being used, and it makes no sense to have my home directory be replicated in the container either.

Currently, my CMake toolchains hardcode the value. Instead, I can specify the environment variable myself when I run cmake.

THIRD_PARTY_LIB_DIR=/home/[my_username]/Projects/Tools cmake ../..

And when I create the docker containers, my new docker-compose.yml file has these entries added to volumes:

        volumes:
            - linuxportbuilder:/data
            - ${THIRD_PARTY_LIB_DIR:?err}:/home/gbgames/Projects/Tools:ro
            - ${THIRD_PARTY_LIB_DIR:?err}/CustomLibs:/home/gbgames/Projects/Tools/CustomLibs:rw

And my Dockerfile adds the gbgames user and ensures that the home directory for that user exists.

So now I need to do:

THIRD_PARTY_LIB_DIR=/home/[my_username]/Projects/Tools docker compose up

And the resulting containers will have my Tools directory mounted read-only. If I don’t specify the environment variable, it fails instead of silently trying to continue to spin things up. I am also pleased that I can make a separate subdirectory read-write, as my current scripts like to build the custom libraries and place them in that directory, which means I won’t need to change my scripts too much.

Next up is mounting my current project’s git repo. I think I will similarly want to use an environment variable here to specify the source directory to make things more generic across projects.

But now that I don’t have to fight against tools that aren’t matching the docs, I should have an easier time of it.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report – Initial Linux Porting Work with Docker

Last week, I previewed the work I was planning, and this past week I planned and worked on my first sprint since Toy Factory Fixer was published in December.

Sprint 57: Ports

Planned and Incomplete:

  • Create Linux port

I usually try to plan my weekly sprints on Sunday, but I was not able to dedicate the time to it. The plan was a day later in coming together, but due to various family commitments earlier in the week and my back giving me problems later in the week, I didn’t get as much time to dedicate to it.

Now, “Create Linux port” is currently split into six tasks, mostly related to creating scripts. As I mentioned last week, I wanted to replace the manual and cumbersome virtual machines I have used in the past with Docker containers that I expect to be able to automate more easily.

First, I wanted to know how easy it would be to setup a 32-bit Docker image and container.

Two weeks ago, I was trying to figure out how much demand there might be for 32-bit Linux binaries. Most of the player metrics I have access to, such as Steam’s reports, show that 64-bit systems reign supreme, which makes sense. And I remember that various distros were trying to get rid of 32-bit architecture support.

But I also know that they got a lot of pushback, partly because people want to be able to play older games. I also know that people like to breathe new life into older computer hardware by installing Linux-based systems.

So here’s my current thinking: I’m going to create a 32-bit binary option for Linux-based systems.

Why? Well, I’ve done it before. Years ago, I released a game and made sure it worked on both architectures, and it basically amounted to having two VMs, running the same build scripts on each, then combining the binaries and libraries together, then providing a script that detects the current architecture and runs the appropriate binary. So it doesn’t require any more work, assuming I can replace the VMs part with the Docker containers.

And I was pleased to find that Debian images exist for both architectures. I have yet to look into whether or not I will run into problems trying to get my 64-bit host to have a 32-bit container running on it, but that will come in due time.

I want to use an older EOL Debian version as my base Docker image because years ago I was trying to ensure a game of mine was compatible with as many systems as possible, and I found that Ubuntu was automatically adding stack protection even though I was setting a flag to say I didn’t want it (read Linux Game Development: GLIBC_2.4 Errors Solved to learn more). Stack protection required a newer version of GLIBC, which meant that people running older distros couldn’t play the game. The Debian-based VMs I was using didn’t have that problem. I could argue that people should update their systems, but I could also just as easily make my game available by not having arbitrary requirements.

So far, I have a docker-compose file and a Dockerfile that pull down Debian Etch, which is the earliest version that introduced 64-bit support, install a bunch of development tools and needed libraries, and…that’s it. I didn’t get too far. In fact, I spent most of my time getting a refresher on Docker configuration as it has been some time since I last messed with it.

I realized that I don’t exactly have a workflow I’m aiming towards. I know I want to be able to spin up a container, build custom libraries that reduce the number of dependencies I need to provide, then build my game project based on those custom libraries, and produce a tarball that I can distribute to players.

But tasks I didn’t identify before the sprint started include actually figuring out how those things happen. I have existing scripts that create the custom libraries, create the game binary, and combine the 32-bit and 64-bit results into one package, and so I expect that if I have to do any work on those, it is minimal.

How do I get my scripts into the containers? How do I get my source code into the containers? How do I access the custom libraries, either when they are built and need to be stored, or when the container needs them to do the building?

With the VMs, I think I remember I copied files into a shared location, then extracted them inside the VM, then ran the scripts manually. It was a bit cumbersome and annoying, and I even wrote down instructions so I could reproduce it consistently, with the expectation that I would eventually turn it into an automated script that would actually reproduce the work consistently.

With Docker containers, I could have the source pulled in by using git to grab the current version from source control, but it feels redundant when I already have my current version of the project in the place that I am probably launching the container from anyway. I envision a Jenkins job doing this work, and it likely already pulled the source from source control and doesn’t need to do it a second time.

I could use a bind mount instead. Boom. As soon as the container is up, it has access to my project. I could similarly give it access to my toolchains and the source libraries.

And if I can get the containers access to all of those directories and files, then it should be a simple matter of using my existing scripts to build everything I need.

And then I could always write an actual master script that does it all for me with a single command rather than follow instructions manually.

So my original sprint plan wasn’t getting me where I needed to be, and I was struggling with figuring out what exactly I needed the Docker configuration to look like, but I have a much clearer idea now, and while I expect that my capacity to work on it will always be limited, I should be able to make good progress in this coming week.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report: Planning Desktop Ports

My last progress report was in December when I announced that my first Freshly Squeezed Entertainment game, Toy Factory Fixer, was published and available for both Android and iOS devices.

I also mused about what I would do next. At the time, I said I was going to create a post-mortem presentation about the Toy Factory Fixer project, create desktop ports for it and Toytles: Leaf Raking, and make plans for my next Freshly Squeezed Entertainment project.

I meant to take a short break, play some games, and be a bit contemplative before getting back into regular development. Instead, it has been months, and I’m only now making such plans.

My Toy Factory Fixer Post-Mortem Presentation

Toy Factory Fixer’s post-mortem was finished in January, but then I worked on creating a presentation as well. Unfortunately my scheduled date in February for actually giving it was canceled and never rescheduled, so I have spent the last few months working on updating and improving my presentation on and off. I did a final pass last week and feel good about it, and I plan to record myself giving the presentation and uploading the video myself.

I realize that this presentation’s development was holding up a lot of planning, as I didn’t feel good about moving on until it was done. Unfortunately, I wasn’t good about working on it consistently, and without an external deadline, I allowed myself to let it prevent me from thinking about any other work.

Once Toy Factory Fixer was finished and published, I suddenly didn’t have the project plan guiding my efforts anymore. Instead of knowing what I was going to do each day during my designated game development time, I had a vague sense that I would finish the post-mortem presentation, then plan for future efforts. But I didn’t simply use my normal game development time for regular presentation development time, so my day to day was less focused and disciplined.

And eventually weeks turned into months. Whoops.

But the presentation is done. I’ve deleted a number of unnecessary slides, corrected typos and errors, and tightened up the pacing and delivery. I think.

I am sure I could continue to improve it, but after some concentrated efforts on it last week, I think I can say I am done. Now it is just a matter of making the priority and effort to record myself giving the presentation.

Desktop Ports

I’ve been thinking on and off about porting plans, but I did not put anything into an actual plan until late last week.

As the weekend was a bit busy, I’ll need to spend my upcoming week on finishing the plans.

Basically, I think the easiest thing for me to do is make a Linux-based desktop port, mainly because it is the system I do development on. I already have build scripts that ultimately create a game that players should find work across any number of Linux-based distros and even versions, and I believe it will be a matter of making it easier for me to do.

My old scripts depended on using virtual machines of old 64-bit and 32-bit Debian systems, which required setting up shared directories between the VMs and my main development machine, and then copying all resulting libraries and binaries into a single place to package up.

I could do probably bring those VMs up again (although I don’t know where on my hard drive they might be), but I would rather find a way to use containers if I could, as well as automating scripts to make it easier to do for future projects as well.

While most people have 64-bit systems, and many distros are trying to get rid of their 32-bit architectures, I think enough people might put Linux on their old computers to give them new life that maybe a 32-bit option for a new game would be welcome. It isn’t much more work, so it won’t hurt or slow things down for me, so why not?

The next desktop to port to might be Windows, something else I’ve done in the past. In fact, thanks to Ludum Dare 50, I discovered that my existing build scripts create a Windows port mostly fine, although I needed to tweak it and downloaded updated libraries from SDL2.

I think the main concern I have is that I know Microsoft scares people with warnings if your game isn’t signed. I think the last time I looked into it, it has to be signed a particular way with a particular certificate authority, which costs some money. Considering Freshly Squeezed Entertainment games are meant to given away for free, it means I’ll be spending some money for the privilege of doing so in a way that won’t have Windows telling people to be afraid of my game they just downloaded.

Finally, I will need to look into what it takes to port to Mac. I’m hoping it is fairly straightforward, but I also know that Apple likes to do things very differently. I already have the iOS port, so maybe I’m most of the way there? But I won’t know until I start working on it. As for signing, I am hoping the same mechanism to sign the iOS port is the same and that I don’t need to do anything special. And I hope that Apple doesn’t deprecate my Mac Mini and require me to buy all new hardware to support their new chips.

If I can, I want to look into creating a web version of my games. I believe using Emscripten and maybe some very minor platform-specific code means I am pretty close.

My Next Freshly Squeezed Entertainment Project

I’ve had a few game ideas rolling around in my notes and in my head, but my efforts during Ludum Dare 50 at the beginning of April resulted in a turn-based game about a city dealing with regular disasters.

Well, the game only had one disaster to speak of because I ran out of time, but my original plan featured multiple. And I think I want to try to turn Disaster City into a Freshly Squeezed Entertainment game now that I don’t have a time crunch to worry about.

The feedback I got during the review period showed me that I had the start of something compelling.

Once I’m done with the desktop port work, I expect to put together some plans for this next game.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

LD50: How My Entry Disaster City Did #LDJam #LD50

Late last week was the end of Ludum Dare 50, culminating in Results Day.

After giving everyone a few weeks to play and rate everyone else’s games, the final numbers were calculated, and the ratings announced.

Congratulations to those who ranked at the top in both the Compo and Jam categories! And congratulations to everyone who managed to finish and submit a game in a weekend!

So, how did my entry, Disaster City, do?

There were about 2,900 games entered. Of those, 1,413 were in the 72-hour Jam category, which is the one I ended up entering my game. I intended to submit it for the 48-hour Compo, but I ran out of time and needed the extra day to finish the game.

Here are my game’s final stats:

Overall: 1137th / 1413 (3.136 average from 35 ratings)
Fun: 987th / 1413 (3.106 average from 35 ratings)
Innovation: 783rd / 1412 (3.212 average from 35 ratings)
Theme: 947th / 1398 (3.424 average from 35 ratings)
Graphics: 1206th / 1279 (2.682 average from 35 ratings)
Audio: 870th / 943 (2.734 average from 34 ratings)
Humor: 602nd / 1147 (3.078 average from 34 ratings)
Mood: 1068th / 1353 (3.109 average from 34 ratings)

So, Disaster City was pretty much in the bottom of the pack, with my worst scores in the Graphics and Audio categories, which…ok, fair. Jam entries are made by teams usually, and I did everything solo, and I am not an artist or great with audio. I do wonder how it would have compared to other Compo games though. My Overall, Graphics, Audio, and Mood scores were in the 1st quartile, which feels a bit disappointing.

All of my other scores were in the 2nd quartile. My best scores came from the Innovation and Humor categories. While they were in the 2nd quartile, they were near the top of it.

LD50: Monster emerging mock-up

I could spend a lot of time analyzing it, but Honest Dan shared a post titled “Why didn’t my game rate higher?” and other similar questions which made me realize something: I’ve had a low-priority bucket list item to create a #1 game for Ludum Dare (despite not having participated in years), but that goal is kind of out of my hands to make happen.

My ratings/rankings don’t matter?

35 people submitted ratings for my game out of thousands of players. That’s not statistically significant. Out of the “winning” entries for LD50, I think I only played one of them. The others were mostly off of my radar. And I imagine it is the same for many participants.

I set a goal to review at least 40 games, as reviewing more games means the Ludum Dare algorithm will show your game to more people who are looking to review games.

One person managed to review 400 games, which is prolific! When I checked their own game, though, I saw that while they had way more than the 35 reviews I did, it was still only about 100 ratings for their game. So there is a limit to how much the algorithm helps players find your game, I suppose. Also, having 100 ratings means you can be more confident about what people generally thought of your game, but it still isn’t much more statistically significant.

The winners of LD didn’t have more participants rating their game than others. They had a small subset of people rate their game, and those ratings were higher than the subset that rated someone else’s game.

In a way, it’s hard to say how someone’s game actually compared to someone else’s game because there’s some randomness in terms of who played which games and how they felt about them.

And as Honest Dan pointed out, you can’t even really compare your previous LD results to your current results, as it isn’t like you had the same 20 or so people rate your game in each. It’s an apples and oranges comparison. You can’t say whether you improved based on how others rated you in two different LD compos.

So despite leading with the stats up above, I’m trying not to take them terribly seriously. Maybe the general average ratings can give me a sense of what someone’s gut feel for my entry was.

But what I found incredibly valuable was the comments section of my game. I think even though there was a major bug that people reported that they exploited, and even though they said the game was too easy, a lot of them also reported that they basically saw the promise of the game. They liked the turn phases, the general concept, the mechanics and dynamics. They enjoyed playing it.

Which tells me that in a weekend I was able to put together something that other people found entertaining, if flawed. And that’s an accomplishment, something a low-ish rating can’t take away.

Now what?

I would like to make a post-LD version of Disaster City. My task list was incomplete, so what you can play now is a subset of what I envisioned. Adding more disasters and more things for the player to do to mitigate or rebuild after them should make it more compelling. And balancing the game should help, too.

But if I do, I want it to be part of my Freshly Squeezed Entertainment line of games, and I want to be deliberate here. Rather than build onto what I did, I might start over without the time crunch of a weekend deadline.

More on those plans later.

Ludum Dare 51 is scheduled for September 30th, and I would love to participate in it. I’ve marked it on my calendar, and I’ll need to work out logistics with my wife in terms of how to ensure I can focus on the compo that weekend. She arranged multiple outings with the kids and babysitters so I could fully participate in LD50, and I am incredibly grateful.

More immediately, this LD got me focused on putting together desktop ports of my entry so that people could more easily play them, which is something I’ve been meaning to work on for my existing games. So I’m going to tweak some build scripts, create some desktop ports of Toy Factory Fixer and Toytles: Leaf Raking, and then look into how easy it would be to create a web build using Emscripten.

Thanks for reading, and thanks for a great weekend and past few weeks!

Categories
Game Design Game Development Geek / Technical

LD50: My Favorite Entries So Far #LDJam #LD50

Ludum Dare 50 is almost over, with the review/ratings period and the new Extra option continuing until April 21st.

In the roughly two weeks after the Compo and Jam deadlines ended, entrants are expected to play and rate other games, leave feedback, and otherwise enjoy the number of new games created.

An entry needs 20 ratings to receive a score, and I am happy to say that so far my entry Disaster City has 27.875. While I have 28 ratings in each of the other categories, I only have 27 for audio. Weird. Also, someone live streamed playing it, so that was exciting to see!

I’ve played and rated only 23 games as of writing this post, and my goal is to try to play and rate at least a total of 40 entries before the 21st.

In the meantime, I want to highlight a few of my favorites so far.

Compo Entry: Ready, uNsTeAdY, FIRE!

The genre is familiar, the production values look and sound simple, and yet the mechanics are compelling. You have to move your ship to prevent your gun from firing too early, you have to move to avoid crashing into the Space Beasts, and if you fire your laser but don’t manage kill any of the Space Beasts, the power of the laser overloads your ship and you die. The fact that some of the Space Beasts can only be killed by a much more charged up laser blast means that you spend a lot of time trying to watch your meter while moving to both keep the laser charging more and avoid enemies at the same time.

I know it was the creator’s voice the entire time, but I found the music catchy (it was stuck in my head while I was washing dishes) and the sound effects effective as well as hilarious. The highest laser beam charge is incredibly satisfying to fire off, too.

Extra Entry: Princess: Unwed

You play a medieval princess trying to use her carrier pigeon to send and receive messages across the land, getting the dirt on potential suitors, all in the name of convincing your mother to call off any arranged marriages.

It’s a mystery game, in which you try to gather clues to piece together where the suitor might be, what they might be up to, and what kind of person they are.

And there’s a pigeon. What’s not to love.

Jam Entry: 22h38

A mystery adventure with a very intimidating end of the world. The art and audio works well together, and the concept kind of reminded me of a few time-based scifi stories I’ve read or watched.

Jam Entry: Kiwis Can’t Fly

This game is emotional, both thanks to the excellent audio design and to the writing. The art is adorable, and as some of the reviewers have mentioned, it is reminiscent of Orisinal. It feels like playing a musical instrument, and it brought a smile to my face. The ending is both sad and beautiful. I loved the entire experience.

It’s been years since I last participated, but I am once again loving the variety of concepts that people have implemented in a short period of time. Good work, everyone!

What have been your favorite entries so far?

Categories
Game Design Game Development Geek / Technical

LD50: Disaster City Ports and a Time Lapse #LDJam #LD50

Disaster City, my Ludum Dare 50 entry in which you must try to R&D your way to averting a seemingly inevitable disaster while dealing with other disasters at the same time, was originally released with a Linux-based build.

I have since ported it to Android in the form of an APK you can sideload.

And now I have a Windows version of the game!

You can find my LD 50 entry page and the download links at: https://ldjam.com/events/ludum-dare/50/disaster-city

LD50: Disaster City entry

Thanks for playing!

In the meantime, here’s a time lapse of my desktop during the 48 hours of the compo, plus a few more hours as I tried to see how far I could get even when the deadline passed. I didn’t record the next day when I finally got the entry in by the Jam deadline, though.

Categories
Game Design Game Development Geek / Technical

LD50: Introducing Disaster City, Submitted as a Jam Entry #LDJam #LD50

I squeezed in a few more hours of development today, and in between picking up kids from school, meetings, dinner, play time with the kids and getting them ready for bed, I managed to get my game to a somewhat finished state, enough that I can call it enough of a game to feel comfortable submitting it.

And then of course I discovered a number of game-breaking bugs with my submission that I managed to fix shortly after.

But I did it. You can learn more about Disaster City at the official Ludum Dare 50 entry page.

LD50: Disaster City entry

I’m going to try to create an Android port next, but for now, I need to rest and prepare for the rest of the week.

Happy 20th birthday, Ludum Dare! I’ve missed this. B-)

Categories
Game Design Game Development Geek / Technical

LD50: Didn’t Make the Compo Deadline, Switching to Jam/Extra? #LDJam #LD50

So a lot got done in the final 12 hours of Ludum Dare 50 before the compo deadline.

Just not enough of it.

Disaster City already had the super secret anti-meteor R&D base (modeled off of the NASA Langley Research Center) and some skyscraper buildings. There was a river nearby that I intended to cause floods.

I set about creating the core game loop. Each turn would occur in phases.

There would be an report phase, in which you learn about any new disasters as well as how the previous day went. Then there would be a player command phase, in which you can decide to do various repairs or investments. Then the day would resolve based on what you did and what was going on.

Getting these turn phases in meant I could quickly put together an ending in which the inevitable meteor hits and you lose the game.

Then I created the ability to invest in R&D to prevent it.

Once I did so, I could make a happy ending, in which you successfully divert the meteor.

Then, the real game development needed to happen to make it interesting. Basically, I wanted a lot of disasters to befall the well-named city that you needed to spend your time and resources on instead of spamming the Invest button.

And since I didn’t have much time, I changed priorities to get the monster attack in sooner because I really wanted to take advantage of the drawing I had made earlier haphazardly and because I thought it would be more interesting than a flood or fire.

LD50: A disaster alert

LD50: A disaster warning

LD50: Changed scale of game

LD50: Monster attack

Unfortunately, getting the monster to appear, attack a building a bit, and disappear took a very long time. There were weird bugs, like when the monster teleported back into the water and kept coming at the same building turn after turn but never did anything. It was weird, and there was one defect I couldn’t figure out but somehow seemed to have resolved, and as I didn’t have a lot of time, I had to just hope I truly fixed it.

I changed the scale of the game up since I obviously wasn’t going to get to implement other features of the city that I wanted, like parks that could be destroyed (people leave when there isn’t greenery nearby), fire departments to handle fires, roads that could get destroyed and need repairing, etc. Luckily, I learned from a previous project to create my art at 1024×1024, then scale it down, which made it easier to scale up without it looking wrong. The game and the monster looks slightly more interesting without all of the blank space.

As for meals, I ate a quick lunch of a hamburger with a veggie patty and some condiments, plus some veggie straws, and later as there were only a few hours left in the compo my wife brought me the pasta and salad dinner she made to my office on a tray.

LD50: Veggie patty burger and veggie straws

LD50: Pasta and salad and a jealous Gizmo

Anyway, right around the deadline, I finally got buildings to get destroyed when attacked enough, and the population loss (uh, the people had to move because their home was destroyed, obviously) reduces your income, which means it is harder to win.

Except not yet, because all you need to do is spam that invest button still, and you can still win by ignoring the destruction.

You have no other meaningful actions yet, so the main dynamic of figuring out your immediate and long-term priorities isn’t in.

About half of my task list got done yesterday, so that’s a big accomplishment on its own. I went from having a title screen and some paper prototypes and notes to having a playable experiment to build upon.

I only blocked off this past weekend to work on it, though. I took today off from the day job to recover from LD, and there is a lot of catching up to do from the weekend in terms of the rest of my life and obligations.

The LD 50 Jam deadline is in 9 hours, and practically speaking I won’t be able to dedicate the entirety to game development.

But what can I do with a few more hours?

I have a monster that successfully attacks buildings. My intention was to have only one monster, as a monster attack was but one disaster possibility out of many, but I could easily make more instead of trying to make random fires or cause flooding.

So maybe there is a meteor coming, and you have to deal with multiple simultaneous monster attacks? Eh.

Giving you the ability to repair buildings that are damaged is my first priority.

Then I can balance the numbers and see if it might be enough to feel like a compelling game.

Then I’ll worry about multiplying monsters.

Oh, I guess I decided I was going to continue working on this game today after all.

Categories
Game Design Game Development Geek / Technical

LD50: A Very Long Task List, Only 12 Hours Left, and a Short Pep Talk for You #LDJam #LD50

Last night I realized that other than a bunch of disconnected notes and a few disconnected images I made, I didn’t have a rough plan in place, and it made it more difficult to move the project forward.

So I made myself a plan. Normally I use a spreadsheet and track weekly tasks, but I think for a 48-hour compo a simple TODO list in a text file works.

Right away, it felt simultaneously daunting (that was a very big task list, and I know I will discover more work as I go) and manageable (just start from the top and work my way down).

So I set to work, first by creating a city layout. It’s simple at the moment, and the most immediate urgency was putting in the river tiles, the ground tiles, and the super secret anti-meteor research and development center. I can put in other buildings (already drawn) later.

But then I thought it would make for a slightly better screenshot if skyscrapers and other buildings were there, too, and I was pleased that the work to add it was almost nothing.

LD50: Initial city layout

It’s not much now, but I hope to make this a more bustling metropolis soon.

I called it a night around midnight (so technically a morning?), even though I still don’t have game play in, but that’s a bigger lift than I was ready for without sleep.

This morning, I did some light exercise and ate breakfast.

Behold, my peanut butter, cinnamon, raisin, and pickle sandwich.

LD50: the classic PB, raisin, and pickle sandwich

If you know, you know.

I washed it down with some orange juice (hah, classic Me, amirite?), and since there was only so much left in the bottle, I finished it off.

LD50: Small glass of OJ

LD50: There wasn't enough left in the bottle to put back in the fridge

And my mother-in-law left us some cinnamon rolls, which my wife baked just now.

LD50: Cinnamon rolls!

So before I get back to game development, I did want to give you a pep talk if you need one.

Whether this is your first time participating in Ludum Dare, your first time doing game development, or even if you have been doing game development for a long time, you might look around at what everyone is accomplishing around you and think to yourself, “I don’t belong here.”

And I’m here to tell you that you do, in fact, belong here.

My first LD was #11 in 2008, but my first game jam was Game in a Day in 2005. I remember early on in the 24 hours I had to make a game that I kind of froze up.

I remember feeling like I simultaneously could do it but also that I should stop. It was a weird mix of fear and confusion. I didn’t know why I felt like I should stop other than a vague fear that I shouldn’t even bother, that I didn’t know what I was doing, etc.

I pushed through somehow, and even though the game I made was a very, very far cry from what I set out to do (24 hours isn’t a lot, it turns out), and even though it was buggy, and even though other game developers participating were professionals who made amazing things in those same 24 hours while I was merely an aspiring wannabe, I can say that I participated and I did, in fact, make a game that day.

When I participated in LD #11 a few years later, that same weird fear gripped me shortly after starting, but this time I recognized it. In hindsight, maybe it was Imposter Syndrome? But whatever it was, I successfully ignored it. I managed to successfully make a game and rank relatively highly, too!

And maybe it was because Ludum Dare in 2008 had something like tens of entries rather than thousands, and the community was smaller and more intimate, but there was definitely a home base in that IRC channel of supportive people who made you feel like you belong there just as much as they did. Game in a Day was even smaller, and I wish I could remember the name of the person who gave me advice to cut my scope, but that person also made me feel like I was being taken under his wing, that I was encouraged to be part of that community, too.

We have less than 12 hours to finish a game for the compo. Maybe you’re like me and have a large task list in front of you and you worry you might not finish in time. Maybe you don’t even have an idea yet, or you gave up on one project already. Maybe you see some of the amazing things that people are posting and think, “I am nowhere near that level!”

But don’t compare your efforts and struggles to published, polished efforts of a team of veterans. It is easy to imagine that everyone else knows what they are doing, but you’re only seeing the people willing to post their awesome stuff and not seeing the mistakes, dead-ends, and struggles behind the scenes for them and for many others.

Don’t compare your efforts except to your own previous experiences. And if you have no previous experiences, then consider this your baseline to compare yourself to next time.

And if you feel like you failed Ludum Dare because you couldn’t get it all together in time and publish a game, I’m here to say that after LD 11, I failed to put together a playable game in LD 14 and again in LD 32. There were Mini LDs that I “failed” as well. It was disappointing, but I never felt like I didn’t belong and shouldn’t have made the attempt.

Now, part of that might be cishet white privilege talking, but part of it was that by that point I was part of the Ludum Dare community.

So I want you to know that you belong and are welcome here, too.