Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report: Windows Port and Preparing to Publish

Last week, I reported that I finally finished my Linux-based port of Toy Factory Fixer.

This past week, I set out to focus on the work of actually publishing it.

Sprint 61: Ports

Planned and Complete:

  • Create Windows port
  • Create itch.io page

It’s always a good feeling to finish all of the work I set out to do.

I didn’t know how long the Windows port work would take, and since the Linux port was finished, I wanted to get that into the hands of players as soon as I could.

I wanted to publish my games on itch.io, the open marketplace for independent digital creators with a focus on independent video games. I already have an account there due to some major charity bundles they hosted, and I like how the organization is so creator-friendly and also proactive about keeping the site toxic-free, so I started by learning what it takes to create a page.

And then…I made a page for the game.

The itch.io page editing tools were for the most part quite intuitive, and other than needing to shrink some of my animated GIFs to fit their 3MB limit, it made me wish there was an itch.io-inspired theme for WordPress because I love the way my game’s page looks and would love to match it on my own website.

I’ll be announcing the game’s release quite soon (as in, this week), so sign up for the newsletter (see below) if you want to hear about it first!

In the meantime, I found that creating the Windows port was fairly straightforward, but I also discovered that some code had to be changed that was causing some compatibility issues.

Specifically I have code using std::fabs, which I changed to abs, but it turned out that I was using the C version of it that only handles integers rather than the floating point version that comes from specifying std::abs from <cmath>.

I also realized that my Linux port had no music-playing capability, namely due to the custom SDL2_mixer library not being able to find OGG Vorbis in my docker container. The docs only mention that it needs OGG/Vorbis libraries installed on the system, but until I installed the libvorbis-dev package, I was struggling to figure out exactly which of the handful of libraries to install. So, hopefully that helped you if you ran into something similar.

The main problem with my Windows build was investigating what it takes to sign the executable. Basically, if the executable isn’t signed, then Windows likes to put up a scary modal dialog informing the player that this could be something dangerous from the Internet.

From what I could tell, it sounds like I could sign it with my own credentials, but since no one has my certificate, it might as well not be signed at all. I could pay for a certificate, but it isn’t cheap, especially for a free game, and on top of it all, I am reading that unless a critical mass of users downloads and runs the game, Microsoft’s Defender SmartScreen will see that the game doesn’t have a positive reputation and still flag it anyway.

So, I decided not to sign it for now. Maybe I’ll come back to figuring this piece out if my players worry about it.

Anyway, itch provides a tool called butler, so I spent time figuring out how itch prefers you to upload your distributables. I was expecting to provide .tar.gz files for Linux or an installer for Windows, but with butler, you simply point at your extracted directory and say which “channel” it is in (linux, windows, mac, etc), and it uploads everything in a nice way that allows you to upload updates in simple patches.

On the player’s side, they see .zip files, with the version specified.

Toy Factory Fixer downloads on itch.io

What I don’t like is that when you unzip the file, it extracts the contents directly into whatever director you’re in, as opposed to providing a root directory.

On the other hand, due to how butler and the itch.io’s app work, when you download the game through their app, it automatically creates a root directory with the name you provide for the page. It’s just a poorer experience (I think) for anyone who does a direct download from the page, and providing a root directory myself means a poorer experience for those who use the app.

I will say that the experience with the itch.io app is nice. You say download, then you hit the Launch button, and despite the fact that I didn’t specify a manifest or anything, they figured out which file was the executable so the game just works.

So now I have a Linux port AND a Windows port ready to go, and I’ll soon launch the itch.io page for the game (again, sign up for the mailing list to be the first to know!).

Next up is to create the Mac port, which I worry will not be as straightforward.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report: A Finished Linux Port

Last week, I reported that I was close to finishing the creation of infrastructure to create a distributable Linux port of Toy Factory Fixer and other games I have made and will make.

I continued the work this past week.

Sprint 60: Ports

Planned and Complete:

  • Create Linux port

My work was split up into the following tasks:

  • T: Use bind mount to load current project and relevant Projects directories as read-only
  • T: Build custom libraries for 32-bit
  • T: Build custom libraries for 64-bit
  • T: Build game using libraries for 32-bit
  • T: Build game using libraries for 64-bit
  • T: Update Cmake files to package up assets correctly
  • T: Write script to coordinate all of the tasks to put together a finished binary with 64-bit and 32-bit support

Due to the Memorial Day weekend, I did not work on game development earlier in the week, and I put in fewer hours than I have in any week of the last month.

Still, I managed to finish the work, partly thanks to being able to leverage work I did many years ago the last time I tried to distribute a Linux version of a game.

I ended the previous sprint with the challenge that my C++ code was accidentally using C++11 features and so wasn’t compatible with older versions of gcc/g++.

I decided to change my code to not require C++11 compatibility, which required in one case creating an assignment operator to properly handle a member that was a reference, and in another case I learned that std::vector’s erase() can’t take a const_iterator. Luckily in that case I didn’t strictly need a const_iterator, but I found it surprising.

Anyway, once I had the game building in both 64-bit and 32-bit Docker containers using Debian Wheezy images (remember, I am using this older version of Debian to ensure that the game will run on as many varieties of Linux-based systems as possible), I then had to ensure that all of the game’s assets were packaged properly, which basically meant updating my CMake files to refer to my GameSpecific directories.

What’s nice about my Docker-based infrastructure was that I could switch up SDL2 versions very quickly. I just download the latest SDL2 tarball, put it in the appropriate directory, and then spin up my container, which will rebuild the libraries, build my game, and package it all together.

I can not only port Toy Factory Fixer to Linux-based systems, but I should be able to do similar work on Toytles: Leaf Raking. I want to play the game with an eye towards what the desktop might afford in case there are some features like keyboard shortcut support I might want to add.

But otherwise the porting work is done. Now I need to do the work of publishing it. I have vague plans to upload the game to itch.io, but I did not spend time on solidifying those plans last week like I wanted to, so I will be doing so this week, and next I will work on porting to Windows and Mac.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report: Very Close to Finishing Linux Port

In last week’s report, I had figured out that my installation of Docker was missing some of the newer features available, which explained why I had to do so much more work to do what seemed like a simple thing.

This week, I managed to make great progress on my port work.

Sprint 59: Ports

Planned and Incomplete:

  • Create Linux port

My work was split up into the following tasks:

  • T: Use bind mount to load current project and relevant Projects directories as read-only
  • T: Build custom libraries for 32-bit
  • T: Build custom libraries for 64-bit
  • T: Build game using libraries for 32-bit
  • T: Build game using libraries for 64-bit
  • T: Update Cmake files to package up assets correctly
  • T: Write script to coordinate all of the tasks to put together a finished binary with 64-bit and 32-bit support

I once again put in more hours than expected, and I successfully built custom SDL2 libraries in both 64-bit and 32-bit varieties.

As I’ve mentioned, I am trying to use an older version of Debian to build my code because I want the resulting binaries to run on as many Linux-based distributions as I can.

So I have been using debian/eol:etch as a base image, but I quickly discovered that the version of CMake available in Etch is v2.4.5, which is too old for some of the features my CMake-related scripts needed.

I tried Debian Lenny, which has a CMake at v2.6, but it turns out that the FindSDL2.cmake that I was using has a QUIET keyword that it didn’t know about.

So I switched to Debian Squeeze, which has a CMake at v2.8.2, which seemed fine. I did run into an issue in which I thought there was yet another compatibility problem, but it turned out to be a problem with an environment variable not actually being passed in.

In trying to build my custom libraries, I have this section in my CustomLibs.cmake:

# Extract SDL2 into build directory (assume only one tarball exists).
# Creates ${SDL2_EXTRACTED_DIR}/build/.libs/libSDL2.so files.
    FILE (GLOB SDL2_TARBALL "${THIRD_PARTY_LIB_DIR}/SDL2-*.tar.gz")
    EXECUTE_PROCESS(COMMAND ${CMAKE_COMMAND} -E tar xzf "${SDL2_TARBALL}" WORKING_DIRECTORY "${PROJECT_BINARY_DIR}/")
    STRING(REGEX REPLACE "^.*(SDL2-[A-Za-z0-9.]+).tar.gz*$" "\\1" SDL2_EXTRACTED_DIR ${SDL2_TARBALL})

And I was getting an error along the lines of:

string sub-command REGEX, mode REPLACE needs at least 6 arguments total to command.

Oh, was this something else that has changed? CMake’s documentation doesn’t seem to indicate anything. I spent quite some time trying to track down what was going on, eventually upgrading to Debian Wheezy before deciding I could downgrade again to Squeeze.

Eventually, I figured out that THIRD_PARTY_LIB_DIR wasn’t set to anything, and so SDL2_TARBALL was similarly not set correctly, which meant the remaining commands were not able to execute properly.

Before this porting work, I had a toolchain file that set THIRD_PARTY_LIB_DIR to a particular directory on my machine that had the SDL2 library’s tarball. That is, a CMake-specific toolchain file set the variable, which meant that the CMakeLists.txt and other files knew about that variable.

Obviously that specific path won’t work inside the Docker container, so I wanted to set that library by use of an environment variable. And I struggled to figure out why my Docker container’s CMD line seemed to be able to output the ENV variable, and the script being called seemed to be able to output the variable, but CMake seemed ignorant of it.

Well, I realized that within my CMakeLists.txt and CustomLibs.cmake, I never needed to pay attention to environment variables before, as I was using that toolchain file to set the variable.

In order to make use of an environment variable, I had to do the following:

IF (NOT DEFINED ENV{THIRD_PARTY_LIB_DIR})
    MESSAGE(FATAL_ERROR "THIRD_PARTY_LIB_DIR is not defined.")
ELSE()
    SET(THIRD_PARTY_LIB_DIR $ENV{THIRD_PARTY_LIB_DIR})
    MESSAGE("THIRD_PARTY_LIB_DIR is defined as ${THIRD_PARTY_LIB_DIR}")
ENDIF()

Note that in the IF statement, I can refer to the environment variable with ENV, but in the SET statement, I need to use a dollar sign to refer to it.

Once I figured that issue out, I found that I needed to install libxext-dev because otherwise SDL2 wasn’t going to build.

Then I ran into a different environment variable being missing, so I solved it similarly to the way I did it above.

And then I ran into a frustrating problem.

My custom libraries get built by extracting the tarballs for various SDL2-related libraries. When SDL2_image got extracted, I then run configure on it with some custom arguments to get as small a binary as I can, then build it.

Except I kept getting the following error:

 aclocal-1.16: command not found
WARNING: 'aclocal-1.16' is missing on your system.
         You should only need it if you modified 'acinclude.m4' or
         'configure.ac' or m4 files included by 'configure.ac'.
         The 'aclocal' program is part of the GNU Automake package:
         <https://www.gnu.org/software/automake>
         It also requires GNU Autoconf, GNU m4 and Perl in order to run:
         <https://www.gnu.org/software/autoconf>
         <https://www.gnu.org/software/m4/>
         <https://www.perl.org/>

I tried installing automake, then wondered if I needed to install a bunch of other automake-related tools separately, but then I came across this Stack Overflow response that pointed me in the right direction:

It defeats the point of Autotools if GNU This and GNU That has to be installed on the system for it to work.

As well as this response:

As the error message hint at, aclocal-1.15 should only be required if you modified files that were used to generate aclocal.m4

If you don’t modify any of those files (including configure.ac) then you should not need to have aclocal-1.15.

In my case, the problem was not that any of those files was modified but somehow the timestamp on configure.ac was 6 minutes later compared to aclocal.m4.

So I checked, and I see that the directory that SDL2_image was being extracted to had files with timestamps that reflect my system’s current time.

If I manually extracted those files on my host system instead of within the Docker container, the timestamps show 2019 as the year, meaning that whatever files were in the tarball, the timestamps of those files were getting preserved.

So either Docker bind mounts do something wacky with timestamps, similar to how virtual machine shared drives can get kind of iffy, or maybe there was a compatibility issue with tar just like how I was dealing with compatibility issues with CMake.

And sure enough, I found that in tar v1.26 around 2011, tar’s –atime-preserve option had a major fix to make it work correctly.

So once again, I upgraded my Docker image to use Debian Wheezy as a base image, and tar was suddenly extracting files with preserved timestamps, which meant that it no longer was asking me to have automake installed.

And the rest of my build scripts were successfully able to build and package up the custom libraries! Woo!

So the next piece to work on was similar. I had to update my scripts to accept environment variables and such, and then I could build the Toy Factory Fixer project just fine.

And the only reason that it isn’t finished yet? I ran out of time in the week. Basically, I ran into an error due to code that requires C++11 compatibility, and Debian Wheezy is using GCC 4.7, which would require the use of a –std=c++0x flag for compatibility.

So either I change my code or I pursue that flag, which I think might require me to install an updated libstdc++, and at that point, why not just upgrade to Debian Jessie?

Either way, I expect that my project will successfully build soon, and then I can work on making sure that my CMake packaging-related lines understand how I’ve created separate directories for my resource files since the last time I worked on it years ago.

After that, I want to make sure all of the library building, project building, and packaging into a single tarball to distribute becomes a one-line script.

And then the port will be done.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report: More Docker Work

In my previous report, I talked about how my initial plan to setup Docker containers to help build the Linux-based version of Toy Factory Fixer was a bit unclear. I didn’t know what workflow I was trying to support, which made it difficult to envision how to setup Docker to help me make it happen.

This week, I continued the work with a clearer vision.

Sprint 58: Ports

Planned and Incomplete:

  • Create Linux port

Despite putting in more hours than expected, I only made a little progress.

My work was split up into the following tasks:

  • T: Create 32-bit docker container using Debian
  • T: Use bind mount to load current project and relevant Projects directories as read-only
  • T: Build custom libraries for 32-bit
  • T: Build custom libraries for 64-bit
  • T: Build game using libraries for 32-bit
  • T: Build game using libraries for 64-bit
  • T: Update Cmake files to package up assets correctly
  • T: Write script to coordinate all of the tasks to put together a finished binary with 64-bit and 32-bit support

I only finished the first task, as I spent the week struggling to figure out why Docker’s support for multiple architectures (multi-arch) was not working as advertised.

My host system is using amd64 as the architecture. According to the documentation, I should be able to use a Dockerfile or a docker-compose.yml file, and either way, I should be able to specify the platform as pretty much anything supported by the specific Docker image I’m referencing.

In my case, I am using debian/eol:etch, which has an i386 version out there.

So my docker-compose file looks like:

version: "2.4"
  
networks:
    linuxportbuilder:
        external: false

volumes:
    linuxportbuilder:
        driver: local

services:
    linuxportbuilder64:
        image: gbinfra/linuxportbuilder64
        build:
            context: .
        restart: "no"
        networks:
            - linuxportbuilder
        volumes:
            - linuxportbuilder:/data
        ports:
            - 22

    linuxportbuilder32:
        platform: linux/386
        image: gbinfra/linuxportbuilder32
        build:
            context: .
        restart: "no"
        networks:
            - linuxportbuilder
        volumes:
            - linuxportbuilder:/data
        ports:
            - 22

And my Dockerfile was:

FROM debian/eol:etch

RUN apt-get update && apt-get install -y mawk gcc autoconf git build-essential libgl1-mesa-dev chrpath cmake pkg-config zlib1g-dev libasound2-dev libpulse-dev
 
CMD ["/bin/bash", "-c", "echo The architecture of this container is: `dpkg --print-architecture`"]

As you can see, the service linuxportbuilder32 specifies a platform of linux/386.

But when I run docker-compose up, I see output indicating that it is running amd64.

I tried modifying my Dockerfile’s FROM line to say:

FROM --platform=linux/386 debian/eol:etch

And I still see amd64.

I eventually learned about the buildx plugin for Docker, which uses Buildkit (something I haven’t looked into yet but apparently it is a Thing), but the docs say that it comes with Docker for Desktop out of the box. Well, I don’t use Docker for Desktop, as I just have the docker package that comes with Ubuntu 20 LTS’s apt repos. I would expect that things should work, right?

Well, I find I can download buildx and place it in ~/.docker/cli-plugins. And then I need to use it to create a new builder, then tell Docker I want to use that builder instead of the default one.

And I think I needed to turn on experimental features by creating a ~/.docker/config.json file and adding an entry for it.

And now if I use buildx directly, I can spin up an image and a container that thinks it is using a 32-bit architecture.

But when I used docker-compose, it still seemed to use the old builder. Frustrating. And the docs make it sound like it should just work.

I eventually got it to build a 32-bit image, which I can determine from using docker manifest, but I couldn’t run a 32-bit container because I would get: “WARNING: The requested image’s platform (linux/386) does not match the detected host platform (linux/amd64) and no specific platform was requested” and I get this error DESPITE THE FACT THAT I WAS SPECIFYING THE PLATFORM EVERYWHERE!

But I found that despite a certain environment variable value supposedly being a default, I needed to specify two different environment variables:

DOCKER_BUILDKIT=1 COMPOSE_DOCKER_CLI_BUILD=1 docker-compose up

And then I was able to spin up containers in both 64-bit and 32-bit! Success!

That is, until a colleague informed me that docker compose (note the lack of a hyphen) was a thing.

WHAT?!

So I discover that docker-compose is v1 and old, while docker compose is v2 and the new hotness.

And while I didn’t want to install Docker for Desktop for Linux, I did find that docker compose didn’t exist in the default apt repos, so I uninstalled my existing Docker and Docker-related tooling, added Docker’s repos for my Ubuntu system, and installed those, and now I have the official Docker tooling installed.

And suddenly I no longer needed to setup a new builder or use environment variables. Everything worked as advertised.

And I’m a little annoyed that I spent a good chunk of the week figuring out what was essentially a workaround for using older Docker tools, but I suppose that means now you know in case you need it.

Annoyingly, if instead of dpkg –print-architecutre I had used uname -m, I would still see amd64, even though supposedly the docs claim that it should show the i386 architecture. I see people report this behavior as of posts from a few years ago, so it seems to be the case today as well. I am not sure if there is something special happening with Alpine images or if the docs are just wrong, but it also makes sense, as Docker containers are using the host system’s resources.

Anyway, once I could consistently spin up 32-bit and 64-bit containers, the next step was to get my toolchains and source libraries for SDL2 and such mounted into the containers. It was both easy and hard, because while the mechanism to mount a directory as read-only into the container is straightforward, some of my build scripts assume the directory structure on my host system is in place.

Basically, I needed to change some things to make it more generic so that it works locally on my host system as well as in the containers.

For example, my third-party libraries and my toolchains are located at ~/Projects/Tools but I didn’t want my container to require that my own user account’s name is being used, and it makes no sense to have my home directory be replicated in the container either.

Currently, my CMake toolchains hardcode the value. Instead, I can specify the environment variable myself when I run cmake.

THIRD_PARTY_LIB_DIR=/home/[my_username]/Projects/Tools cmake ../..

And when I create the docker containers, my new docker-compose.yml file has these entries added to volumes:

        volumes:
            - linuxportbuilder:/data
            - ${THIRD_PARTY_LIB_DIR:?err}:/home/gbgames/Projects/Tools:ro
            - ${THIRD_PARTY_LIB_DIR:?err}/CustomLibs:/home/gbgames/Projects/Tools/CustomLibs:rw

And my Dockerfile adds the gbgames user and ensures that the home directory for that user exists.

So now I need to do:

THIRD_PARTY_LIB_DIR=/home/[my_username]/Projects/Tools docker compose up

And the resulting containers will have my Tools directory mounted read-only. If I don’t specify the environment variable, it fails instead of silently trying to continue to spin things up. I am also pleased that I can make a separate subdirectory read-write, as my current scripts like to build the custom libraries and place them in that directory, which means I won’t need to change my scripts too much.

Next up is mounting my current project’s git repo. I think I will similarly want to use an environment variable here to specify the source directory to make things more generic across projects.

But now that I don’t have to fight against tools that aren’t matching the docs, I should have an easier time of it.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report – Initial Linux Porting Work with Docker

Last week, I previewed the work I was planning, and this past week I planned and worked on my first sprint since Toy Factory Fixer was published in December.

Sprint 57: Ports

Planned and Incomplete:

  • Create Linux port

I usually try to plan my weekly sprints on Sunday, but I was not able to dedicate the time to it. The plan was a day later in coming together, but due to various family commitments earlier in the week and my back giving me problems later in the week, I didn’t get as much time to dedicate to it.

Now, “Create Linux port” is currently split into six tasks, mostly related to creating scripts. As I mentioned last week, I wanted to replace the manual and cumbersome virtual machines I have used in the past with Docker containers that I expect to be able to automate more easily.

First, I wanted to know how easy it would be to setup a 32-bit Docker image and container.

Two weeks ago, I was trying to figure out how much demand there might be for 32-bit Linux binaries. Most of the player metrics I have access to, such as Steam’s reports, show that 64-bit systems reign supreme, which makes sense. And I remember that various distros were trying to get rid of 32-bit architecture support.

But I also know that they got a lot of pushback, partly because people want to be able to play older games. I also know that people like to breathe new life into older computer hardware by installing Linux-based systems.

So here’s my current thinking: I’m going to create a 32-bit binary option for Linux-based systems.

Why? Well, I’ve done it before. Years ago, I released a game and made sure it worked on both architectures, and it basically amounted to having two VMs, running the same build scripts on each, then combining the binaries and libraries together, then providing a script that detects the current architecture and runs the appropriate binary. So it doesn’t require any more work, assuming I can replace the VMs part with the Docker containers.

And I was pleased to find that Debian images exist for both architectures. I have yet to look into whether or not I will run into problems trying to get my 64-bit host to have a 32-bit container running on it, but that will come in due time.

I want to use an older EOL Debian version as my base Docker image because years ago I was trying to ensure a game of mine was compatible with as many systems as possible, and I found that Ubuntu was automatically adding stack protection even though I was setting a flag to say I didn’t want it (read Linux Game Development: GLIBC_2.4 Errors Solved to learn more). Stack protection required a newer version of GLIBC, which meant that people running older distros couldn’t play the game. The Debian-based VMs I was using didn’t have that problem. I could argue that people should update their systems, but I could also just as easily make my game available by not having arbitrary requirements.

So far, I have a docker-compose file and a Dockerfile that pull down Debian Etch, which is the earliest version that introduced 64-bit support, install a bunch of development tools and needed libraries, and…that’s it. I didn’t get too far. In fact, I spent most of my time getting a refresher on Docker configuration as it has been some time since I last messed with it.

I realized that I don’t exactly have a workflow I’m aiming towards. I know I want to be able to spin up a container, build custom libraries that reduce the number of dependencies I need to provide, then build my game project based on those custom libraries, and produce a tarball that I can distribute to players.

But tasks I didn’t identify before the sprint started include actually figuring out how those things happen. I have existing scripts that create the custom libraries, create the game binary, and combine the 32-bit and 64-bit results into one package, and so I expect that if I have to do any work on those, it is minimal.

How do I get my scripts into the containers? How do I get my source code into the containers? How do I access the custom libraries, either when they are built and need to be stored, or when the container needs them to do the building?

With the VMs, I think I remember I copied files into a shared location, then extracted them inside the VM, then ran the scripts manually. It was a bit cumbersome and annoying, and I even wrote down instructions so I could reproduce it consistently, with the expectation that I would eventually turn it into an automated script that would actually reproduce the work consistently.

With Docker containers, I could have the source pulled in by using git to grab the current version from source control, but it feels redundant when I already have my current version of the project in the place that I am probably launching the container from anyway. I envision a Jenkins job doing this work, and it likely already pulled the source from source control and doesn’t need to do it a second time.

I could use a bind mount instead. Boom. As soon as the container is up, it has access to my project. I could similarly give it access to my toolchains and the source libraries.

And if I can get the containers access to all of those directories and files, then it should be a simple matter of using my existing scripts to build everything I need.

And then I could always write an actual master script that does it all for me with a single command rather than follow instructions manually.

So my original sprint plan wasn’t getting me where I needed to be, and I was struggling with figuring out what exactly I needed the Docker configuration to look like, but I have a much clearer idea now, and while I expect that my capacity to work on it will always be limited, I should be able to make good progress in this coming week.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

Freshly Squeezed Progress Report: Planning Desktop Ports

My last progress report was in December when I announced that my first Freshly Squeezed Entertainment game, Toy Factory Fixer, was published and available for both Android and iOS devices.

I also mused about what I would do next. At the time, I said I was going to create a post-mortem presentation about the Toy Factory Fixer project, create desktop ports for it and Toytles: Leaf Raking, and make plans for my next Freshly Squeezed Entertainment project.

I meant to take a short break, play some games, and be a bit contemplative before getting back into regular development. Instead, it has been months, and I’m only now making such plans.

My Toy Factory Fixer Post-Mortem Presentation

Toy Factory Fixer’s post-mortem was finished in January, but then I worked on creating a presentation as well. Unfortunately my scheduled date in February for actually giving it was canceled and never rescheduled, so I have spent the last few months working on updating and improving my presentation on and off. I did a final pass last week and feel good about it, and I plan to record myself giving the presentation and uploading the video myself.

I realize that this presentation’s development was holding up a lot of planning, as I didn’t feel good about moving on until it was done. Unfortunately, I wasn’t good about working on it consistently, and without an external deadline, I allowed myself to let it prevent me from thinking about any other work.

Once Toy Factory Fixer was finished and published, I suddenly didn’t have the project plan guiding my efforts anymore. Instead of knowing what I was going to do each day during my designated game development time, I had a vague sense that I would finish the post-mortem presentation, then plan for future efforts. But I didn’t simply use my normal game development time for regular presentation development time, so my day to day was less focused and disciplined.

And eventually weeks turned into months. Whoops.

But the presentation is done. I’ve deleted a number of unnecessary slides, corrected typos and errors, and tightened up the pacing and delivery. I think.

I am sure I could continue to improve it, but after some concentrated efforts on it last week, I think I can say I am done. Now it is just a matter of making the priority and effort to record myself giving the presentation.

Desktop Ports

I’ve been thinking on and off about porting plans, but I did not put anything into an actual plan until late last week.

As the weekend was a bit busy, I’ll need to spend my upcoming week on finishing the plans.

Basically, I think the easiest thing for me to do is make a Linux-based desktop port, mainly because it is the system I do development on. I already have build scripts that ultimately create a game that players should find work across any number of Linux-based distros and even versions, and I believe it will be a matter of making it easier for me to do.

My old scripts depended on using virtual machines of old 64-bit and 32-bit Debian systems, which required setting up shared directories between the VMs and my main development machine, and then copying all resulting libraries and binaries into a single place to package up.

I could do probably bring those VMs up again (although I don’t know where on my hard drive they might be), but I would rather find a way to use containers if I could, as well as automating scripts to make it easier to do for future projects as well.

While most people have 64-bit systems, and many distros are trying to get rid of their 32-bit architectures, I think enough people might put Linux on their old computers to give them new life that maybe a 32-bit option for a new game would be welcome. It isn’t much more work, so it won’t hurt or slow things down for me, so why not?

The next desktop to port to might be Windows, something else I’ve done in the past. In fact, thanks to Ludum Dare 50, I discovered that my existing build scripts create a Windows port mostly fine, although I needed to tweak it and downloaded updated libraries from SDL2.

I think the main concern I have is that I know Microsoft scares people with warnings if your game isn’t signed. I think the last time I looked into it, it has to be signed a particular way with a particular certificate authority, which costs some money. Considering Freshly Squeezed Entertainment games are meant to given away for free, it means I’ll be spending some money for the privilege of doing so in a way that won’t have Windows telling people to be afraid of my game they just downloaded.

Finally, I will need to look into what it takes to port to Mac. I’m hoping it is fairly straightforward, but I also know that Apple likes to do things very differently. I already have the iOS port, so maybe I’m most of the way there? But I won’t know until I start working on it. As for signing, I am hoping the same mechanism to sign the iOS port is the same and that I don’t need to do anything special. And I hope that Apple doesn’t deprecate my Mac Mini and require me to buy all new hardware to support their new chips.

If I can, I want to look into creating a web version of my games. I believe using Emscripten and maybe some very minor platform-specific code means I am pretty close.

My Next Freshly Squeezed Entertainment Project

I’ve had a few game ideas rolling around in my notes and in my head, but my efforts during Ludum Dare 50 at the beginning of April resulted in a turn-based game about a city dealing with regular disasters.

Well, the game only had one disaster to speak of because I ran out of time, but my original plan featured multiple. And I think I want to try to turn Disaster City into a Freshly Squeezed Entertainment game now that I don’t have a time crunch to worry about.

The feedback I got during the review period showed me that I had the start of something compelling.

Once I’m done with the desktop port work, I expect to put together some plans for this next game.

Thanks for reading!

Want to learn when I release updates to Toytles: Leaf Raking, Toy Factory Fixer, or about future Freshly Squeezed games I am creating? Sign up for the GBGames Curiosities newsletter, and get the 19-page, full color PDF of the Toy Factory Fixer Player’s Guide for free!

Categories
Game Design Game Development Geek / Technical

LD50: How My Entry Disaster City Did #LDJam #LD50

Late last week was the end of Ludum Dare 50, culminating in Results Day.

After giving everyone a few weeks to play and rate everyone else’s games, the final numbers were calculated, and the ratings announced.

Congratulations to those who ranked at the top in both the Compo and Jam categories! And congratulations to everyone who managed to finish and submit a game in a weekend!

So, how did my entry, Disaster City, do?

There were about 2,900 games entered. Of those, 1,413 were in the 72-hour Jam category, which is the one I ended up entering my game. I intended to submit it for the 48-hour Compo, but I ran out of time and needed the extra day to finish the game.

Here are my game’s final stats:

Overall: 1137th / 1413 (3.136 average from 35 ratings)
Fun: 987th / 1413 (3.106 average from 35 ratings)
Innovation: 783rd / 1412 (3.212 average from 35 ratings)
Theme: 947th / 1398 (3.424 average from 35 ratings)
Graphics: 1206th / 1279 (2.682 average from 35 ratings)
Audio: 870th / 943 (2.734 average from 34 ratings)
Humor: 602nd / 1147 (3.078 average from 34 ratings)
Mood: 1068th / 1353 (3.109 average from 34 ratings)

So, Disaster City was pretty much in the bottom of the pack, with my worst scores in the Graphics and Audio categories, which…ok, fair. Jam entries are made by teams usually, and I did everything solo, and I am not an artist or great with audio. I do wonder how it would have compared to other Compo games though. My Overall, Graphics, Audio, and Mood scores were in the 1st quartile, which feels a bit disappointing.

All of my other scores were in the 2nd quartile. My best scores came from the Innovation and Humor categories. While they were in the 2nd quartile, they were near the top of it.

LD50: Monster emerging mock-up

I could spend a lot of time analyzing it, but Honest Dan shared a post titled “Why didn’t my game rate higher?” and other similar questions which made me realize something: I’ve had a low-priority bucket list item to create a #1 game for Ludum Dare (despite not having participated in years), but that goal is kind of out of my hands to make happen.

My ratings/rankings don’t matter?

35 people submitted ratings for my game out of thousands of players. That’s not statistically significant. Out of the “winning” entries for LD50, I think I only played one of them. The others were mostly off of my radar. And I imagine it is the same for many participants.

I set a goal to review at least 40 games, as reviewing more games means the Ludum Dare algorithm will show your game to more people who are looking to review games.

One person managed to review 400 games, which is prolific! When I checked their own game, though, I saw that while they had way more than the 35 reviews I did, it was still only about 100 ratings for their game. So there is a limit to how much the algorithm helps players find your game, I suppose. Also, having 100 ratings means you can be more confident about what people generally thought of your game, but it still isn’t much more statistically significant.

The winners of LD didn’t have more participants rating their game than others. They had a small subset of people rate their game, and those ratings were higher than the subset that rated someone else’s game.

In a way, it’s hard to say how someone’s game actually compared to someone else’s game because there’s some randomness in terms of who played which games and how they felt about them.

And as Honest Dan pointed out, you can’t even really compare your previous LD results to your current results, as it isn’t like you had the same 20 or so people rate your game in each. It’s an apples and oranges comparison. You can’t say whether you improved based on how others rated you in two different LD compos.

So despite leading with the stats up above, I’m trying not to take them terribly seriously. Maybe the general average ratings can give me a sense of what someone’s gut feel for my entry was.

But what I found incredibly valuable was the comments section of my game. I think even though there was a major bug that people reported that they exploited, and even though they said the game was too easy, a lot of them also reported that they basically saw the promise of the game. They liked the turn phases, the general concept, the mechanics and dynamics. They enjoyed playing it.

Which tells me that in a weekend I was able to put together something that other people found entertaining, if flawed. And that’s an accomplishment, something a low-ish rating can’t take away.

Now what?

I would like to make a post-LD version of Disaster City. My task list was incomplete, so what you can play now is a subset of what I envisioned. Adding more disasters and more things for the player to do to mitigate or rebuild after them should make it more compelling. And balancing the game should help, too.

But if I do, I want it to be part of my Freshly Squeezed Entertainment line of games, and I want to be deliberate here. Rather than build onto what I did, I might start over without the time crunch of a weekend deadline.

More on those plans later.

Ludum Dare 51 is scheduled for September 30th, and I would love to participate in it. I’ve marked it on my calendar, and I’ll need to work out logistics with my wife in terms of how to ensure I can focus on the compo that weekend. She arranged multiple outings with the kids and babysitters so I could fully participate in LD50, and I am incredibly grateful.

More immediately, this LD got me focused on putting together desktop ports of my entry so that people could more easily play them, which is something I’ve been meaning to work on for my existing games. So I’m going to tweak some build scripts, create some desktop ports of Toy Factory Fixer and Toytles: Leaf Raking, and then look into how easy it would be to create a web build using Emscripten.

Thanks for reading, and thanks for a great weekend and past few weeks!

Categories
Game Design Game Development Geek / Technical

LD50: My Favorite Entries So Far #LDJam #LD50

Ludum Dare 50 is almost over, with the review/ratings period and the new Extra option continuing until April 21st.

In the roughly two weeks after the Compo and Jam deadlines ended, entrants are expected to play and rate other games, leave feedback, and otherwise enjoy the number of new games created.

An entry needs 20 ratings to receive a score, and I am happy to say that so far my entry Disaster City has 27.875. While I have 28 ratings in each of the other categories, I only have 27 for audio. Weird. Also, someone live streamed playing it, so that was exciting to see!

I’ve played and rated only 23 games as of writing this post, and my goal is to try to play and rate at least a total of 40 entries before the 21st.

In the meantime, I want to highlight a few of my favorites so far.

Compo Entry: Ready, uNsTeAdY, FIRE!

The genre is familiar, the production values look and sound simple, and yet the mechanics are compelling. You have to move your ship to prevent your gun from firing too early, you have to move to avoid crashing into the Space Beasts, and if you fire your laser but don’t manage kill any of the Space Beasts, the power of the laser overloads your ship and you die. The fact that some of the Space Beasts can only be killed by a much more charged up laser blast means that you spend a lot of time trying to watch your meter while moving to both keep the laser charging more and avoid enemies at the same time.

I know it was the creator’s voice the entire time, but I found the music catchy (it was stuck in my head while I was washing dishes) and the sound effects effective as well as hilarious. The highest laser beam charge is incredibly satisfying to fire off, too.

Extra Entry: Princess: Unwed

You play a medieval princess trying to use her carrier pigeon to send and receive messages across the land, getting the dirt on potential suitors, all in the name of convincing your mother to call off any arranged marriages.

It’s a mystery game, in which you try to gather clues to piece together where the suitor might be, what they might be up to, and what kind of person they are.

And there’s a pigeon. What’s not to love.

Jam Entry: 22h38

A mystery adventure with a very intimidating end of the world. The art and audio works well together, and the concept kind of reminded me of a few time-based scifi stories I’ve read or watched.

Jam Entry: Kiwis Can’t Fly

This game is emotional, both thanks to the excellent audio design and to the writing. The art is adorable, and as some of the reviewers have mentioned, it is reminiscent of Orisinal. It feels like playing a musical instrument, and it brought a smile to my face. The ending is both sad and beautiful. I loved the entire experience.

It’s been years since I last participated, but I am once again loving the variety of concepts that people have implemented in a short period of time. Good work, everyone!

What have been your favorite entries so far?

Categories
Game Design Game Development Geek / Technical

LD50: Disaster City Ports and a Time Lapse #LDJam #LD50

Disaster City, my Ludum Dare 50 entry in which you must try to R&D your way to averting a seemingly inevitable disaster while dealing with other disasters at the same time, was originally released with a Linux-based build.

I have since ported it to Android in the form of an APK you can sideload.

And now I have a Windows version of the game!

You can find my LD 50 entry page and the download links at: https://ldjam.com/events/ludum-dare/50/disaster-city

LD50: Disaster City entry

Thanks for playing!

In the meantime, here’s a time lapse of my desktop during the 48 hours of the compo, plus a few more hours as I tried to see how far I could get even when the deadline passed. I didn’t record the next day when I finally got the entry in by the Jam deadline, though.

Categories
Game Design Game Development Geek / Technical

LD50: Introducing Disaster City, Submitted as a Jam Entry #LDJam #LD50

I squeezed in a few more hours of development today, and in between picking up kids from school, meetings, dinner, play time with the kids and getting them ready for bed, I managed to get my game to a somewhat finished state, enough that I can call it enough of a game to feel comfortable submitting it.

And then of course I discovered a number of game-breaking bugs with my submission that I managed to fix shortly after.

But I did it. You can learn more about Disaster City at the official Ludum Dare 50 entry page.

LD50: Disaster City entry

I’m going to try to create an Android port next, but for now, I need to rest and prepare for the rest of the week.

Happy 20th birthday, Ludum Dare! I’ve missed this. B-)