Categories
Games Geek / Technical Marketing/Business

How Sony Killed Its Brand

I found this video on YouTube called PS3 Song that puts a video together with the song “How to Kill a Brand” by Doc Adams. It is actually a good look at how Sony’s marketing hype machine (“4D Graphics”? WTF?) and uninteresting game lineup failed to dazzle gamers and prevent Microsoft and Nintendo from basically laughing all the way to the bank.

Even if you don’t think that there is anything wrong with Sony, the song is a good parody.

Categories
Games Geek / Technical General

If Old Games Were Made Today…

Plenty of people would argue that today’s games are influenced by yesterday’s games. For example, jumping puzzles are not as common as they once were because game developers have learned that jumping puzzles generally suck, something we wouldn’t know if game after game didn’t use such puzzles as filler. Likewise, using the WASD keys to control the game is so pervasive, no one even thinks twice about putting it in a game such as Harry Potter and the Chamber of Secrets, a game that is not targeted to hardcore audiences, the only ones who would expect to use WASD.

But what would happen if old games were made for the first time today? Would they be the same games, or would “conventional wisdom” dictate changes? Below are a few guesses:

  • Donkey Kong, Mario Bros, and Berzerk wouldn’t be considered casual enough. Significant changes would need to take place to make them more acceptable to mainstream audiences. Donkey Kong would need to throw different colored barrels that Jumpman would need to collect and match. Player clicks would dictate where Mario and Luigi should go and in what order to clean the pipes. Robots in Berzerk would be changed to colorful bugs, and the player would control the bouncing smiley face to try to save the flowers that for some reason are growing everywhere.
  • Pac-man, Space Invaders, and Asteroids would provide medals with different shapes, names, and colors. Collect all of the medals, and show off to your friends!
  • Centipede would be made into an RTS based on insects. There would be three factions, each with different abilities. Koreans would watch people play this game in stadiums and on television.
  • Tetris would feature pop music and psychedelic colors flashing to a beat.
  • Defender would be panned as too simplistic as the enemies don’t shoot nearly often enough to provide a real challenge.
  • SimCity would be considered too free form. There should be specific goals, such as destroying as many buildings as possible in three minutes or tearing up the streets to prevent the SimCitizens from getting to work on time. Also, you would need to match three Residential Zones to get the condos, not just two. Eventually politicians would blast it for providing training to terrorists since they could set the city on fire or cause an earthquake on command.
  • Jack Thompson would point to Custer’s Revenge as typical of sex-and-violence training simulators being marketed to children and takes it upon himself to “shutdown Mystique”. Sales of the game would skyrocket due to the publicity.
  • E.T. would have multiplayer modes featuring kids flying through moonlit skies and saving dying flowers. Co-op mode would feature multiple phone components strewn throughout the world. Naturally, it would be a prime candidate for in-game advertising, specifically for The Hershey Company’s Reese’s Pieces. E.T. would still be considered the worst video game ever, and I would probably still be the only person who liked it.

Any other guesses?

Categories
Games Geek / Technical General

Every NES Game on eBay

A friend of mine sent me this eBay auction for every NES game ever made, including the NES system.

Besides 670 officially licensed NES games and the NES system, you can also win the unlicensed games, ROB the robot, the Power Glove, the Power Pad, and a number of other accessories.

Normally I wouldn’t give much thought to an NES auction, but as of this writing the bidding is at $4,550.00. Eep!

I already have the two Wizardry and Zelda games listed, and so I’m good for now; however, if someone in Chicago wins this auction, I would have no problem hanging out to play any number of the other games.

Categories
Game Design Game Development Games Geek / Technical Marketing/Business

Measuring What Players Find Most Rewarding in Games

One of the problems game developers have is figuring out what players want. There are various papers, arguments, and forum threads on what constitutes fun and how to engineer it. Entire books may be dedicated to the question of what players like about video games. If we can find out what they like, we can make more of it.

The PENS model suggested in the article Rethinking Carrots: A New Method For Measuring What Players Find Most Rewarding and Motivating About Your Game seems to be a statistically significant predictor of player enjoyment. That is, someone has come up with a model that is incredibly accurate at predicting what a player may enjoy about playing video games.

The article is eight pages long and goes into some detail, but the Player Experience of Need Satisfaction model breaks everything down to three psychological needs:

  • competence
  • autonomy
  • relatedness

Competence suggests that players enjoy activities in which they can feel effective. Getting to the next level, finding the next item, and surviving the next zombie all allow the player to overcome challenges, and the player enjoys becoming better at these activities.

Autonomy simply means that the player feels he/she has a choice. A game that allows the player to choose his/her way through will be more enjoyable than a game that acts as if it is on rails.

Relatedness is about the fact that video game players are social animals. The article suggests this part of the model has only recently become relevant to the mainstream player, but I think that MUDs, BBS, and various multiplayer video games have existed for a long time. I am sure relatedness applied there as well.

What’s interesting about this PENS model is that it seems to be much more accurate at predicting the success and popularity of a game than trying to measure “fun” in other ways. One of the more interesting quotes:

Describing the player experience in terms of genuine need satisfaction, rather than simply as “fun,” gives the industry the deeper language it deserves for communicating what makes games so powerfully unique. It allows us to speak meaningfully about the value games have beyond leisure and diversion, diffuses the political bias against games as empty experiences, and provides an important new lexicon in the Serious Games arena where, as the name implies, fun is not always the primary goal. When we speak of games in terms of their satisfaction of competence, autonomy, and relatedness, we respect that this is both what makes them fun and also what can make them so much more.

Some new words to make it easier to talk about video games? I’ll take them.

Categories
Game Development Games Geek / Technical General Marketing/Business

Miss Out on GDC Again?

Last year, I had to enjoy the Game Developers Conference remotely, reading the coverage of the event by other indie bloggers.

Yesterday was the last day that I could get a discount to register for the March event. Even discounted, the prices are a bit steep, and I really would have liked to go to attend the keynotes and the tutorials. Since I am not a VIP or giga in any way, I had to settle for the Indie Expo pass. I could manage that price, even without the discount.

Before I registered, I checked to see how much airfare and a hotel room would cost for the week. Ouch.

I don’t want to miss out on GDC again, so I am trying to figure out how I can afford to go. I think if I can find someone to share a hotel room with, it won’t be so bad. Failing that, perhaps I may not be able to stay for the entire week. Maybe I’ll just stay for a few days, timing it so I can attend the Independent Games Festival.

No matter what, I don’t want to miss GDC again.

EDIT: Ok, apparently the pass I was going to purchase is now sold out, which means that the only way I could go is if I use the Expo pass. I don’t think it will be worth the cost for a hotel and the hassle involved if I can’t even go to the Indie Games Summit.

Categories
Geek / Technical General

POTM for January: GnuCash

This month’s Project of the Month is GnuCash, the personal and small-business financial-accounting software.

I use GnuCash to keep track of my finances. Even before I started my business, GnuCash helped me to organize my income, expenses, and bills. I have stopped using my checkbook’s registry just because it is easier to use GnuCash to update my savings and checking accounts.

What appealed to me was the familiarity I already had with this system. I took a couple of years of accounting in high school (thanks, Mr. Mullin!), and so I was quite familiar with ledgers, journals, debits, and credits. Don’t think that you need to know accounting to use GnuCash. It’s just that knowledge of the principles of accounting help, regardless of the application or tools you use to balance your books. I haven’t used anything like Quicken or Money, so I can’t comment on them. I do know that when trying to setup either of those programs on someone else’s machine, I had a tough time figuring out how to enter transactions. It just wasn’t worth the effort to figure out how those applications tried to make it “easier” for me.

By the way, if you already use Quicken or Money, it can import your data, so if accounting software is the only thing preventing you from moving to a different operating system, you don’t have to feel that your data is stuck.

GnuCash provides a way to see your financial data in customizable reports, complete with graphs and charts. You can use multiple currencies, track your stock portfolio, manage your small business, reconcile your statements, and schedule transactions. I personally haven’t used all of these features, but I am starting to do more and more. I recently decided to transfer a set amount each month to my ING Direct account, and instead of requiring me to remember to manually enter the amount each month, GnuCash does it for me.

The only feature I am waiting for is the ability to close balances. Supposedly, the code is actually written for this feature, but it is disabled since it has not been tested thoroughly enough. For now, each December 31st, I manually transfer all of my expenses and revenues into a temporary account, then I transfer the balance to my equity account.

To donate to this project, please visit http://sourceforge.net/donate/index.php?group_id=192 or learn about other ways to help.

Categories
Geek / Technical General

Project of the Month

Larry Garfield has announced the Open Source Project of the Month.

While there are some major, ubiquitous pieces of open source such as the Linux kernel, OpenOffice.org, Mozilla, and Apache, there is plenty of great pieces of open source out there that doesn’t get funding from companies such as Red Hat, IBM, or hardware manufacturers.

While many would argue that fame is more important than money to open source volunteers, I’ve yet to meet one that didn’t like money as well as fame. Really, who wouldn’t? The goal of Project of the Month is to provide a little of each to open source developers, whether they’re already getting revenue from their work or not. The vast majority of open source code is also free-as-in-beer, and while I won’t say that anything is “owed” to those developers (they do, after all, release their code free-as-in-beer by choice), it’s still polite to acknowledge their work.

POTM has two steps each month:

  • Donate $25 USD to an open source project of your choice.
  • Blog about the project.

The idea is to promote and show appreciation for the lesser-known open source projects out there. For more details on how to participate, check Larry’s POTM blog post.

Categories
Geek / Technical

MythTV and Me

I finished writing an article about My MythTV Install Experience.

MythTV is an open source project to allow you to use your PC as a custom-made digital video recorder. It has even more functionality than anything the media companies would allow, and it is actively developed.

Unfortunately, it is also still maturing, which means that there were quite a few nights spent trying to get it to work. MythTV is at v0.20, though, so I suppose expecting it to “just work” was too much to ask.

If you want to make your own MythTV box, I would strongly suggest researching your motherboard and your capture cards. I had the most trouble with the nForce 410 chipset which prevented me from using onboard audio. I also had trouble with two non-hardware encoding capture cards, but once I replaced them with well-supported hardware encoders, my problems seem to go away.

It took way more time than I would have liked, but it is pretty much finished now…just in time for Battlestar Galactica and other shows to start.

Categories
Game Development Games Geek / Technical Linux Game Development Politics/Government

Second Life Client Source Code Released as Open Source

I found this bit of news on LinuxGames.com. Second Life, the virtual world created by the players, has had its client code open sourced. Linden Labs released the client code under the GPL.

I’m excited by this news because it means that progress on the GNU/Linux client might actually move forward, putting it on par with the Windows and Mac clients.

Categories
Geek / Technical

Interesting Results from Testing Random Number Generation Assertions

Taking Ken’s advice from my previous post, I decided to write a small test program to verify that the RNG method touted as the best really was the best.

To recap, there are three methods:
The first method involves using the modulus operator: int pickedNum = rand() % range;
The better method involves some trickery to get past the low-order bits problem: int pickedNum = rand() / (RAND_MAX / range + 1 )
And the best method involves a loop discarding the values that are not within the range:

int randDivisor = RAND_MAX / MAX_RANGE;
unsigned int numPicked;
do
{
numPicked = rand() / randDivisor;
}
while (numPicked >= MAX_RANGE);

I wrote up some code to seed the C++ random number generator, run through the loop for a certain sample size for each method, and then output the results. I was expecting that the best method described in the article would show a better, more even distribution. My test checked the distribution over a range of numbers. I had a vector that stored the number of times a value within that range was picked. That is, if the RNG method output a value of 3, I would increment the value in the vector associated with 3.

I also stored the start time and end time for each loop. Besides accuracy, I think speed would be important for game developers.

I ran it with different sample sizes and different seeds, and each time I seem to come to the same conclusion.

The supposedly “best” method was actually slightly worse than the “better” method.

How did I define what was good and what was bad? I already think that the method described in the article was simpler to write and understand, which means that it wins when it comes to readability and maintainability. Of course, this code will likely be written once and left in some utility class, and the other methods are fairly easy to understand, so we’ll look at other criteria.

How flat is the distribution? If an RNG picks one number much more often than the rest, then it wouldn’t be very good. We would ideally like something that was truly random, so the method that gives us the distribution that was closest to random would be best. Now, I am no expert on statistics, but my check was to subtract the maximum number of times a value was picked from the minimum number of times a value was picked. For an ideal distribution, the answer would be 0. If one distribution’s min to max range is closer to 0 than another, then we can probably assume that the method used to create that first distribution is closer to truly random.

How long does it take to compute a result? If a method produces superior results but takes multiple seconds to calculate each time, it won’t really be useful in a video game which generally tries to calculate results within milliseconds.

So those were my two criteria. If you are very familiar with statistics, perhaps you have already seen a flaw with my test. Please let me know because I would rather have the truth than be correct.

Here are the results when run over a range of 10000 numbers and a sample size of 100000000, using a seed of time(NULL) (the same value was used for each method, so the seed value was created at the start of the program):


Time to finish first method: 9
Time to finish better method: 8
Time to finish best method: 12


Diff between min and max values:
First: 10398 - 9641 = 757
Better: 10414 - 9626 = 788
Best: 10416 - 9621 = 795

I ran it multiple times, and the Best method seemed to always have a large range between the minimum and the maximum number of times a value was picked. It also consistently ran 2 or 3 seconds slower than the other methods.

And using a seed of 0:


Time to finish first method: 9
Time to finish better method: 8
Time to finish best method: 11


Diff between min and max values
First: 10445 - 9629 = 816
Better: 10409 - 9586 = 823
Best: 10395 - 9621 = 774

I ran the test using a much smaller range of values. Instead of 10,000, I used 10.


Time to finish first method: 9
Time to finish better method: 8
Time to finish best method: 10


Diff between min and max values
First: 10006453 - 9992621 = 13832
Better: 10001752 - 9998124 = 3628
Best: 10001752 - 9998124 = 3628

Over a smaller range of values, it seems that the Best and Better methods have the same difference between the minimum and maximum number of times a value was picked. Best was still consistently slower, though.

So what do these results mean?

The trick of dividing by RAND_MAX uses the high order bits of a random number produced by rand(), thus alleviating the old problem of poor rand() implementations with non-random low order bits. But as I said, these days that problem has been largely fixed by library vendors, so the only thing you have managed to do is make the code more complicated. It buys you nothing, and before you start thinking that doing the same thing with floating-point will help, I assure you that it will not.

Well, it seems that according to my tests it buys time, but then the article does mention using a uniform deviate from the random number. I then added that version to my test, and recomputed the results.

For a small range, I found that it ran at the same speed as the first two methods while also providing the same difference between minimum and maximum number of times a value is picked as the Better or Best method.

For a very large range (100000 values), however, I got the following results:

Time to finish first method: 11
Time to finish better method: 11
Time to finish best method: 13
Time to finish deviate method: 11


Diff between min and max values
First: 1178 - 860 = 318
Better: 1140 - 238 = 902
Best: 1135 - 861 = 274
Deviate: 1142 - 865 = 277

As you can see, the uniform deviate method was computed at about the same speed as the non-looping methods. While being only slightly worse at flattening the distribution as compared to the Best method, it blew away the Better method.

However, I also would like to note that the original method, which was supposedly so bad, seems to have a fairly flat distribution, too. Of course, once you get to a smaller range of values, it got much, much worse. If you only checked a range of 10 values, the first method came back with over 3 times the difference of the other methods, and the deviate method worked about as well as the Best, coming back with the same flatness in distribution.

Apparently which method you use depends on what range of values you are checking. If you are checking a small range of values, then you would do well NOT to use the First method. If you are checking a large range of values, it doesn’t seem to matter as much. In any case, the “Best” method is not named right, but the uniform deviate does correct its speed problem.

Potential Problems with These Tests:
Subtracting the minimum number of times a value is picked from the minimum number of times a value is picked might not really give me a meaningful value like I think it does. If you can think of a better way to handle this test, please feel free to correct me.

If you would like the C++ source, you can download it:
randomtest.tar.gz
randomtest.zip