Categories
Geek / Technical

MAKE: 10 Things to Do With Old PCs

People have been fairly excited about the new Make magazine. From the page:

The first magazine devoted to digital projects, hardware hacks, and D.I.Y. inspiration.

They actually have a blog where they post interesting projects. Today I found an entry on 10 Things to Do With Old PCs. I was amused that out of all of the cool things they mentioned, “Install Linux” was simply one option and not a preprequisite for most of them. B-)

I’ve been using a home network for some time and it was a great learning experience. As a gamer, LAN parties are a semi-normal part of life, and setting up LANs is second-nature now. My main system is my multimedia center, and I am interested in making a DVR. I run my own webserver and file server. I even make automated backups across the network.

Sure, none of it “requires” you to use Gnu/Linux. Windows is currently much preferred for LAN parties due to the number of games it has. Still, what’s the point of tinkering with an old box if you aren’t trying to learn something? And isn’t it easier to learn how an engine works when it is visible? When you can take it apart and put it back together again? One of the big things that appealed to me about Gnu/Linux was that it was hackable, meaning that I could mess with it all I wanted without worrying about some violation of a draconian EULA.

Make obviously needs to cater to as many customers as it can, and most people run Windows in some form. Still, I can’t help wondering how much easier it would be to setup a central repository for files or create a personal website for your LAN when running Gnu/Linux. It would also be cheaper since you would technically have to buy a license for Windows in order to run it on the new member of your computer family. You’ll be hard pressed to find an older version of Windows that would run on an older machine, so be prepared for potentially expensive hardware upgrades just to meet minimum specs. Some Linux-based distros, on the other hand, can run on machines that gave Windows 98 problems, and most distros will run on the low-specs described in the article. Since you don’t need a per-machine license, you can even create your own Beowulf cluster if you have multiple old machines.

Categories
Game Development Games Geek / Technical

Out of Touch with Games

I’ve been meaning to post about how I am out of the loop when it comes to mainstream games these days. Wario Ware? I still haven’t seen it, but apparently it is huge. I didn’t see the Nintendo DS in action until a few weeks ago. In fact, I haven’t seen the PSP. Lumines? Wipeout? I haven’t seen the Game Boy Advance SP except in other people’s hands.

I wanted to check out Prince of Persia: Sands of Time a few months ago, but it was sold out. I ended up getting Metroid Prime, which is definitely a fun game, but my keyboard/mouse hands can’t remember how I used to play Goldeneye 64 with a passion. I bought Civilization III months ago because it was at impulse-buy range (under $15). I still haven’t pulled the cellophane off of it. I’m only now considering purchasing Unreal Tournament 2004, which got to the top of me and my friends’ want lists because it was available for Linux out of the box (thank you, Epic! I’ll thank you with my dollars soon).

In the meantime, there has been new a Metroid game, a new SimCity(how did THAT escape my attention?!) and an expansion pack for Doom 3. Nintendogs was out. Nintendo was announcing a new game console. All of these things were news to me way after they should have been.

I used to be on top of Nintendo-related news since I had a subscription to Nintendo Power. Last year I realized that I wasn’t playing Nintendo games as much so I gave it up, but now I find myself surprised that other people know about things before I do. I used to be the GOTO guy for Nintendo trivia and knowledge!

And regarding PC games: I just don’t have the time to play all the games I currently have, let alone buy new ones. I loved Homeworld Cataclysm, but I wasn’t going to buy Homeworld 2 since the demo didn’t impress me that much. Dungeon Siege? Sounded cool, but I wasn’t really compelled to play it. Black & White’s expansion didn’t end up in my collection even though I loved the original.

Am I really capable of making good games if I don’t even play many of the existing games? You can’t become a good writer without reading a lot of books, and I think you can’t make good games without playing a lot of games. The good games will inspire, and the bad ones will acts as sign posts, warning the intrepid developer about what not to do and where not to go.

Clearly, I have some improving to do.

Categories
Geek / Technical

Stupid Hypotheticals While in Line for Star Wars

While I was waiting in line for the new Star Wars movie last night, I posed this joke of a question: who would win in a fight, Voldemort or Vader?

Funny, right?

Well my girlfriend and our friend started discussing it. Seriously discussing it. They concluded that Voldemort would win since he only had to say something to kill Vader. I protested that Vader would do his patented choke-from-across-the-room Force move, rendering Voldemort speechless.

How did I get sucked into arguing about this?! It was a joke! I was poking fun at the die-hard fans who would ask such questions! After a few calls, we had three people for Voldemort and two for Vader. Whatever.

So then I said, “Dumbledore or Yoda?” That one I left alone, and I don’t know if they concluded anything. But “Luke Skywalker and Harry Potter? No, they’d fight on the same side!”

Oh, and the movie was great.

Categories
Geek / Technical Linux Game Development

Wine Talk

This Thursday I am giving a talk on using Wine to run Windows applications on Linux. Wine stands for Wine Is Not an Emulator. It is an implementation of the Win32 API, which means applications run at a normal speed compared to running them in actual emulators like VMWare.

That is, if they run properly. Wine is still not at version 1.0, which means that nothing is expected to work. Still, a great many apps are available as can be seen in the Application Database.

My first test was to install Starcraft. I found a few how-tos, but they aren’t terribly consistent, and it seems that the authoritative document is years old. Fortunately it wasn’t hard to install and run. The only problem I’ve found is that Battle.net isn’t usable. I get a black screen and the mouse cursor disappears so I can’t join in games. Cedega is a fork of the Wine code that was made into a closed source app. It was previously called WineX. It’s database lists Starcraft as highly rated and doesn’t mention the Battle.net problem. Still, I am not ready to shell out $5/month just to have Windows games I’ve already paid for running on my Gnu/Linux system, especially when some months I don’t get to play said games, and especially when there are plenty of native apps already. Also, it seems that newer versions of Wine tend to break the feature whenever it is available. I’ll wait.

My next test was Wizardry 8. I love the Wizardry series, and I feel bad that I haven’t played 8 very much. It works perfectly according to the Wine application database, so I set out to install it. I had forgotten how long it took to install this game! This is one of those 3 CD games which take up a lot of space on my hard drive. At 50% I had already added 1GB to my home directory (well, including Starcraft and the rest of Wine, but still).

Unfortunately when I try to run it, Wizardry 8 thinks I am running a debugger and wants me to stop doing so. I ended up having to find a no-cd crack in order to play this game, which goes to show you how annoying copy protection can be for legitimate owners of games. In the end, I managed to get it to play and aside from a few sound cracks during the opening video, it ran flawlessly. I am very pleased but I wish I didn’t have to go looking for a No-CD crack to play. Still, it shows a legitimate use of what would otherwise be considered illegal under the DMCA.

And Total Annihilation’s setup program wouldn’t even run on the version of Wine that I was using (20050310).

Wine just isn’t ready yet, but then no one pretends it is. Still, I have the ability to play one game as it was meant to be played, and another one works fine in single-player mode. Total Annihilation was possible at one time, so I can look forward to it in the future, I’m sure. Not bad for an incomplete project.

I’ve updated the Wine-Wiki for Wizardry 8, Starcraft, and Total Annihilation. Someone else already added their own comments about Starcraft. Wikis rule.

Categories
Geek / Technical General

DePaul Linux Community Install Fest

May 14th, this Saturday, will be the DePaul Linux Community Install Fest. It’s open to everyone, so if you’re in the Chicagoland area, check out the information on DLC’s website. Here’s a handy map to the event.

Categories
Geek / Technical

Snapback2 How-To

Not too long ago I created my own automated backup script. Shortly afterwards, helpful people sent me links to other, more robust scripts that have been written. One of those was called Snapback2.

Snapback2 is a backup script based on rsync and hard-links. I could explain what that means, but why reinvent an already well invented wheel? Again?

My original script was alright, but it worked by making an exact copy of everything each time it ran. For my 4 GB home directory, backing it up weekly over the course of a month would result in a backup directory that is 16-20 GB in size! That’s a lot of wasted space, especially when some files don’t change at all.

Snapback2 uses hard links and only stores changes between one backup to the next, which means that if I only changed files that were 30MB in size, then the next backup will be 30MB as well. If no changes were made, then no space is wasted at all. Clearly this method is superior to what I have written.

Setting up Snapback2 is supposed to be very simple, but I found that the documentation assumes you know what you’re doing. The following is my Snapback2 How-To:

You can download Snapback2 at http://search.cpan.org/~mikeh/Snapback2-0.5/ for the latest version as of this writing. Technically you should be able to use Perl to download it from CPAN, but I didn’t. Most of the prerequisites should be on your Linux-based system already. According to the documentation, you’ll need:

Gnu toolset, including cp, rm, and mv
rsync 2.5.7 or higher
ssh
Perl 5.8 or higher
Perl module Config::ApacheFormat

On my Debian Sarge system, I have rsync 2.6.4, so your distribution will likely have at least 2.5.7. Similarly, I have Perl 5.8.4. The one thing that you need to do is get and install Config::ApacheFormat. To do so, make sure you have root privileges and run:

# perl -MCPAN -e 'install Config::ApacheFormat'

If it is the first time you’ve used CPAN through Perl, you will be prompted to configure it. If you aren’t sure, you can simply cancel the configuration step and it apparently grabs some defaults just fine. Any and all dependencies will also be installed.

Once you have all of the prerequisites, you can install Snapback2. Again, you could probably do the same thing above to grab it from CPAN, and it will probably grab Config::ApacheFormat for you, but as I didn’t do that, I won’t cover it here.

If you grabbed the tar.gz file from the link I provided above, you should run the following:

# tar xzf Snapback2-0.5.tar.gz

It will create a directory called Snapback2-0.5. The README tells you what to do, but for completeness, here are the next steps:

# cd Snapback2-0.5
# perl Makefile.PL
# make
# make test
# make install

Snapback2 should now be installed on your system. If it isn’t, you should double-check that you have all the prerequisites. The fourth line in the previous list runs tests before installing. If something failed, you should know why from the test results. Even if you did install it successfully, it isn’t going to do anything yet. You now need to make a configuration file.

You can read the documentation setting up the configuration file in the man page for snapback2, but you can also view it online.

Here is my file, snapback.conf:

Hourlies 4
Dailies 7
Weeklies 4
Monthlies 12
AutoTime Yes

AdminEmail gberardi

LogFile /var/log/snapback.log
ChargeFile /var/log/snapback.charges

Exclude core.*

SnapbackRoot /etc/snapback

DestinationList /home/gberardi/LauraGB

<Backup 192.168.2.41>
Directory /home
</Backup>

I didn’t change it much from what was in Snapback2-0.5/examples. I installed Snapback2 on the machine called MariaGB. MariaGB will connect to 192.168.2.41, which is the IP address of my main machine called LauraGB. This is why the DestinationList refers to LauraGB. If I wanted to backup another system, say BobGB, I would keep those backups separate in their own directory called BobGB. Normally, the ssh/rsync request would ask for a password. When I setup the backups to run automatically, it won’t be useful to me if I need to be present to login. You can do the following to create a secure public/private key pair:

$ ssh-keygen -t rsa

The above line will create keys based on RSA encryption, although you could alternatively use DSA. You will be prompted for a passphrase, which is optional. Still, a good passphrase is much better than no phrase at all. Using the defaults, you should now have two files in your .ssh directory: id_rsa and id_rsa.pub. The first one is your private key. DO NOT give it to anyone. The second one is your public key, which you could give to anyone. When setting up key-based authentication, you will append the contents of this file to the server’s .ssh/authorized_keys file. Next time you login, instead of being prompted for a password, you will find yourself at a prompt, ready to work. For more detailed information, read this document about using key-based authentication with SSH.

So now, if I run the following command on MariaGB:

# snapback2

It will backup any changes from LauraGB’s /home directory to MariaGB. If, however, it hasn’t been an hour since the last backup, it won’t do anything.

Still, manually running this command isn’t very useful, and while I could install a cron job to run snapback2, I will instead make sure that snapback_loop is running. It acts as a daemon, checking to see if a file gets created in /tmp/backups. Now I can create the following entry in my crontab:

# Create file for snapback_loop to run
0,30 * * * * touch /tmp/backups/snapback

So now, every 30 minutes, I create the file /tmp/backups/snapback, which snapback_loop will take as its cue to delete that file and run snapback2. Then snapback2 will make a backup if there has been enough time since the last backup was made.

Now, I have automated backups that run regularly. Some caveats:

  • Verify that snapback2 is in /usr/local/bin. On my system, snapback2 would run manually, but snapback_loop would output errors to /tmp/backups/errors that weren’t too clear. I had to create a symlink to /usr/bin/snapback2 in /usr/local/bin in order to get it to run.
  • Make sure snapback_loop is running with root privileges. It has to call snapback2, which will need access to files in /var/log and other directories which will have restricted access. If you run it as a regular user, you may get errors. You could also change the location of the log file, but /var/log is a standard spot to keep such output.
  • Because you are running it with root privileges, you’ll need to make sure root is the one with the public key in authorized keys rather than your user account. Otherwise, you’ll get errors like “permission denied” when rsync tries to connect to the other machine.

If you don’t use a second computer, you can always use a second hard drive instead. Either way, you now have an effortless system for automating your backups!

Categories
Geek / Technical

Automating Backups: I Reinvented the Wheel

I already knew that I had reinvented the wheel when it came to writing my automated backup solution. I spent some time researching this topic, and I found a number of resources.

Of course, as soon as I posted my results on the LUNI mailing list, I got email from helpful people pointing out that there are other projects that are out there and have been tested. Way better than what I had, and much more optimized. My solution was making multiple copies of my data, resulting in multiples of gigabytes of storage, whereas some of these solutions are clever and use much less space.

So in true blog fashion, here are some links:

Categories
Geek / Technical

Automating Backups for Fun and Profit

At the end of last month, I mentioned that I had bought a new hard drive for the purposes of backing up my data. I just now installed it in my second computer. Was getting the backup system in place important? Yes, absolutely! I don’t think that a hard drive might fail. I know it WILL fail.

While I could just simply copy files from one drive to the other manually, it depends on me to do so regularly. I’m only human, and I can forget or get sick, resulting in potentially lost data. Computers are meant to do repetitive tasks really well, so why not automate the backup process? My presence won’t be necessary, so backups can take place even if I go on vacation for a few weeks or months. The backups can be scheduled to run when I won’t be using the computer. Copying lots of data while trying to write code, check email, and listen to music at the same time can be annoyingly slow, but if it happens while I am sleeping, it won’t affect my productivity at all. Also, while the computer can faithfully run the same steps each time successfully, I might mess up if I run the steps manually and in the wrong order. So with an automated system, my backups can be regularly recurring, convenient, and reliable. Much better than what I could do on my own week after week. I decided to make those concerns into my three goals for this system.

I’ll go over my plan, but first I’ll provide some background information.

Some Background Information
LauraGB is my main Debian GNU/Linux machine. MariaGB is currently my Windows machine. The reason for the female names is because I am more a computer enthusiast than a car enthusiast. People name their cars after women, so I thought it was appropriate to name my computers similarly. For the record, my car’s name is Caroline, but she doesn’t get nearly the same care as Laura or Maria. My initials make up the suffix, GB. I guess I am not very creative, as my blog, my old QBasic game review site, and my future shareware company will all have GB. Names can change of course, but now I see I am on a tangent, so let’s get back to the backup plan.

LauraGB has two hard drives. She can run on the 40GB drive, as it has the entire filesystem on it, but the 120GB drive acts as a huge repository for data that I would like to keep. Files like my reviews for Game Tunnel, the game downloads, my music collection, funny videos I’ve found online, etc. It’s a huge drive.

For the most part, I’ve simply had backup copies of data from the 40GB drive to the 120GB drive. I also had data from an old laptop on there. Then I started collecting files there, but they don’t have a second copy anywhere. If I lose that 120GB drive, I could recover some of the files, but there is no recovery for a LOT of data. Losing that drive spells doom.

At this point, MariaGB can become much more useful than its current role as my Games OS. With the new 160GB drive, I can now have at least two copies of any data I own.

The Backup System
I spent the past few weeks or months looking up information on automating backups. I wanted something simple and non-proprietary, and so I decided to go with standard tools like tar and cron. I found information on sites like About.com, IBM, and others. I’m also interested in automating builds for projects, and so I got a lot of ideas from the Creatures 3 project.

On Unix-based systems, there is a program called cron which allows you to automate certain programs to run at specific times. Each user on the system can create his/her own crontab file, and cron will check it and run the appropriate commands when specified. For example, here is a portion of my crontab file on LauraGB:

# Backup /home/gberardi every Monday, 5:30 AM
30 5 * * 1 /home/gberardi/Scripts/BackupLaura.sh

The first line is a comment, signified by the # at the beginning, which means that cron will ignore it. The second line is what cron reads. The first part tells cron when to run, and the second part tells cron what to run. The first part, according to the man page for crontab, is divided as follows:

field allowed values
—– ————–
minute 0-59
hour 0-23
day of month 1-31
month 1-12
day of week 0-7

So as you can see, I have told cron that my job will run at 5:30 AM on Mondays (30 5 * * 1). The asterisks basically tell cron that those entries do not matter.

The second part tells cron to run a Bash script I called BackupLaura.sh, which is where most of the work gets done.

Essentially, it gets the day of the month (1-31) and figures out which week of the month it is (1-5). There are five because it is possible to have five Mondays in a single month. Once it figures out which week it is, it then goes to my 120GB drive and removes the contents of the appropriate backup directory. I called them week1, week2, etc. It then copies all of the files from my home directory to the weekX directory using the rsync utility. I use rsync because using a standard copy utility would change the file data, resulting in files that look like they were all accessed at once. Rsync keeps the same permissions and date access times as they were before the backup.

So tomorrow at 5:30 AM, this script will run. As it will find that the date is 11 (April 11th), it knows that it is in the second week. So the directory week2 will be emptied, and then all files will be copied from my home directory to week2.

That’s all well and good, but you have probably noted that every week, the weekly backup erases the same week’s backup from the month before. When this script runs on May 9th, week2 will be erased, losing all of April’s 2nd week backup data! I’m ok with that.

Here’s why: every 1st Monday of the month, the script will make the week1 backup, but it will ALSO make a monthly backup. It takes week1 and runs another utility on it called tar. Multiple files can be combined into one giant file by using tar. The resulting file can also be compressed. Most people on Windows will create .zip files using similar utilities, but tar fits the Unix programming philosophy: a robust tool that does one thing really well.

Usually tar is used with utilities like gzip or bzip2, but sometimes compression is avoided for stability reasons in corporate environments. Compressing might save you a lot of space, but on the off chance that something goes wrong, you can lose data. And if that data is important enough, the risk isn’t worth the space savings.

In my case, I decided to use bzip2 since it compresses much better than gzip or LZW (what .zip files are compressed with). If I corrupt something, it isn’t the end of the world, since bzip2 has the ability to recover from such problems. So the script will take the directory week1 and compress it into a file with the date embedded in the name. The appropriate line in the script is:

tar -cvvjf $BACKUP_DIR_ROOT/monthly/LauraGBHomeBackup-$(date +%F).tar.bz2 $BACKUPDIR

The $BACKUPSOMETHING are variables in the Bash script that I had defined earlier, but they basically tell the script where to go in my filesystem. The file created is “LauraGBHomeBackup- “+ the date in YYYY-MM-DD format + .tar.bz2. The script runs date +%F and inserts the result in the filename. The file is placed in the directory called monthly. I can manually clean that directory out as necessary, and since the dates will all be unique, none of the monthly backups will get erased and overwritten like the weekly backups.

Conclusion
And so now the need to do backups has been automated. Every week, a copy of my home directory is synced to a second hard drive. Every month, a copy of the data will be compressed into a single file named by date to make it easy for me to search for older data. What’s more, cron will send me an email to let me know that the job ran and also tell me what the output was. If I forgot to mount the second hard drive for some reason and the script couldn’t copy files over, I’ll know about it Monday morning when I wake up.

Once MariaGB is up and running, I can configure the two systems to work together regarding the monthly backups. After the .tar.bz2 file is created, I can then copy it from LauraGB’s monthly directory over to MariaGB in a directory to be determined later. Of course, normally when I copy a file from one machine to another, I would need to manually enter the username and password. I can get around this by letting the two systems consider each other “trusted”, meaning that the connection between the two can be automated, which is consistent with one of the goals of this system. I’m very proud about what I have created, and I am also excited about what I can do in the future.

Currently LauraGB’s home directory has images, code, homework files, game downloads, and other files taking up 4GB of space, which clearly won’t fit on a CDROM uncompressed. To improve my backup system, I will need to purchases a DVD burner. I can place a blank CD in the drive and have the computer automatically burn the monthly backup when it occurs, giving me another copy of my data, one that I can then bring anywhere. Ideally, remote backups would complete my system, but I think I will use those only for specific data, such as my programming projects and business data when I get some. Losing my music files and pictures in a house fire aren’t going to be as big of a conern as the fire itself, but losing sales info, game code, and the customer database, essentially THE BUSINESS, would be something that I probably won’t easily recover from.

For now, I am mitigating the disaster of a failing hard drive, which is my main concern. If you do not have your own backup system in place, either something homemade like my setup or through some commercial product, you’re walking on thin ice. Remember, hard drive failure isn’t just a potential event. It’s a certainty. It WILL fail. Prepare accordingly.

Categories
Games Geek / Technical

LAN Par-tay! or Stupid Processor…

This past weekend I went to a LAN party at my friend’s dorm. For those of you new to the term, you basically take your PC to this party and everyone connects to a Local Area Network (hence the LAN part) to play games for hours on end. This party was in the basement of the dormitory, and it started at 2PM on Saturday and ended at 12PM on Sunday. I think. It was a long night. B-)

While my Debian GNU/Linux system has a new video card and has a slightly faster processor than my Windows system, the fact was that we were playing games, and a lot of them aren’t available for GNU/Linux yet. So I took my Windows machine.

I had problems right away. My Windows machine didn’t have as complete a cooling system as my Debian system. I barely use it, so there was never a need. And I have played games on it before. Yet this weekend of all weekends, games would crash to the desktop. At first I thought that it was possibly Windows 98. I’ve refused to install Windows XP for reasons I may go into another day, but at the urging of others, I installed WIndows XP. Luckily I had brought the CD that I got for free for attending some Microsoft seminar on .NET. Still crashed to the desktop when playing games. So rule out the OS.

I opened the system and found that the ATI Radeon 8500 video card was really, really hot. I think the ribbon cables were blocking airflow. Someone had a spare GeForce 2 MX, so I installed that. I still had crashes, so I opened the case to find that the video card was hot after only a few moments in the system. Was it overheating?

My friend let me use his fan to cool my system. It was funny seeing the case opened and a giant fan blowing into the system, but it kept it quite cool. Unfortunately, games would still crash to the desktop. So rule out overheating.

Someone else insisted I should lower the clock speed on my processor. I had a motherboard that allowed me to flip switches to lower the speed, and I didn’t want to do it at first. I paid money for an AMD XP 2100+, so why lower it? Well, it did the trick. Games stopped crashing, and it still ran quite fast to handle games like Alien vs Predator 2 and Unreal Tournament 2004. I’m still upset that I had to lower the speed, but apparently the processor is overheating otherwise. Perhaps the CPU fan isn’t working well anymore.

Me and my computer woes, eh? Two other people had some issues, but we all eventually got to play.

I haven’t been to a LAN party in a long time, and these days it is less likely since I work 40 hour weeks. It was a completely different situation when I just had school to worry about. It was a good time. I think everyone should attend a LAN party. Besides reminding you that playing games is important if you are going to make good games, it also reminds us that gaming is every bit as social an activity as any other. The next time someone tells you to “put down the controller, go outside, and get a life” remind them that playing video games with friends is a bit more healthy than getting overly drunk at bars and smoking. And arguably more fun.

Categories
Geek / Technical

My Computer is Back and Badder Than Ever!

Last time I mentioned how my Debian GNU/Linux system needed to be upgraded due to a malfunctioning video card.

I am happy to say that my system is running quite fine now. Here is a listing of some of the upgrades and changes:

  • nVidia GeForce2 GTS to nVidia GeForce FX 5500
  • Linux kernel 2.4.24 to Linux kernel 2.6.11
  • Deprecated OSS sound drivers to the new standard ALSA
  • A restrictive hard drive partition scheme to a less restrictive one

The video card runs amazingly well, although my true test is to verify that Quake 3 Arena will run on the system. So far, Tuxracer and Frozen Bubble runs fine. The 2.6 kernel is a good iteration over the 2.4 kernel since it allows the system to respond faster and has more supported hardware under its belt. The OSS sound drivers, for example, have been deprecated in 2.6, but I’ve always used them before since ALSA was iffy at best, requiring a separate download and kernel recompile. Now ALSA is actually part of the kernel, and I am pleased that my system is so much more up to date. Debian is the distro that people make fun of for being less than on the cutting edge, but it makes up for it by being stable.

The hard drive partition scheme I had before made it difficult. I had a lot of space for my home directory, which is where I keep data files and the like, but less space for /usr, which I need for programs, and /tmp, which made installation of new programs difficult. The new scheme doesn’t differentiate between /usr and /tmp, and they share a LOT more space, since I have a bigger hard drive to store any data anyway.

So my system is faster, more convenient, and more compatible with what’s new in the world of technology. Games that were out of reach before are now playable. Not bad for an emergency repair.