Categories
Geek / Technical Personal Development

Learn Computer Science, Not Coding

When I was in college, I was in the computer science program. Some of the classes were about learning programming.

I enjoyed programming, but I didn’t always enjoy the classes. Part of the reason was that we were learning how to program in Visual C++ 6.0.

For anyone who remembers it, 6.0 was…OK. That is, if you wanted to code and compile and run a project, it worked more or less as expected.

But it had a reputation for not supporting the C++ Standard very well. As I didn’t know the Standard very well myself, I had no idea, but in hindsight, it explained why my professors almost universally ran into difficulties teaching certain aspects of it.

Eventually the school’s official language switched to Java. I recall hearing it was because a professor started talking about linkers and object code and got a bunch of stares from confused students who were used to doing nothing more than clicking the “Build” button in a UI.

Similarly, my classes that were ostensibly about teaching database concepts almost always were really classes about using Microsoft Access. Even before I started using GNU/Linux as my main OS and cared about cross-platform compatibility, it seemed wrong to me that the tool we were supposed to use to practice what we learned was so proprietary.

I found I preferred classes in which I learned concepts and theories that could be applied in many contexts. I always liked general principles rather than specific solutions.

Now, when I neared graduation and was worried about finding a job, suddenly my disinterest in specific tools such as Microsoft Access turned out to be a liability, but I was looking for programming jobs in particular, and most of them seemed to want people who knew not only how to work with databases but with specific types of databases.

On the other hand, if I cared, learning Access, or Oracle, or MySQL were all within my grasp if I applied myself. My issue wasn’t skill but experience. I understood how databases worked, and I could apply my knowledge to most of them. And in fact, my knowledge of how they worked could be applied outside of formal databases. When I was creating a component-based entity system for my game objects, I was basically creating my own database system. It’s been said that video games are just databases with pretty front-ends.

So I preferred being able to think generally about solutions rather than learn specific tools. I still hate picking up a game development book and discovering that it’s actually a Windows-specific game development book, or a Game Maker-specific development book, or a Game Salad-specific development book, or a Unity-specific development book.

A few days ago, I enjoyed reading Don’t Learn to Code, Learn to Think. The author is railing against the confusion that computer science is nothing more than programming, against learning a tool as an ends in itself. Computer science is a way of thinking, and programming is how you apply that thinking. The former is more general and has broad applications, the latter…well, creates specific applications. B-)

Not everyone needs to learn how to write code in the same way that not everyone needs to learn how to fly a plane, but knowing logic and information theory helps you in life the same way that knowing physics and math would.

If you want to learn something that also has broad applications, you should check out the online course Model Thinking led by Scott E. Page at the University of Michigan.

In these lectures, I describe some of the reasons why a person would want to take a modeling course. These reasons fall into four broad categories:

  • To be an intelligent citizen of the world
  • To be a clearer thinker
  • To understand and use data
  • To better decide, strategize, and design

Sounds good to me.

Being able to improve my ability to think about problems has wide-ranging benefits. For one, I have more tools. I don’t learn to use a hammer and treat everything as a nail. I learn the screwdriver, the ax, the jigsaw, and more, and I apply the right tool to the problem, and I can even combine tools in ways that make sense.

Similarly, if I learn programming but don’t learn the concepts, I’ve basically learned how to use a hammer. While you could hammer a screw into a board, you can’t use a hammer to remove the screw, or to cut the board. When problems arise, such as compiler or linker issues, or logic bugs, I have almost no frame of reference for how to solve them if I haven’t learned about compilers or De Morgan’s Law.