on computer science

CTD posts a question from a school exam, which appears to come down to a syntax error (I didn't pick it, someone in the blog comments did).

Apart from illustrating that not much has changed since I did computer science at school in 1997, it is an example of everything that is wrong with teaching computer science.

You'll think I'm crazy, but maybe we should rename computer science computer art-economics. Firstly, you'll notice that their is a fair bit of maths and science behind both of those fields, but it is not the primary focus of either.

Both artistry and programming are about plucking something out of your head and representing in some tangible form. Just as we can look at a piece of art and decide if it is a good or bad representation of what the artist had conceived, we can look at an algorithm represented in code and decide if it is an elegant solution to a problem.

The senior English department does not teach you to write by presenting you with grammatical errors and marking you on how many mis-placed apostrophes you find. The music department doesn't teach you to listen for the slip of a finger in a concerto and the art department doesn't make you study the five rough drafts of the Mona Lisa. In all of these departments their goal is to open your mind to the cornerstones of their fields and give you a foothold and the skills to hang on, from which you can climb as high as you want.

Standard algorithms and data structures are the cornerstones of computer science. The skills to hang on are teaching you a language with which to express these ideas. Binary logic is the basis of everything that gets built on top; the way 'and', 'not' and 'or' can be made to conspire and build most anything imaginable should be your first stop.

In other ways, however, computer science is much more like economics. Anyone who has studied economics will very quickly realise that playing around with the economy comes down to two things -- you either do something more efficiently (get more output from the same input) or accept a trade off (get less of one thing for more of something else).

A computer only has so many cycles per second to crunch through data. It only has so much RAM, cache, TLB slots, memory bandwidth, disk space, network bandwidth. When you write your algorithm, you either need to find a way to somehow be more efficient with your resources, or take a trade off.

Understanding where you can get efficiencies starts by learning about algorithms. We formalise this with big-O notation, but you don't need to know about that to realise a quick sort gets the job done faster than a bubble sort. When you get really smart, you start coming up with your own ideas that are better than anyone has ever thought of before.

Understanding tradeoffs is what makes you a programmer. You need to understand what is happening from what you write down to the way the instructions get issued into the processor. You need to understand what the operating system is doing between your program and the hardware, and any of the myriad of other variables that happen on a modern computer. Think of all the others things you understand to make good decisions; networking, databases, electronics, mathematics. How to make a good interface is as much about psychology as anything else.

This is an absolute gold mine of material, and exposes you to everything that modern computer science is about. There is an infinite number of exams without one stupid "find the syntax error" question.

Our goal should be striving to open the eyes of the next generation of hackers to the amazing world that lives beneath their keyboard. If things are anything like when I was at school, the only wonderment was wondering what the hell I was doing sitting in this class when it looked so nice outside!