Thursday, July 14, 2011

Programming puzzle: Testing probability

I've been using the comments section of the previous post as a sort of impromptu classroom for my friend James to work out simple programming problems. Since I've got nothing else in particular going on with this blog right now, I might as well keep it up and post a more interesting (novice level) problem. So here it is.

One of my favorite puzzle books, "Aha! Gotcha" by Martin Gardner, used the game of "Chuck a Luck" to illustrate some basic probability concepts. You can read the rules and history here.

In a nutshell, the player picks a number from one to six and then places a $1 bet and rolls three dice. If your number doesn't come up, you lose the dollar. If it does come up, you keep your dollar and win $1 for each die that shows your number.

A program which implemented these rules might look like this:

What number do you bet on? 3
You rolled: 4 6 2
You lose $1!

What number do you bet on? 2
You rolled: 2 1 5
You win $1!

What number do you bet on? 1
You rolled: 1 3 1
You win $2!


Gardner concocted a rationalization for a gambler who believed that the game favors the player, and then challenged the reader to show why the argument is faulty. But if you actually write such a program, you could show empirically the the game is stacked against you.


I suggest that you do this in four phases.

Phase one: Implement a program that has output similar to the above.

Phase two: Keep track of how much total money the player has won or lost.

Phase three: Ask the player how many games he wants to play automatically. Like this:

What number do you bet on? 1
How many games do you want to play? 10
[optional: display the outcome of each game]
...You lost a total of $3!


Phase four: Have your program play a very large number of games, and calculate the average amount you lost per game.

Let me know how it goes.

Thursday, July 7, 2011

Adventures in junior programming, episode 3

Haven't been posting on this blog much lately. Since I last posted I got a new full time job at a high tech marketing firm, and also taught myself the fundamentals of both Android and iPhone programming. Interesting stuff, probably good fodder for a future post. But for now, I'd like to go over recent efforts to teach my son more programming.

We let it slip during the school year but we've picked it up again over the summer. We're doing about an hour of work several nights a week. It seems like teaching a nine year old to program irregularly is like pushing the stone of Sisyphus, since every time we take it up again, I have to re-teach concepts that seemed to stick the last time but didn't. Nevertheless, my feeling about programming has always been that you have to learn to "think like a programmer" first, and once you have burned this style into your memory, it becomes much easier to pick up any language or platform in existence.

So what is the core of thinking like a programmer? As far as I can sum up for a kid, it boils down to a few key elements which need to be practiced over and over under a variety of conditions:

  1. Output
  2. Tinkering with variables
  3. Input
  4. Loops
  5. Conditionals
  6. Logical operators
  7. Functions and classes, which fall broadly under the category of "splitting up the work into manageable smaller chunks."

I've fallen into sort of a systematic bootstrapping pattern of teaching, which goes something like this. I have milestones in mind for things Ben should know how to do well, and they should be second nature. The current milestone is: "Write a Python program that counts to ten." Once he can do this without hesitation and without complaining that he needs more hints, we'll bump it up and move on to another milestone.

Each session, I ask him to hit the milestone. If he can't do it well enough, I'll point out what he's missing and go over how it works again. After we get through this, if it's appropriate, I'll introduce a new concept, which we'll drill in the future until that becomes the new milestone.

I've also been making him follow the program logic step by step. It's not as intuitive as you might think. For instance, I ask him to explain HOW the computer goes through the stages of counting to ten, and this is the kind of response I'm looking for:

"First I set the counter to one. One is less than ten, so I enter the loop. I print the counter value. Then I add one to the counter, making it two. Now I return to the beginning of the loop. Two is less than ten, so I enter the loop..."

The kind of response I get takes frequent shortcuts, like "I keep doing that until it's ten." That may be technically correct, but it doesn't capture the essence of thinking through every step, which is critical to catching bugs. If you wind up writing a program that only counts to nine, or goes into an infinite loop, you can keep modifying lines until you get lucky, but if you can see what it's doing at every step, then you're less likely to make mistakes in the first place.

I don't even see the code. All I see is blonde, brunette, red-head...


Programming can seem tedious and repetitive, but at its best you get those "light bulb" moments when it's suddenly crystal clear what you want your program to be doing and how. And sometimes it's even more fun to write the code that solves a puzzle than to grind out the solution on your own.