Monday, November 29, 2010

Retrospective on my first do-it-yourself computer experience

I've always considered myself a software geek, and in the past I've had no desire to fool around with hardware beyond the occasional RAM upgrade or extra hard drive. However, so many friends have reported positive experiences with building their own computers, that I decided I really needed to try it.

I was pushed to hurry along this path when my old desktop's hard disk got corrupted and I could no longer run Windows. I was planning to pull the data off and reinstall, but I still recognized that it was time to replace the old eight year old system. Since I got a new laptop a year ago, my old computer was mostly sitting dormant, used occasionally to recover documents and sync my iPod with music that I've recorded and rated from my CD collection over the last several years.

A Facebook friend gave me some words of wisdom: You will not save money by building your own computer. Not because the parts won't be cheaper -- they will! But when you buy them individually, it's impossible to resist the siren call of "100 more watts of power couldn't hurt!" or "Gosh, this terrabyte hard drive is not nearly twice as expensive as the 500GB, how can I pass that up?" or "I'm saving money! A little more muscle in the graphics card? Why not?"

So you'll spend just as much as you would for a prefab desktop, but you will get more than you would have for your money. And then there's the "priceless" category, which is pride in creation, as well as a better understanding of what's going on under the hood of all future computers you may work with.

Another friend who used to build and test hardware for Dell pointed out that I was working at a severe disadvantage as a first timer. An experienced or professional builder will have lots of spare parts lying around to swap out and help test the machine. Nothing's on the screen, is the graphics card okay? Put in a different card and find out. Etc. I had to debug all my problems with just one set of parts, as my other PC was much too old to have any compatible bits.

Phase 1: Shopping spree

The first thing I did right was, instead of buying a motherboard, case and power supply online, I went down to the local Fry's and chatted up an extremely knowledgeable clerk. With all the dire warnings I'd heard about messing up compatibility requirements, I just didn't want to take the chance of guesswork. Before I went, I bought a short trade magazine about building a computer by PC Gamer. I read it thoroughly. They of course recommended a ridiculously overpowered and overpriced system, but later in the book they had some recommendations for working on a budget. They came up with something for under $600, but this estimate is a bit deceptive because it doesn't include an OS or any peripherals like a monitor.

Anyway, I read this thoroughly and dog-eared the page full of cheap stuff, figuring this to be a minimum benchmark for what I can buy. Then I brought the list to the Fry's guy, and I said "I am prepared to spend several hundred dollars today. Pay attention to me!" Basically I went through each item on the "cheap" list, and asked the following questions:
  1. Do you have this particular part?
  2. If not, what do you have that's comparable in price and performance?
  3. What would I get by spending more on a better part?
  4. How likely is it that I will need that extra power?
  5. (In some cases) That's more expensive than the guide implies. Can I get it cheaper on New Egg?

Your mileage may vary, but this guy was extremely helpful, and didn't try argue too hard to get me to buy only at Fry's or to buy stuff that was more powerful than I needed. In fact I only wound up buying a motherboard and case that day, and I left armed with a list of precise specs that would be compatible with the mobo, and deals to look for on New Egg. (For example: "That CPU you're looking for is absolutely the best thing in that price range," he said. "In fact it's so popular that it's been out of stock for weeks. Go look for that exact thing on New Egg and see if you can find it at the price you want." And I did.) I was so happy with the experience that I made a point of coming back when I needed some odds and ends, like a keyboard and a replacement hard drive.

It happens to be the holiday season, so it seems like New Egg had even more combo details and discounts than usual, or so they told me via email every single day. You just have to know approximately what you're looking for, and then keep an eye out for discounts with other parts you're looking for. Hard disk may be paired with Windows 7. Graphics card may come along extra RAM. Stuff like that. $10-$20 savings here and there adds up, or as I warned, encourages you to buy slightly more power. :)

Phase 2: Construction

In addition to getting friendly with a retail guy, it's always a good idea to identify a friend or coworker who has done this before and can guide you through it. If your workplace makes this likely, talk to everyone about your intent and see what kind of feedback they get. A guy who works in QA here is my new best friend. :)

Building the computer was both less complicated and more nerve wracking than I expected. I was frankly terrified of making mistakes, and I did make mistakes. I was curious about the CPU connections, and brushed it with my finger before remembering that you NEVER EVER TOUCH THE PINS. Similarly, I couldn't find a tube of thermal grease that my magazine claimed would come with either the CPU or motherboard. It wasn't until I poked the underside of the CPU cooler and came away with sticky fingers that I realized the stuff was pre-applied, and now I was afraid I'd contaminated and ruined it somehow. I forgot to put on my static wrist strap several times. I dropped the graphics card a little too hard as I was pulling it in and out so often. Luckily, the computer parts are considerably less fragile than I had feared. Not that they aren't fragile, but they can take a few knocks. (Except hard drives. The drive I bought from NewEgg made clicking noises and didn't get detected, which is why I wound up returning it and buying a replacement.)

Figuring out how many parts need power was also a challenge. I initially thought my power supply was busted because nothing happened when I simply plugged it in -- no fan action or anything. Turns out it needs to be hooked into the computer's power button or else it doesn't get an "on" signal. Also, count the number of fans and make sure they are ALL running. My case has two built in, with space to install another one. The CPU has its own fan; the graphics card has its own fan; and the power supply has an intake fan. At first I had plugged the fan into the power supply, but my QA friend pointed out that you are supposed to plug them into the motherboard so it can sense the temperature and regulate the fan speed.

The CPU needs power, and the mobo needs two cables plugged in SNUGLY (loose cables were also a hallmark of the experience). Then there are all the little things like LEDs and USB power supplies, which seem intimidating at first (lots of cables!) but the motherboard's manual is very specific about what goes where. It's not so bad. Just don't forget to find the loose parts you need, like a tiny speaker.

Miscellaneous stuff I learned

  • If you haven't upgraded your system for a while you may be surprised to learn what is built in standard to motherboards these days. You used to need to plug in a sound card and a network card; now you don't. I bought a sound card, and I'm installing it because it's a cheap part that is not much worth returning. But the built in sound works fine for the most part.
  • The BIOS may take a long time to appear on screen the first time you start it up. Give it a couple of minutes.
  • If your graphics card has two ports, they are not necessarily interchangeable. One is for the primary monitor, and the other is for a dual setup. Make sure your monitor is plugged into the right one.
  • The first sign that you are doing something right is if it beeps. Make sure the little internal speaker is set up. Don't put in RAM right away, because when it's out you will hear beeps indicating an error. At that point, you know your motherboard is probably okay.
  • All you have to do with the new system is drop in a CD for the operating system of your choice and watch it go. There is no other initial prep work (I thought i would have to screw around in the BIOS more).
  • Don't talk too much about your issues on Facebook, because an army of annoying drones will tell you to switch to Mac. They can STFU if they're not planning to play grown-up games or develop software. :)

Specs

Case: Cooler Master HAF 912
Motherboard: MSI 870-G45
CPU: AMD Phenom II X2 555 Black Edition Callisto 3.2GHz Socket AM3 80W Dual-Core
Power supply: COOLER MASTER Silent Pro M700 RS-700-AMBA-D3 700W
Hard drive: Seagate Barracuda 7200.12 ST31000528AS 1TB (actually that's the one that died and got replaced; the new one is Toshiba or something)
Mouse: RAZER Lachesis Banshee Blue 9 Buttons 1 x Wheel USB Wired Laser Gaming Mouse
RAM: G.SKILL Ripjaws Series 4GB (2 x 2GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800)
DVD: SAMSUNG CD/DVD Burner Black SATA Model
OS: Windows 7 64 bit Home Premium
Graphics: SAPPHIRE 100284L Radeon HD 5750 1GB 128-bit GDDR5 PCI Express 2.0 x16

The Gaming Experience

Sure, I've installed Eclipse and JBoss on the new box, and I'm enjoying the fact that I won't be coming close to filling up the hard drive for a long time. Compiling programs is nice when it's fast, but don't let me lie to you; the real benefit of a new system is the games.

For a moderate priced system it runs great. I've been playing World of Warcraft at "Ultra" graphics settings with hardly any drop in the full 60 FPS. Of course, WoW is not such a graphics intensive game, but with the new changes in place for Cataclysm, it looks pretty nice: the water has good distortion effects, you can see stuff in the landscape that is a good half a zone away; and the spell effects really sparkle.

With Starcraft II it's hard for people who don't play to notice the difference, since the view is set such a long distance from the action. When playing though, you can see enormous detail in the units, and the glowing effect of workers carrying minerals and gas is a nice bonus. Also, the Machinima in-game cutscenes are noticeably worse than the pre-rendered movies on a slower machine. With higher settings, they are still different but still pretty impressive.

Left 4 Dead 2 looks great and benefits enormously from running quickly.

I'm a pretty satisfied customer of the experience, and again, it's not so much the newness of the machine as the pride of ownership.

Friday, May 14, 2010

The State of the Internet Operating System

I'm working my way through a couple of long and extremely excellent articles by Tim O'Reilly, about how the operating system of the future is the entire internet.


I haven't got much to add that would improve the articles so far, so I'll just quote an excerpt and get out of the way.

The Internet Operating System is an Information Operating System

The underlying services accessed by applications today are not just device components and operating system features, but data subsystems: locations, social networks, indexes of web sites, speech recognition, image recognition, automated translation. It's easy to think that it's the sensors in your device - the touch screen, the microphone, the GPS, the magnetometer, the accelerometer - that are enabling their cool new functionality. But really, these sensors are just inputs to massive data subsystems living in the cloud.

When, for example, as an iPhone developer, you use the iPhone's Core Location Framework to establish the phone's location, you aren't just querying the sensor, you're doing a cloud data lookup against the results, transforming GPS coordinates into street addresses, or perhaps transforming WiFi signal strength into GPS coordinates, and then into street addresses. When the Amazon app or Google Goggles scans a barcode, or the cover of a book, it isn't just using the camera with onboard image processing, it's passing the image to much more powerful image processing in the cloud, and then doing a database lookup on the results.

Increasingly, application developers don't do low-level image recognition, speech recognition, location lookup, social network management and friend connect. They place high level function calls to data-rich platforms that provide these services.

Monday, March 15, 2010

Wondrous number graph

Like many great ideas should, my latest pet project was inspired by an XKCD cartoon.




Of course I recognized the graph right away. I never heard it described as Collatz Conjecture before, but this concept also makes an appearance in my very favorite book of all time, Godel, Escher, Bach. In a dialogue between Achilles and the Tortoise, they were referred to as "Wondrous Numbers," and Wikipedia confirms that both terms are acceptable.

After reading the cartoon, I started doodling my own graphs, which -- despite the convulted twistiness of Randall's picture -- are essentially binary trees, going downward from one. If you draw the left branches as multiplying by two, and the right branches as "minus one divided by three", then you may get a graph like this:

1
|
2
|
4
|
8
|
16
|...\
32...5
|....|
64...10



... and so on.

The tree is easy to generate. Of course, it's an infinitely large tree, so you have to decide which nodes to expand. The two main ways I can see to do it are:
1. Start with 1, balance the depth by always expanding the leaves that are at the least distance from the origin.
2. Start with 1, always expand the leaves that have the smallest numerical value.

So for example, in the tree above, I've used method 1. If I had used method 2, I would have needed to add 5, 10, 20, 3, 6, 12, 24,and 40 to the tree before I could expand node 32.

Now, obviously this tree gets large in the vertical direction faster than the horizontal direction at first; later on, when the branches increase exponentially, the opposite is be true. But when I tried sketching the graphs, I was annoyed by all the white space at the top. I thought the tree would look much neater if I started with the root in the upper left corner and tried to push the left nodes downward, and the right nodes rightward. So the above graph would be changed to something like this:

1...5--10--3--6--12--24
|../../
2.|.20
|.|.|
4.|.40
|.|
8.|
|/
16
|
32
|
64

and so on like that, giving me more space to expand toward the bottom right. (Hope that's clear.)

Once I was able to sketch this, I started writing it as a program, but currently haven't finished working out the algorithm to decide where in the grid to place each node. Basically the requirements are as follows:

1. You should be able to display any binary tree with this program (it doesn't have to be tied to Wondrous Numbers).
2. Root is displayed in the upper left corner.
3. Prefer to display children to the right of and below the parent, as much as possible.
4. The algorithm should eventually fill up all available space in a specified grid size.
5. The nodes should be as tightly packed as possible, avoiding empty spots.
6. Parents and children should be connected by straight lines, which do not cross. Failing that, fewer crossed lines is preferred.


I thought of a few different ways to accomplish these goals, but haven't been able to decide which one to focus on first. My first thought was just to place new nodes anywhere nearby, and then gradually move them out of the way as necessary when adding new nodes. This would require me to decide which nodes need to be moved, based on either position in the tree or position on the grid.

I also thought I might go with a hill climbing algorithm, where I just keep swapping nodes at random until the number of line intersections is decreased. However, I don't much like this solution, because calculating all possible intersections repeatedly could eat up more processing time than I like.

That's about as far as I've gotten. Any thoughts?

Wednesday, March 10, 2010

Adventures in junior programming, episode 2

Thank you, everyone, for all the great suggestions that you submitted to my previous post. As a result of all the recommendations, I decided to go ahead and teach myself Python. I'm sure Python enthusiasts will not be surprised to learn that I've found it... almost embarrassingly easy to work with. For instance, the Java program I listed:

class Hello {
public static void main(String[] args) {
System.out.println("Hello world");
}
}

And the Python equivalent:

print "Hello world"

Not even so much as a semicolon is required in this one, because Python actually enforces proper indentation and uses it to recognize code blocks. Since last week, I've checked out a book from the library and read much of the online tutorial, and implemented a simple object-oriented puzzle game that I used to assign to my C++ students. A special "Thank you" goes to Shawn, who volunteered to be my Python guru and code reviewer all week.

So the real question is, how would Ben take to it? I sought the answer to that question over some wings at Plucker's last night. (Before I get to the coding part, let it be known that Ben actually tried one of the hot ones, and pronounced it good. Then he stuck to the mild and drained two root beers.) We did some stuff before the food arrived, and after that we moved to a Starbucks inside a Barnes & Noble and got hot chocolate.

Python works pretty well as a first time language for a few reasons. It is an interpreted language, so you can run it line by line. To facilitate this, there is a command line interface. Type print "hello world" at the prompt and it gives immediate feedback. In fact, it's not even necessary to type "print", as the CLI will assume that if you type an expression without context, you want to see the value of that expression. Type 2+2 and the interpreter dutifully spits out 4. I got to work in some easy math lessons first by making him guess what the result was going to be as I typed various expressions.

After that I re-introduced him to variables. name = "Ben" is all you need to declare a string, and then you type name at the prompt and the interpreter returns 'Ben'. I pointed out the difference between typing name and "name", as one would print "Ben" and the other would print the string "name."

So next, input. By now we are getting to the point where the command line would just get in the way, so I start up a text file and type a two line program that says "Enter your name: "; you type something and then it says "Hello, [name]". Ben likes to try and confound the computer, so he banged out a long gibberish string, and then thought it was hilarious when the computer treated that as a name too. Lesson: computers aren't smart. They can only do what you tell them.

I demonstrated a simple "if" statement by writing a five line program where you are supposed to guess the secret number. (It was 17. No hints, just a magic number in the code. Maybe I'll work on the "higher"/"lower" binary search game next time.)

At this point, I explained, you know all the basic things that a computer program can do. Input, output, and logic. The rest is all variations on that from now on.

We did some more math, and then Ben said he wanted the computer to print the number googol. He was starting to type in all those zeroes, when I said "Wait! Don't count the zeroes!" Then I got to make a "for" loop, which starts with the number one and multiplies it by ten a hundred times.

That managed to impress him, but he was surprised that a hundred zeroes didn't take up more screen space. I said "Ah, but here's the great thing! Instead of 100 zeroes, can you change one thing in the program to make it a thousand?" After some thinking, he got it.

But then he insisted on upping the number to a million zeroes, which didn't work so well. The loop's too long, the computer got stuck calculating the number in the loop and didn't print anything. Lesson: computers are fast, but they don't have infinite speed. You have to write your programs in order to avoid making them get stuck.

I then showed how we would modify the loop to just print sequential numbers on each iteration. I was going to show how the computer can write progressive doubles, but at that point my laptop's battery ran out and we couldn't find a free plug in the entire bookstore, so we browsed some books and called it a night. Total time on programming was a bit less than two hours.

Ben's into it. We'll see how it goes next time I get a chance at him.

Wednesday, March 3, 2010

Adventures in junior programming, episode 1

In my last post I mentioned that I had been writing an instructional applet for my son to learn some math concepts. I received this half-joking reply:

If you truly loved your son, you would have made HIM program that applet! He's seven - that's old enough for Java, at the very least.

I actually think seven is a little young to get started, but as I thought about it, I decided, heck, why not? I picked up some extra money in college tutoring a thirteen year old in C++, and later taught classes for half a year in Fort Worth. It's been over ten years since I taught any actual classes, but is it all that unreasonable to try and give the essential concepts of Java across to a seven year old?

I was already six when my dad bought our first computer, and it wasn't for a few more years that I started reading and understanding the BASIC programs that were in vogue back then. But Ben, like many kids of his generation, was introduced to computers at an early age, and has been comfortable using them for most of his life. He's already seen what the inside of a computer looks like when I swapped out components, and I've even gone over some explanations of binary numbers with him, including the "counting on your fingers" trick.

I have done two lessons so far, first on paper and last night on a laptop. Since I've done only sporadic posts on this blog recently, I think it's not a bad idea to chronicle some of this effort. One thing I discovered while teaching classes is that there is no better way for you to solidify fundamental concepts in your own mind than to try to explain it to somebody else. There is nothing like having to break down assumptions that are second nature to you but completely foreign to another person.

So the first thing I've discovered is: Java is kind of hard.

Programmers traditionally write "Hello world" as their first program, so let's take a look at some ways that this is handled in various languages.

BASIC:

10 print "Hello world"

Nobody learns BASIC anymore, but it was a standard on those old PC 8086's I had as a kid. In fact, if you booted up the computer without inserting a floppy disk, the computer (which lacked a hard drive, and hence lacked a useful operating system in memory) would just load up a BASIC compiler environment directly from the BIOS. It was designed to be a beginner language.

Next let's look at Perl, which is known as a scripting language that is powerful at handling string manipulation:

print "Hello world";

Even simpler, superficially, because Perl is a procedural language which does not use line numbers. Once you start getting into program flow, BASIC starts looking simpler through heavy use of the "goto" statement to control program flow. But as everyone should know, "goto" sucks.

Next we'll look at C, which is what I initially taught in the semester before teaching my classes C++.

#include <stdio.h>

main() {
printf("Hello world");
}

Slightly trickier. In C, nothing runs unless it is directly or indirectly invoked by a "main" function. Thus, unlike the other two languages, in order to become proficient enough to run the most trivial of programs, you have to have some trivial understanding of what a function is about.

There's also the rather cryptic line "#include <stdio.h>", which contains additional wacky symbols that have to be either explained or hand-waved away. When I introduced students to this language, I initially just gave them a complete "hello world" program and said "Focus on what's inside the braces, you'll understand what they mean later." Students love to complain about this sort of thing, as they tend to view unexplained stuff as unnecessary and annoying.

Now here's the java program.

class Hello {
public static void main(String[] args) {
System.out.println("Hello world");
}
}

Seriously now, those are a LOT of details to expect a pure novice to absorb, even if he were older. Let me count the unfamiliar concepts:

  1. "class Hello": Right from the get-go, all Java programs enforce object-oriented structure. Objects and classes are a beast to understand -- I first heard the concept as a senior in high school, failed to self-teach it, and then didn't really get it until my first college course in C++. Now let's explain the difference between a class and an instantiating object to kid who's going on eight.
  2. "main": Just like in the C example, you're not going to get anywhere until you understand methods (aka functions inside a class). I had to make an analogy to nouns and verbs in sentences, since Ben has played Mad Libs before. Even so, it's all pretty abstract, and the word "main" does not suggest an action word at all.
  3. "(...)": Now that we know that "main" is a verb, it's time to learn that all verbs in Java must be followed up with parentheses, which may or may not contain some arguments.
  4. "void": Some methods return an object type and others don't... you know what? I'll tell you later.
  5. "public": So here we encounter the notion of public and private methods, which is intended to separate interface from implementation. Simple, right?
  6. "static": So yeah, this function can be called generically through the class itself, and does not need to be applied to a particular instance of the class. And by the way, all methods called directly from "main" must also be called "static," and the only way to simplify your calls is if you declare a second class and make an instance object.
  7. "String ... args": It's an object declaration. A string is a sequence of characters. Your program won't recognize the "main" method unless that String argument is in there, even though we have no intention of reading the command line arguments yet...
  8. "[]": ...but this string is not just any old single string, it's actually an array of strings, which means there can be any number of them sequentially, see?
  9. "System.out.println": These are not even a proper part of the language definition. They are objects that are accessed by loading in the standard Java libraries, and they actually have a bunch of additional code written to accomplish their task somehow, but you will never see that code. You just have to accept that it will work as a "black box" that does what you expect. And then there's the idea that "System" contains an object called "out" which has a method called "println", where even the last word is not as easy to understand as plain old "print".
  10. All the syntax: This is probably second nature to every programmer by now, but just try to think about how you would explain and reinforce the idea of all the funny characters and when to use them. Quotation marks must signify string literals. Parentheses are required for both declaring and calling methods. Semicolons must end all statements, but NOT the method declaration (I don't know for sure if he ever saw a semicolon before). And, again, what the dots mean in between "System", "out", and "println".

That's quite a lot to bite off just to write two words. And by the way, when I try to get into inputting text, the proper format will be something like this:

String name = "";
InputStreamReader input = new InputStreamReader(System.in);
BufferedReader reader = new BufferedReader(input);
try {
name = reader.readLine();
} catch (Exception e){};

Compare that to

INPUT "Input name: ", name$

in BASIC, and

$name = <>;

in Perl.

Nevertheless, we heroically slogged through that stuff, discussing some parts in detail and skipping over other parts. Then we finally got to String variables.

No input yet, and no kind of string manipulation or concatenate. Just
String name = "Ben";

followed by

System.out.print("Hello ");
System.out.print(name);

Had a tough time getting him to see that the second print statement would print "Ben" rather than "name," and why.

As far as I can tell, I still have his attention and he wants to keep at it. He would like to write a graphical game, but I told him he has a lot to learn about text-only programs first before we can start to cover that. I know a lot of graphics, but much of it requires knowledge of not only Cartesian coordinates but also trigonometry. Believe me, I'm not touching trig.

A couple of open questions for readers:

  1. Was it a mistake to start in Java?
  2. If not, why not?
  3. If you were being introduced to programming for the very first time, which language would you want to be taught first?
  4. Crazy thought: Ben has a bunch of favorite sites stuffed with Flash games. He even said that he wished he could make one someday, which prompted me to say that it's too advanced for now. I have never learned Flash myself. Is it easy? Is it free? Would it be better or worse than Java as a "first time" language?

Thursday, February 11, 2010

Angle math

I've got a new applet posted on my web page, which I wrote for my son Ben. He is seven. It is a demonstration of how angles work. You can drag the points of the triangle around, and it will continuously update the display of numbers showing what angle is formed at each point. It also gives a little readout showing that the three angles will, indeed, always add up to 180 degrees.

I like to do a little graphical application every once in a while just to stay in practice. There are a lot of concepts from trigonometry that have to be applied. Debugging graphics is sometimes a tricky affair, because if you don't do the right thing then you might wind up with nothing but a blank screen, or a line might appear wildly out of place. Finding the angles required remembering what sines and cotangents and such represent. (SOH CAH TOA! That's one that never leaves your memory, but I got sin and asin mixed up for a while.)

I also had to fudge the numbers a little bit. For instance, one angle might be 70.14 and another might be 50.34. The third angle should be 59.52. However, if you reduce those to one significant figure, you find that 70.1+50.3+59.5 = 179.9. So I had to fake the third angle (a3) as displaying 180-a1-a2.

The hardest challenge came after I decided to change "nearly" right angles into true right angles. Notice that if you drag one corner so that it forms an angle between 80 and 100 degrees, it will automatically snap to the correct position so that it is 90 degrees. I had to put some thought into making that work, and here's the solution I wound up with.

  1. Write an expression of the line segment opposite the point being moved.
  2. Find a vector perpendicular to that line.
  3. Project the moved point along that vector to find where it intersects the opposite segment.
  4. Find the actual distance that the point must be from the segment in order to make a 90 degree angle.
  5. Move the point there.

I had originally used only a "Point" class and a "Triangle" class to represent the problem space; I soon realized I needed a "Vector" class as well (one which could be used to offset a point, normalized, reversed, or made perpendicular to another vector). Then I had to make yet another class, "Line," in order to properly calculate where intersections can be found. I borrowed a lot from this page, and found out just how long it had been since I needed to do that math -- I had a totally incorrect idea of how a line formula is expressed.

Basically the key to correcting your math mistakes -- and I made a lot! -- is to create either a printout or a visual representation at every step. For instance: I think I've normalized the vector correctly, better print out the coordinates and make sure the length is really 1. I need to draw an extra line to make sure it really goes through this point. I need to draw an extra point to make sure it really intersects that line. Do these two lines LOOK perpendicular to me? And so on.

If I had remembered the math better then some of this rigor wouldn't have been necessary. But really, caution and constant testing while coding eliminates the need to be a perfect math whiz. Coding is both theoretical and experimental, you see.

Monday, January 25, 2010

The Chess Master And The Computer

Garry Kasparov On Chess Metaphors

Thirteen years after Deep Blue beat him at chess, Garry Kasparov has written a long, thoughtful article about humanity's search for a meaning behind the event. Manages to cram in quite a lot of insights about both artificial intelligence in general, and the way the game of chess has changed among human opponents. As someone with a small fondness for chess and a large fondness for AI, I enjoyed it a lot.

There have been many unintended consequences, both positive and negative, of the rapid proliferation of powerful chess software. Kids love computers and take to them naturally, so it's no surprise that the same is true of the combination of chess and computers. With the introduction of super-powerful software it became possible for a youngster to have a top-level opponent at home instead of needing a professional trainer from an early age. Countries with little by way of chess tradition and few available coaches can now produce prodigies. I am in fact coaching one of them this year, nineteen-year-old Magnus Carlsen, from Norway, where relatively little chess is played.

The heavy use of computer analysis has pushed the game itself in new directions. The machine doesn't care about style or patterns or hundreds of years of established theory. It counts up the values of the chess pieces, analyzes a few billion moves, and counts them up again. (A computer translates each piece and each positional factor into a value in order to reduce the game to numbers it can crunch.) It is entirely free of prejudice and doctrine and this has contributed to the development of players who are almost as free of dogma as the machines with which they train. Increasingly, a move isn't good or bad because it looks that way or because it hasn't been done that way before. It's simply good if it works and bad if it doesn't. Although we still require a strong measure of intuition and logic to play well, humans today are starting to play more like computers.

Monday, January 11, 2010

More on gambling and random seeds

I always appreciate it when people write in with questions about something I've posted before. Gives me an excuse to keep this blog at least somewhat active.

Joe in Illinois writes:

Enjoyed reading your article on google regarding how to set a seed and randomness. Would different seeds contain different overall results. For example, some people would argue that a simple Jacks or Better video poker game returning 99.54% (in the long run, whatever that is) is still not purely random because the maker must still set this % ( over time), So would/could some seeds produce more winning combinations, maybe with fewer overall winning numbers in the seed. Players would not recognize one seed starting or ending. The next seed would/could have say, only 16% of winning combinations. Some people are overly concerned with this RNG, I say it's just doing its job, running constantly. But I also think that the player should hope for a positive seed, along with some luck and knowledge.

The short answer is, yes, different random seeds would lead to better or worse luck. But in the long run, with a good random algorithm, it wouldn't matter to you. Hoping for a "good" string of numbers coming from the RNG makes neither more nor less sense than hoping for good luck when you sit down at a physical card table.

Look at it this way. If you take a video recording of your session at a poker game, obviously you'll draw more good hands than bad hands sometimes. If you took different videos on successive days, and later you compared the tapes, then you could say "Oh look: on day one I had a lucky streak, and on day two I had an unlucky streak." But that's just the way random numbers actually behave: it's a rare string of randoms that don't show signs of patterns that appear meaningful but aren't.

Since the nature of a correctly designed random number generator is to simulate real random numbers, of course an RNG will create those same apparent patterns. But as long as there is no way for you to reverse engineer the algorithm or figure out what the seed actually was, hoping for a certain pattern is not much more than superstition.

Another interesting thing to note is that the nature of probability is that the larger your sample set is, the more likely it is to confirm to the expected distribution. In other words, if you only play three games in a row, it's not entirely unlikely that you could get a lucky streak and win all three games. Of course, it's slightly more likely that you would lose all three games; but still, if you believe in luck, betting a lot on a small number of hands makes some sense. The longer you play, though, the less the game's outcome will look like a sequence of wild streaks, and the more it will look like a typical bell curve distribution of some sort.

Thus, if you play 5000 hands of video poker, you are almost guaranteed to gradually lose money at the expected rate, as very lucky or very unlucky streaks will tend to cancel each other out.