Wednesday, December 19, 2007

A Tale About a Mistake

While working on a computer program a little while ago, I made a mistake. That's not unusual; mistakes are part and parcel of a programmer's job. There are so many layers and so many details that we have to juggle that mistakes become statistically inevitable. However, the one I refer to falls into the category of things I should have known better than to do.

The problem was simple enough: I needed to read in a series of dollar values as text strings and convert them to text strings representing their value in cents. The solution I chose initially was to convert the input string to a floating-point number, multiply by 100, convert the result to an integer, and then convert the integer to the final string. And yes, if there are any programmers in the audience, I am terribly embarrassed that I actually did this... If you aren't a programmer, being confused at this point is OK, because it looks like the solution would work. What actually happens when I attempt to convert 0.29 using the above method is a result of 28. Yes, I had somehow forgotten that computers can't really do math.

Computers can only be precise when they are working with integers (and then only in a certain range). The steps where I converted the input to floating point values and then back to an integer cause a small rounding error due to the imprecise way numbers are represented in the machine. A little bit of trivia: all programs that work with money represent the money as an integer value, for instance by tracking in cents or tenths of cents. The example above is the reason why.

My first instinct was to change the numeric conversion to a semi-complicated affair whereby I read each digit individually and use them one at a time to construct the proper number. That turned out to be a pretty silly idea too, but at least it was an accurate one. I even had it implemented and tested before I had the typical smack-forehead moment. What I really needed was to change my approach. Instead of using math, I used a touch more common sense. What's the difference between a dollar written as 1.00 and one hundred cents written as 100? Yep, that little decimal point. All the inaccurate or overly complicated math gave way to a simple string manipulation to remove the decimal point. Viola, a simple, precise conversion from dollars to cents.

The wonderful, thorough people in quality assurance found my mistake before it could cause any damage. I diagnosed and corrected the root cause quickly once my attention was focused on the symptoms. But the situation serves as a good reminder that we should question what we are taking for granted. The lesson works on a grander scale too, especially this time of year.

No comments: