Even and Odd

When I’m learning a new language, there are a few basic things I like to do. Such as playing with arrays and iterators. A common practice for me is to filter or map items based based on whether they are even or odd.

Today I decided to mix it up a bit and see if some computer theory and mathematics still hold true. Normally I just throw the the good ol’ modulus operator at the algorithm and call it good. You’ve probably done it before. Whatever the language, it probably looked something like this pseudo-code:

if the_number % 2 == 0
  return true // Assume true represents even numbers here.
else
  return false

It’s fast enough. It’s normal. It’s familiar. Either there’s a remainder, or there’s not. Now for the juicing…

Doing it differently with bits

My brain said to myself, “self, all you really care about is the least significant bit. Stop wasting CPU cycles on division.” At first it seemed like a silly thing to even be thinking. This was, after all, a very trivial example. Still, I couldn’t help myself. I like optimizing things.

With the exception of a few languages on the outskirts (I’m looking at you, Javascript), bit-wise operations tend to be faster than their other mathematical counterparts. Especially multiplication and division. And it held true for today’s test: checking if a number is odd or even by using number & 1 comparison.

Instead of looking for a remainder, all we do is check if the “one” bit is, um, one. If not, the number is even. Simple. Fast. Faster than number % 2 (usually).

The Pudding (where the proof is)

This all came about because I decided to start learning Swift. Pretty much just because I feel like it. I wrote two very basic functions. One using the bit operation, the other using modulus.

Here’s an excerpt from the code:

func even_or_odd(array_size: Int = 10) -> [String] {
    let num_range = 1...array_size
    return num_range.map({(number: Int) -> String in
        return (number&1) == 1 ? "odd" : "even"
    })
}

func even_or_odd_2(array_size: Int = 10) -> [String] {
    let num_range = 1...array_size
    return num_range.map({(number: Int) -> String in
        return (number%2) == 1 ? "odd" : "even"
    })
}

var the_result = even_or_odd(array_size: 400000000)
var the_result2 = even_or_odd_2(array_size: 400000000)

Even if you don’t know Swift, I’m sure you can read what is happening here.

First, an array is filled with four hundred million integers, sequentially. I chose the rather large array size to timings on a human scale. Seconds instead of milliseconds. The values are all mapped to “odd” or “even” based on their parity.

I threw this into some instrumentation and found the modulus operation consistently took around 60% of the runtime between the two functions. Everyone likes pictures. Here is a fancy screenshot of the simple time instrumentation:

Screenshot of profiler timings.

Why?

Things like this have their place. It isn’t something we have to think about for every situation. If you’re working with small amounts of information, this isn’t going to be a big win.

But, if you’re working with massive amounts of transformations and looking to squeeze every ounce of your CPU, consider the small things. Like bit-wise operations.

Yes, there are always exceptions. So play with stuff.
Experiment. Instrument. Tweak. Repeat.