Sorting Algorithms: Asymptotes

    Sorting Algorithms: Asymptotes

      Back in the day, it took computers forever to perform a calculation, but at this point we count computations in millions per second. So yeah, we don't need to worry about the runtime of an algorithm until we're talking about millions of computations.

      What we do need to worry about is asymptotic behavior. Specifically, what happens when the number of items we need to sort approaches infinity?

      Deep, we know.

      We only really care about the largest order when approaching infinity. If we're talking in terms of the variable x (math's favorite letter), as x grows larger, the numbers growing slower than x matter less.

      Let's talk about it in terms of barrels of monkeys over time. Because…why not? Say we have one a barrel of Gibson monkeys that grows quadratically every hour (as Gibsons do). In comparison, a barrel of Blue monkeys only increases by one every hour—probably because those monkeys are too busy playing rhythm guitar in smoky nightclubs.

      In the first couple of hours, the difference between the number of barrels of Gibsons and the number of barrels of Blues isn't that impressive, but once we get to a couple million hours, Barrels of Gibsons will outnumber those Blues by a million barrels. That many monkeys can cause a lot of mayhem.

      At that point, it won't matter (except to Mom and Dad monkey) if two barrels of monkeys die of old age every hour, because we'll still have almost a trillion monkeys.

      In case you're wondering, here's what that difference looks like:

      Blues Monkeys' Linear GrowthGibson Monkeys' Quadratic GrowthGibson Monkey Growth, Tempered by Death Rate
      1,000,0001,000,000,000,000999,998,000,000

      As we approach an infinite number of hours, it doesn't matter what the smaller powers are because the largest power makes the number of barrels grow way quicker.

      And that's why we can safely approximate the runtime of algorithms. It's pretty awesome, we know.