Here’s a real world analogy to better understand how algorithmic complexity is most frequently described. Algorithms such as those used for sorting an array of numbers are commonly measured by how long they take to run. Time taken isn’t necessarily the only thing you care about but it’s a very important proxy for an algorithm’s efficiency. And of course, faster is better.
The measure is often referred to as the Big-O of the algorithm. Algorithms can be fast i.e. twice the amount of work still takes the same amount of time or they can be linear i.e. twice the amount work takes twice the amount of time, or exponential where twice the amount of work may take four times the amount of time and so on.
Bubble Sort for example has has a Big(O) of n2 while Merge Sort has a Big(O) of n*log(n).
To better understand this representation, with a real-world analogy, let’s go to the city of Squareville. Squareville is mostly made up of flat, grassy land. You are interested in buying land here to graze your cattle.
In Squareville, there are just two rules to buy plots. They want to keep things simple and insist on these two rules:
Next, once you buy the plot of land you must agree to adhere to the city rules:
Now let’s see what happens if you buy another plot of land that is 4 times larger i.e. 40x40. Let’s see how your obligations change:
As a final example you buy a plot that is 100x100. Now here’s what you have to do:
If you pause here for a minute here’s what you’ll see about how the change in the side of your plot affects the rate at with your obligations grow:
To use Big-O terminology, here’s how each of your obligations grow:
Here it is summarized in table form below:
Side of Plot | Sign (Corner) | Fence (Perimeter) | Mow (Area) | Property Tax |
---|---|---|---|---|
10 | 1 | 40 | 100 | 3 |
40 | 1 | 160 | 1600 | 4 |
100 | 1 | 400 | 10000 | 5 |
Rate of growth: | O(1) | O(n) | O(n2) | O(log(n)) |
Clearly, it would be ideal to have algorithms that are O(1) - those that would take the same amount of time no matter how large the input. Next best would be logarithmic or O(log(n)) - those that would take more time as input grew but where the effort would grow very slowly, next would be linear where the effort was proportional to input O(n) and the worst would be exponential where the effort increased very rapidly with growth in input - O(n2).
Now you know and can handle Big(O) questions as deftly as President Obama did at Google!