Big O notation and Algorithm complexity analysis is a way to formally measure that how fast a program or an algo runs
Complexity analysis also helps to identify how fast an algo will run for the given input or how the algo behaves if the input grows larger
How to do that?
There are tools available, known as profilers which determine the run time of an algo in milliseconds and help us to optimize the code by identify bottlenecks
Asymptotic Behaviour
Out of an algo's complexity dropping the all the other factors and keeping only the largest growing term is known as asymptotic behavior
following examples shows asymptotic behavior of dropping the constant terms and keeping only the one which grows fastest
- f( n ) = 5n + 12 gives f( n ) = n.
- f( n ) = 109 gives f( n ) = 1.
We're dropping the multiplier 109 * 1, but we still have to put a 1 here to indicate that this function has a non-zero value. - f( n ) = n2 + 3n + 112 gives f( n ) = n2
Here, n2 grows larger than 3n for sufficiently large n, so we're keeping that. - f( n ) = n3 + 1999n + 1337 gives f( n ) = n3
- f( n ) = n +
gives f( n ) = n
- Any program that doesn't have any loops will have f( n ) = 1, since the number of instructions it needs is just a constant (unless it uses recursion)
- Any program with a single loop which goes from 1 to n will have f( n ) = n, since it will do a constant number of instructions before the loop, a constant number of instructions after the loop, and a constant number of instructions within the loop which all run n times
Key Point to note about Asymptotic Behavior-
1. Any program that doesn't have any loop will have f(n)=1
2. Any program which has a single loop goes from 1 to n will have f(n)=n
3. When we have two nested loop within each other, we will have asymptotic behavior as f(n)= n2
4. A loop within a loop within a loop will yield f(n)= n3
5. Given a series of for loops that are sequential, the slowest of them determines the asymptotic behavior of the program. Two nested loops followed by a single loop is asymptotically the same as the nested loops alone, because the nested loops dominate the simple loop. e.g. Two nested loop followed by another single loop will have asymptotic behavior as f(n)= n2
Θ( here )
- When we've figured out the exact such f asymptotically, we'll say that our program is Θ( f( n ) ) e.g. O(1), O(n), O(n2) etc.
- We call this function i.e. O(here) as complexity of our algorithm
- Program with bigger O() run slower than program having smaller O()
Complexity of Binary Search- O(log n)
Complexity of Merge Sort- O(n log n)
Complexity of Selection sort- O(n2 )
Reference-
Material in this blog is referenced from http://discrete.gr/complexity/. Visit it for explanation of computing the complexity of different popular search & sorting algos
No comments:
Post a Comment