Introduction
Concept
More concepts
More to come from scratch about algorithms.
“A simplified way to calculate the running time of an algorithm. The primitive way is to count the number of iterations, instructions and their constant running time on the machine. But tracking the running time on all the machine is impossible. Instead we can use Asymptotic Notation to eliminate constant “
Metaphor
The time taken to do some thing is really important these days. Not just time, even the resource usage is important nowadays. To calculate how much resource/time is taken to do a job, there are many statistical ways. These statistical ways are needed for a computer algorithm as well.
Concept
Consider, your whole algorithm can be represented as a function f(n). Where n is the input size. We need to calculate the running time f(n). How can we do this? Take the constant time taken on a machine as “c”. Up to certain input values, running time will be proportional to the input size. The running time is represented with a separate function named g(n). Usually, it gets multiplied like c.g(n). This “c” value is never taken into account and a inequality is created for avoiding it.
There are three inequalities possible for f(n) = c.g(n) (note that this equality is possible only for a some n, as always running time increases with input size and type of input)
The above notations are called Big-Oh, Big-Omega and Big-Theta respectively. For most of the algorithms, they ask only Big-Oh case. Read more about how to determine this value here: http://rob-bell.net/2009/06/a-beginners-guide-to-big-o-notation/
Important points