## Complexity of Algorithm

• The complexity of an algorithm(M) is the function f(n) which gives the running time and/or storage space requirement of the algorithm in terms of the size(n) of the input data.
• Mostly, the storage space required by an algorithm is simply a multiple of the data size(n).
• Generally, complexity means the running time(i.e., Time Complexity) of the algorithm.
• The function f(n), gives the running time of an algorithm, depends not only on the size ‘n’ of the input data but also on the particular data.
• The comparison of an Algorithm is based on its time complexity and space complexity.
• Algorithms with less time complexity and less memory space is considered as a better algorithm.
• The Performance analysis of an algorithm is calculated and depends on mainly two factors – Space complexity and Time complexity

#### Type of Complexity

(1)

##### (A) Space Complexity
• The space complexity of a program is the amount of memory space(maximum) required by an program or algorithm to run to its completion.
• The space need by a program has the following components: –
• Instruction space: Instruction space is the space needed to store the compiled version of the program instructions. The amount of instructions space that is needed depends on some factors as: –
• The compiler used to complete the program into machine code.
• The compiler options in effect at the time of compilation.
• The target computer.
• Data space: Data space is the space needed to store all constant and variable values. Data space has two components-
(i) Space needed by constants and simple variables in program.
(ii) Space needed by dynamically allocated objects such as arrays and class instances.
• Environment stack space: The environment stack is used to save information needed to resume execution of partially completed functions.
##### (B)Time Complexity
• The time complexity of a program is the amount of computer time required by an algorithm to run to completion.
• The number of machine instructions which a program executes during its running time is called its time complexity. This number depends primarily on the size of the program‟s input. Time taken by a program is the sum of the compile time and the run time.
• The time complexity is measured by counting the number of key operations involved in the process, for example, in sorting and searching algorithms, the number of comparisons is the number of key operations.
• The time complexity of an algorithm is calculated by the no. of steps(count) taken by the algorithm to compute the function it was written for or to complete its task. The number of steps is itself a function of the instance characteristics.
• The limiting behavior of the complexity as size increases is called the asymptotic time complexity. It is the asymptotic complexity of an algorithm, which ultimately determines the size of problems that can be solved by the algorithm.
• In time complexity, the time(tp) , taken by a program(P), is the sum of the Compile time & the Run (execution) time. The Compile time does not depends on the instance characteristics (i.e. no. of inputs, no. of outputs, magnitude of inputs, magnitude of outputs etc.). Thus we are concerned with the running time of a program only.
• Calculation of Time Complexity:
• To calculate the time complexity of a program, we will measure the complexity for every step in the program and then calculate overall time complexity.
• In time complexity, we consider run time only.
• The time required by an algorithm is determined by the number of elementary operations.
• The following conventions are taken into consideration to calculate the running time complexity of a program : –
• Assigning a value to a variable
• Calling a function
• Performing an arithmetic operation
• Comparing two variables
• Indexing into a array of following a pointer reference
• Returning from a function
• For example:

To determine the running time of the program, we have to count the number of statements that are executed in the procedure.

int sum(int n)
{
int i, sum;
sum=0;                         // Add 1 to the time count/complexity.
for(i=1; i<=n; i++)         //Add (n+1) to the time count/complexity.
{
sum=sum+i*i;     //Add n to the time count/complexity.
}
return sum;                // Add 1 to the time count/complexity.
}
According to the above conventions regarding the calculations of the time complexity :-
The code [sum=0;] executes 1 time, the code of loop [for(i=0; i<n; i++)] executed n+1 times and then break the loop, the code [sum=sum+i*i;] executed n time and the code [return sum;] executes 1 time. Hence, the final time complexity of the above program is
1+(n+1)+n+1 = 2n+3.
Thus, in terms of O-notation this function is O(n).

(2) There are 3 cases, in general, to find the complexity function f(n) of an algorithm. Thus, the complexity may exist in three forms –

(A) Best Case (B) Average Case (C) Worst Case

#### (A) Best Case

• The best-case algorithm executed in the minimum number of steps for a given set of parameters.
• The minimum possible value of f(n) for any possible input is called the best case.
• Best case algorithm is the arrangement of data for which the algorithm performs best.
• To calculate the best case complexity of a program, we count the minimum number of steps that are executed for the completion of the given parameter/program.

#### (B) Average Case

• The average case algorithm executed in the average number of steps for a given set of parameters.
• The value of f(n) which is in between maximum and minimum for any possible input.
• Generally the Average case implies the expected value of f(n).
• To calculate the average case complexity of a program, we count the average number of steps that are executed for the completion of the given parameter/program.
• The analysis of the average case assumes a certain probabilistic distribution for the input data. One such assumption might be that all possible permutations of an input data set are equally likely.
• The Average case also uses the concept of probability theory. Suppose the numbers N1, N2, …… Nk occur with respective probabilities p1,p2, …… pk. Then the expectation or average value of E is given by E= N1p1+N2p2+ …………. +Nkpk.

#### (C) Worst Case

• The worst-case algorithm executed in the maximum number of steps for a given set of parameters.
• The maximum possible value of f(n) for any possible input is called the worst case.
• To calculate the worst case complexity of a program, we count the maximum number of steps that are executed for the completion of the given parameter/program.

Example to explain Best, Average and Worst case:

(1.)

To explain the Best, Worst and Average cases of an algorithm, we consider an example of linear array A[1…..n], where the array(A) contains n-elements. Suppose we want to search a specific element(x) in the array(A) with their location or index value(LOC). Here, the linear search algorithm solves this problem by comparing given value x, one-by-one, with each element in A. That is, we compare x with A[1], then A[2], and so on, until we find location (LOC) or not such that x=A[LOC] or Not(x=[LOC]).

The complexity of the search algorithm is given by the number(C) of comparisons between x and array elements A[K].

Best case: In this case, the searching element x is found at the first element in the array A. That is x=A[LOC], which is C(n) = 1.

Worst case: In this case, the searching element x is found at the last location/element in the array A or x is not present in the given array A. In this case, we have C(n) = n.

Average case: Here we assume that searched element x must appear in the array A, and it is equally likely to occur at any position in the array A. Here the number of comparisons can be any of numbers from 1 to n, and each number occurs with the probability p=1/n, then C(n) = (n+1)/2. It means that the average number of comparisons needed to find the location of x is approximately equal to half the number of elements in array A. Thus, the complexity of an algorithm in the average case is much more complicated to analyze than that of worst case. Unless otherwise stated or implied, we always find and write the complexity of an algorithm in the worst case.

(2.)

Best Case: Clearly the best case occurs when we divide the array 1st time and we get the searched element x. This requires only one step (minimum possible step). In this case, the number of comparison, C(n) = 1= O(1).

Worst case: Suppose we are searching a given key x and that x is either not in a given array A or to search that element x takes maximum possible steps. Clearly the worst case occurs, when x is searched at last/not. In this case, the number of comparison T(n)= O log(n).

Average Case: If we are searching given key x for which neither a Minimum (best case) nor a maximum (worst case) steps are required is called average case. In this case, we definitely get the given key x in the array A. In this case also C(n) = O(log n).

Categories: Algorithm & Design