top of page

////////////////////////////////////////////////////////////////////////////////////////////////////////////////

WHAT IS EFFICIENCY?

In order to evaluate the efficiency of the algorithm, TIME and SPACE, two aspects becomes criteria

The complexity of the space is

for either occupy how much memory.

Complexity of time is of about

how much being fast execution,

                   The best algorithm is to use less both time and memory.

Recently, due to technical advances, the time aspects has become more important than the space aspects.
Generally, analysis of the algorithm means that determine the time complexity of the algorithm by calculating the number of executions of commands that make up the algorithm.

BUT

Why the time complexity should be specified?

  • Some the computers require an upper limit for the execution time of the program.

  • If there are many solutions for the problem, computer would like to select the fastest one.

How to count the time complexity?

  • Count the number of operation

  • Count the number of steps

  • Asymptotic complexity

Generally algorithm considers only the number of command executions except the actual execution time, because there is a case where the evalution data is different depending on the particular hardware and programming language.

 

And since it is difficult to calculate these execution count each time,

Asymptotic complexity is used for time complexity.

Case of time complexity

Time complexity needs to consider the 3 cases

 

Worst- case

When the data which is wanted to obtained is located at the end of data array.

Average-case 

All inputs of a given size are equally time.

Best - case

When the data which is wanted to obtained is located at the first of data array

Algorithm is evaluated based on the worst case of 3 cases.

Asymptotic Complexity of the time complexity

It is rough estimation of number of steps performed by given computation assuming inputs are infinite.

Also, the exact calculations is to calculate the actual execution times too many parts.

So, usually use the Asymptotic complexity.

 

There are 3 degrees of asymptotic complexity.

  • Big-O [O()] is express the upper-bound (Suppose that program is worst case)

Example

   1. T(n)  = 3n²  + 2n + 5                O(g) ∈ O(n²)

   2. T(n)  = n³+ 3n² +1                   O(g) ∈ O(n³)

 

   3. Time complexity calculation writes the number of times executed every time

       the command is finished.

 

 

void  Func(int *a, n)

{

     int i=0, j=0;                                       1

     for (i = 0 ; i < n-1 ; i++)                      n      

         for(j=i+1; j<n ; j++)                        (n-1) * n 

            if (a[i] == a[j]) a[j] = 0 ;            (n-1) * (n-1)

Sum up all the number of executions of commands. then, result is 2n²-2n + 2

After, constant is omitted, and choose highest order term. Answer is O (n²) 

  • Omega [Ω()] is express the lower-bound (Suppose that program is best case)

  • Theta [Θ()] is express the space of Big-O & Big Omega (Case of Average case)

The order of time complexity Big O is as follows. The higher the number, the slower

O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ) < O(n!) < O(nⁿ)

bottom of page