# Asymptotic analysis 101

## Big O

## Asymptotic :

The term asymptotic means approaching a value or curve arbitrarily closely (i.e., as some sort of limit is taken). A line or curve

that is asymptotic to given curveais called the asymptote ofc.c(http://mathworld.wolfram.com/Asymptotic.html)

When studying Algorithm, even at a beginner level, we need to understand what **asymptotic (or limiting) behaviour** means. Asymptotic analysis is a way to classify algorithms according to how their running time or space requirements grow. We need to predict **how it will behave with a very large amount of data**.

# Asymptotic functions

The graph below represents the function *f(x)= 1/x. *What’s important to observe in this graph is the **“tail behaviour”**. We are interested in how does *f(x)* behaves with very large *x* values :

We can see that where the value of *x *approach infinity, the output *y* gets really close to *zero*. *f(x)* has a horizontal asymptote at *y*→*0* and a vertical asymptote at *x→0*.

# Asymptotic bounding

## Upper Bound

We use the **big O notation** to express upper bounding.

If we have the function *T(n)* that represents the time an algorithm takes :

*T(n)* is upper bounded by *f(n)* if and only if *T(n)* ≤ *some constant* multiplied by *f(n) *for all* n ≥ n[0], *so even* *when *n* gets very large.

So, given a function *T(n)*, we can bound it with a function *f(n)*. Let’ s have for example a linear function *T(n)* and a quadratic *f(n) = n²*. *T(n)* is upper bounded by *n²* as long as we have a constant *c* that can keep *T(n)* below at all times. So we can see that we have a large possibility of functions that can upper bound *T(n) *and we need to know how tight the bound is and chose the tightest one. In this example where *T(n)* is linear we can also have *f(n) = n* and find a constant *c* so that *T(n) ≤ cf(n) *for all* n ≥ n[0]*. The Upper bound gives us the order of growth of an algorithm’ s running time in the worst case scenario.

## Lower bound

The function *T(n)* is lower bounded by *f(n)* if and only if *c*f(n)* keeps *T(n)* above at all times. With linear function *T(n), Ω(f(n)) = Ω(log(n))* is an example of lower bounding : *T(n)* is *Ω(f(n))* for all *n ≥ n[0].*

## Theta

So we have the upper bound *O(.) *and* *the lower bound *Ω(.). *An exact bound is denoted by theta *Θ(.)*.

## Some asymptotic behaviours :

O(1)

O(n)

O(*log*(n))

O(*n***log*(n))

O(n²)

O(n³)

O(!n)

…

# Frequency count method :

We can analyse the time and space complexity of an algorithm with the **frequency count method**. I found really good explanations and examples in this YouTube video by Abdul Bari and I recommend everyone to subscribe to his channel. Here are some examples of *O(1), O(n), O(nˣ) *Algorithms :

In the case of a simple Swap algorithm where we swap two integers *a* and *b*,* *each statement takes 1 unit of time :

`Swap(a, b)`

{

temp = a; /* *statement takes 1 unit of time* */

a = b; /* *statement takes 1 unit of time* */

b = temp; /* *statement takes 1 unit of time* */

}

T(n) = 3

O(1)

What about space complexity ? We have 3 variables : *a*, *b* and *temp. *Our space function is therefore :

S(n) = 3

O(1)

When a *for loop* is involved, we need to multiply the repeated statements *n *times and the loop itself takes *n+1* units of time :

`Sum(A, n)`

{

s = 0; /* *1 unit of time* */

for (i = 0; i < n; i++) /* *n+1 unit of time* */

{

s = s + A[i]; /* *n*1 unit of time* */

}

return s; /* *1 unit of time* */

}

T(n) = 2n+ 3

O(n)

We have 4 variables here : *A, n, s* and *i. A *takes *n* units of space and *n, s* and *i* take 1 unit of space each.

S(n) = n+3

O(n)

In the case where we have two nested for loop, for example an algorithm that compute the sum of two matrices :

`Add(A, B, n)`

{

for(i = 0; i < n; i++) /* *n+1* */

{

for(j = 0; j < n; j++) /* n+1 * n */

C[i,j] = A[i,j] + B[i,j]; /* 1 * n * n */

}

}

}

T(n) = 2n² + 2n + 1

O(n²)

We have 6 variables here : *A, B, C, n, i *and *j. A, B, C *are *n × n *matrices and take *n²* units of space each and *n, i *and *j* take 1 unit of space each.

S(n) = 3n² + 3

O(n²)

An other example :

`Multiply(A, B, n)`

{

for(i = 0; i < n; i++)

{

for(j = 0; j< n; j++)

{

for(k = 0; k< n; k++)

{

C[i,j] = C[i,j] + A[i,k] * B[k,j];

}

}

}

}

T(n) = 2n³ + 3n² + 2n + 1

O(n³)

S(n) = 3n² + 4

O(n²)