Skip to main content

Big O Notation, how fast is your code?

· 4 min read
Strider

The Big-O notation, many talk about it but what is it? It is the question, how fast is the code, which you have in front of you. You have an algorithm that is supposed to solve a task, all well and good, but how long does it take? And more importantly, how does it look with a lot of data?

Google, for example, needs just 0.33 seconds for a search query "c++" for 68,300,000 results. Here you can see how good the performance of the search engine is. That is exactly our keyword: performance. When we talk about performance in code, we mean how long the code takes to solve a task. You can also say time complexity. So the question how much effort does the code need to solve the problem?

Small example, from the Linked Lists post, where we want to insert something at the end. Here we have a next and a prev pointer, as well as head and tail. Does it make sense to go through the whole list from head to add an element at the end of the list?

Short answer: No, it's nonsense. It would make more sense, since we have Tail and a prev pointer, to say we start directly at Tail and add it there directly. Why does it make more sense? It's simply because we don't have to go through the whole list, which contains n elements. Because you can skip this part and go directly to the operation. Here you can really use the saying "Think practical, give coffins".

dia.png

If we look at the diagram, we can see what types of efforts exist. Basically, everything greater than O(n)O(n) is in the bad zone. Everything equal to or less than O(n)O(n) is in the good zone.

There are a few rules regarding notation. These are the following:

  • Efforts that contain one or more operations, and are executed only once -> O(1)O(1), O(c)O(c).
  • Constants are omitted, because n is considered to be very large. -> 2n2n,3n3n,4n4n,5n5n = O(n)O(n)
  • Always the largest effort is taken, which is considered as the total effort. Complexities like -> O(n3+n2+n+c)=O(n3)O(n^3 + n^2 + n + c) = O(n^3).
  • Strive to reduce complexity or even reach O(1)O(1).
  • Always consider worst case.

O(1)O(1)

void foo()
{
std::cout << "bar" << std::endl;
}

Here we see a function which has an operation. Here, for example, the effort is O(1)O(1).

O(n)O(n)

void foo(int n)
{
for(int i = 0; i < n; i++)
{
std::cout << "bar" << std::endl;
}
}

Here we have built in a loop that runs to nn. So here the effort is O(n)O(n).

O(log(n))O(log(n))

void foo(int n)
{
for(int i = 0; i < n; i=i*2)
{
std::cout << "bar" << std::endl;
}
}

Here we have the same function, only that we now double the counter per run. Here the effort would be O(log(n))O(log(n)).

O(nm)O(nm) and O(n2)O(n^2)

void foo(int n)
{
int m = n / 2;
for(int i = 0; i < n; i++)
{
for(int j = 0; j < m; j++)
{
std::cout << "bar" << std::endl;
}
}
}

Here we now have 2 loops, one that runs to n and one that runs to m. This results in an effort O(nm)O(nm). Suppose if n=mn = m. Wouldn't that be O(n2)O(n^2)? Yup it would be.

void foo(int n)
{
for(int i = 0; i < n; i++)
{
for(int j = 0; j < n; j++)
{
std::cout << "bar" << std::endl;
}
}
}

Here we have a code, with the effort O(n2)O(n^2), because both loops run to n and are nested in each other.

O(n3)O(n^3)

void foo(int n)
{
for(int i = 0; i < n; i++)
{
for(int j = 0; j < n; j++)
{
for(int k = 0; k < n; k++)
{
std::cout << "bar" << std::endl;
}
}
}
}

Here we have a cubic effort. In this code there are 3 loops nested inside each other and all three run to n. So the effort here is O(n3)O(n^3).

O(2n)O(2^n)

We have an effort of O(2n)O(2^n) e.g. with the game Towers of Hanoi or with password attacks (Bruteforce).

O(n!)O(n!)

We have an effort of O(n!)O(n!), if we calculate the Fibonacci numbers recursively.

int fib(int num)
{
if(num <= 1)
return num;

return fib(num - 2) + fib(num - 1)
}

I hope I could bring a little insight into the topic 😄