Big-O notation in nested loops - big-o

What is the Big-Oh formula for the following code fragment:
k=0
for i in range(1,100) :
for j in range(i, 100) :
k = k + 1
I think its n^2? Is this right? Also does it have to have the n variable in it?

The algorithmic complexity of code isn't determined by its time to run. It determined by how much extra work a computer has to do depending on the input.
This code has O(1) as regardless of any input you give it it does the exact same thing in the same amount of time
k = 0
for i in range(100):
for j in range(100):
k+=1
This code however would have O(N) time if N is a number given as input because each increment of N makes it do 1 more think
k = 0
for i in range(N):
for j in range(100):
k+=1
And this code however would have O(N^2) time if N is a number given as input because each increment of N makes the computer do N more things
k = 0
for i in range(N):
for j in range(100):
k+=1
So you code has O(1) as it does the same amount of "Things" regardless of any input

Related

Run-Time complexities of the following functions

I need some help with these functions and if the run-time complexities for it are correct, I'm learning the concepts currently in class and I've looked at videos and such but I can't find any videos explaining these tougher ones, so I'm hoping I can get some help here if I'm doing it right.
sum = 0
for i = 1 to n*n
for j = 1 to i * i * i
sum++
For this one I am thinking the answer is O(n^5) because the outer loop is running n^2 times while the inner loop will be running n^3 times and together that'll make n^5
sum = 0
for i = 1 to n^2 // O(n^2) times
j = i
while j > 0 //(O(n+1) since the while loop will check one more time if the loop is valid
sum++
j = (j div 5)
for this run time I'm assuming its going to run O(n^3 + 1) times since outer loop is running n^2 times and while loop will be n+1 and together thats n^3 + 1.
for i = 1 to n // n times
for j = 1 to n { // n^2 times
C[i,j] = 0
for k = 1 to n // n^3 times?
C[i,j] = C[i,j] + A[i,k]*B[k,j]
}
so for this one I'm thinking it's O(n^6) but I am really iffy on this one. I have seen some examples online where people will figure the loop to be O(n log n) but I am totally lost on how that is found. Any help would be greatly appreciated!
Your understanding of the first and the third algorithms looks correct. The second, however, is totally off. The inner loop
while j > 0 //(O(n+1) since the while loop will check one more time if the loop is valid
sum++
j = (j div 5)
starts with j being equal to i and divides j by 5 at each iteration, so it runs log(i) times. In turn, i varies from 1 to n^2, and the total execution time is a
sum[i: 1..n^2] log(i)
By the property of a logarithm this sum is equal to log ((n^2)!). Using Stirling approximation for factorial one obtains the time complexity being O(n^2 log(n^2)) = O(2 n^2 log(n)) = O(n^2 log(n)).

Is this algorithm O(log log n) complexity?

In particular, I'm interested in finding the Theta complexity. I can see the algorithm is bounded by log(n) but I'm not sure how to proceed considering the problem size decreases exponentially.
i = n
j = 2
while (i >= 1)
i = i/j
j = 2j
The simplest way to answer your question is to look at the algorithm through the eyes of the logarithm (in my case the binary logarithm):
log i_0 = log n
log j_0 = 1
k = 0
while (log i_k >= 0) # as log increases monotonically
log i_{k+1} = log i_k - log j_k
log j_{k+1} = (log j_k) + 1
k++
This way we see that log i decreases by log j = k + 1 during every step.
Now when will we exit the loop?
This happens for
The maximum number of steps is thus the smallest integer k such that
holds.
Asymptotically, this is equivalent to , so your algorithm is in
Let us denote i(k) and j(k) the value of i and j at iteration k (so assume that i(1)=n and j(1)=2 ). We can easily prove by induction that j(k)=2^k and that
Knowing the above formula on i(k), you can compute an upper bound on the value of k that is needed in order to have i(k) <= 1 and you will obtain that the complexity is

Time complexity of this simple code

In pseudo-code:
j = 5;
while (j <= n) {
j = j* j* j * j;
}
What is the time complexity of this code?
It is way shorter than O(logn), is there even any reason to go lower than that?
Let's trace through the execution of the code. Suppose we start with initial value j0:
0. j ← j0
1. j ← j0^4
2. j ← [j0^4]^4 = j0^(4^2)
3. j ← [j0^(4^2)]^4 = j0^(4^3)
4. j ← [j0^(4^3)]^4 = j0^(4^4)
...
m. j ← [j0^(4^(m-1))]^4 = j0^(4^m)
... after m loops.
The loop terminates when the value exceeds n:
j0^(4^m) > n
→m > log(4, log(j0, n))
Thus the time complexity is O(m) = O(log log n).
I used help from MathSE to find out how to solve this. The answer is same as another one by #meowgoesthedog, but I understand it the following way:
On every iteration, the value of j is going to increase by its own 4th power. Or, we can look at it from the side of n, that on every iteration n is going to reduce by its 4th root. Hence, the recurrence will look like:
T(n) = 1 + T(n1/4)
For any integer k, with 24k + 1 <= n <= 24k + 1, the recurrence will become:
T(n) = 1 + k
if we go on to assume that the 4th root will always be an integer. It won't matter if it is not as the constant of +/- 1 will be ignored in the Big-O calculation.
Now, since the assumption of 4th root being an integer simplifies things for us, we can try to solve the following equation:
n = 24k,
with the equation yielding k = (Log(Log(n)) - Log(2))/Log(4).
This implies that O(T(n)) = O(Log(Log(n))).

Order of growth for loops

What would be the order of growth of the code below. My guess was, each loop's growth is linear but the if statement is confusing me. How do I include that with the whole thing. I would very much appreciate an explanatory answer so I can understand the process involved.
int count = 0;
for (int i = 0; i < N; i++)
for (int j = i+1; j < N; j++)
for (int k = j+1; k < N; k++)
if(a[i] + a[j] + a[k] == 0)
count++;
There are two things that can be confusing when trying to determine the code's complexity.
The fact that not all loops start from 0. The second loop starts from i + 1 and the third from j + 1. Does this affect the complexity? It does not. Let's consider only the first two loops. For i = 0, the second runs N - 1 times, for i = 1 it runs N - 2 times, ..., for i = N - 1 it runs 0 times. Add all these up:
0 + 1 + ... + N - 1 = N(N - 1) / 2 = O(N^2).
So not starting from 0 does not affect the complexity (remember that big-oh ignores lower-order terms and constants). Therefore, even under this setting, the entire thing is O(N^3).
The if statement. The if statement is clearly irrelevant here, because it's only part of the last loop and contains no break statement or other code that would affect the loops. It only affects the incrementation of a count, not the execution of any of the loops, so we can safely ignore it. Even if the count isn't incremented (an O(1) operation), the if condition is checked (also an O(1) operation), so the same rough number of operations is performed with and without the if.
Therefore, even with the if statement, the algorithm is still O(N^3).
Order of growth of the code would be O(N^3).
In general k nested loops of length N contribute growth of O(N^k).
Here are two was to find that the time complexity is Theta(N^3) without much calculation.
First, you select i<j<k from the range 0 through N-1. The number of ways to choose 3 objects out of N is the binomial coefficient N choose 3 = N*(N-1)*(N-2)/(3*2*1) ~ (N^3)/6 = O(N^3), and more precisely Theta(N^3).
Second, an upper bound is that you choose i, j, and k from N possibilities, so there are at most N*N*N = N^3 choices. This is O(N^3). You can also find a lower bound of the same type since you can choose i from 0 through N/3-1, j from N/3 through 2N/3-1, and k from 2N/3 through N-1. This gives you at least floor(N/3)^3 choices, which is about N^3/27. Since you have an upper bound and lower bound of the same form, the time complexity is Theta(N^3).

Give both an exact and asymptotic answer for the pseudo code below

for i <--- 1 step i <--- 2* i while i< n do
for j <--- 1 step j <---2* j while j<n do
if j = 2*i
for k = 0 step k <--- k+ 1 while k < n do
.... CONSTANT NUMBER OF ELEMENTARY OPERATIONS
end for
else
for k<--- 1 step k<-- 3*k while k<n do
...CONSTANT NUBER OF ELEMENTARY OPERATIONS
end for
end if
end for
end for
What is the running time for the following code fragment as a function of n?
The 'exact answer' refers to the equation relating to the code BEFORE you determine the asymptotic running time.
It sounds as homework, however, making a few considerations, the asymptotic complexity of that pseudo code should be O(n*log(n)).
You cannot estimate exactly the running time since it highly depends on your system.

Resources