I have the following code and have to determine the big theta complexity
for i =1 to n do
for j = 1 to n do
k=j
while k<= n do
k = k*3
end while
end for
end for
It's easy to see that the first two for-loops run n times each, but the while loop is throwing my off. The first time it runs log3(n) times, but after that i can't really tell.
Anyone who can help?
Let T be the run time. It is clear T is Ω(n2). We can use Stirling's Approximation to expand ln n! to get
T = ∑i ∑j ⌈lg3(n/j)⌉ = n * O(∑j ln n - ln j + 1) = n * O(n ln n - ln n! + n) = n * O(n ln n - (n ln n - n + O(ln n)) + n) = O(n2)
Thus T = Θ(n2)
Solution without using heavy-weight math:
Turn the problem on its head: instead of thinking about the first time the inner loop runs, think about the last time: it runs only once. In fact, the innermost loop runs only once for most values of j.
It runs once when j > n/3, that is, for 2n/3 values of j
It runs twice when n/9 < j <= n/3, that is, for 2n/9 values of j
It runs 3 times when n/27 < j <= n/9, that is, for 2n/27 values of j
It runs 4 times when n/81 < j <= n/27, that is, for 2n/81 values of j
...
The total number of times the innermost loop runs is going to be
1 * 2n/3 + 2 * 2n/9 + 3 * 2n/27 + 4 * 2n/81 + ...
= 2n(1/3 + 2/9 + 3/27 + ... )
< 2n Sum[k/3^k, for k=1 to infinity]
It's easy to see that the series Sum[k/3^k] converges (ratio test). Therefore the j-loop runs in O(n) time, and the entire thing in O(n²) time.
Related
Can someone help me with calculating the time complexity of the inner loop? As far as I understand, the outer one will be O(n). But I have no idea how to calculate what happens inside the second one.
for (int i = 2; i < n; i++) {
for (int j = 2; i * j < n; j++) {
}
For every iteration of "outer loop", inner loop runs n/i times
So, total complexity of this will be given by:
n/2 + n/3 + n/4 + ...
= n * (1/2 + 1/3 + 1/4 ...)
For the right term above, upper bound is ln(n)
Hence, complexity of this code is O(n log n).
The inner loop runs from 2 up to but not including n/i times. You can express it as n/i - 2.
If we run the inner loop n - 2 times (since that's the number of times the outer loop runs), we get the following summation:
(n/2 - 2) + (n/3 - 2) + ... + (3 - 2)
I have a hunch but can't remember 100% that this series sums up to log_e(n) * n or similar. So in terms of time complexity, this becomes O(log n * n).
The loop exits as soon as i * j ≥ n, i.e. when j = ceiling(n / i) ~ n / i. As it starts from j=2, the number of iterations is ceiling(n / i) - 1.
I was asked if what time complexity if this:
What is the time complexity (with respect of n) of this algorithm:
k=0
for(i = n / 2 ; i < n ; i++ ) {
for( j=0 ; j < i ; j++)
k = k + n / 2
}
choices was : a. O(n) b. O(n/2) c. O(n log(n) and d. O(n^2)
can have a multiple answers.
i know the algorithm above is d. O(n^2) but i came with with a. O(n) since it is looking for complexity of n only?.
if you are to have this question. how would you answer it.?? im so curious about the answer.
The answer is O(n²).
This is easy to understand. I will try to make you understand it.
See, the outer for loop block is executed n - n/2 = n/2 times.
Of course it depends whether the number n is even or odd. If it's even then the outer loop is executed n/2 times. If it's odd then it's executed for (n-1)/2 times.
But for time complexity, we don't consider this. We just assume that the outer for loop is executed n/2 times where i starts from n/2 and ends at n - 1 (because the terminating condition is i < n and not i <= n).
For each iteration of the outer loop, the inner loop executes i times.
For example, for every iteration, inner loop starts with j = 0 to j = i - 1. This means that it executes i times (not i - 1 times because j starts from 0 and not from 1).
Therefore, for 1st iteration the inner loop is executed i = n / 2 times. i = n / 2 + 1 for 2nd iteration and so on upto i = n - 1 times.
Now, the total no. of times the inner loop executes is n/2 + (n/2 + 1) + (n/2 + 2) + ... + (n - 2) + (n - 1). It's simple math that this sums up to (3n² - n)/2 times.
So, the time complexity becomes O((3n² - n)/2).
But we ignore the n term because n² > n and the constant terms because for every n they will remain the same.
Therefore, the final time complexity is O(n²).
Hope this helps you understand.
I have a nested for loop that I am trying to analyze the efficiency of. The loop looks like this:
int n = 1000;
for (int i = 0; i < n; i++) {
for (int j = 0; j < i; j++) {
System.out.print("*");
}
}
I don't believe that this algorithm is O(n^2) because the inner loop does not run n times, it only runs i times. However, it certainly is not O(n). So I hypothesize that it must be between the two efficiencies, which gives O(nlog(n)). Is this accurate or is it really a O(n^2) algorithm and I'm misunderstanding the effect the inner loop has on the efficiency?
Your algorithm will run a triangular number of times:
n * (n + 1) / 2
In the above case, n = 999 because the first j loop doesn't run:
(999 * 1000) / 2 = 499500
It is lower than n**2, but it still is O(n**2), because n * (n + 1) / 2 is n**2 / 2 + n / 2. When n is large, you can ignore n / 2 compared to n**2 / 2, and you can also ignore the constant 1 / 2 factor.
I kind of get your doubts, but try to think in this way: what value will i have in the worst case scenario? Answer is n-1, right? So, as the complexity is evaluated by considering the worst case scenario it turns out that it is O(n^2) as n * (n-1) ~ n^2.
The number of iterations is sum from i=0 to n-1 (sum from j=0 to i-1 (1)). The inner sum is obviously equal to i. sum from i=0 to n-1 (i) = n * (n-1) / 2 = O(n^2) is well known.
In the book Programming Interviews Exposed it says that the complexity of the program below is O(N), but I don't understand how this is possible. Can someone explain why this is?
int var = 2;
for (int i = 0; i < N; i++) {
for (int j = i+1; j < N; j *= 2) {
var += var;
}
}
You need a bit of math to see that. The inner loop iterates Θ(1 + log [N/(i+1)]) times (the 1 + is necessary since for i >= N/2, [N/(i+1)] = 1 and the logarithm is 0, yet the loop iterates once). j takes the values (i+1)*2^k until it is at least as large as N, and
(i+1)*2^k >= N <=> 2^k >= N/(i+1) <=> k >= log_2 (N/(i+1))
using mathematical division. So the update j *= 2 is called ceiling(log_2 (N/(i+1))) times and the condition is checked 1 + ceiling(log_2 (N/(i+1))) times. Thus we can write the total work
N-1 N
∑ (1 + log (N/(i+1)) = N + N*log N - ∑ log j
i=0 j=1
= N + N*log N - log N!
Now, Stirling's formula tells us
log N! = N*log N - N + O(log N)
so we find the total work done is indeed O(N).
Outer loop runs n times. Now it all depends on the inner loop.
The inner loop now is the tricky one.
Lets follow:
i=0 --> j=1 ---> log(n) iterations
...
...
i=(n/2)-1 --> j=n/2 ---> 1 iteration.
i=(n/2) --> j=(n/2)+1 --->1 iteration.
i > (n/2) ---> 1 iteration
(n/2)-1 >= i > (n/4) ---> 2 iterations
(n/4) >= i > (n/8) ---> 3 iterations
(n/8) >= i > (n/16) ---> 4 iterations
(n/16) >= i > (n/32) ---> 5 iterations
(n/2)*1 + (n/4)*2 + (n/8)*3 + (n/16)*4 + ... + [n/(2^i)]*i
N-1
n*∑ [i/(2^i)] =< 2*n
i=0
--> O(n)
#Daniel Fischer's answer is correct.
I would like to add the fact that this algorithm's exact running time is as follows:
Which means:
I have a short program here:
Given any n:
i = 0;
while (i < n) {
k = 2;
while (k < n) {
sum += a[j] * b[k]
k = k * k;
}
i++;
}
The asymptotic running time of this is O(n log log n). Why is this the case? I get that the entire program will at least run n times. But I'm not sure how to find log log n. The inner loop is depending on k * k, so it's obviously going to be less than n. And it would just be n log n if it was k / 2 each time. But how would you figure out the answer to be log log n?
For mathematical proof, inner loop can be written as:
T(n) = T(sqrt(n)) + 1
w.l.o.g assume 2 ^ 2 ^ (t-1)<= n <= 2 ^ (2 ^ t)=>
we know 2^2^t = 2^2^(t-1) * 2^2^(t-1)
T(2^2^t) = T(2^2^(t-1)) + 1=T(2^2^(t-2)) + 2 =....= T(2^2^0) + t =>
T(2^2^(t-1)) <= T(n) <= T(2^2^t) = T(2^2^0) + log log 2^2^t = O(1) + loglogn
==> O(1) + (loglogn) - 1 <= T(n) <= O(1) + loglog(n) => T(n) = Teta(loglogn).
and then total time is O(n loglogn).
Why inner loop is T(n)=T(sqrt(n)) +1?
first see when inner loop breaks, when k>n, means before that k was at least sqrt(n), or in two level before it was at most sqrt(n), so running time will be T(sqrt(n)) + 2 ≥ T(n) ≥ T(sqrt(n)) + 1.
Time Complexity of a loop is O(log log n) if the loop variables is reduced / increased exponentially by a constant amount. If the loop variable is divided / multiplied by a constant amount then complexity is O(Logn).
Eg: in your case value of k is as follow. Let i in parenthesis denote the number of times the loop has been executed.
2 (0) , 2^2 (1), 2^4 (2), 2^8 (3), 2^16(4), 2^32 (5) , 2^ 64 (6) ...... till n (k) is reached.
The value of k here will be O(log log n) which is the number of times the loop has executed.
For the sake of assumption lets assume that n is 2^64. Now log (2^64) = 64 and log 64 = log (2^6) = 6. Hence your program ran 6 times when n is 2^64.
I think if the codes are like this, it should be n*log n;
i = 0;
while (i < n) {
k = 2;
while (k < n) {
sum += a[j] * b[k]
k *= c;// c is a constant bigger than 1 and less than k;
}
i++;
}
Okay, So let's break this down first -
Given any n:
i = 0;
while (i < n) {
k = 2;
while (k < n) {
sum += a[j] * b[k]
k = k * k;
}
i++;
}
while( i<n ) will run for n+1 times but we'll round it off to n times.
now here comes the fun part, k<n will not run for n times instead it will run for log log n times because here instead of incrementing k by 1,in each loop we are incrementing it by squaring it. now this means it'll take only log log n time for the loop. you'll understand this when you learn design and analysis of algorithm
Now we combine all the time complexity and we get n.log log n time here I hope you get it now.