I am stuck on a review question for my upcoming midterms, and any help is greatly appreciated.
Please see function below:
void george(int n) {
int m = n; //c1 - 1 step
while (m > 1) //c2 - log(n) steps
{
for (int i = 1; i < m; i++) //c3 - log(n)*<Stuck here>
int S = 1; //c4 - log(n)*<Stuck here>
m = m / 2; //c5 - (1)log(n) steps
}
}
I am stuck on the inner for loop since i is incrementing and m is being divided by 2 after every iteration.
If m = 100:
1st iteration m = 100: loop would run 100, i iterates 100 times + 1 for last check
2nd iteration m = 50: loop would run 50 times, i iterates 50 times + 1 for last check
..... and so on
Would this also be considered log(n) since m is being divided by 2?
External loop executes log(n) times
Internal loop executes n + n/2 + n/4 +..+ 1 ~ 2*n times (geometric progression sum)
Overall time is O(n + log(n)) = O(n)
Note - if we replace i < m with i < n in the inner loop, we will obtain O(n*log(n)) complexity, because in this case we have n + n + n +.. + n operations for inner loops, where number of summands is log(n)
Related
Can someone help me with calculating the time complexity of the inner loop? As far as I understand, the outer one will be O(n). But I have no idea how to calculate what happens inside the second one.
for (int i = 2; i < n; i++) {
for (int j = 2; i * j < n; j++) {
}
For every iteration of "outer loop", inner loop runs n/i times
So, total complexity of this will be given by:
n/2 + n/3 + n/4 + ...
= n * (1/2 + 1/3 + 1/4 ...)
For the right term above, upper bound is ln(n)
Hence, complexity of this code is O(n log n).
The inner loop runs from 2 up to but not including n/i times. You can express it as n/i - 2.
If we run the inner loop n - 2 times (since that's the number of times the outer loop runs), we get the following summation:
(n/2 - 2) + (n/3 - 2) + ... + (3 - 2)
I have a hunch but can't remember 100% that this series sums up to log_e(n) * n or similar. So in terms of time complexity, this becomes O(log n * n).
The loop exits as soon as i * j ≥ n, i.e. when j = ceiling(n / i) ~ n / i. As it starts from j=2, the number of iterations is ceiling(n / i) - 1.
I am practicing time complexity and some of them if a bit too complicated for me.
I would really appreciate of someone could explain these for me.
A) The time complexity is O(n). How is that?
for (int i = N; i > 0; i = i/2) {
for (int j = i+i; j > 0; j--) {
doSomething(i, j);
}
}
B) The time complexity is O(n logn). How is that?
for (int i = N+N; i > 0; i--) {
for (int j = N; j > 0; j = j/2) {
doSomething(i, j);
}
}
I suppose we must assume that the execution of doSomething takes constant time, independent of the values it gets as arguments.
Algorithm A:
On the first iteration of the outer loop, the inner loop iterates 2𝑁 times. Every next iteration of the outer loop, the number of iterations of the inner loop is halved. So we get this series:
2𝑁 + 𝑁 + 𝑁/2 + 𝑁/4 + 𝑁/8 + ... + 2
Given that this series is finite, but has the pattern of 1/2 + 1/4 + 1/8 + 1/16 + ..., we can conclude that this is less than 4𝑁, and so it is O(𝑁).
Algorithm B:
Here the number of iterations of the inner loop does not depend on the value of 𝑖, so it is always the same: each time it performs log2𝑁 iterations (since 𝑗 is halved each iteration). As the outer loop iterates 2𝑁 times, doSomething is called 2𝑁log2𝑁, which is O(𝑁log𝑁)
problem A
Here the first loop will execute log2(n)+1 times, and the second loop will execute i+i times. So what will the value of i in every second loop.
for n, it will be like
n + n/2 + n/4 + n/8 + n/16 + .......
summation of this will be the answer.
as we know
a + ar + ar^2 + ar^3 + ar^4 .... + ar^m = (1-a^(m+1))/(1-a)
here a = n, r = 1/2 and m = log2(n)+1
n + n/2 + n/4 + n/8 + ... n/(2^(m)) =2n−n/2^m = 2n-1;
so the complexity is O(2n-1) = O(n)
problem B
here the first loop will execute n times. And for every first loop execution, the second loop will be executed log2(n)+1 time.
for (int j = n; j > 0; j = j/2)
for example n = 10 ,
value of j will be 10, 5, 2, 1 ,0. For 10 it will execute 4 times or log2(10)+1 times .
so for every first loop it will execute log2(n)+1 times. so the complexity is
O(n(log2(n)+1)) = O (nlog(n))
This code snippet is suppose to have a complexity of O(n). Yet, I don't understand why.
sum = 0;
for (k = 1; k <= n; k *= 2) // For some arbitrary n
for (j = 1; j <= k; j++)
sum++;
Now, I understand that the outer loop by itself is O(log n), so why is it that adding the inner loop makes this O(n).
Let's assume that n is a power of 2 for a moment.
The final iteration of the inner loop will run n times. The iteration before that will run n/2 times, the second-to-last iteration n/4 times, and so on up until the first iteration which will run once. This forms a series which sums to 2n − 1 total iterations. This is O(n).
(For example, with n = 16, the inner loop runs 1 + 2 + 4 + 8 + 16 = 31 total times.)
Let m = floor(lg(n)). Then 2^m = C*n with 1 <= C < 2. The number k of steps in the inner loop goes like:
1, 2, 4, 8, ..., 2^m = 2^0, 2^1, ..., 2^m
Therefore the total number of operations is
2^0 + 2^1 + ... + 2^m = 2^{m+1} - 1 ; think binary
= 2*2^m - 1
= 2*C*n - 1 ; replace
= O(n)
In the book Programming Interviews Exposed it says that the complexity of the program below is O(N), but I don't understand how this is possible. Can someone explain why this is?
int var = 2;
for (int i = 0; i < N; i++) {
for (int j = i+1; j < N; j *= 2) {
var += var;
}
}
You need a bit of math to see that. The inner loop iterates Θ(1 + log [N/(i+1)]) times (the 1 + is necessary since for i >= N/2, [N/(i+1)] = 1 and the logarithm is 0, yet the loop iterates once). j takes the values (i+1)*2^k until it is at least as large as N, and
(i+1)*2^k >= N <=> 2^k >= N/(i+1) <=> k >= log_2 (N/(i+1))
using mathematical division. So the update j *= 2 is called ceiling(log_2 (N/(i+1))) times and the condition is checked 1 + ceiling(log_2 (N/(i+1))) times. Thus we can write the total work
N-1 N
∑ (1 + log (N/(i+1)) = N + N*log N - ∑ log j
i=0 j=1
= N + N*log N - log N!
Now, Stirling's formula tells us
log N! = N*log N - N + O(log N)
so we find the total work done is indeed O(N).
Outer loop runs n times. Now it all depends on the inner loop.
The inner loop now is the tricky one.
Lets follow:
i=0 --> j=1 ---> log(n) iterations
...
...
i=(n/2)-1 --> j=n/2 ---> 1 iteration.
i=(n/2) --> j=(n/2)+1 --->1 iteration.
i > (n/2) ---> 1 iteration
(n/2)-1 >= i > (n/4) ---> 2 iterations
(n/4) >= i > (n/8) ---> 3 iterations
(n/8) >= i > (n/16) ---> 4 iterations
(n/16) >= i > (n/32) ---> 5 iterations
(n/2)*1 + (n/4)*2 + (n/8)*3 + (n/16)*4 + ... + [n/(2^i)]*i
N-1
n*∑ [i/(2^i)] =< 2*n
i=0
--> O(n)
#Daniel Fischer's answer is correct.
I would like to add the fact that this algorithm's exact running time is as follows:
Which means:
I have a short program here:
Given any n:
i = 0;
while (i < n) {
k = 2;
while (k < n) {
sum += a[j] * b[k]
k = k * k;
}
i++;
}
The asymptotic running time of this is O(n log log n). Why is this the case? I get that the entire program will at least run n times. But I'm not sure how to find log log n. The inner loop is depending on k * k, so it's obviously going to be less than n. And it would just be n log n if it was k / 2 each time. But how would you figure out the answer to be log log n?
For mathematical proof, inner loop can be written as:
T(n) = T(sqrt(n)) + 1
w.l.o.g assume 2 ^ 2 ^ (t-1)<= n <= 2 ^ (2 ^ t)=>
we know 2^2^t = 2^2^(t-1) * 2^2^(t-1)
T(2^2^t) = T(2^2^(t-1)) + 1=T(2^2^(t-2)) + 2 =....= T(2^2^0) + t =>
T(2^2^(t-1)) <= T(n) <= T(2^2^t) = T(2^2^0) + log log 2^2^t = O(1) + loglogn
==> O(1) + (loglogn) - 1 <= T(n) <= O(1) + loglog(n) => T(n) = Teta(loglogn).
and then total time is O(n loglogn).
Why inner loop is T(n)=T(sqrt(n)) +1?
first see when inner loop breaks, when k>n, means before that k was at least sqrt(n), or in two level before it was at most sqrt(n), so running time will be T(sqrt(n)) + 2 ≥ T(n) ≥ T(sqrt(n)) + 1.
Time Complexity of a loop is O(log log n) if the loop variables is reduced / increased exponentially by a constant amount. If the loop variable is divided / multiplied by a constant amount then complexity is O(Logn).
Eg: in your case value of k is as follow. Let i in parenthesis denote the number of times the loop has been executed.
2 (0) , 2^2 (1), 2^4 (2), 2^8 (3), 2^16(4), 2^32 (5) , 2^ 64 (6) ...... till n (k) is reached.
The value of k here will be O(log log n) which is the number of times the loop has executed.
For the sake of assumption lets assume that n is 2^64. Now log (2^64) = 64 and log 64 = log (2^6) = 6. Hence your program ran 6 times when n is 2^64.
I think if the codes are like this, it should be n*log n;
i = 0;
while (i < n) {
k = 2;
while (k < n) {
sum += a[j] * b[k]
k *= c;// c is a constant bigger than 1 and less than k;
}
i++;
}
Okay, So let's break this down first -
Given any n:
i = 0;
while (i < n) {
k = 2;
while (k < n) {
sum += a[j] * b[k]
k = k * k;
}
i++;
}
while( i<n ) will run for n+1 times but we'll round it off to n times.
now here comes the fun part, k<n will not run for n times instead it will run for log log n times because here instead of incrementing k by 1,in each loop we are incrementing it by squaring it. now this means it'll take only log log n time for the loop. you'll understand this when you learn design and analysis of algorithm
Now we combine all the time complexity and we get n.log log n time here I hope you get it now.