My question is about finding the complexity of this algorithm. J value is related to n, so I'm confused about this.
What is the asymptotic complexity of this pseudocode?
for i=1 to n
do
j = 1;
while (j < n)
do
j = j * 2;
Thanks.
I believe it is O(n log2n)
The outer loop is called n times and the inner loop is called log2n times, since in every iteration, j is doubled. For first iteration, i.e., k=0; j is equal to 1 and goes on like 2, 4, 8, ... until 2k>=n
If I add a print at the end of the inner loop, I see:
(1,2,5)
(1,4,5)
(1,8,5)
(2,2,5)
(2,4,5)
(2,8,5)
(3,2,5)
(3,4,5)
(3,8,5)
(4,2,5)
(4,4,5)
(4,8,5)
(5,2,5)
(5,4,5)
(5,8,5)
So it appears O(n^2) but the inner loop appears constant so probably O(n) -- If the (j < n) part is switched to (j < i), that would be closer to O(n log(n)):
(2,2,5)
(3,2,5)
(3,4,5)
(4,2,5)
(4,4,5)
(5,2,5)
(5,4,5)
(5,8,5)
Related
What is the big-O for the following code :
y=1;
x=3;
for(int i =1 ; i < =n ; i*=2)
for(int j =1; j<= i * i; j++)
if (i % j == 0)
for(int k = 1; k<=j; k++)
y=y*x;
My Thoughts :
Looking at another similar questions I think the inner most loop is O(n) and the first loop is O(log (n))..as for the middle its O(n^2)
so the overall result would be O(log(n)*n^3)
Is my answer and way of thinking right ? I'm new to this so i hope i can get some help explaning how this loops work.
the most inner loop will run j time if i % j == 0. As the middle loop will run i^2 times, only when j < i it will be possible to satisfy the specified condition. Hence, among i^2 iteration of the middle loop, at least i^2 - i times, the condition will not be satisfied.
Suppose we denote the number of divisors of i with tau(i), among j < i only tau(i) times the condition will satisfy that means the total complexity of the most inner loop is equal to the sum of divisions of i which is at most 77/16 i (see this post for the proof).
Hence, the total complexity of the middle loop with the inner loop is at most (i^2 - i) + (i - tau(i)) + 77/16 i = i^2 + 77/16 i - tau(i).
We also know that the tau(i) is in O(i^(1/loglog(i))) (see the proof here). Now, to find the complexity of the whole loop, we need to sum the last expression for i = 1, 2, 4, ..., n. As we desire to find the asymptotic complexity and we have a sum here, we can ignore the lower powers of i. Therefore, the time complexity of the whole loop is 1 + 2^2 + (2^2)^2 + ... + (2^2)^log(n) = ((2^2)^(log(n)+1)-1)/(2^2-1) = Theta(n^2) (a geometric sum with factor of 2^2 and log(n) items).
In sum, the higher time complexity analysis for the specified code is Theta(n^2) which is also in O(n^2) as well.
for (int i = n; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
//statement
}
}
Answer: O(N)
I know that the first loop for for (int i = n; i > 0; i /= 2) results in O(log N).
The second loop for (int j = 0; j < i; j++) depends on i and will iterate first i times then i / 2, i / 4, ... times. (where i depends on n)
I don't know the Big O for the second loop, but I thought the answer would be O(log N * something) where O(log N) is the outer loop and something is the inner loop?
How do you get O(N)?
The outer loop has a complexity of O(log n), because of the i /= 2. But the inner loop is a little bit more tricky.
The inner loop has a complexity of O(i), but the i is changing for each iteration of the outer loop. In combination with the outer loop you get a complexity of O(n / log n). You get this as follow:
The number of steps, the inner loop is doing, is similar to the sum of 1/(2n), as described on https://en.wikipedia.org/wiki/1/2_%2B_1/4_%2B_1/8_%2B_1/16_%2B_⋯. At first you are doing n steps, then only n/2 steps, then n/4 steps and so on until you do only 2 steps and then finally 1 step. This sums up together to the result of 2n. In total you run the inner loop log n times (as defined by the outer loop). This means the inner loop runs at average 2n / log n times. So you have a complexity of O(n / log n).
With the outer loop of O(log n) and the inner loop of O(n / log n) you get O(log n * n / log n), which can be simplified to O(n).
Below code is actually bounded by O(n^2), could anyone please explain why?
for (int i = 1; i < n; i++) {
int j = i;
while (j < n) {
do operation with cost O(j)
j = j * 3;
}
}
It's not that tricky.
Your inner loop's complexity forms a geometric progression with total omplexity O(n).
Without filling the details all the way (this seems like a HW problem), note that the formula for a geometric sequence is
a_0 (q^k - 1) / q - 1, (a_0 = first element, q = multiplication factor, k = num elements).
and your q^k here is O(n).
Your outer loop is O(n).
Since it's a nested loop, and the inner term does not depend on the outer index, you can multiply.
The proof is geometric progression :)
For i = n, the inner loop doesn't execute more than once if i > n/3 (because in next iteration j is >= n).
So for 2n/3 iterations of outer loop the inner loop iterates only once and does O(j) operation. So for 2n/3 iterations the complexity is O(n^2).
To calculate complexity of remaining iterations set n = n/3, now apply steps 1, 2 and 3
Methodically, using Sigma notation, you may do the following:
I have an array of lists(i.e. each cell in the array contains a list). The length of the array is n and the sum of all the lengths of all the lists is k
I want to iterate over all the list elements(in the whole array):
for(int i = 0; i < n; ++i) {
for(int j = 0; j < array[i].list.Length(); ++j) {
//do something in O(1)
}
}
NOTE the inner loop runs less than k times per an iteration of the outer loop, but the total iterations it does for all the i is k
QuestionDoes the time complexity of the code is O(n + k)? Or would it be O(n*k)?
Question Does the time complexity of the code is O(n + k)? Or would it be O(n*k)?
Neither.
The complexity is O(n + k). In the case where n <= k, this would equal O(k), but this is not necessarily the case.
n <= k (original answer)
If the sum of all lengths is k, then, if you don't do anything else in the outer loop, the running time would be O(k). n is irrelevant in this case, since there is nothing interesting you're doing n times. Your data just happens to be split up in n chunks.
On average, each list's size would be k/n. That makes the time complexity of the algorithm O(n * k/n) which results in O(k).
n > k
In the case that n is larger than k, n becomes relevant since work has to be done each time, even if it's only checking the Length() of array[i]. Because of that, in this case the complexity is O(n + k).
Update
As Jordi Vermeulen correctly points out in the comments, my original answer that only took into consideration the case where n <= k is incomplete incorrect. The answer has been edited accordingly.
This is O(n + k), which is O(k) when n is O(k). This is, however, not necessarily the case (as suggested in the answer by Bart van Nierop). Consider, for instance, the case where n = k2. The loop is still running k2 times, so you can't say the complexity is O(k), even though in many iterations no work will be done other than increasing the counter.
For every i of the external loop, the inner loop is run array[i].list.Length() which you say is k:
k times + -+
k times + |
... |
... +--- n times
... |
k times -+
So the resulting time is O(n * k)
You should use n*k.
Foreach cols, process each lines.
You've got to do a loop (for or foreach) over each columns (n).
And then inside the n loop, you do another loop (for or foreach) that process each rows (k).
for (int i = 0; i < n; i++) {
for (int j = 0; j < array[i].list.length(); j++) {
// do something with array[i][j]
}
}
O(k).do something part will occur k times.
n is irrelevant in this case.
Stuck with me HW - Need to try complexity
time=0;
for (i=n; i>=1; i = sqrt(i))
for (j=1; j<=i; j++)
time++;
What I did - First loop going like this:
i=n, n^(1/2), n^(1/4)...1
than we get:
n^(1/2)^k = 1
if I log both sides one side get 0... what should I do?
I suppose there is a typo somewhere because otherwise it's Θ(∞) if the input n is not smaller than 1. (For i == 1, the update i = sqrt(i) doesn't change i, so that's an infinite loop.)
So let us suppose it's actually
time = 0;
for (i = n; i > 1; i = sqrt(i))
for (j = 1; j <= i; j++)
time++;
Then, to get the complexity of nested loops, you need to sum the complexity of the inner loop for each iteration of the outer loop. Here, the inner loop runs i times, obviously, so we need to sum the values i runs through in the outer loop. These values are n, n^0.5, n^0.25, ..., n^(1/2^k), where k is characterised by
n^(1/2^(k+1)) < 2 <= n^(1/2^k)
or, equivalently,
2^(2^k) <= n < 2^(2^(k+1))
2^k <= lg n < 2^(k+1)
k <= lg (lg n) < k+1
k = floor(lg(lg n))
Now it remains to estimate the sum from above and below to get the Θ of the algorithm. This estimate is very easy if you start writing down the sums for a few (large) values of n.