What is the time complexity for this loop - algorithm

How can we find the time complexity for this loop
int c = 0;
int j = 1;
while (j< n^3) {
c+=1;
System.out.println(c);
j=j*4;
}

Since every time j is multiplied by 4 we can say after every iteration it can be written as :
1, 4, (4^2), ..., (4^k)
Now for loop to be false, (4^k) >= n^3
4^k >= n^3
k = log(n^3) to the base 4
You can further simplify it to:
3log(n) to base 4 and remove 3 as we do for constants.
k = log(n)
This should be the complexity of your loop.

Related

Why is this snippet O(n)?

This code snippet is suppose to have a complexity of O(n). Yet, I don't understand why.
sum = 0;
for (k = 1; k <= n; k *= 2) // For some arbitrary n
for (j = 1; j <= k; j++)
sum++;
Now, I understand that the outer loop by itself is O(log n), so why is it that adding the inner loop makes this O(n).
Let's assume that n is a power of 2 for a moment.
The final iteration of the inner loop will run n times. The iteration before that will run n/2 times, the second-to-last iteration n/4 times, and so on up until the first iteration which will run once. This forms a series which sums to 2n − 1 total iterations. This is O(n).
(For example, with n = 16, the inner loop runs 1 + 2 + 4 + 8 + 16 = 31 total times.)
Let m = floor(lg(n)). Then 2^m = C*n with 1 <= C < 2. The number k of steps in the inner loop goes like:
1, 2, 4, 8, ..., 2^m = 2^0, 2^1, ..., 2^m
Therefore the total number of operations is
2^0 + 2^1 + ... + 2^m = 2^{m+1} - 1 ; think binary
= 2*2^m - 1
= 2*C*n - 1 ; replace
= O(n)

Time complexity of nested while with changing condition

I'm trying to solve the complexity of this loop
for(int i= 0; i < n; i++) {
c = i;
while(c > 1){
O(1);
c = c / 2;
}
}
as the while condition changes in every loop I don't know how to calculate that strange series.
I mean, if the loop where
for(int i= 0; i < n; i++) {
c = n;
while(c > 1){
O(1);
c = c / 2;
}
}
I know the while has a complexity of O(logn) and it repeats itself n times, so the complexity would be O(nlogn).
The problem I have with previous loop is "c=i". As c=i, first time (c=0) the loop would reproduce 0 times, when c=1 it would reproduce 0 times again, when c=2 it would reproduce 1 time, then the series would follow and it is 0, 0, 1, 2, 2, 3, 3... (while reproductions each time of for loop)
O(logn) would not repeat itself n times, would repeat a number of times I can't come up with, so I don't know how to solve it.
This need a bit of math involved.Given that log is well defined for a and b:
log(a) + log(b) = log(ab)
Here you have
log(1) + log(2) +....+ log(n) = log(1*....*n) = log(n!)
There is a mathematical approximation for log(n!), namely
log(n!) ~ nlog(n) - n + 1
which reveal O(log(n!)= O(nlog(n))

Time complexity for two nested for loops

This question is for revision from a past test paper just needed advice if I on the right track.
Work out the time complexity T(n) of the following piece of code in terms of number of operations for a given integer n:
for ( int k = n; k >0; k /= 3 ) {
for ( int i = 0; i < n; i += 2 ) {
// constant number C of elementary operations
}
for ( int j = 2; j < n; j = (j*j)) {
// constant number C of elementary operations
}
}
So I thought the outer loop would be O(logn), the first inner loop would be O(n) and the second inner loop would be O(logn). Just wanted to know if I had a rough idea and how to move forward from here.
There was recently a question somewhat similar few days ago for which I provided a step-by-step description of complexity analysis: https://stackoverflow.com/a/49093954/926701
The outer loop is indeed O(log3(n))
The first inner loop is indeed O(n)
The second inner loop is O(log2(log2(n)))
Informally, for the second loop, with j(k) the sequence of values taken by the index j of the for loop, we can write:
j(1) = 2, j(2) = j(1)^2 = 4, j(3) = j(2)^2 = 16, ..., j(k) = j(k-1)^2 >= n
=> j(k) = j(k-1)^2 = j(k-2)^4 = ... = j(1)^(2^k) = 2^(2^k)
=> k = log2(log2(n))
Since the number of operations in the inner loops is independent from that of the outer loop, we can multiply the complexity:
T(n) = O(log3(n) * (n + log2(log2(n))))
= O(n.log3(n))
because log2(log2(n)) << n as n -> +Inf.

Explain the answer of this Big O complexity code

for( int bound = 1; bound <= n; bound *= 2 ) {
for( int i = 0; i < bound; i++ ) {
for( int j = 0; j < n; j += 2 ) {
... // constant number of operations
}
for( int j = 1; j < n; j *= 2 ) {
... // constant number of operations
}
}
}
The correct answer is O(n2). As I understand, the Of the first two for loops, they are "nested". Since bound goes from 1 to n, and i goes to bound. The third and fourth loops are not nested.
The third for loop has complexity O(n+2) , The fourth for loop has complexity O(log n)
Someone told me since n2 > n+2 + log n, algorithm is O(n2). (I'm really confused about this step)
I thought you are suppose to multiply each loop, so shouldn't it be log n(outer loop) * n(2nd outer loop) * n(third) * log N(fourth). how do I know which loop I need to add or multiply, and which loop's value should I take(do I take N over log N for the first two loops because N is smaller to process?
The complexity should be O(n3).
First consider the inner-most loops:
for( int j = 0; j < n; j += 2 ) {
... // constant number of operations
}
Index j takes values 0, 2, 4, 6, 8, ... up to n, so j can take at most n / 2 + 1 possible values, hence the complexity of this loop is O(n).
And for another inner-most loop:
for( int j = 1; j < n; j *= 2 ) {
... // constant number of operations
}
Index j takes values 1, 2, 4, 8, 16, ... up to n, so j can take at most log(n) + 1 possible values, hence the complexity of this loop is O(log(n)).
Therefore for any fixed bound and i, the total work done by the two inner-most loops is O(n) + O(log(n)) = O(n).
Now consider the two outer-most loops. Note that if the variable bound for the outer-most loop is k, then the loop indexed by i will loop exactly k times.
for( int bound = 1; bound <= n; bound *= 2 ) {
for( int i = 0; i < bound; i++ ) {
// work done by the inner-most loops
}
}
Since bound takes values 1, 2, 4, 8, ..., these two nested loops will loop:
1^2 + 2^2 + 4^2 + 8^2 + ... + (2^⌊log(n)⌋)^2
Note that this is just a geometric series with common ratio 4, so the summation gives us:
( 4(^(⌊log(n)⌋+1) - 1 ) / 3
= O(2^(2log(n))
= O(2^log(n^2))
= O(n^2)
Since the work done of the inner-most loops is independent of these two loops, the total complexity is given by O(n2) * O(n) = O(n3).
The first and second successive innermost loops have O(n) and O(log n)
complexity, respectively. Thus, the overall complexity of the innermost
part is O(n). The outermost and middle loops have complexity O(log n)
and O(n), so a straightforward (and valid) solution is that the overall
complexity is O(n^2*log(n))

Big O Time Complexity for this code

Given the following code -:
for(int i = 1; i <= N; i++)
for(int j = 1; j <= N; j = j+i)
{
//Do something
}
I know that the outer loop runs N times, and that the inner loop runs approximately log(N) times. This is because on each iteration of i, j runs ceil(N), ceil(N/2), ceil(N/4) times and so on. This is just a rough calculation through which one can guess that the time complexity will definitely be O(N log(N)).
How would I mathematically prove the same?
I know that for the ith iteration, j increments by ceil(N/2(i - 1)).
The total number of iterations of the inner loop for each value of i will be
i = 1: j = 1, 2, 3 ..., n ---> total iterations = n
i = 2: j = 1, 3, 5 ..., n ---> total iterations = n/2 if 2 divides n or one less otherwise
i = 3: j = 1, 4, 7 ..., n ---> total iterations = n/3 if 3 divides n or one less otherwise
...
i = m: j = 1, 1 + m, ... , n ---> total iterations ~ n/m
...
1
So approximately the total iterations will be (n + n/2 + n/3 ... + 1).
That sum is the Harmonic Series which has value approximately ln(n) + C so the total iterations is approximately n ln(n) and since all logarithms are related by a constant, the iterations will be O(nlogn).

Resources