Time complexity for nester for loop - time

for(int i =1; i<n; i=i*2)
{
for(int j=1; j<i; j++){
printf("hello");
}
}
Is the time complexity O(logn) or O(nlogn)?

It's neither. It's O(n).
First, study the special cases where n = 2^k-1 e.g. 1,3,7,15,31,63,127,255 ....
Second, for any number n, prove that if 2^(k-1) <= n < 2^k then 2^k - 1 <= 2n
that's all.

The time complexity for this will be O(n^2) [n square]

Related

What can be the time complexity of this code?

i am kind of confuse in this i know in outer loop if i < n and in inner loop if j < i then the complexity will be O(n^2) but if we increase the limit from n and i respectively to n^2 and i^2 does the complexity doubles like O(n^4) or does it becomes cubic O(n^3) ?
for(long i = 1; i < n*n; i++)
{
for(long j = 1; j < i * i; j++)
{
//some code
}
}
Assuming that //some code takes O(1) operations, the time complexity is O(n^6). Since the inner loop takes i^2 - 1 iterations, you can use the sum of squares formula (or equivalently use Wolfram alpha) to get

what is the time complexity of this code and how? in Big-O

int i, j, k = 0;
for (i = n/2; i <= n; i++) {
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
}
I came across this question and this is what I think.
The outer loop will run, N/2 times and the inner loop will run logN times so it should be N/2*logN. But this is not the correct answer.
The correct answer is O(NlogN), can anybody tell me what I am missing?
Any help would be appreciated.
Let's take a look at this block of code.
First of all, you can notice that inner loop doesn't depend on the external, so the complexity of it would not change at any iteration.
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
I think, your knowledge will be enough to understand, that complexity of this loop is O(log n).
Now we need to understand how many times this loop will be performed. So we should take a look at external loop
for (i = n/2; i <= n; i++) {
and find out, that there will be n / 2 iterations, or O(n) in a Big-O notation.
Combine these complexities and you'll see, that your O(log n) loop will be performed O(n) times, so the total complexity will be O(n) * O(log n) = O(n log n).

complexity on a annidate for cycle

i would like to know if the solution for the complexity of that code is correct:
for (j = 2^N; j>1; j = j/2) {
h = h * 2;
for (i =1; i < j; i = i*2)
for (k=2; k<log N; k++)
cont ++;
}
According me the last cycle have complexity log n
The first cycle have complexity n
The second cycle have complexity log n
So the total complexity is n log n
Best Regards
You have three loops here:
First is linear in N (logarithmic in 2^N)
Second is linear in N (logarithmic in 2^N)
Third is logarithmic in N
So the whole code looks rather as O(N^2 log N)

How is the time complexity of these simple loops calculated?

I understand how:
for (int i=0; i<n; i++)
This time complexity is O(n).
for (int i=0; i<n; i++)
for (int j=0; j<n; j++)
for (k=0; k<n; k++)
this is O(n^3) right?
i=1
do
//......
i++
while (i*2 <n)
Is this O(n)? Or is it exactly O(n/2)?
O(n/2) is O(n) only with a constant coefficient of 1/2. The coefficient can be 10 billion, it would still be O(n), and not e.g. O(n^(1.0001)) which is a different complexity class.
The first one complexity O(n^3), correct.
The second one, O(cn), c constant. No matter how huge c is, according to the definition of big-O, the complexity is still O(n).
However, O-notation is considered harmful. See here.
The first one of O(n3), you're right.
Your second algorithm is O(n/2) = O(Cn) = O(n). 1/2 is a constant so we can safety discard it.
This fragment of code:
i=1
do
//......
i++
while (i*2 < n);
is equivalent to that one:
for ( i = 1; i < n / 2 ; ++ i );
Superficially, this is O(n).

Big-O complexity of nested for loops

I'm confused about the complexity of the following (the operation performed inside the inner loop is in constant time):
for(int i=0; i<n; i++)
for(int j=i; j<n; j++)
is this O(n^2) or O(n)? I figure O(n^2). Any ideas?
also the following makes me curious:
for(int i=0; i<n; i++)
for(j=0; j<i; j++)
Definitely O(n squared), of course. Summary explanation for both cases: 1 + 2 + ... + n is n(n+1)/2, that is, (n squared plus n) / 2 (and in big-O we drop the second, lesser part, so we're left with n squared / 2 which is of course O(n squared)).
You are correct, those nested loops are still O(n^2). The actual number of operations is something close to (n^2)/2, which, after discarding the constant 1/2 factor, is O(n^2).

Resources