Algorithm Complexity-Nested For Loops - algorithm

for(j=n; j>1; j/=2)
++p;
for(k=1; k<p; k*=2)
++q;
On the first code sample, p variable belong to 1st loop. So,even they are not nested loop,will 2nd return log(n),too? I mean totally, O(loglog(n))?
for(i=n; i>0; i--){
for(j=1; j<n; j*=2){
for(k=0; k<j; k++){
//statements-O(1)
}
}
}
And these one, they are nested but k variable belong to 2nd loop. So, should it similar to 1st one? Like O(n^2.log(n)) or O(n.log^2(n))?

Algorithm:
First loop takes log(n) time. Second loop takes log(log(n)) time. So you have log(n) + log(log(n)). But first loop counts more. So it's O(log(n)).
Algorithm: At first look what runtime you have when you only analyse the two outer for loops (means n log(n)). But unfortunately there comes an inner third for loop which makes it more complex.
The third for loop counts from 0 to 2^m where m=log(n). So you have to sum 2^m from 0 to log(n)-1 which is n-1. So n-1 is the run time of the two inner for loops. Now you have to multiply it by the linear run time of the outer for loop. So it's (n-1)n which is n²-n. So you have O(n²) for the three loops.

Related

Time Complexity of this nested for-loop algorithm?

I'm having some trouble calculating the bigO of this algorithm:
public void foo(int[] arr){
int count = 0;
for(int i = 0; i < arr.length; i++){
for(int j = i; j > 0; j--){
count++;
}
}
}
I know the first for loop is O(n) time but I can't figure out what nested loop is. I was thinking O(logn) but I do not have solid reasoning. I'm sure I'm missing out on something pretty easy but some help would be nice.
Let's note n the length of the array.
If you consider the second loop alone, it is just a function f(i), and since it will iterate on all elements from i to 1, its complexity will be O(i). Since you know that j<n, you can say that it is O(n). However, there is no logarithm involved, since in the worst case, i.e. j=n, you will perfrom n iterations.
As for evaluating the complexity of both loops, observe that for each value of i, the second loop goes throught i iterations, so the total number of iterations is
1+2+...+(n-1)= n*(n-1)/2=(1/2)*(n^2-n)
which is O(n^2).
If we consider c as a number of times count is incremented in the inner loop then the total number of times count is incremented can be represented by the formula below:
As you can see, the total time complexity of the algorithm is O(n^2).

Time complexity of the code containing condition

foo(int n)
{
int s=0;
for(int i=1;i<=n;i++)
for(int j=1;j<=i*i;j++)
if(j%i==0)
for(k=1;k<=j;k++)
s++;
}
What is the time complexity of the above code?
I am getting it as O(n^5) but it is not correct.
The complexity is O(n^4).
Innermost loop will be executed i times for each i. (i multiples of i within 0..i*i)
It will be like the inner loop will run for
j = 0 1 2...i i+1 ...2*i ....3*i .... 4*i .... 5*i... i*i
x x x x x x
\------/\--------/\-------/ \------/
These x denotes the execution of the innermost for loop with complexity j. Rest of the time this is not touched and just the test is done and it fails.
So now check the thing, these \-----/ has i*j (j = 1,2,3...i) looping and i checks.
And now we do i times precisely.
So total work = i*(1+1+1+...1) + i*(1+2+3+..i)
= i*i+ i*i*(i+1)/2 ~ i^3
With the outer loop it will be n^4.
Now what is the meaning of it. The whole work can be divided in like this
O(i*j+i)
^^^ ^
| The other cases when it simply skips
The innermost loop executed
Now if we iterate over j then it will have complexity O(n^3).
With added external loop it will be O(n^4).
Your function computes 4-dimensional pyramidal numbers (A001296). The number of increments to s can be computed using this formula:
a(n) = n*(1+n)*(2+n)*(1+3*n)/24
Therefore, the complexity of your function is O(n4).
The reason it is not O(n5) is that if (j%i == 0) proceeds with the "payload" loop only for multiples of i, of which we have exactly i among all js in the range from 1 to i2, inclusive.
Hence, we add one for the outermost loop, one for the loop in the middle, and two for the innermost loop, because it iterates up to i2, for the total of 4.
Why only one for middle (j) ? It runs up to i2 right?
Perhaps it would be easier to see if we rewrite the code to exclude the condition:
int s=0;
for(int i=1;i<=n;i++)
for(int j=1;j<=i;j++)
for(int k=1;k<=i*j;k++)
s++;
return s;
This code produces the same number of "payload loop" iterations, but rather than "filtering out" the iterations that skip the inner loop it removes them from consideration by computing the terminal value of k in the innermost loop as i*j.

Why doesn't the while loop contribute O(n) to the total time complexity? [duplicate]

This question already has answers here:
What is time complexity for the following code?
(6 answers)
Closed 5 years ago.
I got this problem from Interview Bit
Problem
int j = 0;
for(i = 0; i < n; ++i) {
while(j < n && arr[i] < arr[j]) {
j++;
}
}
Question
The total number of comparisons that the while loop does is about n (probably less than or equal to n depending on arr). The loop runs n times. Shouldn't the time complexity be O(n^2)?
One of the conditions in the while loop is while j < n. Meaning worst case, the code will only ever loop in the while loop n times regardless of how many loops the outer for loop does (j starts at zero and only increases, never resets to zero or decreases). Since the the for loop also loops n times, your big-O is O(n+n) => O(n)
NOTE: You can safely ignore the other condition arr[i] < arr[j], since we're just considering what the runtime would be in the worst-case.
This code looks like it was purposefully designed to be misleading. The while loop only runs once from 0 to n, and does not reset for every iteration of the outer for loop.
You need to count the total number of times the statements in the innermost loop get executed.
The nested while loop does not contribute to the complexity because it goes through the values between 0 to n-1 only once, even though the steps through these values may be distributed among different iterations of the outer loop.
The innermost "payload" of the loop, i.e. arr[i] < arr[j] and j++, will execute at most n times, because incrementing j is a "one-way street": its value is never reset back to zero, so once j reaches n, the body of the loop no longer executes.
Actually the inner loop is not dependent on 'i' ,so it will run maximum n times if 'i' goes from 0 to n-1.
The complexity would be O(n^2) if before while loop 'j' was initialised by 0, then in worst case for each 'i' while loop will execute 1+2+3+.......n-2 + n-1 times= O(n^2) when array elements are in descending order.

Why is this algorithm O(nlogn)?

I'm reading a book on algorithm analysis and have found an algorithm which I don't know how to get the time complexity of, although the book says that it is O(nlogn).
Here is the algorithm:
sum1=0;
for(k=1; k<=n; k*=2)
for(j=1; j<=n; j++)
sum1++;
Perhaps the easiest way to convince yourself of the O(n*lgn) running time is to run the algorithm on a sheet of paper. Consider what happens when n is 64. Then the outer loop variable k would take the following values:
1 2 4 8 16 32 64
The log_2(64) is 6, which is the number of terms above plus one. You can continue this line of reasoning to conclude that the outer loop will take O(lgn) running time.
The inner loop, which is completely independent of the outer loop, is O(n). Multiplying these two terms together yields O(lgn*n).
In your first loop for(k=1; k<=n; k*=2), variable k reaches the value of n in log n steps since you're doubling the value in each step.
The second loop for(j=1; j<=n; j++) is just a linear cycle, so requires n steps.
Therefore, total time is O(nlogn) since the loops are nested.
To add a bit of mathematical detail...
Let a be the number of times the outer loop for(k=1; k<=n; k*=2) runs. Then this loop will run 2^a times (Note the loop increment k*=2). So we have n = 2^a. Solve for a by taking base 2 log on both sides, then you will get a = log_2(n)
Since the inner loop runs n times, total is O(nlog_2(n)).
To add to #qwerty's answer, if a is the number of times the outer loop runs:
    k takes values 1, 2, 4, ..., 2^a and 2^a <= n
    Taking log on both sides: log_2(2^a) <= log_2(n), i.e. a <= log_2(n)
So the outer loop has a upper bound of log_2(n), i.e. it cannot run more than log_2(n) times.

Big O sum of integers runtime

I have am trying to learn Big O and am confused on an algorithm I just came across. The algorithm is:
void pairs(int[] array){
for (int i=0; i < array.length; i++){
for (int j=i+1; j<array.length; j++){
System.out.println(array[i]+","+array[j]);
}
}
}
I think the first for loop is O(n) and I know the second for loop is O(1/2*n(n+1)). The answer to the problem was that the run time for the function is O(n^2). I simplified O(1/2*n(n+1)) to O(1/2*(n^2+n)). So I'm confused because I thought that you needed to multiply the two run time terms since the for loop is nested, which would give O(n) * O(1/2*(n^2+n)). I simplified this to O(1/2n^3 + 1/2n^2). From what I understand of Big O, you only keep the largest term so this would reduce to O(n^3). Can anyone help me out with where I went wrong? Not sure how the answer is O(n^2) instead of O(n^3).
When you say the inner loop is O(1/2*n(n+1)), you are actually describing the big-O complexity of both loops.
To say that the outer loop has complexity O(N) basically means its body runs N times. But for your calculation of the inner loop's complexity, you already took account of all iterations of the outer loop, because you added up the number of times the inner loop runs over all iterations of the outer loop. If you multiply again by N, you would be saying that the outer loop itself is re-run another N times.
Put another way, what your analysis shows is that the inner loop body (the System.out.println call) runs 1/2*n(n+1) times overall. That means the overall complexity of the two-loop combination is O(1/2*n(n+1)) = O(n^2). The overall complexity of the two-loop combination describes how many times the innermost code is run.
your mistake is counting the second loop as O(1/2n^2)...
first, you can clearly see it is capped to N-1 (when j = 0)
first loop is clearly N
Second loop is MAX of N-1...
threrefore, O(N^2)...
if we examine it little more,
second loop will run N-1 when i=0,
then N-2 for i=1,
and ONE single time for i=n-1,
this is 1/2n(n-1) = 1/2n^2 - 1/2n = O(n^2)
Notice this includes all iteration of the outer loop too!

Resources