time complexity calculation for two for loops with connecting variables - algorithm

what would be the time coplexity of this:
for(k=1;K<=n;k*=2)
for(j=1;j<=k;j++)
sum++
For this i thought as
1. Outer Loop will run logn times
2. Inner Loop will also run logn times.because i think inner loop j is related to k. So how much ever k runs, same is the running time for j too. So total = O(logn * logn)
but in text they have given total= O(2n-1).
can you please explain

when k is 1 (sum++) runs 1 times
when k is 2 (sum++) runs 2 times
when k is 4 (sum++) runs 4 times
when k is n = 2^k (sum++) runs 2^k times
so we must calculate
1+2+4+ ... + 2^k = 2^0 + 2^1 + 2^2 + .... + 2^k = (1 - 2^(k+1))/(1-2)
because we put n = 2^k so :
k = log(n)
2^(log(n)) = n^(log(2))
2* 2^k -1 = 2*n - 1

This problem is most easily interpreted by forgetting that the outer loop is there and first looking at the inner loop complexity. Suppose that the inner loop runs 'M' times... then the total number of 'sum++' operations will be,
1 + 2 + 4 + ... + 2^(M-1)
This sum can be reduced to '2^(M) - 1' by noticing that this is a binary number composed of all 1's. Now the question is what is M? You've already figure this out, M = log(n)+1 (the +1 is because the loop must run at least once). Plugging this into the inner loop complexity leaves us with,
2^(log(n)+1)-1 = 2*n - 1.
Thus the entire loop scales as O(2n-1). Hope this helps!

Related

How to do this nested for loop time complexity?

I'm trying to figure out this time complexity:
for(i=0; i<=(n/2)-1; i++){
for (j=i+1; j<=(n/2)-1; j++){
--some O(1) thing--
}
}
The outer loop I understand to be on its own O(n/2.) However with the inner loop as well I can't wrap my brain around how to break down how many times O(1) executes.
If the inner one stared j=0 I could do n/2(inner) * n/2(outer) = O(n^2) time complexity right? However since j depends on i, I'm thinking some type of summation is involved from i+1 to n/2 but i can't figure out how to set it up...
Basically I need help kind of visualizing how many times it loops, and how to set the summation up. Thank you! :)
Assuming that m = n/2. You will see that in inner loop, j will iterater over range m-1, m-2, m-3,... 1. Summing all of that will be 1+2+..+m-1 = (m-1)*m/2 = O(m^2)
Premise
For simplicity, let us call m = n / 2 - 1. The outer loop runs from 0 to m. The inner loop from i + 1 to m.
Iteration counting
We need to count how often the inner statement which you labeled O(1) is executed. That is, how often the inner loop runs in total, as executed by the outer loop. So let us take a look.
The first iteration of the outer loop generates m - 1 iterations of the inner loop. The second generates m - 1, then m - 2, m - 3, m - 4, ..., 2, 1, 0.
That means that the O(1) statement is, in total, executed:
(m - 1) + (m - 2) + (m - 3) + ... + 2 + 1 + 0
That is the sum from 0 up to m - 1
sum_{i = 0}^{m - 1} i
which can be simplified to
(m^2 - m) / 2
Substitute back
Let us now substitute back m = n / 2 - 1, we get
((n / 2 - 1)^2 - (n / 2 - 1)) / 2
After simplifying, this is
n^2/8 - 3n/4 + 1
Big-O
For Big-O, we observe that it is smaller than
n^2 - 0 + n^2
= 2n^2
Which, by definition is O(n^2).
As you see, this bound is also tight. So we also receive Omega(n^2) which also concludes Theta(n^2).

time complexity (with respect of n input)

I was asked if what time complexity if this:
What is the time complexity (with respect of n) of this algorithm:
k=0
for(i = n / 2 ; i < n ; i++ ) {
for( j=0 ; j < i ; j++)
k = k + n / 2
}
choices was : a. O(n) b. O(n/2) c. O(n log(n) and d. O(n^2)
can have a multiple answers.
i know the algorithm above is d. O(n^2) but i came with with a. O(n) since it is looking for complexity of n only?.
if you are to have this question. how would you answer it.?? im so curious about the answer.
The answer is O(n²).
This is easy to understand. I will try to make you understand it.
See, the outer for loop block is executed n - n/2 = n/2 times.
Of course it depends whether the number n is even or odd. If it's even then the outer loop is executed n/2 times. If it's odd then it's executed for (n-1)/2 times.
But for time complexity, we don't consider this. We just assume that the outer for loop is executed n/2 times where i starts from n/2 and ends at n - 1 (because the terminating condition is i < n and not i <= n).
For each iteration of the outer loop, the inner loop executes i times.
For example, for every iteration, inner loop starts with j = 0 to j = i - 1. This means that it executes i times (not i - 1 times because j starts from 0 and not from 1).
Therefore, for 1st iteration the inner loop is executed i = n / 2 times. i = n / 2 + 1 for 2nd iteration and so on upto i = n - 1 times.
Now, the total no. of times the inner loop executes is n/2 + (n/2 + 1) + (n/2 + 2) + ... + (n - 2) + (n - 1). It's simple math that this sums up to (3n² - n)/2 times.
So, the time complexity becomes O((3n² - n)/2).
But we ignore the n term because n² > n and the constant terms because for every n they will remain the same.
Therefore, the final time complexity is O(n²).
Hope this helps you understand.

Determining the time complexity

Given the following pseudo code for an array A
x = 0
for i = 0 to n - 1
for j = i to n - 1
if A[i] > A[j]:
x = x + 1
return x
how do I determine the running time?
I would say it's (n - 1)*(n - 1) = n^2 - 2n - 1 = O(n^2).
I'm not quite sure how to work with the if loop though.
yes O(n^2), just sum the number of iterations in the inner loop:
n + (n - 1) + (n - 2) + ... 1 = [ n x (n + 1) ] / 2
and if is not a loop, it is a control structure. generally you just count the number of times the condition is checked without considering the condition. The condition may be important if there is another loop in the if body.
how to count the iterations of the inner loop:
when i = 0 the inner loop runs n times, then ends
then i = 1 the inner loop runs n - 1 times, then ends
then i = 2 the inner loop runs n - 2 times, then ends
....
when i = n - 2 inner loop runs 1 times
when i = n - 1 inner loop runs 0 times
so all we need to do is to add the number of iterations of the inner loop:
n + (n - 1) + (n - 2) + ... 1 = [ n x (n + 1) ] / 2
#perreal is totally right about the order:
n*(n+1)/2 => O(n^2)
About the "if" part, it doesn't really matter. (I write this to answer to this part)
Lets say doing checking if takes c1 time, and doing the x=x+1 takes c2 time. You will have
(c1 | c1+c2)* n*(n+1)/2
And since you can ignore the constants from the order, it is from
O(n^2)
Actually, saying "this algorithm has a O(n^2)time complexity" suggests implicitly that it's a worst case complexity.
In a naive way, you can just count the number of times each instruction is executed in the worst case. if A[i] > A[j]: might be considered as an instruction as well, so first you don't necessarily have to ask yourself when the condition is true.
2*(n-1)*(n-1) is a majorant of the number of instructions executed in the inner-most loop, and more precisely:
2(n + (n-1) + ... + 1) = n(n+1) = O(n^2)
Even if it's not important with the O-notation, there are several arrays for which the condition is always true (thus these are the worst case for this algorithm). For example:
n n-1 n-2 ... 2 1

Big-O complexity of a piece of code

I have a question in algorithm design about complexity. In this question a piece of code is given and I should calculate this code's complexity.
The pseudo-code is:
for(i=1;i<=n;i++){
j=i
do{
k=j;
j = j / 2;
}while(k is even);
}
I tried this algorithm for some numbers. and I have gotten different results. for example if n = 6 this algorithm output is like below
i = 1 -> executes 1 time
i = 2 -> executes 2 times
i = 3 -> executes 1 time
i = 4 -> executes 3 times
i = 5 -> executes 1 time
i = 6 -> executes 2 times
It doesn't have a regular theme, how should I calculate this?
The upper bound given by the other answers is actually too high. This algorithm has a O(n) runtime, which is a tighter upper bound than O(n*logn).
Proof: Let's count how many total iterations the inner loop will perform.
The outer loop runs n times. The inner loop runs at least once for each of those.
For even i, the inner loop runs at least twice. This happens n/2 times.
For i divisible by 4, the inner loop runs at least three times. This happens n/4 times.
For i divisible by 8, the inner loop runs at least four times. This happens n/8 times.
...
So the total amount of times the inner loop runs is:
n + n/2 + n/4 + n/8 + n/16 + ... <= 2n
The total amount of inner loop iterations is between n and 2n, i.e. it's Θ(n).
You always assume you get the worst scenario in each level.
now, you iterate over an array with N elements, so we start with O(N) already.
now let's say your i is always equals to X and X is always even (remember, worst case every time). how many times you need to divide X by 2 to get 1 ? (which is the only condition for even numbers to stop the division, when they reach 1).
in other words, we need to solve the equation
X/2^k = 1 which is X=2^k and k=log<2>(X)
this makes our algorithm take O(n log<2>(X)) steps, which can easly be written as O(nlog(n))
For such loop, we cannot separate count of inner loop and outer loop -> variables are tighted!
We thus have to count all steps.
In fact, for each iteration of outer loop (on i), we will have
1 + v_2(i) steps
where v_2 is the 2-adic valuation (see for example : http://planetmath.org/padicvaluation) which corresponds to the power of 2 in the decomposition in prime factor of i.
So if we add steps for all i we get a total number of steps of :
n_steps = \sum_{i=1}^{n} (1 + v_2(i))
= n + v_2(n!) // since v_2(i) + v_2(j) = v_2(i*j)
= 2n - s_2(n) // from Legendre formula (see http://en.wikipedia.org/wiki/Legendre%27s_formula with `p = 2`)
We then see that the number of steps is exactly :
n_steps = 2n - s_2(n)
As s_2(n) is the sum of the digits of n in base 2, it is negligible (at most log_2(n) since digit in base 2 is 0 or 1 and as there is at most log_2(n) digits) compared to n.
So the complexity of your algorithm is equivalent to n:
n_steps = O(n)
which is not the O(nlog(n)) stated in many other solutions but a smaller quantity!
lets start with worst case:
if you keep dividing with 2 (integral) you don't need to stop until you
get to 1. basically making the number of steps dependent on bit-width,
something you find out using two's logarithm. so the inner part is log n.
the outer part is obviously n, so N log N total.
A do loop halves j until k becomes odd. k is initially a copy of j which is a copy of i, so do runs 1 + power of 2 which divides i:
i=1 is odd, so it makes 1 pass through do loop,
i=2 divides by 2 once, so 1+1,
i=4 divides twice by 2, so 1+2, etc.
That makes at most 1+log(i) do executions (logarithm with base 2).
The for loop iterates i from 1 through n, so the upper bound is n times (1+log n), which is O(n log n).

Analyzing worst case order-of-growth

I'm trying to analyze the worst case order of growth as a function of N for this algorithm:
for (int i = N*N; i > 1; i = i/2)
for (int j = 0; j < i; j++) {
total++;
}
What I'm trying is to analyze how many times the line total++ will run by looking at the inner and outer loops. The inner loop should run (N^2)/2 times. The outer loop I don't know. Could anyone point me in the right direction?
The statement total++; shall run following number of times:
= N^2 + N^2 / 2 + N^2 / 4 ... N^2 / 2^k
= N^2 * ( 1 + 1/2 + 1/4 + ... 1/2^k )
The number of terms in the above expression = log(N^2) = 2log(N).
Hence sum of series = N^2 * (1 - 1/2^(2logN)) / (1/2)
= N^2 * (1 - 1/4N) / (1/2).
Hence according to me the order of complexity = O(N^2)
The outer loop would run with a complexity of log(N) as the series reduces to half on every iteration . For example a binary search.
The outer loop runs exactly 2LOG (base 2) N + 1 times (Float to int conversion and remove decimal places). If you see the value decreases like N^2,N^2/2 , N^2/4 ... 1. ..
So the total number of times total ++ runs is,
Summazion( x from 0 to int(2LOG (base 2) N + 1)) N^2/2^x
for this question as the inner loop is depending upon the value of the variable that is changing by the outer loop (so u cant solve this simply by multiplying the values of inner and the outer loops). u will have to start writing the values in a and then try to figure out the series and then solve the series to get the answer..
like in your question, total++ will run..
n^2 + n^2/2 + n^2/2^2 + n^2/2^3 + .....
then, taking n^2 common, we get
n^2 [ 1 + 1/2 + 1/2^2 + 1/2^3 + ...]
solve this series to get the answer

Resources