Analyzing worst case order-of-growth - algorithm

I'm trying to analyze the worst case order of growth as a function of N for this algorithm:
for (int i = N*N; i > 1; i = i/2)
for (int j = 0; j < i; j++) {
total++;
}
What I'm trying is to analyze how many times the line total++ will run by looking at the inner and outer loops. The inner loop should run (N^2)/2 times. The outer loop I don't know. Could anyone point me in the right direction?

The statement total++; shall run following number of times:
= N^2 + N^2 / 2 + N^2 / 4 ... N^2 / 2^k
= N^2 * ( 1 + 1/2 + 1/4 + ... 1/2^k )
The number of terms in the above expression = log(N^2) = 2log(N).
Hence sum of series = N^2 * (1 - 1/2^(2logN)) / (1/2)
= N^2 * (1 - 1/4N) / (1/2).
Hence according to me the order of complexity = O(N^2)

The outer loop would run with a complexity of log(N) as the series reduces to half on every iteration . For example a binary search.

The outer loop runs exactly 2LOG (base 2) N + 1 times (Float to int conversion and remove decimal places). If you see the value decreases like N^2,N^2/2 , N^2/4 ... 1. ..
So the total number of times total ++ runs is,
Summazion( x from 0 to int(2LOG (base 2) N + 1)) N^2/2^x

for this question as the inner loop is depending upon the value of the variable that is changing by the outer loop (so u cant solve this simply by multiplying the values of inner and the outer loops). u will have to start writing the values in a and then try to figure out the series and then solve the series to get the answer..
like in your question, total++ will run..
n^2 + n^2/2 + n^2/2^2 + n^2/2^3 + .....
then, taking n^2 common, we get
n^2 [ 1 + 1/2 + 1/2^2 + 1/2^3 + ...]
solve this series to get the answer

Related

What complexity is that a nested loop but the inner loop starts from i and ends with half?

for ( i = 1; i <= n; i++) {
for ( j = i; j >= 1; j = j / 2) {
// some code
}
}
Assume that the code of the inner loop is constant. I thought its O(n^2)
Here is my opinion regarding this question.
I think that the run time of the inner loop is logi+1 so I got the formula: (log1+1)+(log2+1)+...+(logn+1) then get O(n^2)
but I saw another solution for this is logi and then the answer is O(nlogn)
then I get confused about which one is correct?
I think that I'm correct but I'm not sure.
So if I'm wrong plz convince me why the second is correct?
I know that the difference between these two is the number of times the inner loop executes
The inner loop has the time complexity of O(log(i)), and the outer loop has the time complexity of O(n). The overall time complexity is O(nlog(n)).
We can also get the time complexity with your calculation:
(log1 + 1) + (log2 + 1) + ... + (logn + 1) <= (logn + 1) + (logn + 1) + ... + (logn + 1) = n * log(n) + n = O(nlog(n)). We can discard n because nlog(n) is the dominating in the polynomial n * log(n) + n. The calculated time complexity is O(nlog(n)) as well.

Time complexity of the inner loop

Can someone help me with calculating the time complexity of the inner loop? As far as I understand, the outer one will be O(n). But I have no idea how to calculate what happens inside the second one.
for (int i = 2; i < n; i++) {
for (int j = 2; i * j < n; j++) {
}
For every iteration of "outer loop", inner loop runs n/i times
So, total complexity of this will be given by:
n/2 + n/3 + n/4 + ...
= n * (1/2 + 1/3 + 1/4 ...)
For the right term above, upper bound is ln(n)
Hence, complexity of this code is O(n log n).
The inner loop runs from 2 up to but not including n/i times. You can express it as n/i - 2.
If we run the inner loop n - 2 times (since that's the number of times the outer loop runs), we get the following summation:
(n/2 - 2) + (n/3 - 2) + ... + (3 - 2)
I have a hunch but can't remember 100% that this series sums up to log_e(n) * n or similar. So in terms of time complexity, this becomes O(log n * n).
The loop exits as soon as i * j ≥ n, i.e. when j = ceiling(n / i) ~ n / i. As it starts from j=2, the number of iterations is ceiling(n / i) - 1.

Complexity when loop runs log times

If we're finding the no. of factors of a number, we can use the following efficient loop.
for(i=1;i<=sqrt(n);i++), where n is the 'no' whose factors are to be found. This loop would have a complexity of O(n).
What would be the time complexity of the below code snippet? (Assume that log(x) returns log value in base 2). O(n^2) or O (n logn)? (I assume that log n is the complexity when the loop divides by two. ie. i/=2)
void fun()
{
int i,j;
for(i=1;i<=n;i++)
for(j=1;j<=log(i);j++)
printf("hello world");
}
The actual number of "Hello world" prints in your code is:
You can then use the Srinivasa Ramanujan approximation of log(n!):
To get the actual complexity of the whole code, which is O(n logn)
The inner loop calls printf approximately log(i) times, for i in range [1..n]. The total number of calls is approximately
log(1)+log(2)+log(3)+...log(n) = log(n!)
Now, the Stirling asymptotic formula will give you the solution.
For the base 2 logarithm, the exact count is given by
0 + 1 + 1 + 2 + 2 + 2 + 2 + 3 + 3 + 3 + 3 + 3 + 3 + 3 + 3 + ... + floor(Lg(n))
or
1.0 + 2.1 + 4.2 + 8.3 + ... + k.floor(Lg(n))
For convenience, assume that n is of the form n=2^m-1, so that the last run is complete (and k=2^(m-1)).
Now take the sum of x^k from 0 to m-1, which equals (x^m-1)/(x-1) and derive on x to get the sum of x^k.k. Evaluating for x=2, you get
s = m.2^m-2^m+2 = (n+1).Lg(n+1)-n+1
For other n, you need to add a correction term for the last partial run. With m=floor(Lg(n+1)):
t = m.(n+1-2.2^m)
An upper bound of O(n*Log(n)) can be proven without any math.
void fun()
{
int i,j;
for(i=1;i<=n;i++)
for(j=1;j<=log(n);j++) // << notice I changed "i" to "n"
printf("hello world");
}
The above function will run N times the inner loop, and the inner loop will run log(N) times.
Hence, the function will run exactly nLog(n) times.
Since this function
(log(n) + log(n) + ... + log(n)) // n times
is larger than the OP version
(log(1) + log(2) + ... + log(n))
Then it is an upper bound of the original version.
<= O(n log(n)
comment
also
(log(n) + log(n) + ... + log(n)) // n times
= log(n^n)
= n*log(n)
j is dependent on j, therefore unroll the dependency, means analyze for i only
if i=1 ----> inner loop executes log(1) times
if i=2 ----> inner loop executes log(2) times
if i=3 ----> inner loop executes log(3) times
.
.
if i=n ----> inner loop executes log(n) times.
combine them ==> log(1)+log(2)+.....+log(n) = log ( 1.2.3...n ) = log ( n! ) = n log(n)

time complexity calculation for two for loops with connecting variables

what would be the time coplexity of this:
for(k=1;K<=n;k*=2)
for(j=1;j<=k;j++)
sum++
For this i thought as
1. Outer Loop will run logn times
2. Inner Loop will also run logn times.because i think inner loop j is related to k. So how much ever k runs, same is the running time for j too. So total = O(logn * logn)
but in text they have given total= O(2n-1).
can you please explain
when k is 1 (sum++) runs 1 times
when k is 2 (sum++) runs 2 times
when k is 4 (sum++) runs 4 times
when k is n = 2^k (sum++) runs 2^k times
so we must calculate
1+2+4+ ... + 2^k = 2^0 + 2^1 + 2^2 + .... + 2^k = (1 - 2^(k+1))/(1-2)
because we put n = 2^k so :
k = log(n)
2^(log(n)) = n^(log(2))
2* 2^k -1 = 2*n - 1
This problem is most easily interpreted by forgetting that the outer loop is there and first looking at the inner loop complexity. Suppose that the inner loop runs 'M' times... then the total number of 'sum++' operations will be,
1 + 2 + 4 + ... + 2^(M-1)
This sum can be reduced to '2^(M) - 1' by noticing that this is a binary number composed of all 1's. Now the question is what is M? You've already figure this out, M = log(n)+1 (the +1 is because the loop must run at least once). Plugging this into the inner loop complexity leaves us with,
2^(log(n)+1)-1 = 2*n - 1.
Thus the entire loop scales as O(2n-1). Hope this helps!

What will be the time complexity of the following algorithm?

for(i=0;i< m; i++)
{
for(j=i+1; j < m; j++)
{
for(k=0; k < n;k++)
{
for(l=0;l< n;l++)
{if(condition) do something}
}
}
}
In details:
The two first loops will result in (m-1) + (m-2) + (m-3) + ... + 1 repetitions, which is equal to m*(m-1)/2. As for the second two loops, they basically run from 0 to n-1 so they need n^2 iterations.
As you have no clue whether the condition will be fulfilled or not, then you take the worst case, which is it being always fulfilled.
Then the number of iterations is:
m*(m+1)/2*n^2*NumberOfIterations(Something)
In O notation, the constants and lower degrees are not necessary, so the complexity is:
O(m^2*n^2)*O(Something)
for(i=0;i< m; i++)
{
for(j=i+1; j < m; j++)
{
The inner loop will run ((m-1) + (m-2) + ... 1) times
= 1 + 2 + 3 + ...m-1
= m * (m - 1) / 2
for(k=0; k < n;k++)
{
for(l=0;l< n;l++)
{
In this case, the inner loop clearly runs n * n times
So clearly, the number of iterations is exactly
(m * (m - 1) / 2) * (n * n)
= O(m^2 * n^2)
Obviously, this assumes that
if(condition) do something
runs in constant time.
Looks like O(m^2 n^2) to me, assuming the "something" is constant-time.
Although the j loop starts from a different point with each step, the combined effect of the i and j loops is still an m^2 factor.
Evaluating the unstated condition itself would normally be (at least) a constant time operation, so certainly the loop cannot be faster than O(m^2 n^2) - unless, of course, the "something" includes a break, goto, exception throw or whatever that exits out of one or more of the loops early.
All bets are off if, for any reason, either n or m isn't constant throughout.
I assume the time complexity of "do something" is O(S).
Let's start with the most inner loop: It's time complexity is O(n*S) because it does something n times. The loop which wraps the most inner loop has a time complexity of O(n)O(nS)=O(n^2*S) because it does the inner loop n times.
The loop whcih wraps the second most inner loop has a time complexity of O(m-i)*O(n^2*S) because it does an O(n^2*S) operation m-i times.
Now for the harder part: for each i in the range 0...m-1 we do an (m-i)*O(n^2*S) operation. How long does it take? (1 + 2 + 3 + ... + m)*O(n^2*S).
But (1 + 2 + ... + m) is the sum of an arithmetic sequence. Therefore the sum equals to m*(m-1)/2=O(m^2).
Conclusion: We do an O(n^2*S) operation about m^2 times. The time complexity of the whole thing is O(m^2*n^2*S)
O(m^2*n^2*(compexity of something)). If condition and something are executed in constant time then O(m^2*n^2).
O(m²*n²) *complexity of "something"

Resources