Time complexity analysis inconsistency - algorithm

I have this code :
int fun(int n)
{
int count = 0;
for (int i = n; i > 0; i /= 2)
for (int j = 0; j < i; j++)
count += 1;
return count;
}
The time complexity of this code can be thought of as O(n) because O(n+n/2+n/4+...) = O(n)
By that logic, the time complexity of this snippet can also be argued to be O(n) :
for(i = 1; i < n; i *= 2)
//O(1) statements
Since O(1+2+4+..+n/4+n/2) = O(n). But since the loop runs log(n) times, it can be log(n) too.
Why is the former one not : log(n) times the outer loop * log(n) times the inner loop so, log(n)log(n)
What am I doing wrong ?

The first snippet has the outer loop that executes O(log n) times, and each iteration the inner loop executes O(i) times. If you sum any number of terms of the form n / 2^k, you'll get O(n).
The second piece of code has O(log n) iterations of O(1) operations, and sum of logarithmic amount of constants is still logarithmic.

In the first example, you don't have an O(1) statement inside your loop, as you have for (int j = 0; j < i; j++) count += 1. If in your second example you put the same inner loop of the first example, you are back to the same complexity. The first loop is not O(n*log(n)); this is easy to demonstrate because you can find an upper bound in O(2n) which is equivalent to O(n).

The time complexity of the 2nd one should not be calculated as a series O(1+2+4+..+n/4+n/2) = O(n), because it is not that series.
Notice the first one. It is being calculated as a series because one counts the number of times the inner for loop executes and then add all of them (series) to get the final time complexity.
When i=n inner for loop executes n times
When i=(n/2) inner for loop executes n/2 times
When i=(n/4) inner for loop executes n/4 times
and so on..
But in the second one, there is no series to add. It just comes to a formula (2^x) = n, which evaluates to x = logn.
(2^x) = n this formula can be obtained by noticing that i starts with 1, and when it becomes 2 it is multiplied by 2 until it reaches n.
So one needs to find out how many times 2 needs to be multiplied by 2 to reach n.
Thus the formula (2^x) = n, and then solve for x.

Related

When do you multiply by the average Big O of nested loop?

When do we multiply the Big O of the outer loop by the average Big O of the inner loop?
for (int i = n; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
//statement
}
}
Answer: O(n)
In this question, the outer loop is O(log n). Since the inner loop executes some number of times based on 'i', the average Big O is taken. This results in a summation of n + n/2 + n/4 + ... = 2n which is then divided to get the average: O(2n / log n).
Hence O(log n) * O(2n / log n) = O(n).
But then why don't we need to take the average for this question?
for (i = 0; i < n; i++) {
for (j = 0; j < i * i ; j++){
}
}
Answer: O(n^3)
The outer loop is O(n). The inner loop is O(n^2). But why?
Doesn't the inner loop execute based on a value of 'i' in the outer loop? So why isn't the average taken like the question above? - resulting in something like O(n^2 / n) for the inner loop.
The complexity of an algorithm is determined by counting how many operations it performs when executed. For a loop which iterates k times, where m is the mean number of operations on each iteration, the total number of operations performed is always total = k * m, simply because the mean is defined to be m = total / k.
It also makes sense to do this in big O notation: if a loop iterates O(k) times, and performs a mean of O(m) operations on each iteration, then the total number of operations is O(k * m).
The problem is that in your second example, you are not calculating the mean correctly. When you say "The inner loop is O(n^2)", this is already the mean; you are considering the number of iterations of the inner loop for one iteration of the outer loop. The total number of iterations is O(n^3), and the number of iterations of the outer loop is n, so the mean is O(n^3 / n) = O(n^2) as expected.
It is more natural to do this calculation the other way around: after observing that the mean is O(n^2), and the outer loop iterates O(n) times, the total is O(n^2 * n) = O(n^3).

how i can find the time complexity of the above code

for(i=0; i<n; i++) // time complexity n+1
{
k=1; // time complexity n
while(k<=n) // time complexity n*(n+1)
{
for(j=0; j<k; j++) // time complexity ??
printf("the sum of %d and %d is: %d\n",j,k,j+k); time complexity ??
k++;
}
What is the time complexity of the above code? I stuck in the second (for) and i don't know how to find the time complexity because j is less than k and not less than n.
I always having problems related to time complexity, do you guys got some good article on it?
especially about the step count and loops.
From the question :
because j is less than k and not less than n.
This is just plain wrong, and I guess that's the assumption that got you stuck. We know what values k can take. In your code, it ranges from 1 to n (included). Thus, if j is less than k, it is also less than n.
From the comments :
i know the the only input is n but in the second for depends on k an not in n .
If a variable depends on anything, it's on the input. j depends on k that itself depends on n, which means j depends on n.
However, this is not enough to deduce the complexity. In the end, what you need to know is how many times printf is called.
The outer for loop is executed n times no matter what. We can factor this out.
The number of executions of the inner for loop depends on k, which is modified within the while loop. We know k takes every value from 1 to n exactly once. That means the inner for loop will first be executed once, then twice, then three times and so on, up until n times.
Thus, discarding the outer for loop, printf is called 1+2+3+...+n times. That sum is very well known and easy to calculate : 1+2+3+...+n = n*(n+1)/2 = (n^2 + n)/2.
Finally, the total number of calls to printf is n * (n^2 + n)/2 = n^3/2 + n^2/2 = O(n^3). That's your time complexity.
A final note about this kind of codes. Once you see the same patterns a few times, you quickly start to recognize the kind of complexity involved. Then, when you see that kind of nested loops with dependent variables, you immediately know that the complexity for each loop is linear.
For instance, in the following, f is called n*(n+1)*(n+2)/6 = O(n^3) times.
for (i = 1; i <= n; ++i) {
for (j = 1; j <= i; ++j) {
for (k = 1; k <= j; ++k) {
f();
}
}
}
First, simplify the code to show the main loops. So, we have a structure of:
for(int i = 0; i < n; i++) {
for(int k = 1; k <= n; k++) {
for(int j = 0; j < k; j++) {
}
}
}
The outer-loops run n * n times but there's not much you can do with this information because the complexity of the inner-loop changes based on which iteration of the outer-loop you're on, so it's not as simple as calculating the number of times the outer loops run and multiplying by some other value.
Instead, I would find it easier to start with the inner-loop, and then add the outer-loops from the inner-most to outer-most.
The complexity of the inner-most loop is k.
With the middle loop, it's the sum of k (the complexity above) where k = 1 to n. So 1 + 2 + ... + n = (n^2 + n) / 2.
With the outer loop, it's done n times so another multiplication by n. So n * (n^2 + n) / 2.
After simplifying, we get a total of O(n^3)
The time complexity for the above code is : n x n x n = n^3 + 1+ 1 = n^3 + 2 for the 3 loops plus the two constants. Since n^3 carries the heaviest growing rate the constant values can be ignored, so the Time complexity would be n^3.
Note: Take each loop as (n) and to obtained the total time, multiple the (n) values in each loop.
Hope this will help !

Finding Big-O of this code

I need some help finding the complexity or Big-O of this code. If someone could explain what the Big-O of every loop would be that would be great. I think the outter loop would just be O(n) but the inner loop I'm not sure, how does the *=2 effect it?
k = 1;
do
{
j = 1;
do
{
...
j *= 2;
} while (j < n);
k++;
} while (k < n);
The outer loop runs O(n) times, since k starts at 1 and needs to be incremented n-1 times to become equal to 1.
The inner loop runs O(lg(n)) times. This is because on the m-th execcution of the loop, j = 0.5 * 2^(m).
The loop breaks when n = j = 0.5 * 2^m. Rearranging that, we get m = lg(2n) = O(lg(n)).
Putting the two loops together, the total code complexity is O(nlg(n)).
Logarithms can be tricky, but generally, whenever you see something being repeatedly being multiplied or divided by a constant factor, you can guess that the complexity of your algorithm involves a term that is either logarithmic or exponential.
That's why binary search, which repeatedly divides the size of the list it searches in half, is also O(lg(n)).
The inner loop always runs from j=1 to j=n.
For simplicity, let's assume that n is a power of 2 and that the inner loop runs k times.
The values of j for each of the k iterations are,
j = 1
j = 2
j = 4
j = 8
....
j = n
// breaks from the loop
which means that 2^k = n or k = lg(n)
So, each time, it runs for O(lg(n)) times.
Now, the outer loop is executed O(n) times, starting from k=1 to k=n.
Therefore, every time k is incremented, the inner loop runs O(lg(n)) times.
k=1 Innerloop runs for : lg(n)
k=2 Innerloop runs for : lg(n)
k=3 Innerloop runs for : lg(n)
...
k=n Innerloop runs for : lg(n)
// breaks from the loop
Therefore, total time taken is n*lg(n)
Thus, the time complexity of this is O(nlg(n))

Loop Analysis - Analysis of Algorithms

This question is based off of this resource http://algs4.cs.princeton.edu/14analysis.
Can someone break down why Exercise 6 letter b is linear? The outer loop seems to be increasing i by a factor of 2 each time, so I would assume it was logarithmic...
From the link:
int sum = 0;
for (int n = N; n > 0; n /= 2)
for (int i = 0; i < n; i++)
sum++;
This is a geometric series.
The inner loops runs i iterations per iteration of the outer loop, and the outer loop decreases by half each time.
So, summing it up gives you:
n + n/2 + n/4 + ... + 1
This is geometric series, with r=1/2 and a=n - that converges to a/(1-r)=n/(1/2)=2n, so:
T(n) <= 2n
And since 2n is in O(n) - the algorithm runs in linear time.
This is a perfect example to see that complexity is NOT achieved by multiplying the complexity of each nested loop (that would have got you O(nlogn)), but by actually analyzing how many iterations are needed.
Yes its simple
See the value of n is decreasing by half each time and I runs n times.
So for the first time i goes from 1 to n
next time 0 to n/2
and hence 0 to n/k on kth term.
Now total time inner loop would run = Log(n)
So its a GP the number of times i is running.
with terms
n,n/2,n/4,n/8....0
so we can find the sum of the GP
2^(long(n) +1)-1 / (2-1)
2^(long(n)+1) = n
hence n-1/(1) = >O(n)

Presumingly simple runtime analysis, need explanation

According to todays lecture, the first loop has a runtime of the order O(n), while the second loop has a runtime of the order O(log(n)).
for (int i = 0; i < n; i++) { // O(n)
stuff(); // O(1)
}
for (int i = 1; i < n; i*=4) { // O(log(n))
stuff(); // O(1)
}
Could someone please elaborate on why?
The first loop will do a constant time operation exactly n times. Therefore it is O(n).
The second loop (starting from i = 1 not i = 0, you had a typo that I fixed) executes its body for i set to 1, 4, 16, 64, ... that is, 4^0, 4^1, 4^2, 4^3, ... up until n.
4^k < n when k < log_4(n). Therefore the body of the second loop executes O(log(n)) times, because log base 4 and log base e differ by only a constant coefficient.
Time complexity is calculated in terms of how the numbers of times all unit time statements in the code are executed as function of n (input size.)
Input size = n and for loop runs in O(N) and stuff runs of O(N)*O(1) hence overall O(N)
for loop runs till 4^k-1 < n where k is number of iterations. Taking log on both sides of inequality we get (k-1)*log4 < logn, k < logn/log4 +1, k=O(logn) because log4 is constant. Stuff runs in O(logn)*O(1) hence overall O(logn)

Resources