Big O for while loops - big-o

I had this question for my assignment the other day, but I was still unsure if I'm right.
for(int i =1; i <n; i++) //n is some size
{
for(j=1; j<i; j++)
{
int k=1;
while (k<n)
{
k=k+C; //where C is a constant and >=2
}
}
}
I know the nested for loops is O(n^2) but I wasn't sure with the while loop. I assumed that the whole code will be O(n^3).

The inner loop is literally O(n/C)=O(n), so yes, overall it's O(n^3) (the second loop has an upper bound of O(n))

int k=1;
while (k<n){
k=k+C //where C is a constant and >=2
}
This will take (n-1)/C steps: write u = (k-1)/C. Then, k = Cu + 1 and the statement becomes
u=0;
while(u < (n-1)/C) {
u=u+1
}
Hence the while loop is O(n) (since C is constant)
EDIT: let me try to explain it the other way around.
Start with a dummy variable u. The loop
u=0;
while(u < MAX) {
u = u+1
}
runs MAX times.
When you let MAX = (n-1) / C, the loop is
u=0;
while(u < (n-1)/C) {
u=u+1
}
And that runs (n-1)/C times.
Now, the check u < (n-1)/C is equivalent to C*u < n-1 or C*u + 1 < n, so the loop is equivalent to
u=0;
while(C*u + 1 < n) {
u=u+1
}
Now, suppose that we rewrote this loop in terms of a variable k = C * u + 1. Then,
u=0;
k=1; // C * 0 + 1 = 1
The loop looks like
while(C*u + 1 < n) {
while(k < n) {
and the inner condition is
u=u+1
k=k+C //C * (u+1) + 1 = C * u + 1 + C = old_k + C
Putting it all together:
int k=1;
while (k<n){
k=k+C
}
takes (n-1)/C steps.

Formally, you may proceed using the following methodology (Sigma Notation):
Where a symbolizes the number of constant operations inside the innermost loop (a = 1 if you want to count the exact number of iterations).

Well, you would need to look at how many times the while loop body is run for a given value of n and C. For example n is 10 and C is 3. The body would run 3 times: k = 1, k = 4, k = 7. For n = 100 and C = 2, the body would run 50 times: k = 1,3,5,7,9,...,91,93,95,97,99. It is a matter of counting by C until n. You should be able to calculate the Big-O complexity from that clue.

Related

How to calculate time complexity of a recursive function?

What is the time complexity of the following code?
My guess:
The for loop runs for constant time i.e. 3. And the function calls itself with n/3. So 'n' is contracted by 3 times every time and the time complexity is O(log3N)?
void function(int n){
if(n == 1)
return 1;
for(int i = 0; i < 3; i++){
cout << "Hello";
}
function(n/3);
}
Yes, it's O(log3N). Call the amount of work done by the loop C. The first few calls will go:
f(n) = C + f(n/3) = C + C + f(n/9) = C + ... + C + f(1)
The number of times C appears will be the number of times you can divide n by 3 before it gets to 1, which is exactly log3n. So the total work is C*log3n, or O(log3N).

Can't understand big-O of a nested loop

I am having problem understanding the answer to the following question about analyzing two algorithms below.
for (int i = n ; i >= 1; i = i/2) {
for ( int j = 1; j <= n ; j++) {
//do something
}
}
The algorithm above has complexity of O(n) according to the answers. Shouldn't it be lower since the outer loop always halves the amount we have to go through. I thought that it should be something along the lines of O(n/2 * )?
for ( int j = 1; j <= n ; j++ ) {
for ( int i = n ; i >= j ; i = i / 2 ) {
//do something
}
}
This one is O(n log n) if I am correct?
The first iteration will execute n steps, the second will execute n/2, the third will execute n/4, and so on.
If you compute the sum of n/(2^i) for i=0..log n you will get roughly 2n and that is why it is O(n).
If you take n out of the summation and sum only the 1/(2^i) part, you will get 2. Take a look at an example:
1 + 1/2 + 1/4 + 1/8 + 1/16 + ... = 1 + 0.5 + 0.25 + 0.125 + 0.0625 + ... = 2
Each next element is twice smaller, therefore the sum will never exceed 2.
You are right with the second nested loop example - it is O(n log n).
EDIT:
After the comment from ringø I re-read the question and in fact the algorithm is different from what I understood. ringø is right, the algorithm as described in the question is O(n log n). However, judging from the context I think that the OP meant an algorithm where the inner loop is tied to i and not n.
This answer relates to the following algorithm:
for (int i = n ; i >= 1; i = i/2) {
for ( int j = 1; j <= i ; j++) {
//do something
}
}

Trouble figuring out these tough Big-O examples

I'm trying to study for an upcoming quiz about Big-O notation. I've got a few examples here but they're giving me trouble. They seem a little too advanced for a lot of the basic examples you find online to help. Here are the problems I'm stuck on.
1. `for (i = 1; i <= n/2; i = i * 2) {
sum = sum + product;
for (j= 1; j < i*i*i; j = j + 2) {
sum++;
product += sum;
}
}`
For this one, the i = i * 2 in the outer loop implies O(log(n)), and I don't think the i <= n/2 condition changes anything because of how we ignore constants. So the outer loop stays O(log(n)). The inner loops condition j < i*i*i confuses me because its in terms of 'i' and not 'n'. Would the Big-O of this inner loop then be O(i^3)? And thus the Big-O for the entire problem
be O( (i^3) * log(n) )?
2. `for (i = n; i >= 1; i = i /2) {
sum = sum + product
for (j = 1; j < i*i; j = j + 2) {
sum ++;
for (k = 1 ; k < i*i*j; k++)
product *= i * j;
}
}`
For this one, the outermost loop implies O(log(n)). The middle loop implies, again unsure, O(i^2)? And the innermost loop implies O(i^2*j)? I've never seen examples like this before so I'm almost guessing at this point. Would the Big-O notation for this problem be O(i^4 * n * j)?
3. `for (i = 1; i < n*n; i = i*2) {
for (j = 0; j < i*i; j++) {
sum ++;
for (k = i*j; k > 0; k = k - 2)
product *= i * j;
}
}`
The outermost loop for this one has an n^2 condition, but also a logarithmic increment, so I think that cancels out to be just regular O(n). The middle loop is O(i^2), and the innermost loop is I think just O(n) and trying to trick you. So for this problem the Big-O notation would be O(n^2 * i^2)?
4. `int i = 1, j = 2;
while (i <= n) {
sum += 1;
i = i * j;
j = j * 2;
}`
For this one I did a few iterations to better see what was happening:
i = 1, j = 2
i = 2, j = 4
i = 8, j = 8
i = 64, j = 16
i = 1024, j = 32
So clearly, 'i' grows very quickly, and thus the condition is met very quickly. However I'm not sure just what kind of Big-O notation this is.
Any pointers or hints you can give are greatly appreciated, thanks guys.
You can't add i or j to O-notation, it must be converted to n.
For the first one:
Let k be log 2 i.
Then inner loop is done 2^(k*3)/2=2^(3k-1) times for each iteration of outer loop.
k goes from 1 to log2n.
So total number of iterations is
sum of 2^(3k-1) for k from 1 to log 2 n which is 4/7(n^3-1) according to Wolfram Alpha, which is O(n^3).
For the last one, i=j1*j2*j3*...jk, and jm=2^m
i=2^1*2^2*...2^k=2^(1+2+...k)
So 1+2+3+...+k=log 2 n
(k+1)k/2 = log 2 n
Which is O(sqrt(log n))
BTW, log n^2 is not n.
This question is better to ask at computer science than here.

Someone please explain to me ths big O complexity between these two java codes? I m so confused?

for ( int i = n, i>0; i / = 2) {
for ( int j = 1, j<n; j * = 2) {
for ( int k = 0, k<n; k += 2) {
} // not nested
}
}
Answer: O(n(log n)^ 2), (2 is to the square root by the way)
The two outer loops are all log n, because it's having, and the inner one is N because it is halving right?
For this code, the correct answer is O(n) ^ 2, I understand the outer loop is n, and the middle loop is log n, and the inner loop should be n too. so why is the answer not N * N * log n?
for( int i = n; i > 0; i - -) {
for( int j = 1; j < n; j *= 2 ) {
for( int k = 0; k < j; k++ ) {
// constant number C of operations
}
}
}
Finally, how do I know when to add or multiply loops? if two loops are nested I just multiply them right? and when do I take the greatest N value over the other loops?
Here it is #2 formatted for readability:
for( int i = n; i > 0; i --)
for( int j = 1; j < n; j *= 2 )
for( int k = 0; k < j; k++ )
action
Forget the i-loop; we know it multiplies the inner bits by N.
The number of times action gets done by the nested j-, k-loops is then
1 + 2 + 4 + 8 + ... N. (If N is not a power of 2, replace it with the next lower power of 2.)
Put this in binary and sum it. For my example, let's let N be 16, but you can easily generalize.
00001
00010
00100
01000
10000
which sums to
11111
which is 2*N-1, or O(N).
Multiplying that by the i-loop range of N gives us O(N^2).
Interesting problem!
the outer and middle loops execute the same number of times, each sqrt(n). the inner loop executes n/2 times. Since they're nested, yes you multiply them together, for a total of sqrt(n)*sqrt(n)*n/2 critical step executions. This equals n^2/2, for which the limit as n approaches infinity is n^2, so it's in the O(n^2) classification of functions. You don't really just take the greatest N value, you take the asymptotic limit, when asked which family the function belongs to.

running time of algorithm does not match the reality

I have the following algorithm:
I analyzed this algoritm as follow:
Since the outer for loop goes from i to n it iterates at most n times,
and the loop on j iterates again from i to n which we can say at most n times,
if we do the same with the whole algorithm we have 4 nested for loop so the running time would be O(n^4).
But when I run this code for different input size I get the following result:
As you can see the result is much closer to n^3? can anyone explain why does this happen or what is wrong with my analysis that I get a loose bound?
Formally, you may proceed like the following, using Sigma Notation, to obtain the order of growth complexity of your algorithm:
Moreover, the equation obtained tells the exact number of iterations executed inside the innermost loop:
int sum = 0;
for( i=0 ; i<n ; i++ )
for( j=i ; j<n ; j++ )
for( k=0 ; k<j ; k++ )
for( h=0 ; h<i ; h++ )
sum ++;
printf("\nsum = %d", sum);
When T(10) = 1155, sum = 1155 also.
I'm sure there's a conceptual way to see why, but you can prove by induction the above has (n + 2) * (n + 1) * n * (n - 1) / 24 loops. Proof left to the reader.
In other words, it is indeed O(n^4).
Edit: You're count increases too frequently. Simply try this code to count number of loops:
for (int n = 0; n < 30; n++) {
int sum = 0;
for (int i = 0; i < n; i++) {
for (int j = i; j < n; j++) {
for(int k = 0; k < j; k++) {
for (int h = k; h < i; h++) {
sum++;
}
}
}
}
System.out.println(n + ": " + sum + " = " + (n + 2) * (n + 1) * n * (n - 1) / 24);
}
You are having a rather complex algorithm. The number of operations is clearly less than n^4, but it isn't at all obvious how much less than n^4, and whether it is O (n^3) or not.
Checking the values n = 1 to 9 and making a guess based on the results is rather pointless.
To get a slightly better idea, assume that the number of steps is either c * n^3 or d * n^4, and make a table of the values c and d for 1 <= n <= 1,000. That might give you a better idea. It's not a foolproof method; there are algorithms changing their behaviour dramatically much later than at n = 1,000.
Best method is of course a proof. Just remember that O (n^4) doesn't mean "approximately n^4 operations", it means "at most c * n^4 operations, for some c". Sometimes c is small.

Resources