Analyzing programs - algorithm

I was wondering if anyone could explain to me the semantics of analyzing programs. I get how to do simple ones, but there are some more complicated ones I am not sure how to do. For example here is a question from my book. We are given 6 little segments of code and told to analyze them:
(1). sum = 0;
for( i = 0; i < n; ++i )
++sum;
(2). sum = 0;
for( i = 0; i < n; ++i )
for( j = 0; j < n; ++j )
++sum;
(3). sum = 0;
for( i = 0; i < n; ++i )
for( j = 0; j < n * n; ++j )
++sum;
(4). sum = 0;
for( i = 0; i < n; ++i )
for( j = 0; j < i; ++j )
++sum;
(5). sum = 0;
for( i = 0; i < n; ++i )
for( j = 0; j < i * i; ++j )
for( k = 0; k < j; ++k )
++sum;
(6). sum = 0;
for( i = 1; i < n; ++i )
for( j = 1; j < i * i; ++j )
if( j % i == 0 )
for( k = 0; k < j; ++k )
++sum;
I understand 1, 2, and 4. Those ones are easy. The ones I don't get are 3, 5, and 6.
1 runs n times so that one is in Big-Oh(n). 2 has two for loops which both run n times each so this one is Big-Oh(n^2). 4 is something I have seen before. The inner loop runs as many times as the value of i. So if i = 1 then loop runs once, if i = 2 the loop runs twice for a patters of 1 + 2 + 3 + ... + n which is the pattern n(n + 1) / 2 which means this whole this is in Big-Oh(n^2). I am unsure how to go about 3 with the n * n in the conditional. That is also the reason I am not sure how to go about 5 with the i * i also in there. As for 6, not only do we have the i * i but there is also an if statement that may or may not run. What do I do with that? Can anyone help explain how to do those ones? Thanks.
UPDATE I had an idea about 3. The outer for loop in that one runs n times, the inner for loop runs n^2 times. So for that one, would we have n * n^2 which is n^3? So would that one be in Big-Oh(n^3)?

You figured third one out. The same reasoning applies to all of them. Actually your aim is to find the final value of sum in terms of n.
For the fifth one, that is to find the sum: ∑i=0n(∑j=0n^2 (j))
The inner summation is just (i2*(i2 + 1)) / 2, which is follows from the simple summation formula.
Then you have ∑i=0n(i2*(i2 + 1)) / 2 which is equal to [∑i=0n(i4) + ∑i=0n(i2)] / 2.
This is O(N5) looking at the dominating term ∑i=0n(i4), which you can think as an integral so you just increase the exponent and you get N5. Sixth one is left as an exercise, just imagine what sum will be in the end.

Related

Finding Big O Notation of For loop which is inside an If condition

sum = 0;
for( i = 1; i < n; ++i )
for( j = 1; j < i * i; ++j )
if( j % i == 0 )
for( k = 0; k < j; ++k )
++sum;
How do I find Big O notation for this code? I'm new to this big o notation thing. So I'll appreciate if someone can explain me it simply with details.. Thank you!
Big O is an asymptotic upper bound of a function. So in your case the the for loops take the most time, if the if condition evaluates always to true, so you can just assume this an get a correct upper bound, which is maybe not tight. But there are a lot of cases where you cannot do better than this.
In some cases you can try to remove the if while maintaining the number of operations roughly. E.g. in your case you could replace j = 1 by j = i and ++j by j += i. This is not to change the algorithm, but only for the complexity analysis to change the way you look at it. You still have to remember that the middle for loop takes i*i steps. Now you have this:
sum = 0;
for( i = 1; i < n; ++i )
O(i * i) Operations
for( j = i; j < i * i; j += i )
for( k = 0; k < j; ++k )
++sum;
You also can assume that the if condition is always false. This way you get a lower bound. In some cases the upper and lower bound match, meaning that the part you hat trouble to analyze is actually irrelevant for the overall complexity.

Having trouble finding the Big-Oh value

Please refer to the code fragment below:
sum = 0;
for( i = 0; i < n; i++ )
for( j = 0; j < i * i; j++ )
for( k = 0; k < j; k++ )
sum++;
If one was to analyze the run-time of this fragment in Big-Oh Notation, what would be the most appropriate value? Would it be O(N^4)?
I have to say O(n^5). I maybe wrong. n is the length of the array in this question I am assuming. If you substitute it, i will stop when i < n j will stop when j < n*n and k will stop when k < n * n. Since these are nested for-loops, the run time is O(n^5)
I think it's O(n^5)
We know that
1 + 2 + 3 + ... + n = n(n+1) / 2 = O(n^2)
1^2 + 2^2 + ... + n^2 = n(n+1)(2n+1) / 6 = O(n^3)
I think that similary statement is true for sum of bigger powers.
Let's think about these two lines
for( j = 0; j < i * i; j++ )
for( k = 0; k < j; k++ )
This loop's time complexity for fixed i is
1 + 2 + ... + (i-1)^2 = (i-1)^2(1 + (i-1)^2) / 2 = O(i^4)
But in our code i changes from 0 to n-1, so we like add fourth powers, and as I said before
1^k + 2^k + ... + n^k = O(n^(k+1))
That's why time complexity is O(n^5)
Given this code, consider each loop separately.
sum = 0;
for( i = 0; i < n; i++ ) //L1
for( j = 0; j < i * i; j++ ) //L2
for( k = 0; k < j; k++ ) //L3
sum++; //
Consider the outermost loop (L1) alone. Clearly this loop is O(n).
for( i = 0; i < n; i++ ) //L1
L2(i);
Consider the next loop (L2). This one is harder, but the loop is O(n^2).
See this SO answer for why the loop is O(n^2): Big O, how do you calculate/approximate it?
for( j = 0; j < i * i; j++ ) //L2
L3(j);
Consider the next loop (L3). Since you know that the L2 loop is O(N^2), what is this loop? (leaving reader to complete their homework).
for( k = 0; k < j; k++ ) //L3
sum++; //

Determine Big-O runtime for these loops

I am trying to determine the Big-O runtimes for these loops. I believe the answers I have are correct, but I would like to check with the community.
int sum = 0;
for (int i = 1; i <= n*2; i++ )
sum++;
My answer is O(n)
This is because the loop iterates n x 2 times. We drop the 2 and are left with n. Therefore O(n).
int sum = 0;
for ( int i = 1; i <= n; i++)
for ( int j = n; j > 0; j /= 2)
sum++;
My answer is O(n lgn)
The outer loop iterates n times. The inner loop iterates from n down to 0, but only through half of the items. This works out to be Log base 2 of n. We drop the 2 and keep log n. The inner loop (log n) times the outer loop (n), gives us O(n lgn).
int sum = 0;
for ( int i = 1; i <= n; i++)
for ( int j = i; j <= n; j += 2)
sum++;
My answer is O(n^2)
This one is easy. The inner loop and outer loop each iterate n times. n x n = n^2. Therefore O(n^2).
int sum = 0;
for ( int i = 1; i <= n * n; i++)
for ( int j = 1; j < i; j++ )
sum++;
My answer is O(n^3)
Are my answers correct? If not, how can I correct them?
Only the last one is wrong. It should be O(n⁴).
You can see it this way: substitute n * n with x. The number of operations will be the usual O(x*(x+1)/2) = O(x²). Now substitute n * n back into x.
Extra challenge for you.
You correctly said that:
int sum = 0;
for ( int i = 1; i <= n; i++)
for ( int j = n; j > 0; j /= 2)
sum++;
Is O(n log n). But what about this one:
int sum = 0;
for ( int i = 1; i <= n; i++)
for ( int j = i; j > 0; j /= 2)
sum++;
I only changed j = n to j = i.

Nested loops, how many times run and complexity

I have these 2 codes, the question is to find how many times x=x+1 will run in each occasion as T1(n) stands for code 1 and T2(n) stands for code 2. Then I have to find the BIG O of each one, but I know how to do it, the thing is I get stuck in finding how many times ( as to n of course ) will x = x + 1 will run.
CODE 1:
for( i= 1; i <= n; i++)
{
for(j = 1; j <= sqrt(i); j++)
{
for( k = 1; k <= n - j + 1; k++)
{
x = x + 1;
}
}
}
CODE 2:
for(j = 1; j <= n; j++)
{
h = n;
while(h > 0)
{
for (i = 1; i <= sqrt(n); i++)
{
x = x+1;
}
h = h/2;
}
}
I am really stuck, and have read already a lot so I ask if someone can help me, please explain me analytically.
PS: I think in the code 2 , this for (i = 1; i <= sqrt(n); i++) will run n*log(n) times, right? Then what?
For code 1 you have that the number of calls of x=x+1 is:
Here we bounded 1+sqrt(2)+...+sqrt(n) with n sqrt(n) and used the fact that the first term is the leading term.
For code 2 the calculations are simpler:
The second loop actually goes from h=n to 0 by iterating h = h/2 but you can see that this is the same as going from 1 to log n. What we used is the fact the j, t, i are mutually independent (analogously just like we can write that sum from 1 to n of f(n) is just nf(n)).

Big-O analysis for a loop

I've got to analyze this loop, among others, and determine its running time using Big-O notation.
for ( int i = 0; i < n; i += 4 )
for ( int j = 0; j < n; j++ )
for ( int k = 1; k < j*j; k *= 2 )`
Here's what I have so far:
for ( int i = 0; i < n; i += 4 ) = n
for ( int j = 0; j < n; j++ ) = n
for ( int k = 1; k < j*j; k *= 2 ) = log^2 n
Now the problem I'm coming to is the final running time of the loop. My best guess is O(n^2), however I am uncertain if this correct. Can anyone help?
Edit: sorry about the Oh -> O thing. My textbook uses "Big-Oh"
First note that the outer loop is independent from the remaining two - it simply adds a (n/4)* multiplier. We will consider that later.
Now let's consider the complexity of
for ( int j = 0; j < n; j++ )
for ( int k = 1; k < j*j; k *= 2 )
We have the following sum:
0 + log2(1) + log2(2 * 2) + ... + log2(n*n)
It is good to note that log2(n^2) = 2 * log2(n). Thus we re-factor the sum to:
2 * (0 + log2(1) + log2(2) + ... + log2(n))
It is not very easy to analyze this sum but take a look at this post. Using Sterling's approximation one can that it is belongs to O(n*log(n)). Thus the overall complexity is O((n/4)*2*n*log(n))= O(n^2*log(n))
In terms of j, the inner loop is O(log_2(j^2)) time, but sine
log_2(j^2)=2log(j), it is actually O(log(j)).
For each iteration of middle loop, it takes O(log(j)) time (to do the
inner loop), so we need to sum:
sum { log(j) | j=1,..., n-1 } log(1) + log(2) + ... + log(n-1) = log((n-1)!)
And since log((n-1)!) is in O((n-1)log(n-1)) = O(nlogn), we can conclude middle middle loop takes O(nlogn) operations .
Note that both middle and inner loop are independent of i, so to
get the total complexity, we can just multiply n/4 (number of
repeats of outer loop) with complexity of middle loop, and get:
O(n/4 * nlogn) = O(n^2logn)
So, total complexity of this code is O(n^2 * log(n))
Time Complexity of a loop is considered as O(n) if the loop variables is incremented / decremented by a constant amount (which is c in examples below):
for (int i = 1; i <= n; i += c) {
// some O(1) expressions
}
for (int i = n; i > 0; i -= c) {
// some O(1) expressions
}
Time complexity of nested loops is equal to the number of times the innermost statement is executed. For example the following sample loops have O(n²) time complexity:
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
for (int i = n; i > 0; i += c) {
for (int j = i+1; j <=n; j += c) {
// some O(1) expressions
}
Time Complexity of a loop is considered as O(logn) if the loop variables is divided / multiplied by a constant amount:
for (int i = 1; i <=n; i *= c) {
// some O(1) expressions
}
for (int i = n; i > 0; i /= c) {
// some O(1) expressions
}
Now we have:
for ( int i = 0; i < n; i += 4 ) <----- runs n times
for ( int j = 0; j < n; j++ ) <----- for every i again runs n times
for ( int k = 1; k < j*j; k *= 2 )` <--- now for every j it runs logarithmic times.
So complexity is O(n²logm) where m is n² which can be simplified to O(n²logn) because n²logm = n²logn² = n² * 2logn ~ n²logn.

Resources