For loop - big-O - big-o

I am trying to do this problem out of a book and am struggling to understand the answer.
for (i = 0; i < N; ++i) {
if ((i % 2) == 0) {
outVal[i] = inVals[i] * i
}
}
here's how I was breaking it down:
I=0 -> executes 1 time
I < n and ++I each execute once every iteration. so 1n+1n = 2n.
the if statement contains 2 operands, so now we are at 4n+1.
the contents of the if statement only executes n/2 times, so we are at 4n+1+n/2
however, big O drops those terms off, leaving us with N as the answer
Here's what I don't get: the explanation for the answer of my problem says this:
outVal[i] = inVals[i] * i; executes every other loop iteration, so the total number of operations include: 1 operation at the start of the loop, 3 operations every loop iteration, 1 operation every other loop iteration, and 1 operation for the final loop condition check.
how are there only 3 operations in the loop? I counted 4 as stated above. Please let me know the rationale behind this.

The complexity is measured by the time/space you take to accomplish a task. i<N and ++i do not take time dependant of your space variable N (the length of the loop).
You must not add how many times an operation is done and sum all of them - you must, instead, choose the one who takes more time or space, as that's the algorithm bottleneck. In a loop, msot of the operations run equal times, so we use the length of the loop as its complexity space or time.
The loop will run N times, so that's its complexity -> O(n)
Inside the loop, the if scope will run N/2 times, as you correctly said -> O(n/2)
But those runs are already added to the first loop iterations. You will not add it since there are no external iterations.
So, the complexity of the algorithm is O(n).
Regarding the operations, the 3 are:
Checking I
Adding 1 to I
The if condition
All of them are done in every iteration.

Related

How is this nested loop algorithm analyzed?

So i have this algorithm analysis from my lecturer i need some help why the outer loop is n - 1 , isn't it should be n - 2? and the inner loop should be log3 (n) instead of log3(n) + 1
for(int a=3; a<=n; a++) n+1-2 = n-1
for(int a=1; a<n; a=a*3) log3 (n) +1
System.out.println(a); log3 (n)
Total =(n - 1)* (log3 (n)+1+log3 (n))
=(n-1)*(2 log3(n) + 1)
=2(n log3(n))+n -1 – 2 log3(n)
=n log3(n) + n – log3(n)
Is this correct answer for algorithmn analysis? thats what my lecturer showed me. Anyone can explain to me?
If you want a very precise count of the operations, it's important to specify: what operations are you counting? Number of increments a++? Number of comparisons a<=n? Number of executions of the loop body?
If you don't specify which operation you're counting, then there is not much sense in worrying about an extra +1 or -1.
Note that the variable used as counter for the outer loop is called a and the variable used as counter for the inner loop is also called a. While this is possible to do in C++, I strongly recommend not doing it. It's confusing and a good source of errors.
The outer loop is going to run for n-2 iterations. The inner loop is going to run for ceil(log3(n)) iterations (where ceil is the ceiling function). The line System.out.println(a) is not a loop, it's just one line of code, so you could write "1" on that line if you wanted; there is not much sense in writing "log3(n)" here.
The total number of times the line System.out.println(a) is executed is thus:
(n - 2) ceil(log3(n)).
It is possible that you might want to count the exact number of characters written. Again, we're coming back to the fact that you didn't specify what it is that you were counting. The number of characters depends on the exact value of a, so it changes at each iteration of the loop. But all in all, each call to System.out.println(a) prints about log10(n) characters, since we're writing in decimal.

What are the number of operations in this method?

I am currently learning about asymptotic analysis, however I am unsure of how many operations are in these nested loops. Would somebody be able to help me understand how to approach them? Also what would the Big-O notation be? (This is also my first time on stack overflow so please forgive any formatting errors in the code).
public static void primeFactors(int n){
while( n % 2 == 0){
System.out.print(2 + " ");
n /= 2;
}
for(int i = 3; i <= Math.sqrt(n); i += 2){
while(n % i == 0){
System.out.print(i + " ");
n /= i;
}
}
}
In the worst case, the first loop (while) is O(log(n)). In the second loop, the outer loop runs in O(sqrt(n)) and the inner loop runs in O(log_i(n)). Hence, the time complexity of the second loop (inner and outer in total) is:
O(sum_{i = 3}^{sqrt(n)} log_i(n))
Therefore, the time complexity of the mentioned algorithm is O(sqrt(n) log(n)).
Notice that, if you mean n is modified inside the inner loop, and it affects on sqrt(n) in the outer loop, so the complexity of the second loop is O(sqrt(n)). Therefore, under this assumtion, the time complexity of the alfgorithm will be O(sqrt(n)) + O(log(n)) == O(sqrt(n)).
First, we see that the first loop is really a special case of the inner loop that occurs in the other loop, with i equal to two. It was separated as a special case in order to be able to increase i with steps of 2 instead of 1. But from the point of view of asymptotic complexity the step by 2 makes no difference: it represents a constant coefficient, which we can ignore. And so for our analysis we can just rewrite the code to this:
public static void primeFactors(int n){
for(int i = 2; i <= Math.sqrt(n); i += 1){ // note the change in start and increment value
while(n % i == 0){
System.out.print(i + " ");
n /= i;
}
}
}
The number of times that n/i is executed, corresponds to the number of non-distinct prime divisors that a number has. According to this Q&A that number of times is O(loglogn). It is not straightforward to derive this, so I had to look it up.
We should also consider the number of times the for loop iterates. The Math.sqrt(n) boundary for i can lower as the for loop iterates. The more divisions take place, the (much) fewer iterations the for loop has to make.
We can see that at the time that the loop exits, i has surpassed the square root of the greatest prime divisor of n. In the worst case that greatest prime divisor is n itself (when n is prime). So the for loop can iterate up to the square root of n, so O(√n) times. In that case the inner loop never iterates (no divisions).
We should thus see which is more determining, and we get O(√n + loglogn). This is O(√n).
The first loop divides n by 2 until it is not divisible by 2 anymore. The maximum number of time this can happen is log2(n).
The second loop, at first sight, seems to run sqrt(n) times the inner loop which is also O(log(n)), but that is actually not the case. Everytime the while condition of the second loop is satisfied, n is drastically decreased, and since the condition of a for-loop is executed on each iteration, sqrt(n) also decreases. The worst case actually happens if the condition of while loop is never satisfied, i.e. if n is prime.
Hence the overall time complexity is O(sqrt(n)).

Big O sum of integers runtime

I have am trying to learn Big O and am confused on an algorithm I just came across. The algorithm is:
void pairs(int[] array){
for (int i=0; i < array.length; i++){
for (int j=i+1; j<array.length; j++){
System.out.println(array[i]+","+array[j]);
}
}
}
I think the first for loop is O(n) and I know the second for loop is O(1/2*n(n+1)). The answer to the problem was that the run time for the function is O(n^2). I simplified O(1/2*n(n+1)) to O(1/2*(n^2+n)). So I'm confused because I thought that you needed to multiply the two run time terms since the for loop is nested, which would give O(n) * O(1/2*(n^2+n)). I simplified this to O(1/2n^3 + 1/2n^2). From what I understand of Big O, you only keep the largest term so this would reduce to O(n^3). Can anyone help me out with where I went wrong? Not sure how the answer is O(n^2) instead of O(n^3).
When you say the inner loop is O(1/2*n(n+1)), you are actually describing the big-O complexity of both loops.
To say that the outer loop has complexity O(N) basically means its body runs N times. But for your calculation of the inner loop's complexity, you already took account of all iterations of the outer loop, because you added up the number of times the inner loop runs over all iterations of the outer loop. If you multiply again by N, you would be saying that the outer loop itself is re-run another N times.
Put another way, what your analysis shows is that the inner loop body (the System.out.println call) runs 1/2*n(n+1) times overall. That means the overall complexity of the two-loop combination is O(1/2*n(n+1)) = O(n^2). The overall complexity of the two-loop combination describes how many times the innermost code is run.
your mistake is counting the second loop as O(1/2n^2)...
first, you can clearly see it is capped to N-1 (when j = 0)
first loop is clearly N
Second loop is MAX of N-1...
threrefore, O(N^2)...
if we examine it little more,
second loop will run N-1 when i=0,
then N-2 for i=1,
and ONE single time for i=n-1,
this is 1/2n(n-1) = 1/2n^2 - 1/2n = O(n^2)
Notice this includes all iteration of the outer loop too!

While loop Time complexity

I have an algorithm exam.. and am a bit not great in the loops time complexity :s I just started to get the basics of it..
I have this while loop
i=2
while (i<n)
{
i=i*i
x=x+1
}
I believe that the solution must be like:
(i) will run from 2 to 2k where k = 2i
every time it execute the statement 1 time..
so 1+1+1+.. , this means 1*2k
and from here I can't continue..
the second question guys.. please recommend a site or sth that I can practice some more of these.. searched but didn't find :s
The loop runs as long as the i is less than n. In other words, you need to find k such that 22k < n. ==> k = log2log2n. So the loop iterates for log2log2n times and at each iteration it performs 2 operations (multiplication and addition) - these operations take O(1) time.
==> The time to execute the loop is equal to log2log2n * O(1) ==> total complexity is O(loglogn).

How to calculate worst case analysis of this algorithm?

sum = 0;
for(int i = 0; i < N; i++)
for(int j = i; j >= 0; j--)
sum++;
From what I understand, the first line is 1 operation, 2nd line is (i+1) operations, 3rd line is (i-1) operations, and 4th line is n operations. Does this mean that the running time would be 1 + (i+1)(i-1) + n? It's just these last steps that confuse me.
To analyze the algorithm you don't want to go line by line asking "how much time does this particular line contribute?" The reason is that each line doesn't execute the same number of times. For example, the innermost line is executed a whole bunch of times, compared to the first line which is run just once.
To analyze an algorithm like this, try identifying some quantity whose value is within a constant factor of the total runtime of the algorithm. In this case, that quantity would probably be "how many times does the line sum++ execute?", since if we know this value, we know the total amount of time that's spent by the two loops in the algorithm. To figure this out, let's trace out what happens with these loops. On the first iteration of the outer loop, i == 0 and so the inner loop will execute exactly once (counting down from 0 to 0). On the second iteration of the outer loop, i == 1 and the inner loop executes exactly twice (first with j == 1, once with j == 0. More generally, on the kth iteration of the outer loop, the inner loop executes k + 1 times. This means that the total number of iterations of the innermost loop is given by
1 + 2 + 3 + ... + N
This quantity can be shown to be equal to
N (N + 1) N^2 + N N^2 N
--------- = ------- = --- + ---
2 2 2 2
Of these two terms, the N^2 / 2 term is the dominant growth term, and so if we ignore its constant factors we get a runtime of O(N2).
Don't look at this answer as something you should memorize - think of all of the steps required to get to the answer. We started by finding some quantity to count, and then saw how that quantity was influenced by the execution of the loops. From this, we were able to derive a mathematical expression for that quantity, which we then simplified. Finally, we took the resulting expression and determined the dominant term, which serves as the big-O for the overall function.
Work from inside-out.
sum++
This is a single operation on it's own, as it doesn't repeat.
for(int j = i; j >= 0; j--)
This loops i+1 times. There are several operations in there, but you probably don't mean to count the number of asm instructions. So I'll assume for this question this is a multiplier of i+1. Since the loop contents is a single operation, the loop and its block perform i+1 operations.
for(int i = 0; i < N; i++)
This loops N times. So as before, this is a multiplier of N. Since the block performs i+1 operations, this loop performs N(N+1)/2 operations in total. And that's your answer! If you want to consider big-O complexity, then this simplifies to O(N2).
It's not additive: the inner loop happens once for EACH iteration of the outer loop. So it's O(n2).
By the way, this is a good example of why we use asymptotic notation for this kind of thing -- depending on the definition of "operation" the exact details of the count could vary pretty widely. (Like, is sum++ a single operation, or is it add sum to 1 giving temp; load temp to sum?) But since we know that all that can be hidden in a constant factor, it's still going to be O(n2).
No; you don't count a specific number of operations for each line and then add them up. The entire point of constructions like 'for' is to make it possible for a given line of code to run more than once. You're supposed to use thinking and logic skills to figure out how many times the line 'sum++' will run, as a function of N. Hint: it runs once for every time that the third line is encountered.
How many times is the second line encountered?
Each time the second line is encountered, the value of 'i' is set. How many times does the third line run with that value of i? Therefore, how many times will it run overall? (Hint: if I give you a different amount of money on several different occasions, how do you find out the total amount I gave you?)
Each time the third line is encountered, the fourth line happens once.
Which line happens most often? How often does it happen, in terms of N?
So guess what interest you is the sum++ and how many time you execute it.
The final stat of sum would give you that answer.
Actually your loop is just:
Sigma(n) n goes from 1 to N.
Which equal to: N*(N+1) / 2 This give you in big-o-notation O(N^2)
Also beside the name of you question there is no worst case in you algorithm.
Or you could say that the worst case is when N goes to infinity.
Using Sigma notation to represent your loops:

Resources