What is the O notation of this loop? - big-o

I understand that this is O(N^2):
Loop from i=1 to N
Loop from j=1 to N
Do something with i,j
But what about this?
Loop from i=1 to N
Loop from j=1 to i
Do something with i,j
Is it still O(N^2) or O(N log N)? I don't really understand how to tell.

this is also O(N^2).
N(N-1)/2 ~ O(N^2).
i = 1 than j = 1
i = 2 than j = 1 to 2
i = 3 than j = 1 to 3
i = 4 than j = 1 to 4
…….
…
i = N than j = 1 to N
So for total is 1 + 2 + 3 + 4 + …. + N = (N * (N+1))/2 ~ O(N^2).

For second problem, the running time will be O (1/2 N^2), which later becomes O(N^2), as we don't care about the constant in O notation. Usually the log N algorithm involves dividing the subproblem into half-size of the actual size in each iteration. Take for example merge sort. In merge sort, in each iteration it is dividing the size of the array into half.

Also O(n^2).
You have to look at the worst case of how long your code will run.
So first loop runs from 1 to N.
For each iteration of that loop there is second loop, which runs from 1 to i.
And we know that i will be N on the last iteration, hence it will run for O(N*N), which is (N^2)
We ignore constants in big-O notations.
If these concepts are difficult, try googling some tutorials and examples. All you need is some practice, and you will get it.

Related

Run-Time complexities of the following functions

I need some help with these functions and if the run-time complexities for it are correct, I'm learning the concepts currently in class and I've looked at videos and such but I can't find any videos explaining these tougher ones, so I'm hoping I can get some help here if I'm doing it right.
sum = 0
for i = 1 to n*n
for j = 1 to i * i * i
sum++
For this one I am thinking the answer is O(n^5) because the outer loop is running n^2 times while the inner loop will be running n^3 times and together that'll make n^5
sum = 0
for i = 1 to n^2 // O(n^2) times
j = i
while j > 0 //(O(n+1) since the while loop will check one more time if the loop is valid
sum++
j = (j div 5)
for this run time I'm assuming its going to run O(n^3 + 1) times since outer loop is running n^2 times and while loop will be n+1 and together thats n^3 + 1.
for i = 1 to n // n times
for j = 1 to n { // n^2 times
C[i,j] = 0
for k = 1 to n // n^3 times?
C[i,j] = C[i,j] + A[i,k]*B[k,j]
}
so for this one I'm thinking it's O(n^6) but I am really iffy on this one. I have seen some examples online where people will figure the loop to be O(n log n) but I am totally lost on how that is found. Any help would be greatly appreciated!
Your understanding of the first and the third algorithms looks correct. The second, however, is totally off. The inner loop
while j > 0 //(O(n+1) since the while loop will check one more time if the loop is valid
sum++
j = (j div 5)
starts with j being equal to i and divides j by 5 at each iteration, so it runs log(i) times. In turn, i varies from 1 to n^2, and the total execution time is a
sum[i: 1..n^2] log(i)
By the property of a logarithm this sum is equal to log ((n^2)!). Using Stirling approximation for factorial one obtains the time complexity being O(n^2 log(n^2)) = O(2 n^2 log(n)) = O(n^2 log(n)).

Complexity Analysis of the following loops

I have some exercises of complexity analysis of double loops, and I don't know if I'm doing them correctly.
for i = 1 to n do
j = i
while j < n do
j = 2∗j
end while
end for
My answer on this is O(n^2), because the first loop is running O(n) times and the inner one is doing O(n/2) iterations for the "worst" iteration of the outer loop. So O(n) * O(n/2) = O(n^2).
Also looking a bit further, I think I can say that the inner loops is doing a partial sum that is O(n/2) + O(n-1) + ... + O(1), and this is also O(n)
for i = 1 to n do
j = n
while i∗i < j do
j = j − 1
end while
end for
Again the outer loop is O(n), and the inner loop is doing O(sqrt(n)) in the worst iteration, so here I think it's O(n*sqrt(n)) but I'm unsure about this one.
for i = 1 to n do
j = 2
while j < i do
j = j ∗j
end while
end for
Here the outer loop is O(n) and the inner loop is doing O(logn) work for the worst case. Hence I think this is O(nlogn)
i = 2
while (i∗i < n) and (n mod i != 0) do
i = i + 1
end while
Finally, I don't know how to make sense of this one. Because of the modulus operator.
My questions are:
Did I do anything wrong in the first 3 examples?
Is the "worst-case approach" for the inner loops I'm doing correct?
How should I approach the last exercise?
First Question:
The inner loop takes log(n/i) time. an upper bound is O(log(n)) giving a total time of O(n*log(n)). a lower bound is log(n/2) and sum only on the last n/2 terms, giving a total complexity of n/2 * log(n/2) = n/2*log(n) - n/2 = O(n * log(n)) and we get that the bound O(n* log(n)) is tight (we have a theta bound).
Second Question:
The inner loop takes n - i^2 time (and O(1) if i^2 >= n). Notice that for i >= sqrt(n) the inner loop takes O(1) time so we can run the outer loop only for i in 1:sqrt(n) and add O(n) to the result. An upper bound is n for the inner loop, giving a total time of O(n * sqrt(n) + n) = O(n ^ (3/2)). A lower bound is 3/4 * n for the inner loop and summing only for i's up to sqrt(n) / 2 (so that i^2 < n / 4 and n - i ^ 2 > 3/4 * n ) and we get a total time of Ω(sqrt(n) / 2 * n * 3/4 + n) = Ω(n^(3/2)) thus the bound O(n * sqrt(n)) is indeed tight.
Third Question:
In this one j is starting from 2 and we square it until it reaches i. after t steps of the inner loop, j is equal to 2^(2^t). we reach i when j = 2 ^ (log(i)) = 2 ^ (2 ^ log(log(i))), i.e., after t = log(log(i)) steps. We can again give an upper bound and lower bound similarly to the previous questions, and get the tight bound O(n * log(log(n))).
Forth Question:
The complexity can vary between 2 = O(1) and sqrt(n), depending on the factorization of n. In the worst case, n is a perfect square, giving a complexity of O(sqrt(n)
To answer your questions at the end:
1. Yes, you have done some things wrong. You have reached wrong answers in 1 and 3 and in 2 your result is right but the reasoning is flawed; the inner loop is not O(sqrt(n)), as you have already seen in my analysis.
2. Considering the "worst case" for the inner loop is good, as it's giving you an upper bound (which is mostly accurate in this kind of questions), but to establish a tight bound you must also show a lower bound, usually by taking only the higher terms and lowering them to the first, as I did in some of the examples. Another way to prove tight bounds is to use formulas of known series such as 1 + ... + n = n * (n + 1) / 2, giving an immediate bound of O(n^2) instead of getting the lower bound by 1 + ... + n >= n/2 + ... + n >= n/2 + ... + n/2 = n/2 * n/2 = n^/4 = Ω(n^2).
3. Answered above.
For the first one in the inner loop we have:
i, 2*i, 4*i, ... , (2^k)*i where (2^k)*i < n. So k < logn - logi. The outer loop as you said repeats n+1 times. In total we have this sum:
Which equals to
Therefore I think the complexity should be O(nlogn).
For the second one we have:
For third one:
So I think it should be O(log(n!))
For the last one, if n is even, it will be O(1) because we don't enter the loop. But the worst case is when n is odd and is not divisible by any of the square numbers, then I think it should be

time complexity (with respect of n input)

I was asked if what time complexity if this:
What is the time complexity (with respect of n) of this algorithm:
k=0
for(i = n / 2 ; i < n ; i++ ) {
for( j=0 ; j < i ; j++)
k = k + n / 2
}
choices was : a. O(n) b. O(n/2) c. O(n log(n) and d. O(n^2)
can have a multiple answers.
i know the algorithm above is d. O(n^2) but i came with with a. O(n) since it is looking for complexity of n only?.
if you are to have this question. how would you answer it.?? im so curious about the answer.
The answer is O(n²).
This is easy to understand. I will try to make you understand it.
See, the outer for loop block is executed n - n/2 = n/2 times.
Of course it depends whether the number n is even or odd. If it's even then the outer loop is executed n/2 times. If it's odd then it's executed for (n-1)/2 times.
But for time complexity, we don't consider this. We just assume that the outer for loop is executed n/2 times where i starts from n/2 and ends at n - 1 (because the terminating condition is i < n and not i <= n).
For each iteration of the outer loop, the inner loop executes i times.
For example, for every iteration, inner loop starts with j = 0 to j = i - 1. This means that it executes i times (not i - 1 times because j starts from 0 and not from 1).
Therefore, for 1st iteration the inner loop is executed i = n / 2 times. i = n / 2 + 1 for 2nd iteration and so on upto i = n - 1 times.
Now, the total no. of times the inner loop executes is n/2 + (n/2 + 1) + (n/2 + 2) + ... + (n - 2) + (n - 1). It's simple math that this sums up to (3n² - n)/2 times.
So, the time complexity becomes O((3n² - n)/2).
But we ignore the n term because n² > n and the constant terms because for every n they will remain the same.
Therefore, the final time complexity is O(n²).
Hope this helps you understand.

How do you find the asymptotic time complexity of these loops?

How do you find the asymptotic time complexity of this code?
for i=1 to n do
j=2
while j<i do
j=j*j
In my notes I have the answer O(n*log(logn)) but without an explanation.
The first for loop runs n times. The inner loop iterates in squares and so will take O(loglogn) So, total complexity is O(n*log(logn)).
To understand why iterating in squares take O(log(logn)) time, see this way :
Suppose n is as large number as 2^16.
Initially: j = 2
1st step : j = 2^2
2nd step : j = 2^4
3rd step : j = 2^8
4th step : j = 2^16.
Hence it takes only 4 steps which is loglog(2^16).
So now for any n = 2^k, you start with 2 and everytime you are squaring. So , there can be atmost O(logk) squaring you can do to reach k. Since n = 2^k, So, k = log(n) and hence O(logk) is same as O(log(logn)).

derivation of algorithm complexity

Refreshing up on algorithm complexity, I was looking at this example:
int x = 0;
for ( int j = 1; j <= n; j++ )
for ( int k = 1; k < 3*j; k++ )
x = x + j;
I know this loops ends up being O(n^2). I'm believing inner loop is executed 3*n times( 3(1+2+...n) ), and the outer loop executes n times. So, O(3n*n) = O(3n^2) = O(n^2).
However, the source I'm looking at expands the execution of the inner loop to: 3(1+2+3+...+n) = 3n^2/2 + 3n/2. Can anyone explain the 3n^2/2 + 3n/2 execution times?
for each J you have to execute J * 3 iterations of internal loop, so you command x=x+j will be finally executed n * 3 * (1 + 2 + 3 ... + n) times, sum of Arithmetic progression is n*(n+1)/2, so you command will be executed:
3 * n * (n+1)/2 which is equals to (3*n^2)/2 + (3*n)/2
but big O is not how much iterations will be, it is about assymptotic measure, so in expression 3*n*(n+1)/2 needs to remove consts (set them all to 0 or 1), so we have 1*n*(n+0)/1 = n^2
Small update about big O calculation for this case: to make big O from the 3n(n+1)/2, for big O you can imagine than N is infinity, so:
infinity + 1 = infinity
3*infinity = infinity
infinity/2 = infinity
infinity*infinity = infinity^2
so you after this you have N^2
The sum of integers from 1 to m is m*(m+1)/2. In the given problem, j goes from 1 to n, and k goes from 1 to 3*j. So the inner loop on k is executed 3*(1+2+3+4+5+...+n) times, with each term in that series representing one value of j. That gives 3n(n+1)/2. If you expand that, you get 3n^2/2+3n/2. The whole thing is still O(n^2), though. You don't care if your execution time is going up both quadratically and linearly, since the linear gets swamped by the quadratic.
Big O notation gives an upper bound on the asymptotic running time of an algorithm. It does not take into account the lower order terms or the constant factors. Therefore O(10n2) and O(1000n2 + 4n + 56) is still O(n2).
What you are doing is try to count the number the number of operations in your algorithm. However Big O does not say anything about the exact number of operations. It simply provides you an upper bound on the worst case running time that may occur with an unfavorable input.
The exact precision of your algorithm can be found using Sigma notation like this:
It's been empirically verified.

Resources