I'm wondering what is the big O notation for each of the below statement:
Sum = 0;
for i=1 to N^2 do:
for j=1 to N do:
'Sum += 1;
Sum = 0 is O(1) for sure, because it will only be executed once.
But I'm confused by the second statement, should it be O(N) because it's the first loop? or it should be O(N^2) because N^2 is a quadratic function about variable N?
The first loop is O(N2) because it executes N2 steps. Each of those steps executes the inner loop, which involves N steps, so there are N2 * N or N3 steps, and the algorithm is O(N3).
You'll be looping through N three rounds..so i say: O(n^3)
Algorithm:
Sum = 0; ~ 1
for i=1 to N^2 do: ~ 1+2N^2
for j=1 to N do: ~ (1+2N) * N^2
'Sum += 1; ~ 1 * N * N^2
Time Complexity:
Time = 1 + 1+2N^2 + (1+2N)*N^2 + 1 * N * N^2
Time = 2 + 2N^2 + N^2 + 2N^3 + N^3
Time = 2 + 3N^2 + 3N^3
O(Time) = O(2 + 3N^2 + 3N^3) ~ O(N^3)
Related
I have a algorithm, the pseudo code below:
def foo(n):
if n == 0
return;
// Loop below take O(N)
for(i=0; i<n:i++){
....
}
foo(n-1):
The idea is that each recursion takes n time, and there are n recursions.
The total time should be like 1 + 2 3 + 4 +5 + ... +n
Can it be proved as O(n*n)?
Yes, it is O(n^2).
The sum of n natural numbers is: n * (n+1) / 2, link. Which is different to n^2 by a constant factor, so O(n * (n+1) / 2) == O(n^2)
First, you have n iterations in the for loop, then the function will repeat with n-1, n-2, ..., 0.
It's easy to see that n + (n-1) + (n-2) + ... + 1 = (n+1) * n/2 = (n^2 + n)/2 = O(n^2).
To evaluate Big O, that is, the complexity of the worst case, remember you have to ignore the all the coeficients, constants and lower power terms:
(n^2 + n)/2 = (1/2) * (n^2 + n)
O( (1/2) * (n^2 + n) ) = O(n^2 + n) = O(n^2)
This is a test I failed because I thought this complexity would be O(n), but it appears i'm wrong and it's O(n^2). Why not O(n)?
First, notice that the question does not ask what is the time complexity of a function calculating f(n), but rather the complexity of the function f(n) itself. you can think about f(n) as being the time complexity of some other algorithm if you are more comfortable talking about time complexity.
This is indeed O(n^2), when the sequence a_i is bounded by a constant and each a_i is at least 1.
By the assumption, for all i, a_i <= c for some constant c.
Hence, a_1*1+...+a_n*n <= c * (1 + 2 + ... + n). Now we need to show that 1 + 2 +... + n = O(n^2) to complete the proof.
1 + 2 + ... + n <= n + n + ... + n = n * n = n ^ 2
and
1 + 2 + ... + n >= n / 2 + (n / 2 + 1) + ... + n >= (n / 2) * (n / 2) = n^2/4
So the complexity is actually Theta(n^2).
Note that if a_i was not constant, e.g., a_i = i then the result is not correct.
in that case, f(n) = 1^2 + 2^2 + ... + n^2 and you can show easily (using the same method as before) that f(n) = Omega(n^3), which means it's not O(n^2).
Preface, not super great with complexity-theory but I'll take a stab.
I think what is confusing is that its not a time complexity problem, but rather the functions complexity.
So for easy part i just goes up to n ie. 1,2,3 ...n , then for ai all entries must be above 0 meaning that a could be something like this 2,5,1... for n times. If you multiply them together n*n = O(n2).
The best case would be if a is 1,1,1 which drop the complexity down to O(n) but the average case will be n so you get squared.
Unless it's mentioned that a[i] is O(n), it's definitely O(n)
Here an another try to achieve O(n*n) if sum should be returned as result.
int sum = 0;
for(int i = 0; i<=n; i++){
for(int j = 0; j<=n; j++){
if(i == j){
sum += A[i] * j;
}
}
return sum;
I'm trying to figure out the exact big-O value of algorithms. I'll provide an example:
for (int i = 0; i < n; i++) // 2N + 2
{
for (int x = i; x < n; x++) // N * 2N + 2 ?
{
sum += i; // N
}
} // Extra N?
So if I break some of this down, int i = 0 would be O(1), i < n is N+1, i++ is N, multiply the inner loop by N:
2N + 2 + N(1 + N + 1 + N) = 2N^2 + 2N + 2N + 2 = 2N^2 + 4N + 2
Add an N for the loop termination and the sum constant, = 3N^2 + 5N + 2...
Basically, I'm not 100% sure how to calculate the exact O notation for an algorithm, my guess is O(3N^2 + 5N + 2).
What do you mean by exact? Big O is an asymptotic upper bound, so it's by definition not exact.
Thinking about i=0 as O(1) and i<n as O(N+1) is not correct. Instead, think of the outer loop as doing something n times, and for every iteration of the outer loop, the inner loop is executed at most n times. The calculation inside the loop takes constant time (the calculation is not getting more complex as n gets bigger). So you end up with O(n*n*1) = O(n^2), quadratic complexity.
When asking about "exact", you're running the inner loop from 0 to n, then from 1 to n, then from 2 to n, ... , from (n-1) to n, each time doing a constant time operation. So you do n + (n-1) + (n-2) + ... + 1 = n*(n+1)/2 = n^2/2 + n/2 iterations. To get from the exact number of calculations to big O notation, omit constants and lower-order terms, and you'll end up with O(n^2) (the 1/2 and +n/2 are omitted).
Big O means Worst case complexity.
And Here worst case will occur only if both the loops will run for n numbers of time i.e n*n.
So, complexity is O(n2).
What is the time complexity for the following function?
for(int i = 0; i < a.size; i++) {
for(int j = i; j < a.size; i++) {
//
}
}
I think it is less than big O n^2 because we arent iterating over all of the elements in the second for loop. I believe the time complexity comes out to be something like this:
n[ (n) + (n-1) + (n-2) + ... + (n-n) ]
But when I solve this formula it comes out to be
n^2 - n + n^2 - 2n + n^2 - 3n + ... + n^2 - n^2
Which doesn't seem correct at all. Can somebody tell me exactly how to solve this problem, and where I am wrong.
That is O(n^2). If you consider the iteration where i = a.size() - 1, and you work your way backwards (i = a.size() - 2, i = a.size - 3, etc), you are looking at the following sum of number of iterations, where n = a.size.
1 + 2 + 3 + 4 + ... + n
The sum of this series is n(n+1)/2, which is O(n^2). Note that big-O notation ignores constants and takes the highest polynomial power when it is applied to a polynomial function.
It will run for:
1 + 2 + 3 + .. + n
Which is 1/2 n(n+1) which give us O(n^2)
The Big-O notation will only keep the dominant term, neglecting constants too
The Big-O is only used to compare algorithms on the same variation of a problem using the same complexity analysis standard, if and only if the dominant terms are different.
If the dominant terms are the same, you need to compare Big-Theta or Time complexity, which will show minor differences.
Example
A
for i = 1 .. n
for j = i .. n
..
B
for i = 1 .. n
for j = 1 .. n
..
We have
Time(A) = 1/2 n(n+1) ~ O(n^2)
Time(B) = n^2 ~ O(n^2)
O(A) = O(B)
T(A) < T(B)
Analysis
To visualize how we got 1 + 2 + 3 + .. n:
for i = 1 .. n:
print "(1 + "
sum = 0
for j = i .. n:
sum++
print sum") + "
will print the following:
(1+n) + (1+(n-1)) + .. + (1+3) + (1+2) + (1+1) + (1+0)
n+1 + n + n-1 + .. + 3 + 2 + 1
1 + 2 + 3 + .. + n + n+1
1/2 n(n+1) + (n+1)
1/2 n^2 + 1/2 n + n + 1
1/2 n^2 + 3/2 n + 1
Yes, the number of iterations is strictly less than n^2, but it's still Θ(n^2). It will eventually be greater than n^k for any k<2, and it will eventually be less than n^k for any k>2.
(As a side note, computer scientists often say big-O when they really mean big-theta (Θ). It's technically correct to say that almost every algorithm you've seen has O(n!) running time; all reasonably algorithms have running times that grow no more quickly than n!. But it's not really useful to say that the complexity is O(n!) if it's also O(n log n), so by some kind of Gricean maxim we assume that when someone says an algorithm's complexiy is O(f(x)) that f(x) is as small as possible.)
The question is to set up a recurrence relation to find the value given by the algorithm. The answer should be in teta() terms.
foo = 0;
for int i=1 to n do
for j=ceiling(sqrt(i)) to n do
for k=1 to ceiling(log(i+j)) do
foo++
Not entirely sure but here goes.
Second loop executes 1 - sqrt(1) + 2 - sqrt(2) + ... + n - sqrt(n) = n(n+1)/2 - n^1.5 times => O(n^2) times. See here for a discussion that sqrt(1) + ... + sqrt(n) = O(n^1.5).
We've established that the third loop will get fired O(n^2) times. So the algorithm is asymptotically equivalent to something like this:
for i = 1 to n do
for j = 1 to n do
for k = 1 to log(i+j) do
++foo
This leads to the sum log(1+1) + log(1+2) + ... + log(1+n) + ... + log(n+n). log(1+1) + log(1+2) + ... + log(1+n) = log(2*3*...*(n+1)) = O(n log n). This gets multiplied by n, resulting in O(n^2 log n).
So your algorithm is also O(n^2 log n), and also Theta(n^2 log n) if I'm not mistaken.