I have a algorithm, the pseudo code below:
def foo(n):
if n == 0
return;
// Loop below take O(N)
for(i=0; i<n:i++){
....
}
foo(n-1):
The idea is that each recursion takes n time, and there are n recursions.
The total time should be like 1 + 2 3 + 4 +5 + ... +n
Can it be proved as O(n*n)?
Yes, it is O(n^2).
The sum of n natural numbers is: n * (n+1) / 2, link. Which is different to n^2 by a constant factor, so O(n * (n+1) / 2) == O(n^2)
First, you have n iterations in the for loop, then the function will repeat with n-1, n-2, ..., 0.
It's easy to see that n + (n-1) + (n-2) + ... + 1 = (n+1) * n/2 = (n^2 + n)/2 = O(n^2).
To evaluate Big O, that is, the complexity of the worst case, remember you have to ignore the all the coeficients, constants and lower power terms:
(n^2 + n)/2 = (1/2) * (n^2 + n)
O( (1/2) * (n^2 + n) ) = O(n^2 + n) = O(n^2)
Related
I want to prove why Laplace determinant or recursive algorithm complexity is n!. Can anyone prove it for me? I don't know how could it then be n!, given that the equation T(n)=nT(n-1)+3n-1 only involves multiplication and addition.
Try to expand:
T(n) = n T(n-1) + 3n-1 =
n ((n-1)T(n-2) + 3(n-1)-1) + 3n-1 =
n (n-1) T(n-2) + 3 n (n-1) - n + (3n - 1)
Now by induction you can show that (if T(1) = 1):
T(n) = n (n-1) (n-2) ... 1 + 3(n + n (n-1) + ... + n!) -
(1 + n + n (n-1) + ... + n (n-1) ... n * (n-1) * ... * (n-2))
= Theta(n!)
I have solved all of them however i have been told there are some mistakes, can somebody please help me
n^4 - 10^3 n^3 + n^2 + 4n + 10^6 = O(n^4)
10^5 n^3 + 10^n = O(10^n)
10 n^2 + n log n + 30 √n = O(n^2)
25^n = O(1)
n^2+ n log n + 7 n = O(n^2)
(n^3 + 10) (n log n+ 1) / 3 = O(n^4 log n)
20 n^10 + 4^n = O(4^n)
n^2 log n^3 + 10 n^2 = O(n^2 log n)
10^20 = O(1)
n^2 log (6^2)n = O(n^2 log n)
n log(2n) = O(n log n)
30 n + 100 n log n + 10 = O(n log n)
(n+√n) log n^3 = O(n+√n log n)
n (n + 1) + log log n = O(n^2)
4n log 5^(n+1) = O(n log 5^n)
3^(n+4) = O(3^n)
n^2 log n^2 + 100 n^3 = O(n^3)
(n log n) / (n + 10) = O(n^2 log n)
5n + 8 n log(n) + 10n^2 = O(n^2)
2n^3 + 2n^4 + 2^n + n^10 = O(2^n)
Hints:
if you have n on the left, you should have it on the right
there should not be any + operations on the right
log(x^y) can be simplified
Most of your answers look correct, but you have 25^n = O(1) which looks wrong (unless it's 0.25^n), and also you have (n log n) / (n + 10) = O(n^2 log n) which does not look like the tightest possible bound (I'm assuming you want the tightest possible upper bound function). Also you should never have to add functions in your big-O, unless your original function is taking the sum or max of two functions or something and the two functions have cris-crossing different growth rates at different values of n as n goes to infinity. And that very rarely happens.
I'm wondering what is the big O notation for each of the below statement:
Sum = 0;
for i=1 to N^2 do:
for j=1 to N do:
'Sum += 1;
Sum = 0 is O(1) for sure, because it will only be executed once.
But I'm confused by the second statement, should it be O(N) because it's the first loop? or it should be O(N^2) because N^2 is a quadratic function about variable N?
The first loop is O(N2) because it executes N2 steps. Each of those steps executes the inner loop, which involves N steps, so there are N2 * N or N3 steps, and the algorithm is O(N3).
You'll be looping through N three rounds..so i say: O(n^3)
Algorithm:
Sum = 0; ~ 1
for i=1 to N^2 do: ~ 1+2N^2
for j=1 to N do: ~ (1+2N) * N^2
'Sum += 1; ~ 1 * N * N^2
Time Complexity:
Time = 1 + 1+2N^2 + (1+2N)*N^2 + 1 * N * N^2
Time = 2 + 2N^2 + N^2 + 2N^3 + N^3
Time = 2 + 3N^2 + 3N^3
O(Time) = O(2 + 3N^2 + 3N^3) ~ O(N^3)
What time-complexity will the following code have in respect to the parameter size? Motivate.
// Process(A, N) is O(sqrt(N)).
Function Complex(array[], size){
if(size == 1) return 1;
if(rand() / float(RAND_MAX) < 0.1){
return Process(array, size*size)
+ Complex(array, size/2)
+ Process(array, size*size);
}
}
I think it is O(N), because if Process(A, N) is O(sqrt(N)), then Process(A, N*N) should be O(N), and Complex(array, size/2) is O(log(n)) because it halves the size every time it runs. So on one run it takes O(N) + O(log(N)) + O(N) = O(N).
Please correct me and give me some hints on how I should think / proceed an assignment like this.
I appreciate all help and thanks in advance.
The time complexity of the algorithm is O(N) indeed, but for a different reason.
The complexity of the function can be denoted as T(n) where:
T(n) = T(n/2) + 2*n
^ ^
recursive 2 calls to
invokation Process(arr,n*n),
each is O(n(
This recursion is well known to be O(n):
T(n) = T(n/2) + 2*n =
= T(n/4) + 2*n/2 + 2*n =
= T(n/8) + 2*n/4 + 2*n/2 + 2*n
= ....
= 2*n / (2^logN) + ... + 2*n/2 + 2*n
< 4n
in O(n)
Let's formally prove it, we will use mathematical induction for it:
Base: T(1) < 4 (check)
Hypothesis: For n, and for every k<n the claim T(k) < 4k holds true.
For n:
T(n) = T(n/2) + n*2 = (*)
< 2*n + 2*n
= 4n
Conclusion: T(n) is in O(n)
(*) From the induction hypothesis
The question is to set up a recurrence relation to find the value given by the algorithm. The answer should be in teta() terms.
foo = 0;
for int i=1 to n do
for j=ceiling(sqrt(i)) to n do
for k=1 to ceiling(log(i+j)) do
foo++
Not entirely sure but here goes.
Second loop executes 1 - sqrt(1) + 2 - sqrt(2) + ... + n - sqrt(n) = n(n+1)/2 - n^1.5 times => O(n^2) times. See here for a discussion that sqrt(1) + ... + sqrt(n) = O(n^1.5).
We've established that the third loop will get fired O(n^2) times. So the algorithm is asymptotically equivalent to something like this:
for i = 1 to n do
for j = 1 to n do
for k = 1 to log(i+j) do
++foo
This leads to the sum log(1+1) + log(1+2) + ... + log(1+n) + ... + log(n+n). log(1+1) + log(1+2) + ... + log(1+n) = log(2*3*...*(n+1)) = O(n log n). This gets multiplied by n, resulting in O(n^2 log n).
So your algorithm is also O(n^2 log n), and also Theta(n^2 log n) if I'm not mistaken.