What is the tightest asymptotic growth rate - algorithm

I have solved all of them however i have been told there are some mistakes, can somebody please help me
n^4 - 10^3 n^3 + n^2 + 4n + 10^6 = O(n^4)
10^5 n^3 + 10^n = O(10^n)
10 n^2 + n log n + 30 √n = O(n^2)
25^n = O(1)
n^2+ n log n + 7 n = O(n^2)
(n^3 + 10) (n log n+ 1) / 3 = O(n^4 log n)
20 n^10 + 4^n = O(4^n)
n^2 log n^3 + 10 n^2 = O(n^2 log n)
10^20 = O(1)
n^2 log (6^2)n = O(n^2 log n)
n log(2n) = O(n log n)
30 n + 100 n log n + 10 = O(n log n)
(n+√n) log n^3 = O(n+√n log n)
n (n + 1) + log log n = O(n^2)
4n log 5^(n+1) = O(n log 5^n)
3^(n+4) = O(3^n)
n^2 log n^2 + 100 n^3 = O(n^3)
(n log n) / (n + 10) = O(n^2 log n)
5n + 8 n log(n) + 10n^2 = O(n^2)
2n^3 + 2n^4 + 2^n + n^10 = O(2^n)

Hints:
if you have n on the left, you should have it on the right
there should not be any + operations on the right
log(x^y) can be simplified

Most of your answers look correct, but you have 25^n = O(1) which looks wrong (unless it's 0.25^n), and also you have (n log n) / (n + 10) = O(n^2 log n) which does not look like the tightest possible bound (I'm assuming you want the tightest possible upper bound function). Also you should never have to add functions in your big-O, unless your original function is taking the sum or max of two functions or something and the two functions have cris-crossing different growth rates at different values of n as n goes to infinity. And that very rarely happens.

Related

Time complexity of the recursive algorithm each recursion takes O(N)

I have a algorithm, the pseudo code below:
def foo(n):
if n == 0
return;
// Loop below take O(N)
for(i=0; i<n:i++){
....
}
foo(n-1):
The idea is that each recursion takes n time, and there are n recursions.
The total time should be like 1 + 2 3 + 4 +5 + ... +n
Can it be proved as O(n*n)?
Yes, it is O(n^2).
The sum of n natural numbers is: n * (n+1) / 2, link. Which is different to n^2 by a constant factor, so O(n * (n+1) / 2) == O(n^2)
First, you have n iterations in the for loop, then the function will repeat with n-1, n-2, ..., 0.
It's easy to see that n + (n-1) + (n-2) + ... + 1 = (n+1) * n/2 = (n^2 + n)/2 = O(n^2).
To evaluate Big O, that is, the complexity of the worst case, remember you have to ignore the all the coeficients, constants and lower power terms:
(n^2 + n)/2 = (1/2) * (n^2 + n)
O( (1/2) * (n^2 + n) ) = O(n^2 + n) = O(n^2)

Is O(K + (N-K)logK) equivalent to O(K + N log K)?

Can we say O(K + (N-K)logK) is equivalent to O(K + N logK) for 1 < = K <= N?
The short answer is they are not equivalent and it depends on the value of k. If k is equal to N, then the first complexity is O(N), and the second complexity is O(N + Nlog N) which is equivalent to O(NlogN). However, O(N) is not equivalent to O(N log N).
Moreover, if a function is in O(K + (N-K) log K) is in O(K + N log K) (definitely for every positive K), and the proof of this is straightforward.
Yes because in the worst case (N-K) logK is at most N logK given your constraints since 1 <= K <= N.
Not exactly.
If they are equivalent, then every function in O(k + (n-k)log k) is also in O(k + n log k) and vice-versa.
Let f(n,k) = n log k
This function is certainly in O(k + n log k), but not in O(k + (n-k)log k).
Let g(n,k) = k + (n-k)log k
Then as x approaches infinity, f(x,x)/g(x,x) grows without bound, since:
f(x,x) / g(x,x)
= (x log x) / x
= log x
See the definition of big-O notation for multiple variables: http://mathwiki.cs.ut.ee/asymptotics/04_multiple_variables
Wikipedia provides the same information, but in less accessible notation:
https://en.wikipedia.org/wiki/Big_O_notation#Multiple_variables

Get the running time T(n)

A algorithm core is like the following.
CHECK_VALUE_IN_ARRAY(array, n, value)
for i = 1 to n
binary_search(array, i, n, value)
And already know binary_search(array,1,n,value)'s T(n) = Theta(lgn)
how to get T(n) ?
PS:
My steps:
T(n) = t(n) + t(n-1) + ... + t(1)
= lg(n) + lg(n-1) + ... + lg(1)
= lgn!
Is this right?
If binary_search(array, i, n, value) searches elements i ... n of the array for value using binary search, then yes, your analysis is correct. The runtime will be
Θ(log 1 + log 2 + log 3 + ... + log n) = Θ(log n!)
Note that by Stirling's approximation, log n! = Θ(n log n), so the total runtime would be Θ(n log n).
Hope this helps!

Asymptotic run time complexity of an expression

Can I say that:
log n + log (n-1) + log (n-2) + .... + log (n - k) = theta(k * log n)?
Formal way to write the above:
Sigma (i runs from 0 to k) log (n-i) = theta (k* log n)?
If the above statement is right, how can I prove it?
If it is wrong, how can I express it (the left side of the equation, of course) as an asymptotic run time function of n and k?
Thanks.
Denote:
LHS = log(n) + log(n-1) + ... + log(n-k)
RHS = k * log n
Note that:
LHS = log(n*(n-1)*...*(n-k)) = log(polynomial of (k+1)th order)
It follows that this is equal to:
(k+1)*log(n(1 + terms that are 0 in limit))
If we consider a division:
(k+1)*log(n(1 + terms that are 0 in limit)) / RHS
we get in limit:
(k+1)/k = 1 + 1/k
So if k is a constant, both terms grow equally fast. So LHS = theta(RHS).
Wolfram Alpha seems to agree.
When n is constant, terms that previously were 0 in limit don't disappear but instead you get:
(k+1) * some constant number / k * (some other constant number)
So it's:
(1 + 1/k)*(another constant number). So also LHS = theta(RHS).
When proving Θ, you want to prove O and Ω.
Upper bound is proven easily:
log(n(n-1)...(n-k)) ≤ log(n^k) = k log n = O(k log n)
For the lower bound, if k ≥ n/2,
then in the product there is n/2 terms greater than n/2:
log(n(n-1)...(n-k)) ≥ (n/2)log(n/2) = Ω(n log n) ≥ Ω(k log n)
and if k ≤ n/2, all terms are greater than n/2:
log(n(n-1)...(n-k)) ≥ log((n/2)^k) = k log(n/2) = Ω(k log n)

Recurrence Relation for a loop

The question is to set up a recurrence relation to find the value given by the algorithm. The answer should be in teta() terms.
foo = 0;
for int i=1 to n do
for j=ceiling(sqrt(i)) to n do
for k=1 to ceiling(log(i+j)) do
foo++
Not entirely sure but here goes.
Second loop executes 1 - sqrt(1) + 2 - sqrt(2) + ... + n - sqrt(n) = n(n+1)/2 - n^1.5 times => O(n^2) times. See here for a discussion that sqrt(1) + ... + sqrt(n) = O(n^1.5).
We've established that the third loop will get fired O(n^2) times. So the algorithm is asymptotically equivalent to something like this:
for i = 1 to n do
for j = 1 to n do
for k = 1 to log(i+j) do
++foo
This leads to the sum log(1+1) + log(1+2) + ... + log(1+n) + ... + log(n+n). log(1+1) + log(1+2) + ... + log(1+n) = log(2*3*...*(n+1)) = O(n log n). This gets multiplied by n, resulting in O(n^2 log n).
So your algorithm is also O(n^2 log n), and also Theta(n^2 log n) if I'm not mistaken.

Resources