What is the time complexity of the following expression? - algorithm

T(n) = 16T(n/4) + n!
I know it can be solved using Master theorem, but I don't know how to handle
f(n) = n!

This is case three of Master Theorem.
Since T(n) = 16T(n/4) + n!
Here f(n) = n!.
a = 16 and b = 4, so logb a = log4 16 = 2.
Master Theorem states that the complexity T(n) = Θ(f(n)) if
c > logb a where f(n) ∈ Ω(nc) .
Since f(n) = n! > nc for some value of n > n0 the statement f(n) ∈ Ω (nc) is true. Thus the statement
c > logb a =2 is also true. Hence by the third case of Master Thoerem the complexity T(n) = Θ(f(n)) = Θ(n!).

Related

Determine the time complexity using master theorem

How would I determine the time complexity using the Master Theorem for the given problem?
T(n) = aT(n/b) + O(n^d)
T(n) = 4 T(n/2) + n*log(n)
a = 4, b = 2, d = 1
1. O(n^d) if d > logb(a)
2. O(n^d logn) if d = logb(a)
3. O(n^logb(a)) if d < logb(a)
In my case:
log2(4) = 2 --> d < log2(4)
T(n) = O(n^logb(a))
= O(n^2)
Is this correct?
edit:
I have made an different approach now, resulting in the same as before though, and following my professors instruction.
T(n) = a*T(n/b) + f(n)
T(n) = 4 T(n/2) + n*log(n)
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε)),
T(n) = { 2. Ө(n^(logba) log2n) , if --> f(n) є Ө(n^(logba))
3. Ө(f(n)) , if --> f(n) є Ω(n^(logba+ε))
First I look at the 2nd case
f(n) = n*log(n)
logb^a = log2^4 = 2
n^log24 = n^2
The second case does not apply because:
n*log(n) ∉ Θ(n^logba) —> the runtime of f(n) is smaller, because n*log(n) < n^2
Look at the first case if the runtime of f(n) is smaller than n^2
1. Ө(n^(logba)) , if --> f(n) є O(n^(logba-ε))
O(n^(logba-ε)) = O(n^(log24-ε))= O(n^2-ε)
n*log(n) ∈ O(n^(log24-ε)) --> works
f(n) has O(n^2-ε) as upper limit for worst case scenario,
case of Master—Theorems applies:
T(n) = Θ(n^log24) = Θ(n^2)
T(n) = Θ(n^2)
The version of the master theorem that you've stated here specifically assumes that the additive term has the form O(nd). In this case, the additive term is of the form n log n, which is not O(n1). This means that you cannot apply the master theorem as you have above.
A useful technique when working with recurrences like these is to sandwich the recurrence between two others. Notice, for example, that T(n) is bounded by these recurrences:
S(n) = 4S(n / 2) + n
R(n) = 4T(n / 2) + n1+ε, where ε is some small positive number.
Try solving each of these recurrences using the master theorem and see what you get, keeping in mind that your recurrence solves to a value sandwiched between them.
Hope this helps!

Master theorem with f(n)=n!?

How do I solve the following as f(n)=n! does not as to my knowledge apply to any of the cases of master theorem.
T (n) = 16T (n/4) + n!
David Eisenstat is partially correct. Case 3 does apply, but T(n) = theta(n!), not O(n!).
T(n) = 16T(n/4) + n!
Case 3 of the Master Theorem (AKA Master Method) applies. a = 16, b = 4, f(n) = n!. n^(log [base(b)] a) = n^2. f(n) is n!. Since n! is omega(f(n)) i.e. n! omega n^2 AND af(n/b) <= cf(n) for a large n, T(n) is theta(n!).
For reference, consult #10 here : http://www.csd.uwo.ca/~moreno/CS433-CS9624/Resources/master.pdf

Comparing complexities

I have these three questions for an exam review:
If f(n) = 2n - 3 give two different functions g(n) and h(n) (so g(n) doesn't equal h(n)) such that f(n) = O(g(n)) and f(n) = O(h(n))
Now do the same again with functions g'(n) and h'(n), but this time the function should be of the form
g'(n) = Ɵ(f(n)) and f(n) = o(h'(n))
Is it possible for a function f(n) = O(g(n)) and f(n) = Ω(g(n))?
I know that a function is O(n) of another, if it is less than or equal to the other function. So I think 1. could be g(n) = 2n²-3 and h(n) = 2n²-10.
I also know that a function is Ɵ(n) of another if it is basically equal to the other function (we can ignore constants), and o(n) if it is only less than the function, so for 2. I think you could have g'(n) = 2n-15 and h'(n) = 2n.
To 3.: It is possible for a function to be both O(n) and Ω(n) because O(n) and Ω(n) allows for the function to be the same as the given function, so you could have a function g(n) that equals f(n) and satisfies the rules for being both O and Ω.
Can someone please tell me if this is correct?
Your answers are mostly right. But I would like to add some points:
Given is f(n) = 2n - 3
With g(n) = 2n²-3 and h(n) = 2n²-10 f(n) is in O(g(n)) and in O(h(n)). But your g(n) and h(n) are basicly the same, at least they are both in Θ(n²). There exists many other function that would also work. E.g.
f(n) ∈ O(n) ⇒ g(n) = n
f(n) ∈ O(nk) ⇒ g(n) = nk ∀ k ≥ 1
f(n) ∈ O(2ⁿ) ⇒ g(n) = 2ⁿ
g'(n) = 2n-15 reduces to g'(n) = n, if we think in complexities, and this is right. In fact, it is the only possible answer.
But f(n) ∈ o(h'(n)) does not hold for h'(n) = 2n. Little-o means that
limn → ∞ | f(n)/g(n) | = 0 ⇔ f(n) ∈ o(g(n))
So you can choose h'(n) = n² or more general h'(n) = nk ∀ k > 1 or h'(n) = cⁿ for a constant c > 1.
Yes it is possible and you can take it also as a definition for Θ(g(n)):
f(n) ∈ Θ(g(n)) ⇔ f(n) ∈ O(g(n)) and f(n) ∈ Ω(g(n))

Recurrance relation: T (n/16) + n log n

Can the master theorem be applied?
Or say for T (n) = 2T (n/16) + n log n, how is the master theorem applied here?
I get a = 2, b = 16 and I am not sure about c and k.
To solve such a recurrence relation T(n) = a⋅T(n/b) + f(n), you have to calculate e = logb(a).
Then (for an ε > 0):
f(n) ∈ O(ne - ε) ⇒ T(n) ∈ Θ(ne)
f(n) ∈ Θ(ne) ⇒ T(n) ∈ Θ(ne⋅log(n))
f(n) ∈ Ω(ne + ε) ⇒ T(n) ∈ Θ(f(n))
For more details see Masters Theorem.
So in your case: a = 2, b = 16 ⇒ e = log16(2) = 0.25 holds for case 3,
so T(n) is in Θ(n log n).
Even if the log (n) term was not there the reduction in work per sub-problem at each level dominates (b > a). Hence in my opinion the complexity shall be dictated by the work done at highest level viz O (nlogn).

Find Closed End Formula for Recurrence equation by master theorem

Can we solve this
T(n) = 2T( n/2 ) + n lg n recurrence equation master theorem I am coming from a link where he is stating that we can't apply here master theorem because it doesn't satisfied any of the 3ree case condition. On the other hand he has taken a another example
T(n) = 27T(n/3) + Θ(n^3 lg n) and find the closed solution theta(n^3logn) For solving this he used 2nd case of master theorem If f(n) = Θ(nlogba (lg n)k ) then T(n) ∈ Θ(nlogba (lg n)k+1) for some k >= 0 Here my confusion arises why not we can apply 2nd case here while it is completely fit in 2nd case.
My thought: a = 2 , b =2; let k =1 then
f(n) = theta(n^log_2 2 logn) for k= 1 so T(n) = theta(nlogn) But he as mentioned on this we can't apply master theorem I m confused why not.
Note: It is due to f(n) bcz in T(n) = 2T( n/2 ) + n lg n f(n) = nlog n and in T(n) = 27T(n/3) + Θ(n^3 lg n) *f(n) = theta(n^3log n)* Please Correct me if I am wrong here.
Using case 2 of master theorem I find that
T(n) = Theta( n log^2 (n))
Your link states that the case 2 of theroem is :
f(n) = Theta( n log_b(a))
While from several other links, like the one from mit, the case is :
f(n) = Theta( n log_b(a) log_k(n)) for k >= 0

Resources