Master theorem - second case issue - data-structures

Given the following recursive equations:
T(n) = 5T(n/5)+(5sin^5(5n^5)+5)*n
T(n) = T(n/4)+2sin^2(n^4)
I can easily see that both equations fit the 2nd case of the master theorem,
but due to the fact that sin is a circular function, it seems that a large enough N
might bring it really close to zero.
So, we will always be able to find an N > N0 for two constants c1,c2 (By theta definition)
which will disapprove it..
Is it really possible solving it with the master theorem?
thanks

I think you're right, the Master Theorem does not apply here. The reason for this is that the difference between f(n) and n^(log_b(a)) has to be polynomial. (See Master Theorem Recurrences: What is exactly polynomial difference?)
In your case:
((5sin^5(5n^5)+5)*n)/(n^(log_5(5)))=(5sin^5(5n^5)+5and
(2sin^2(n^4))/(n^(log_4(1)))= 2sin^2(n^4), which is not polynomial, so Master Theorem is invalid in this case.

Related

How to solve the recurrence relation T(n)=T(n/3)+T(n/6)+1

I've been trying to solve the next recurrence relation T(n)=T(n/3)+T(n/6)+1
I don't know where to start. I thought about drawing a recursion tree first and then solve it, but I don't know if it's right.
Can someone please help me with this? Thanks
You could use the Akra-Bazzi method, which is a generalization of the Master Theorem for solving divide-and-conquer recurrences with sub-problems of different sizes. It applies to recurrences of the form:
for positive constants a, constants b in (0,1), g(n) in O(n^c) and h(n) in O(n/log^2(n)).
In this case:
Following the method, we need the p value such that:
Solving this equation for p gives p=0.48954...
The Akra-Bazzi theorem says that the complexity for the algorithm is then:
which when solved given g(u) = 1 yields:

Complexity class of recurrence equation

It's been a while for me since my undergraduate class in algorithms. Could you please help me solving this recurrence equation?
T(0)=14
T(n)=4*T(n/2)+n^2 for n>0
I'm interested in an upper bound that is as low as possible.
The exact solution of this equation is difficult to compute, but according to the master theorem, it's asymptotic bound is Θ(n2 log n)
EDIT 1:
Actually it is possible to compute the exact solution, it is (for n > 0)
n2(228 log(2)+4 log (n)) / log(16)
(I obtained this solution by adding constants wherever possible to the master theorem solution and solving system of 5 equations with computer algebra application)
When n is a power of 2, and n > 0, then the following expression gives the solution:
(57+log2n)n²

What is the asymptotic complexity of T(n) = 2T(n/3) +n/ (logn) ^2?

Master theorem fails so tried recursion tree, variable change, repeating method, etc etc.
I can’t handle the “sum from i = 0 to log3(n) of (2/3)^i * n/((log(n/3^i))^2 “ that occurs:
Used logarithmic properties to expand, but still a dead end for me.
According to this page, a logarithm grows more slowly than any polynomial term:
Applying this result to the Master theorem, find that case 3 applies:
Hence the complexity is just

Algorithms : Master Theorem

Master theorem can be used to solve recurrence relations like
T(n)= aT(n/b)+f(n).
So, if f(n)=O(n) or if f(n)=cn are both the values same?
can I use master theorem for f(n)=cn also?
Asumming that c is a constant and that I understand your question correctly, the solution will be the same for both f(n) = O(n) and f(n) = cn, since cn = O(n) and thus the Master theorem can be applied to solve the recurrance.
If I understood the question correctly, f(n)=cn (where c is a constant) is in O(n); the master theorem can be applied.

Run-time of these recurrence relations

How do you calculate a tight bound run time for these relations?
T(n)=T(n-3)+n^2
T(n) = 4T(n/4)+log^3(n)
For the first one I used the substitution method which gave me n^2 but wasn't right and the second one I used Masters Theorem and got nlog^4(n) which also wasn't right. A thorough explanation would be helpful. Thanks!
for the First Recurrence, we can solve it by recurrence tree method
T(n)=T(n-3)+n^2
a) here we see that the number of sub problems are n/3(every i Subtract 3 from n so in n/3 steps we will be reaching the last subproblem).
b) at each level the cost is n^2
therefore the time complexiety is roughly (n/3)*n^2= (n^3)/3 which is O(n^3)
Coming to the second recurrence relation
T(n)=4T(n/4)+log^3(n)
Here we can't apply Master's theorem because n and log^3(n) are not comparable Polynomial times
we could have applied master's theorem(Corollary for strictly logarithmic bounds) if we had something like nlog^3(n) because it is greater strictly by log times
correct me if i am wrong here

Resources