Please explain how to proceed with these types of questions.
T(n)=4T(n/3)+n;
Can we get the complexity of this relation without using Master theorem. If so, how?
Please explain.
Also, how should we go about finding the run time complexity of any code?
Thank You.
Time complexity of your code is O(n*log_3(n)) log_3(n) -> O(n * log n).
Why?
This is because your relation is recursive and it will keep on recurring until n<3 (assuming it is the base case.)
In each recurrence step the value of n becomes n/3, and also a loop worth O(n) gets executed.
Here is the tree implementation
Time Complexity for T(n)=c*T(n/3)+n^2 will be O((n^2)*logn) if(log_3(c)==2)
Related
I run into an exercise from a book on "algorithms and data structures" that is giving me some trouble.
I need to write the pseudo-code of a recursive algorithm regulated by the recurrence relation:
T(n) = T(n-1)*T(n-2) + T(n-3) + O(1) for n>10
without solving the relation.
I suspect there is no such an algorithm but I am unsure.
In my attempts to find a solution, I have evaluated k=T(n-1) and called the algorithm on n-2 for k times. Reasoning in this way is not correct because I need to add a cost for estimating T(n-1) to the relation (for instance, I can estimate the cost in a iterative way in O(n) or I may call the algorithm on n-1 if the algorithm return the cost. The latter would add T(n-1) to the recurrence relation).
I’d be thankful if someone could give me an hint and showing me where my reasoning is wrong.
In general, how should be structured an algorithm with a number of recursive calls equals to T(n-1)*T(n-2)?
Tks
How do you calculate a tight bound run time for these relations?
T(n)=T(n-3)+n^2
T(n) = 4T(n/4)+log^3(n)
For the first one I used the substitution method which gave me n^2 but wasn't right and the second one I used Masters Theorem and got nlog^4(n) which also wasn't right. A thorough explanation would be helpful. Thanks!
for the First Recurrence, we can solve it by recurrence tree method
T(n)=T(n-3)+n^2
a) here we see that the number of sub problems are n/3(every i Subtract 3 from n so in n/3 steps we will be reaching the last subproblem).
b) at each level the cost is n^2
therefore the time complexiety is roughly (n/3)*n^2= (n^3)/3 which is O(n^3)
Coming to the second recurrence relation
T(n)=4T(n/4)+log^3(n)
Here we can't apply Master's theorem because n and log^3(n) are not comparable Polynomial times
we could have applied master's theorem(Corollary for strictly logarithmic bounds) if we had something like nlog^3(n) because it is greater strictly by log times
correct me if i am wrong here
The time complexity of the closest pair problem is T(n) = 2T(n/2) + O(n). I understand that 2T(n/2) comes from the fact that the algorithm is applied to 2 sets of half the original's size, but why does the rest come out to O(n)? Thanks.
Check out http://en.wikipedia.org/wiki/Closest_pair_of_points_problem which mentions clearly where the O(n) comes from (Planar case).
Any divide and conquer algorithm will consist of a recursive 'divide' component, and a 'merge' component where the recursed results are put together. The linear O(n) component in closet pair comes from merging the results from the 'divide' step into a merged answer.
If we know the lower bound for the time complexity of a problem is Ω(n^2), am I correct in thinking it is not possible to have an algorithm with worst-case time complexity O(n log n)?
If the lower bound for the time complexity of a problem is Ω(n^2), then that means an algorithm solving this problem has to take at least C*n^2 time.
On the other hand, you have an algorithm that takes at most K*n*logn time.
This algorithm cannot run any longer than nlogn. What you need is an algorithm that runs at least n^2 time.
Therefore; it is impossible for this algorithm to solve this problem. You are correct.
How do you calculate time complexity in case of recursion algorithms?
for eg t(n) = t(3n/2) + 0(1) (Heapsort)
Use the Master Theorem.
Anyway, your equation looks broken, since recursive calls have higher input values than that of the caller, so your complexity is O(infinity).
Please fix it.
Master's theorm is the quick and short way. But since you are trying to learn the complexity for all recursive functions, I would rather suggest you to learn the working of recursion tree, which forms the foundation of Master's Theorm . This link goes on to explain it in detail. Rather than using the Master's theorm blindly, learn this for your better understanding in the future ! This link about recursion tree is a good read too
usually you can guess the answer and use induction to prove it.
but there is a theorem which solves a lot of situations as heap sort, named Master Theorem:
http://en.wikipedia.org/wiki/Master_theorem
Complexity of Heapsort