How do you calculate a tight bound run time for these relations?
T(n)=T(n-3)+n^2
T(n) = 4T(n/4)+log^3(n)
For the first one I used the substitution method which gave me n^2 but wasn't right and the second one I used Masters Theorem and got nlog^4(n) which also wasn't right. A thorough explanation would be helpful. Thanks!
for the First Recurrence, we can solve it by recurrence tree method
T(n)=T(n-3)+n^2
a) here we see that the number of sub problems are n/3(every i Subtract 3 from n so in n/3 steps we will be reaching the last subproblem).
b) at each level the cost is n^2
therefore the time complexiety is roughly (n/3)*n^2= (n^3)/3 which is O(n^3)
Coming to the second recurrence relation
T(n)=4T(n/4)+log^3(n)
Here we can't apply Master's theorem because n and log^3(n) are not comparable Polynomial times
we could have applied master's theorem(Corollary for strictly logarithmic bounds) if we had something like nlog^3(n) because it is greater strictly by log times
correct me if i am wrong here
Related
I run into an exercise from a book on "algorithms and data structures" that is giving me some trouble.
I need to write the pseudo-code of a recursive algorithm regulated by the recurrence relation:
T(n) = T(n-1)*T(n-2) + T(n-3) + O(1) for n>10
without solving the relation.
I suspect there is no such an algorithm but I am unsure.
In my attempts to find a solution, I have evaluated k=T(n-1) and called the algorithm on n-2 for k times. Reasoning in this way is not correct because I need to add a cost for estimating T(n-1) to the relation (for instance, I can estimate the cost in a iterative way in O(n) or I may call the algorithm on n-1 if the algorithm return the cost. The latter would add T(n-1) to the recurrence relation).
I’d be thankful if someone could give me an hint and showing me where my reasoning is wrong.
In general, how should be structured an algorithm with a number of recursive calls equals to T(n-1)*T(n-2)?
Tks
Can you point out to me an example of a divide and conquer algorithm that runs in CONSTANT time! I'm in a "OMG! I can not think of any such thing" kind of situation. Point me to something please. Thanks
I know that an alg that follows the following recurence: T(n) = 2T(n/2) + n would be merge sort. We're dividing the problem into 2 subproblems - each of size n/2. Then we're taking n time to conquer everything back into one sorted array.
I also know that T(n) = T(n/2) + 1 would be binary search.
But what is T(n) = 1?
For a divide-and-conquer algorithm to run in constant time, it needs to do no more than a fixed amount of work on any input. Therefore, it can make at most a fixed number of recursive calls on any input, since if the number of calls was unbounded, the total work done would not be a constant. Moreover, it needs to do a constant amount of work across all those recursive calls.
This eliminates basically any reasonable-looking recurrence relation. Anything of the form
T(n) = aT(n / b) + O(nk)
is immediately out of the question, because the number of recursive calls would grow as a function of the input n.
You could make some highly contrived divide-and-conquer algorithms that run in constant time. For example, consider this problem:
Return the first element of the input array.
This could technically be solved with divide-and-conquer by noting that
The first element of a one-element array is equal to itself.
The first element of an n-element array is the first element of the subarray of just the first element.
The recurrence is then
T(n) = T(1) + O(1)
T(1) = 1
As you can see, this is a very odd-looking recurrence, but it does work out.
I've never heard of anything like this coming up in practice, but if I think of anything I'll try to update this answer with details. (A note: I'm not expecting to ever update this answer. ^_^)
Hope this helps!
Please explain how to proceed with these types of questions.
T(n)=4T(n/3)+n;
Can we get the complexity of this relation without using Master theorem. If so, how?
Please explain.
Also, how should we go about finding the run time complexity of any code?
Thank You.
Time complexity of your code is O(n*log_3(n)) log_3(n) -> O(n * log n).
Why?
This is because your relation is recursive and it will keep on recurring until n<3 (assuming it is the base case.)
In each recurrence step the value of n becomes n/3, and also a loop worth O(n) gets executed.
Here is the tree implementation
Time Complexity for T(n)=c*T(n/3)+n^2 will be O((n^2)*logn) if(log_3(c)==2)
When calculate the median, we know that if we break the input array into subgroups as five and solve it recursively, we will get O(n) complexity, but if we break the array into 3, it won't return the O(n) complexity.
Does any one know how to prove it?
It' gonna be nlg(n) .
Try to draw it's recursion tree : The total cost of each level is equal to n, and the depth of this tree is lg(n) .
Note : Actually it's depth is between log(n) base 3 and log(n) base 3/2, but since the order of logarithms in all bases are same, we can just consider it as lg(n).
It looks like the recurrence in the title is wrong, but I think the Master Theorem for solving recurrences will be handy. You can show that going from one denominator to another puts you into a different case which goes from O(n) to something worse.
In T(n) = 2T(n/2) + M(n), where does the 2 in front of T come from. n/2 because it is dividing, and M(n) is linear, but I can't figure out what the 2 is for?
2, because you are performing the operation on the two subsets. See the master theorem.
The recurrence relation is similar to what you get in Merge Sort. The time complexity would be O(n log n)
This says that the time cost of the problem of size n comes from dividing the problem in half (i.e., T(n/2)) and solving it for both halves (2 T(n/2)) plus some fix-up cost (i.e., M(n)).
So, the 2 is because you divide the problem in half and solve both halves.
The 2 represents how many times you're going to call the recurring function.
For example, if you had a tree that had 4 children, you would expect a 4 for that value. In this case, you're recurring twice.