Solve recurrences in Big-Theta notation - algorithm

Can someone help me with (b) and (c). I really don't know how to solve it, whether using substitution or recurrence tree method. Master theorem doesn't seem to work here either. Thanks.

Related

Is Rice's Theorem equivalent to the Halting problem?

As I understand it Rice's Theorem seems to imply the existence of the Halting problem. That is, with Rice's Theorem, we can prove that the Halting problem is undecidable. However, to me, it seems like one could write a proof using that the Halting problem is undecidable to show Rice's Theorem. I'm not exactly sure how one would go about proving such a thing (though it seems by contradiction would be the natural thing to do), but it feels to me that that it should be possible?

Solving this recurrence without the master theorem. Backtracking Algorithm

I've made a backtracking algorithm.
I've been asked to say what is the complexity of this Algo.
I know that the equation is T(n) = 2T(n-1) + 3(n_hat), where n_hat is the initial n. Meaning it doesn't decrease in each step.
The thing is that I'm getting quite lost on calculating this thing. I believe it's around 2**n * something. But my calculations are a bit confusing. Can you help me please? Thanks!
Let's expand this formula repeatedly by substituting into itself:

Solving a complex recurrence relation for the Traveling Salesman

I need to solve the exact time complexity for the brute force version of the Traveling Salesman using a recurrence relation.
I've worked out the recurrence relation to be as follows:
T(n)=T(n-1)*(n-1)+1
But I'm having trouble reducing that that to a closed form of the function, and thus get the exact time complexity. Not for lack of trying either. It's looking like it's coming out as a binomial sequence but my algebra is a bit rusty.
If anyone could help or point me on the right path I would appreciate it.
Thanks!
Here are a few hints :
define R(n) = T(n)/(n-1)!
solve the recurrence for R(n)
express T(n) as a function of R(n)

Calculating time complexity in case of recursion algorithms?

How do you calculate time complexity in case of recursion algorithms?
for eg t(n) = t(3n/2) + 0(1) (Heapsort)
Use the Master Theorem.
Anyway, your equation looks broken, since recursive calls have higher input values than that of the caller, so your complexity is O(infinity).
Please fix it.
Master's theorm is the quick and short way. But since you are trying to learn the complexity for all recursive functions, I would rather suggest you to learn the working of recursion tree, which forms the foundation of Master's Theorm . This link goes on to explain it in detail. Rather than using the Master's theorm blindly, learn this for your better understanding in the future ! This link about recursion tree is a good read too
usually you can guess the answer and use induction to prove it.
but there is a theorem which solves a lot of situations as heap sort, named Master Theorem:
http://en.wikipedia.org/wiki/Master_theorem
Complexity of Heapsort

Recursion Tree, Solving Recurrence Equations

As far as I know There are 4 ways to solve recurrence equations :
1- Recursion trees
2- Substitution
3 - Iteration
4 - Derivative
We are asked to use Substitution, which we will need to guess a formula for output. I read from CLRS book that there is no magic to do this, i was curious if there are any heuristics to do this?
I can certainly have an idea by drawing a recurrence tree or using iteration but, because the output will be in Big-OH or Theta format, formulas doesnt necessarily match.
Does any one have any recommendation for solving recurrence equations using substitution?
Please note that the list of possible ways to solve recurrence equations is definitely not complete, its merely a set of tools they teach Computer Scientists, because they will most likely solve most of your problems.
For exact solutions of recurrence equations mathematicians use a tool called generating functions. Generating functions give you exact solutions, and in general are more powerful than the master theorem.
There is a great resource online to learn about the here. http://www.math.upenn.edu/~wilf/DownldGF.html
If you go through the first couple examples you should get the hang of it in no time.
You need some math background and understand rudimentary taylor series. http://en.wikipedia.org/wiki/Taylor_series
Generating functions are also extremely useful in probability.
For simple ones, just take a "reasonable" guess.
For more complicated ones, I would go ahead and use a recurrence tree — it seems to me to be the easiest "algorithm" for generating a guess. Note that it can be difficult to use a recurrence tree to prove a bound (the details are tough to get right). Recurrence trees are highly useful for forming guesses which are then proven by substitution.
I'm not sure why you're saying the formulas won't match with the output in Big-O or Theta. They typically don't match exactly, but that's part of the point of Big-O. Part of the trick of going back to substitution is knowing how to plug in the Big-O solution to to make the substitution algebra work out. IIRC, CLRS does work out an example or two of this.

Resources