Algorithms and Recursion Help? - algorithm

I have the following recursion: T(n) = 2*T(n/4) + T(n/2) + n and I need to know the exact equation, I know Master theorem won't help me, and the iteration seems to be wrong...
Please tell me how to do it in general for such recursions.
Thanks in advance.
Hey all, thanks for replying I need complexity. I need to understand how to solve such problems.

T(n) = O(nlogn) and W(nlogn)
To prove that, by definition of O, we need to find constants n0 and c such that:
for every n>=n0, T(n)<=cnlogn.
We will use induction on n to prove that T(n)<=cnlogn for all n>=n0
Let's skip the base case for now... (we'll return later)
Hipothesis: We assume that for every k<n, T(k)<=cklogk
Thesis: We want to prove that T(n)<=cnlogn
But, T(n)=2T(n/4)+T(n/2)+n
Using the hipothesis we get:
T(n)<=2(c(n/4)log(n/4))+c(n/2)log(n/2)+n=cnlogn + n(1-3c/2)
So, taking c>=2/3 would prove our thesis, because then T(n)<=cnlogn
Now we need to prove the base case:
We will take n0=2 because if we take n0=1, the logn would be 0 and that wouldn't work with our thesis. So our base cases would be n=2,3,4. We need the following propositions to be true:
T(2) <= 2clog2
T(3) <= 3clog3
T(4) <= 4clog4
So, by taking c=max{2/3, T(2)/2, T(3)/3log3, T(4)/8} and n0=2, we would be finding constants c and n0 such that for every natural n>=n0, T(n)<=cnlogn
The demonstration for T(n) = W(nlogn) is analog.
So basically, in these cases where you can't use the Masther Theorem, you need to 'guess' the result and prove it by induction.
For more information on these kind of demonstrations, refer to 'Introduction to Algorithms'

First of all you need to define some limits on this, otherwise it won't ever end and you will stuck up with OverflowException.
Something like the n is integer and the minimal value is 0.
Could you please bring up more details on your question in this manner ?

This won't help you figure out how to do it necessarily, but apparently Wolfram Alpha can get the right answer. Perhaps you can look for documentation or have Mathematica show you the steps it takes in solving this:
Wolfram Alpha: T(n)=2*T(n/4)+T(n/2)+n
To put crude upper and lower bounds on the search space, you could have recognized your T(n) is bounded above by 3T(n/2) + n and below by 2T(n/4) + n... so O(n^(3/2)) and W(n), by the master theorem.
In general, solving recurrence relations hard problem.

Related

Proof of Master theorem for Case-1: How these steps are mathematically derived?

I was reading Thomas H. Cormen book to understand the Proof of Master theorem.However, i am stuck at proving case-1.please help me to understand the mathematical proofs by more easy mathematical derivation of steps in the following image:
Thanks
For the first question:
b^{\log_b(a)} = a
(Don't we have TeX in SO?)
This is because the logarithm to the base b is the inverse of b^.
Then a / a = 1, thus only b^epsilon remains.
Second and third question: This is the geometric series, you can find it here: https://en.wikipedia.org/wiki/Geometric_series#Formula
For this the summand b^epsilon must be between zero and one (exclusive), i.e. |b^epsilon| < 1.

Summations - Closed Form - Where to Start

I am struggling to understand basics as it related to forming a closed form expression from a summation. I understand the goal at hand, but do not understand the process for which to follow in order to accomplish the goal.
Find a closed form for the sum k+2k+3k+...+K^2. Prove your claim
My first approach was to turn it into a recurrence relation, which did not work cleanly. After that I would attempt to turn from a recurrence relation into a closed form, but I am unsuccessful in getting there.
Does anyone know of a strong approach for solving such problems? Or any simplistic tutorials that can be provided? The material I find online does not help, and causes further confusion.
Thanks
No one gave the mathematical approach, so I am adding the mathematical approach to this AP problem.
Given series is 1k + 2k + 3k + .... + k.k(OR k^2)
Therefore, it means that there are altogether k terms together in the given series.
Next, as here all the consecutive terms are greater than the previous term by a constant common difference,i.e., k.
So, this is an Arithmetic Progression.
Now, to calculate the general summation, the formula is given by :-
S(n) = n/2{a(1)+a(n)}
where,S(n) is the summation of series upto n terms
n is the number of terms in the series,
a(1) is the first term of the series, and
a(n) is the last(n th) term of the series.
Here,fitting the terms of the given series into the summation formula, we get :-
S(n) = k/2{1k + k.k} = (k/2){k+k^2) = [(k^2)/2 + (k^3)/2]*.
If you are interested in a general algorithm to compute sums like these (and more complicated ones) I can't recommend the book A=B enough.
The authors have been so kind to make the pdf freely available:
http://www.math.upenn.edu/~wilf/AeqB.html
Enjoy!
Asad has explained a mathematical approach in the comments to solving this.
If you are interested in a programming approach that works for more complicated expressions, then you can use Sympy in Python.
For example:
import sympy
x,k = sympy.symbols('x k')
print sympy.sum(x*k,(x,1,k))
prints:
k*(k/2 + k**2/2)

Prove, using only the definition of O(), that 2^sqrt(x) is not O(x^10)

Prove, using only the definition of O(), that 2^sqrt(x) is not O(x^10).
I have been doing a few exercises on Big O and this is the first time I have encountered the variable in the exponent. I was wondering how to disprove this function. Any help would be appreciated
you can use limits to prove it.
lim (2^sqrt(x))/x^10) when x->infinity then this limit is Infinity, which means that 2^sqrt(x) is not O(x^10).

What is the bound or T(n) of this algorithm?

def unknownsort(A[],x,y):
if x ==y+1:
if A[x]>A[y]:
switch A[x] and A[y]
elif y > x+1:
z = (y-x+1)/3
unkownsort(A[],x,y-z)
unkownsort(A[],x+z,y)
unkownsort(A[],x,y-z)
Is there a name for this equation? For T(n) what is have is
T(n)= 3T(n) + Theta(n) is this right? I was planning to use Master's Theorem but im not sure exactly if this is right. Also what do you call this process of finding T(n)
I was thinking unkownsort is called three times so, T(n) = 3T(n), but it has a base case depending on the size of the input so T(n) = 3T(n)+theta(n). Now I was wondering if this equation would be wrong because of "z" since z manipulates the size of my array.
Somehow ive come up with this: T(n) = 3T(n/3)+1. Is this correct now?
Ok, homework, let's stick to hints then.
In 3T(n), the 3 is correct since there are 3 recursive calls, but the T(n) is not - the n should be (in the form n/c) the size which the next recursive calls work with, currently you're saying the size is the same.
The Theta(n) is incorrect - apart from the recursive calls, how much work is done in the function? Does the amount of work depend on x and y (you should probably assume that any arithmetic operation always takes a constant amount of time, although this isn't strictly true)?
Did you give us the whole function? If so, I'm not convinced that that algorithm of yours does anything particularly useful (while it looks like it is sorting, I'm not convinced it is, but I could be wrong), and thus probably doesn't have a name.
The T(n) equation is called a recurrence relation, so the process of finding it would simply be called the process of finding the recurrence relation (I'm not aware of a single term to denote this).
(The updated equation you edited into your question is correct)

Introduction to Algorithm 3rd edition, Exercise 4.3-6

4.3-6
Show that the solution to T(n)=2T(n/2 + 17) + n is O(nlgn).
Using substitution method, I tried to solve this question by assuming
T(n/2+17) <= C(n/2+17)lg(n/2+17)
However I can not work it out, any suggestions?
let f(n)=T(n+34) then you get f(n)=T(n+34)=2T(n/2+34)+n+34=2f(n/2)+n+34, try to solve f(n), then you get T(n). In fact, for T(n)=2(n/2+c)+n, with the master theorem, you can ignore the const c.

Resources