I am wondering what is the time complexity of this recurrence relation.
Sorry if I'm writing with images, but I needed to include some math:
T(n) = T(n-1) + f(n)
Means
T(n) = T(0) + Sum_from_i=1_to_n_of( f(i) )
In your case, that's:
T(n) = T(0) + 02 + 12 + 22 ... (n-1)2
If you don't know immediately from discrete calculus that the sum comes out to O(n3), you can notice that there are n terms, with the largest being (n-1)2, and there more than (n/3) terms that are >= (n/2)2.
Related
I am trying to find the time complexity (Big-Θ) of this algorithm:
Recursion(n):
while n > 1:
n = floor(n/2)
Recursion(n)
I have found an upper bound of O(n) by considering the worst case which is when n is a power of 2.
However, I am having trouble finding a lower bound (Big-Ω) for this. My intuition is that this is Ω(n) as well, but I am not sure how to show this with the floor function in the way.
Any suggestions? Thank you!
EDIT: the main thing I'm not convinced of is that T(n/2) is equivalent to T(floor(n/2)). How would one prove this for this algorithm?
floor function performs its operation in constant time, O(1). Therefore you can ignore it/see it as a constant. Let's analyze the time complexity of the algorithm below:
T(n) = T(n/2) + 1 (floor operation)
T(n/2) = T(n/4) + 1
...
T(2) = T(1) + 1 --> T(1) = constant
T(n) = T(n/4) + 2
T(n) = T(n/8) + 3
...
T(n) = T(n/2^k) + k 2^k = n therefore k=log(n)
T(n) = T(1) + log(n)
T(n) = log(n)
We can conclude that T(n) ∈ θ(log(n)).
public static int recurC(int n) {
if(n==1)
return 1;
return n + recurC(n-1) + recurC(n-2);
}
So I need to find the formal equation for T(n). I set it up as a recurrence relation with T(n) = C + T(n-1) + T(n-2). However, when I tried to evaluate it out, I got nowhere. The relation with each subsequent recursive call isn't entirely clear to me. Any help would be appreciated, thanks!
Let's analyze the recurrence relation below:
T(n) = T(n-1) + T(n-2) + C
T(0) = T(1) = 1
Notice that T(n-1) ≈ T(n-2), that is to say:
Number of operations performed with input size n - 1 is approximately equal to the number of operations performed with input size n - 2.
Therefore, we can show that:
T(n) = T(n-1) + T(n-2) + C
T(n) ≈ 2T(n-2) + C
T(n) ≈ 4T(n-4) + 3C
T(n) ≈ 8T(n-6) + 7C
...
T(n) = 2^k*T(n-2k) + (2^k-1)*C
n-2k = 0 --> k = n/2
T(n) = 2^(n/2) + (2^(n/2)-1)*C
T(n) = (1 + C)*2^(N/2) - C
Therefore we can conclude that, T(n) ∈ O(2^(n/2))
I watched a video where they prove T(n)= T(n-1) + n is O(n^2)
I have the following expressions which are:
T(1) = 4
T(N) = T(N – 1) + N + 3, N > 1
My question is, is the expression above solved the same way, even though there is a +3 after N.
The question is a bit messed up, but i hope you get the point. If there are questions i will try to explain better.
In a word is T(N) = T(N – 1) + N + 3 = O(n^2)
T(n) = T(n-1) + n-1 + 4 => given equation by adding 1 and subtracting 1
T(n) = T(n-1) + n-1 + T(1) ...(1)
Now, T(1) = constant.
Therefore, from eq(1),
T(n) = T(n-1) + (n-1) ...(2)
Eq(2) reduces to T(n) = T(n-k) + n*k - k*(k+1)/2 ...(3)
Upon substituting (n-k)=1 or k=(n-1) in eq(3),
we get,
T(n) = T(1) + n*(n-1) - (n-1)(n)/2
T(n) = n*(n-1)/2 => O(n^2)
PS: If we won't neglect T(1) in eq(1), final equation we get is T(n) = n*(n-1)/2 + T(1) + 4*k => T(n) = n*(n-1)/2 + 4 + 4*(n-1) which still gives O(n^2) as final answer.
I have been trying to solve a recurrence relation.
The recurrence is T(n) = T(n/3)+T(2n/3)+n^2
I solved the the recurrence n i got it as T(n)=nT(1)+ [ (9/5)(n^2)( (5/9)^(log n) ) ]
Can anyone tell me the runtime of this expression?
I think this recurrence works out to Θ(n2). To see this, we'll show that T(n) = Ω(n2) and that T(n) = O(n2).
Showing that T(n) = Ω(n2) is pretty straightforward - since T(n) has an n2 term in it, it's certainly Ω(n2).
Let's now show that T(n) = O(n2). We have that
T(n) = T(n / 3) + T(2n / 3) + n2
Consider this other recurrence:
S(n) = S(2n / 3) + S(2n / 3) + n2 = 2S(2n / 3) + n2
Since T(n) is increasing and T(n) ≤ S(n), any upper bound for S(n) should also be an upper-bound for T(n).
Using the Master Theorem on S(n), we have that a = 2, b = 3/2, and c = 2. Since logb a = log3/2 2 = 1.709511291... < c, the Master Theorem says that this will solve to O(n2). Since S(n) = O(n2), we also know that T(n) = O(n2).
We've shown that T(n) = Ω(n2) and that T(n) = O(n2), so T(n) = Θ(n2), as required.
Hope this helps!
(By the way - (5 / 9)log n = (2log 5/9)log n = 2log n log 5/9 = (2log n)log 5/9 = nlog 5/9. That makes it a bit easier to reason about.)
One can't tell about runtime from the T(n) OR the time complexity!It is simply an estimation of running time in terms of order of input(n).
One thing which I'd like to add is :-
I haven't solved your recurrence relation,but keeping in mind that your derived relation is correct and hence further putting n=1,in your given recurrence relation,we get
T(1)=T(1/3)+T(2/3)+1
So,either you'll be provided with the values for T(1/3) and T(2/3) in your question OR you have to understand from the given problem statement like what should be T(1) for Tower of Hanoi problem!
For a recurrence, the base-case is T(1), now by definition its value is as following:
T(1) = T(1/3) + T(2/3) + 1
Now since T(n) denotes the runtime-function, then the run-time of any input that will not be processed is always 0, this includes all terms under the base-case, so we have:
T(X < 1) = 0
T(1/3) = 0
T(2/3) = 0
T(1) = T(1/3) + T(2/3) + 1^2
T(1) = 0 + 0 + 1
T(1) = 1
Then we can substitute the value:
T(n) = n T(1) + [ (9/5)(n^2)( (5/9)^(log n) ) ]
T(n) = n + ( 9/5 n^2 (5/9)^(log n) )
T(n) = n^2 (9/5)^(1-log(n)) + n
We can approximate (9/5)^(1-log(n)) to 9/5 for asymptotic upper-bound, since (9/5)^(1-log(n)) <= 9/5:
T(n) ~ 9/5 n^2 + n
O(T(n)) = O(n^2)
I want to understand how to arrive at the complexity of the below recurrence relation.
T(n) = T(n-1) + T(n-2) + C
Given T(1) = C and T(2) = 2C;
Generally for equations like T(n) = 2T(n/2) + C (Given T(1) = C), I use the following method.
T(n) = 2T(n/2) + C
=> T(n) = 4T(n/4) + 3C
=> T(n) = 8T(n/8) + 7C
=> ...
=> T(n) = 2^k T (n/2^k) + (2^k - 1) c
Now when n/2^k = 1 => K = log (n) (to the base 2)
T(n) = n T(1) + (n-1)C
= (2n -1) C
= O(n)
But, I'm not able to come up with similar approach for the problem I have in question. Please correct me if my approach is incorrect.
The complexity is related to input-size, where each call produce a binary-tree of calls
Where T(n) make 2n calls in total ..
T(n) = T(n-1) + T(n-2) + C
T(n) = O(2n-1) + O(2n-2) + O(1)
O(2n)
In the same fashion, you can generalize your recursive function, as a Fibonacci number
T(n) = F(n) + ( C * 2n)
Next you can use a direct formula instead of recursive way
Using a complex method known as Binet's Formula
You can use this general approach described here.Please ask if you have more questions.
If you were also interested in finding an explicit formula for T(n) this may help.
We know that T(1) = c and T(2) = 2c and T(n) = T(n-1) + T(n-2) + c.
So just write T(n) and start expanding.
T(n) = T(n-1) + T(n-2) + c
T(n) = 2*T(n-2) + T(n-3) + 2c
T(n) = 3*T(n-3) + 2*T(n-4) + 4c
T(n) = 5*T(n-4) + 3*T(n-5) + 7c
and so on.
You see the coefficients are Fibonacci numbers themselves!
Call F(n) the nth Fibonacci number. F(n) = (phi^n + psi^n)/sqrt(5) where phi = (1+sqrt(5))/2 and psi = -1/phi, then we have:
T(n) = F(n)*2c + F(n-1)*c + (F(n+1)-1)*c
Here is some quick code to demonstrate:
def fib_gen(n):
"""generates fib numbers to avoid rounding errors"""
fibs=[1,1]
for i in xrange(n-2):
fibs.append(fibs[i]+fibs[i+1])
return fibs
F = fib_gen(50) #just an example.
c=1
def T(n):
"""the recursive definiton"""
if n == 1:
return c
if n == 2:
return 2*c
return T(n-1) + T(n-2) + c
def our_T(n):
n=n-2 #just because your intials were T(1) and T(2), sorry this is ugly!
"""our found relation"""
return F[n]*2*c + F[n-1]*c + (F[n+1]-1)*c
and
>>> T(24)
121392
>>> our_T(24)
121392
Is "worse than exponential" accurate enough for your purposes? The special case C=0 defines http://en.wikipedia.org/wiki/Fibonacci_number, which you can see from the article is exponential. Assuming C is positive, your series will be growing faster than this. In fact, your series will lie between the Fibonacci series and a variant of the Fibonacci series in which the golden ratio is replaced by something very slightly larger.
This type of recurrences are called: non-homogeneous recurrence relations and you have to solve in the beginning homogeneous recurrence (the one without a constant at the end). If you are interested, read the math behind it.
I will show you an easy way. Just type your equation in wolfram-alpha and you will get:
So the complexity grows in the same way as either Lucas or Fibonacci number (the bigger of them).
But both of them have the same growth rate:
and therefore your growth rate is an exponential of the golden ratio: O(phi^n)