Confusion with recursion [closed] - algorithm

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Suppose I have a recursion with the equation: T(n)= T(n-2) + c .. This means that we are breaking the problem size consequently by a factor of 2 and the order of this algorithm is O(n) which is right! Now, Suppose my equation becomes, T(n)= T(n-2)+cn .. why does the order becomes n2 (to the power of 2) ? I don't want any recursion tree method or any other method to prove it becomes n2 .. Just tell me what does c and cn make the difference here ?

Just tell me what does c and cn make the difference here ?
It means, that the additional work always increases by one c (or two in the case of T(n -2) + cn):
T(n) = T(n-1) + c
If the problem size increases by one, the additional work you need to put in is c, which is constant.
T(n) = T(n-1) + cn
If the problem size increases by one, the additional work you need to put in is one more c than when you last increased the problem size by one.
I.e. suppose you increased the problem size from n to n + 1, which added 10c of additional work. When you now increase the problem size from n + 1 to n + 2, you will need to add an additional 11c of work.
We end up with this series:
d + c + 2c + 3c + 4c + 5c + ...

Related

Solving recurrence: T(n)=sqrt(2)T(n/2)+log(n) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
Given the equation T(n)=sqrt(2)T(n/2)+log(n).
The solution points to case 1 of the M.T. with a complexity class of O(sqrt(n)). However after my understanding log(n) is polynomial greater then sqrt(n). Am I missing something?
I used the definition as following: n^e = log_b(a) where a = sqrt(2) and b = 2. This would give me e = 1/2 < 1. log n is obviously polynomial greater then n^e.
No. logx n is not greater than √n.
Consider n=256,
√n = 16,
and
log2 256 = 8 (let us assume base x=2, as with many of the computational problems).
In your recurrence,
T(n)= √2 T(n/2) + log(n)
a = √2, b = 2 and f(n) = log(n)
logb a = log2 √2 = 1/2.
Since log n < na, for a > 0, We have Case 1 of Master Theorem.
There for T(n) = Θ(√n).
Using the masters theorem you get: a=sqrt(2), b = 2 and therefore c = logb(a) = 1/2. Your f(n) = log(n) and therefore you fall into the first case.
So your complexity is O(sqrt(n))

Solving a recurrence T(n) = 2T(n/2) + sqrt(n) [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Need a little help! This is what I have so far using backward substitution:
T(n) = 2T(n/2) + sqrt(n), where T(1) = 1, and n = 2^k
T(n) = 2[2T(n/4) + sqrt(n/2)] + sqrt(n) = 2^2T(n/4) + 2sqrt(n/2) + sqrt(n)
T(n) = 2^2[2T(n/8) + sqrt(n/4)] + 2sqrt(n/2) + sqrt(n)
= 2^3T(n/8) + 2^2sqrt(n/4) + 2sqrt(n/2) + sqrt(n)
In general
T(n) = 2^kT(1) + 2^(k-1) x sqrt(2^1) + 2^(k-2) x sqrt(2^2) + ... + 2^1 x sqrt(2^(k-1)) + sqrt(2^k)
Is this right so far? If it is, I can not figure out how to simplify it and reduce it down to a general formula.
I'm guessing something like this? Combining the terms
= 1 + 2^(k-(1/2)) + 2^(k-(2/2)) + 2^(k-(3/2)) + ... + 2^((k-1)/2) + 2^(k/2)
And this is where I'm stuck. Maybe a way to factor out a 2^k?
Any help would be great, thanks!
You're half way there.
The expression can be simplified to this:
If you want just a big-O solution, then Master Theorem is just fine.
If you want a exact equation for this, a recursion tree is good. like this:
The right hand-side is cost for every level, it's easy to find a general form for the cost, which is sqrt((2^h) * n). Then, sum up the cost you could get T(n).
According to Master Theorem, it's case 1, so O(n).
According to Recursion Tree, the exact form should be sqrt(n)*(sqrt(2n)-1)*(sqrt(2)+1), which corresponds with the big-O notation.
EDIT:
The recursion tree is just a visualized form of the so called backward substitution. If you sum up the right hand side, i.e. the cost, you could get the generalized form of T(n). All these methods could found in introduction to algorithm

Whats the number of steps this algorithm will take for the general case where the input is of size n? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Here's the algorithm:
Let a = 30, i = 1
While i < n
For j = i+1 to n
If ABS[(j*i)+20] < a then a = ABS[(j*i)+20]
i = i + 1
Return k
Whats the number of steps this algorithm will take for the general case where the input is of size n? How do you work that out?
Also does this algorithm come under the quadratic complexity class?
I think this is with O(n^2)
we have
n+(n-1)+(n-2)+(n-3)......[total n] ....3.2.1
if we calculate it, it would be
0.5( (n^2) + n) = C (n^2 + n)
and it is quadratic complexity class.
Let f(i) denote the number of times the inner for loop runs assuming that j goes from i+1 to n. For example f(5) = n - 5 + 1, since j goes through 6,7,...,n. So we want f(1) + f(2) + f(3) + ... + f(n - 1). Compute what each f(i) and then sum them to see the exact answer.
In general there is an outer loop that runs n times, then the inner loop runs at most n times, for a complexity upper bounded by ???
If I was a compiler, I would notice that this code only changes i, j, and a, local variables; and the only variable whose value is subsequently used is k. So I would gradually optimize away everything but this:
Return k
and the computation would be all constant time, just a few machine instructions. Therefore also within quadratic time.

Asymptotic complexity of T(n)=T(n-1)+1/n [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
There is an algorithm which has the time complexity
T(n)=T(n-1)+1/n if n>1
=1 otherwise
I am solving for its asymptotic complexity, and getting order as 'n' but the answer given is 'log n'. Is it correct? If it is log n, then why?
It can be easily seen (or proven formally with induction) that T(n) is the sum of 1/k for the values of k from 1 to n. This is the nth harmonic number, Hn = 1 + 1/2 + 1/3 + ... + 1/n.
Asymptotically, the harmonic numbers grow on the order of log(n). This is because the sum is close in value to the integral of 1/x from 1 to n, which is equal to the natural logarithm of n. In fact, Hn = ln(n) + γ + O(1/n) where γ is a constant. From this, it is easy to show that T(n) = Θ(log(n)).
For more details:
With H(N) = 1 + 1/2 + 1/3 + ... + 1/N
the function x :-> 1/x is a decreasing function so :
We sum from 1 to N the left part and for the right part we sum from 2 to N and we add 1, we get:
Then we calculate the left and right parts : ln(N+1) <= H(N) <= 1 + ln(N)
this implies H(N)/ln(N) -> 1 hence H(N)=Θ(log(N))
(from http://fr.wikipedia.org/wiki/S%C3%A9rie_harmonique#.C3.89quivalent_de_Hn)

Solution for recurrence T(n)=t(n/2)+sqrt(n) using induction only [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am looking to prove that T(n)=T(n/2)+sqrt(n) is O(sqrt(n)) given T(1)=1
using only induction.
It is easy to solve using the Master theorem but this is not the case.
I tried to assume
T(n/2) < c*sqrt(n/2)
but didnt get very far with the rest of the proof.
Thank you all in advance for your answers.
Edit:
my line of solution (after the assumption above) is:
T(n) <= c*sqrt(n/2)+sqrt(n) = sqrt(n)(c/sqrt(2)+1) <= sqrt(n)(c+1)
I dont know how to move from this to the required
T(n)<=c*sqrt(n)
ok, you're close. So basically, as I mentioned in the comment, base case is simple. For induction case, you want to show that T(n) is O(sqrt(n)) given that T(n/2) is O(sqrt(n/2)).
So, it goes like this:
T(n) = T(n/2) + sqrt(n) ; this is just your recurrence
< c sqrt(n/2) + sqrt(n) ; since T(n/2) is O(sqrt(n))
; wlog here, assume c > 4
= c sqrt(n) / sqrt(2) + sqrt(n)
= (c/sqrt(2) + 1) sqrt(n)
observe that for c > 4, c / sqrt(2) + 1 < c, so
(c/sqrt(2) + 1) sqrt(n) < c sqrt(n)
so
T(n) < c sqrt(n)
Therefore, T(n) is O(sqrt(n))
So there's a couple key points here that you missed.
The first is that you can always increase the c to whatever value you want. This is because big O only requires <. if it's < c f(n) then it is < d f(n) where d > c.
The second is to note that the line f(c) = c/sqrt(2) + 1 intersects with the line f(c) = c at about c = sqrt(2) / (sqrt(2)-1) = 3.4143 (or so), so all you have to do is force c to be > this value in order to get (c/sqrt(2) + 1) < c. 4 certainly works, so that's where the 4 comes from.
In retrospect, I should have given the key points as hints. My fault. Sorry!
One line of thinking which may help is to expand the recurrence recursively. You get
T(n) = sqrt(n) + sqrt(n/2) + sqrt(n/4) + ... + sqrt(n/(2^k)) + ... + sqrt(1)
= sqrt(n) + sqrt(n)/sqrt(2) + sqrt(n)/sqrt(4) + ... + sqrt(n)/sqrt(2^k) + ... + sqrt(1)
= sqrt(n) * (1 + sqrt(1/2) + sqrt(1/2)^2 + ... + sqrt(1/2)^k + ...)
<= sqrt(n) * ∑(k=0 to ∞) sqrt(1/2)^k
= sqrt(n) * 1/(1 - sqrt(1/2))
Since 1/(1-sqrt(1/2)) is a finite constant (it's about 3.4), T(n) must be O(sqrt(n)). You can use this information to prove it using standard induction.

Resources