Solving Recursion in fibonacci numbers - algorithm

I'm unaware of the mathematics in this algorithm and could use some help.
Algorithm:
if n<2 then
return n
else return fibonacci(n-1) + fibonacci(n-2)
Statement
n < 2 is O(1)
Time n >=2 is O(1)
Return n is O(1)
Time n>=2 is -
Return fib(n-1) + fib(n-2) is -
and time n>=2 is T(n-1) + T(n-2) +O(1)
Total: O(1) T(n-1) + T(n-2) + O(1)
T(n) = O(1) if n < 2
T(n) = T(n-1) + T(n-2) + O(1) if n>=2

I think you're supposed to notice that the recurrence relation for this function is awfully familiar. You can learn exactly how fast this awfully familiar recurrence grows by looking it up by name.
However, if you fail to make the intuitive leap, you can try to bound the runtime using a simplified problem. Essentially, you modify the algorithm in ways guaranteed to increase the runtime while making it simpler. Then you figure out the runtime of the new algorithm, which gives you an upper bound.
For example, this algorithm must take longer and is simpler to analyze:
F(n): if n<2 then return n else return F(n-1) + F(n-1)

By induction: if calculation of fib(k) takes less than C*2^k for all k < n, for the calculation tome of fib(n) we've got
T(n) = T(n-1) + T(n-2) + K < C*2^(n-1) + С*2^(n-2) + K
= 0.75*C*2^n + K < C*2^n
for sufficiently big C (for C > K/0.25, as 2^n > 1). This proves that T(n) < C*2^n, i.e. T(n) = O(2^n).
(Here T(n) is the time for calculation of fib(n), K is the constant time needed for calculating fib(n) when both fib(n-1) and fib(b-1) are [already] calculated.)

You need to solve the recurrence equation:
T(0) = 1
T(1) = 1
T(n) = T(n-1) + T(n-2), for all n > 1

Related

Finding time complexity Big-Θ of this recursive formula involving floor function

I am trying to find the time complexity (Big-Θ) of this algorithm:
Recursion(n):
while n > 1:
n = floor(n/2)
Recursion(n)
I have found an upper bound of O(n) by considering the worst case which is when n is a power of 2.
However, I am having trouble finding a lower bound (Big-Ω) for this. My intuition is that this is Ω(n) as well, but I am not sure how to show this with the floor function in the way.
Any suggestions? Thank you!
EDIT: the main thing I'm not convinced of is that T(n/2) is equivalent to T(floor(n/2)). How would one prove this for this algorithm?
floor function performs its operation in constant time, O(1). Therefore you can ignore it/see it as a constant. Let's analyze the time complexity of the algorithm below:
T(n) = T(n/2) + 1 (floor operation)
T(n/2) = T(n/4) + 1
...
T(2) = T(1) + 1 --> T(1) = constant
T(n) = T(n/4) + 2
T(n) = T(n/8) + 3
...
T(n) = T(n/2^k) + k 2^k = n therefore k=log(n)
T(n) = T(1) + log(n)
T(n) = log(n)
We can conclude that T(n) ∈ θ(log(n)).

How to solve recurrence T(n) = T(n-c) +T(c) + n^2

I have the recurrence function
T(n) = T(n-c) +T(c) + n²
Can you explain how can I calculate the height of the recurrence tree when:
c has an indefinite value
c = n/3 => T(n) = T(n-(n/3)) +T(n/3) + n²
I think in the first case T(n) costs θ(n³) and in the second case θ(n²), is it right?
1) if c is a constant then you can ignore the T(c) term, and it will indeed be θ(n³).
2) when it is n/3 or some other factor, look for the T() term with the largest coefficient on n - this leads to the longest branch. Then the upper bound on the time complexity is given by replacing all other T terms with this one.
Example: T(2n/3) + T(n/3) + n² < 2T(2n/3) + n², and from the Master theorem this is indeed θ(n²).

Finding these three algorithm's run time

Hi I am having a tough time showing the run time of these three algorithms for T(n). Assumptions include T(0)=0.
1) This one i know is close to Fibonacci so i know it's close to O(n) time but having trouble showing that:
T(n) = T(n-1) + T(n-2) +1
2) This on i am stumped on but think it's roughly about O(log log n):
T(n) = T([sqrt(n)]) + n. n greater-than-or-equal to 1. sqrt(n) is lower bound.
3) i believe this one is in roughly O(n*log log n):
T(n) = 2T(n/2) + (n/(log n)) + n.
Thanks for the help in advance.
T(n) = T(n-1) + T(n-2) + 1
Assuming T(0) = 0 and T(1) = a, for some constant a, we notice that T(n) - T(n-1) = T(n-2) + 1. That is, the growth rate of the function is given by the function itself, which suggests this function has exponential growth.
Let T'(n) = T(n) + 1. Then T'(n) = T'(n-1) + T'(n-2), by the above recurrence relation, and we have eliminated the troublesome constant term. T(n) and U(n) differ by a constant factor of 1, so assuming they are both non-decreasing (they are) then they will have the same asymptotic complexity, albeit for different constants n0.
To show T'(n) has asymptotic growth of O(b^n), we would need some base cases, then the hypothesis that the condition holds for all n up to, say, k - 1, and then we'd need to show it for k, that is, cb^(n-2) + cb^(n-1) < cb^n. We can divide through by cb^(n-2) to simplify this to 1 + b <= b^2. Rearranging, we get b^2 - b - 1 > 0; roots are (1 +- sqrt(5))/2, and we must discard the negative one since we cannot use a negative number as the base for our exponent. So for b >= (1+sqrt(5))/2, T'(n) may be O(b^n). A similar thought experiment will show that for b <= (1+sqrt(5))/2, T'(n) may be Omega(b^n). Thus, for b = (1+sqrt(5))/2 only, T'(n) may be Theta(b^n).
Completing the proof by induction that T(n) = O(b^n) is left as an exercise.
T(n) = T([sqrt(n)]) + n
Obviously, T(n) is at least linear, assuming the boundary conditions require T(n) be nonnegative. We might guess that T(n) is Theta(n) and try to prove it. Base case: let T(0) = a and T(1) = b. Then T(2) = b + 2 and T(4) = b + 6. In both cases, a choice of c >= 1.5 will work to make T(n) < cn. Suppose that whatever our fixed value of c is works for all n up to and including k. We must show that T([sqrt(k+1)]) + (k+1) <= c*(k+1). We know that T([sqrt(k+1)]) <= csqrt(k+1) from the induction hypothesis. So T([sqrt(k+1)]) + (k+1) <= csqrt(k+1) + (k+1), and csqrt(k+1) + (k+1) <= c(k+1) can be rewritten as cx + x^2 <= cx^2 (with x = sqrt(k+1)); dividing through by x (OK since k > 1) we get c + x <= cx, and solving this for c we get c >= x/(x-1) = sqrt(k+1)/(sqrt(k+1)-1). This eventually approaches 1, so for large enough n, any constant c > 1 will work.
Making this proof totally rigorous by fixing the following points is left as an exercise:
making sure enough base cases are proven so that all assumptions hold
distinguishing the cases where (a) k + 1 is a perfect square (hence [sqrt(k+1)] = sqrt(k+1)) and (b) k + 1 is not a perfect square (hence sqrt(k+1) - 1 < [sqrt(k+1)] < sqrt(k+1)).
T(n) = 2T(n/2) + (n/(log n)) + n
This T(n) > 2T(n/2) + n, which we know is the recursion relation for the runtime of Mergesort, which by the Master theorem is O(n log n), s we know our complexity is no less than that.
Indeed, by the master theorem: T(n) = 2T(n/2) + (n/(log n)) + n = 2T(n/2) + n(1 + 1/(log n)), so
a = 2
b = 2
f(n) = n(1 + 1/(log n)) is O(n) (for n>2, it's always less than 2n)
f(n) = O(n) = O(n^log_2(2) * log^0 n)
We're in case 2 of the Master Theorem still, so the asymptotic bound is the same as for Mergesort, Theta(n log n).

Calculating Big O complexity of Recursive Algorithms

Somehow, I find that it is much harder to derive Big O complexities for recursive algorithms compared to iterative algorithms. Do provide some insight about how I should go about solving these 2 questions.
*assume that submethod has linear complexity
def myMethod(n)
if (n>0)
submethod(n)
myMethod(n/2)
end
end
def myMethod(k,n)
if(n>0)
submethod(k)
myMethod(k,n/2)
end
end
For your first problem, the recurrence will be:
T(n) = n + T(n/2)
T(n/2) = n/2 + T(n/4)
...
...
...
T(2) = 2 + T(1)
T(1) = 1 + T(0) // assuming 1/2 equals 0(integer division)
adding up we get:
T(n) = n + n/2 + n/4 + n/8 + ..... 1 + T(0)
= n(1 + 1/2 + 1/4 + 1/8 .....) + k // assuming k = T(0)
= n*1/(1 - 1/2) ( sum of geometric series a/(1-r) when n tends to infinity)
= 2n + k
Therefore, T(n) = O(n). Remember i have assumed n tends to infinity ,cause this is what we do in Asymptotic analysis.
For your second problem its easy to see that, we perform k primitive operations everytime till n becomes 0. This happens log(n) times. Therefore, T(n) = O(k*log(n))
All you need to do is count how many times a basic operation is executed. This is true for analysing any kind of algorithm. In your case, we will count the number of times submethod is called.
You could break-down the running time of call myMethod(n) to be 1 + myMethod(n / 2). Which you can further break down to 1 + (1 + myMethod(n / 4)). At some point you will reach the base case, in log(n)th step. That gives you an algorithm of log(n).
The second one is no different, since k is constant all the time, it will again take log(n) time, assuming submethod takes constant time regardless of its input.

Timecomplexity analysis of function, Big O

What time-complexity will the following code have in respect to the parameter size? Motivate.
// Process(A, N) is O(sqrt(N)).
Function Complex(array[], size){
if(size == 1) return 1;
if(rand() / float(RAND_MAX) < 0.1){
return Process(array, size*size)
+ Complex(array, size/2)
+ Process(array, size*size);
}
}
I think it is O(N), because if Process(A, N) is O(sqrt(N)), then Process(A, N*N) should be O(N), and Complex(array, size/2) is O(log(n)) because it halves the size every time it runs. So on one run it takes O(N) + O(log(N)) + O(N) = O(N).
Please correct me and give me some hints on how I should think / proceed an assignment like this.
I appreciate all help and thanks in advance.
The time complexity of the algorithm is O(N) indeed, but for a different reason.
The complexity of the function can be denoted as T(n) where:
T(n) = T(n/2) + 2*n
^ ^
recursive 2 calls to
invokation Process(arr,n*n),
each is O(n(
This recursion is well known to be O(n):
T(n) = T(n/2) + 2*n =
= T(n/4) + 2*n/2 + 2*n =
= T(n/8) + 2*n/4 + 2*n/2 + 2*n
= ....
= 2*n / (2^logN) + ... + 2*n/2 + 2*n
< 4n
in O(n)
Let's formally prove it, we will use mathematical induction for it:
Base: T(1) < 4 (check)
Hypothesis: For n, and for every k<n the claim T(k) < 4k holds true.
For n:
T(n) = T(n/2) + n*2 = (*)
< 2*n + 2*n
= 4n
Conclusion: T(n) is in O(n)
(*) From the induction hypothesis

Resources