Recurence relation of recursive call inside another - algorithm

I'm trying to analyze the cost of this code:
static int funct1(int x) {
if (x<=1) return x;
return funct1(funct1(x/2)) + 1;
}
How i could resolve the recurrence relation T(n)=T(T(n/2))+1?

Your analysis is not correct. T(n) = T(f(n/2)) + T(n/2) + 1 (You've missed T(n/2) but the program should compute T(n/2) first and then run T(f(n/2))). and it is not T(T(n/2)).
Now try to consider f(n) as a value of the function funct1.
Suppose n = 2^k, f(2^k) = f(f(2^{k-1})) + 1 = f(f(2^{k-2}) + 1) + 1 = ... = f(f(f(...f(1) + 1..)+1)+1) + 1. f(1) = 1, f(2) = 2, f(3) = 2, f(4) = 3, f(8) = f(f(4)) + 1 = f(3) + 1 = 3, f(16) = f(f(8)) + 1 = f(3) + 1 = 3, and f(32) = f(f(16)) + 1 = f(3) + 1 = 3. As you can see it repeats for all values 2^k, and f(2^k) = 3.
Therefore, T(n) = T(3) + T(n/2) + 1 = O(log n).

Related

Time complexity of recursive function with a loop

What would be the time complexity of the below function?
int f = (int n) => {
if (n === 1) {
return 1;
}
for (let i = 0; i < n; i++) {
console.log(i);
}
return f(n-1) + f(n-1)
}
This is what the recursion tree looks like where i denotes the depth:
f(4) // i = 0
/ \
/ \
/ \
/ \
/ \
f(3) f(3) // i = 1
/ \ / \
/ \ / \
f(2) f(2) f(2) f(2) // i = 2
/ \ / \ / \ / \
f(1) f(1) f(1) f(1) f(1) f(1) f(1) f(1) // i = 3
Had there been no loop inside the function, the time complexity would have been O(2^n). But now there's a loop so:
At each depth, the number of function calls made is 2 ^ i and in each function call n - i iterations are done in the loop, hence at each level (2 ^ i) * (n - i) operations are being done. So, time complexity could be calculated as:
T(n) = [(2 ^ 0) * (n - 0)] + [(2 ^ 1) * (n - 1)] + [(2 ^ 2) * (n - 2)] + ... + [(2 ^ (n - 1)) * (n - (n - 1))]
So am I right on this and what could possibly be the final time complexity?
You are right, the number of log calls is driven by the recurrence
T(n) = C1 n + C2 + 2 T(n-1)
with T(1)=C3.
We can guess that the solution has the form
T(n) = A 2^n + B.n + C
and by identification,
T(n) = A 2^n - C1 n - C2 + C1.
Finally,
T(1) = 2 A - C2 = C3
gives A.

Is there any solution for recurrence relation T(n) = T(T(n - 1)) + 1?

Is there any solution for this Recurrence relation
T(n) = T( T( n - 1 ) ) + 1
from code in C like syntax
Algo(int n)
{
printf("%d ->",n);
return (n >= 1)?Algo(Algo(n - 1))+1:1;
}
#include <stdio.h>
int main(void){
printf("\tEnd: %d",Algo(3));
return 0;
}
Result :
3->2->1->1->2->1->1 End: 3
How to find Time Complexity for this relation ?
Thank you
Is there any solution
Depending on the value for T(0), and assuming n = 0,1,2 ... there is a solution :
T(n) = T(0) + n
Prove :
n
T(n)
T(n)
0
T(0) =
X
1
T(1) =
T(0)+1 = (X)+1
2
T(2) =
T(1)+1 = (X+1)+1
3
T(3) =
T(2)+1 = (X+1+1)+1
4
T(4) =
T(3)+1 = (X+1+1+1)+1
5
T(5) =
T(4)+1 = (X+1+1+1+1)+1
6
T(6) =
T(5)+1 = (X+1+1+1+1+1)+1
7
T(7) =
T(6)+1 = (X+1+1+1+1+1+1)+1
8
T(8) =
T(7)+1 = (X+1+1+1+1+1+1+1)+1
Please share if the solution/assumption is acceptable. ( :

time complexity of recursion function

I don't know where to start to calculate the time complexity of this function. What is O(time complexity) of this function? I learned that the answer is 3^n.
int f3(int n) {
if (n < 100)
return 1;
return n* f3(n-1) * f3(n-2) * f3(n-3)
}
We have:
1 : if statement
3 : * operator
3 : function statement
Totally: 7
So we have:
T(n)=t(n-1) + t(n-2) + t(n-3) + 7
T(99)=1
T(98)=1
...
T(1)=1
Then to reduce T(n), we have T(n-3)<T(n-2)<T(n-1) therefore:
T(n) <= T(n-1)+T(n-1)+T(n-1) + 7
T(n) <= 3T(n-1) + 7
So we can solve T(n) = 3T(n-1) + 7 (in big numbers T(n-1) is almost equal T(n-2) and ... )
Then we can calculate T(n) by following approach:
T(n) = 3.T(n-1) + 7
=3.(3.T(n-2)+7) + 7 = 3^2.T(n-2) + 7.(3^1 + 3^0)
=3^2.(3.T(n-3)+7) + ... = 3^3.T(n-3) + 7.(3^2 + 3^1 + 3^0)
= 3^4.T(n-4) + 7.(3^4 + 3^2 + 3^1 + 3^0)
...
=3^(n-99).T(n-(n-99)) + 7.(3^(n-98) + 3^(n-97) + ...+ 3^1 + 3^0)
=3^(n-99) + 7.(3^(n-98) + 3^(n-97) + ...+ 3^1 + 3^0)
So consider that 1 + 3 + 3^2 ...+3^(n-1) = 3^n - 1/2
Therefore we reached:
T(n) = 3^(n-99) + 7.(3^(n-99) + 1/2) = 8.3^(n-99) - 7/2
= 8/(3^99) . 3^n -7/2
Finally: Big O is O(3^n)
If you just have T(1)=1, the answer is the same.

Number of ways to divide a number

Given a number N, print in how many ways it can be represented as
N = a + b + c + d
with
1 <= a <= b <= c <= d; 1 <= N <= M
My observation:
For N = 4: Only 1 way - 1 + 1 + 1 + 1
For N = 5: Only 1 way - 1 + 1 + 1 + 2
For N = 6: 2 ways - 1 + 1 + 1 + 3
1 + 1 + 2 + 2
For N = 7: 3 ways - 1 + 1 + 1 + 4
1 + 1 + 2 + 3
1 + 2 + 2 + 2
For N = 8: 5 ways - 1 + 1 + 1 + 5
1 + 1 + 2 + 4
1 + 1 + 3 + 3
1 + 2 + 2 + 3
2 + 2 + 2 + 2
So I have reduced it to a DP solution as follows:
DP[4] = 1, DP[5] = 1;
for(int i = 6; i <= M; i++)
DP[i] = DP[i-1] + DP[i-2];
Is my observation correct or am I missing any thing. I don't have any test cases to run on. So please let me know if the approach is correct or wrong.
It's not correct. Here is the correct one:
Lets DP[n,k] be the number of ways to represent n as sum of k numbers.
Then you are looking for DP[n,4].
DP[n,1] = 1
DP[n,2] = DP[n-2, 2] + DP[n-1,1] = n / 2
DP[n,3] = DP[n-3, 3] + DP[n-1,2]
DP[n,4] = DP[n-4, 4] + DP[n-1,3]
I will only explain the last line and you can see right away, why others are true.
Let's take one case of n=a+b+c+d.
If a > 1, then n-4 = (a-1)+(b-1)+(c-1)+(d-1) is a valid sum for DP[n-4,4].
If a = 1, then n-1 = b+c+d is a valid sum for DP[n-1,3].
Also in reverse:
For each valid n-4 = x+y+z+t we have a valid n=(x+1)+(y+1)+(z+1)+(t+1).
For each valid n-1 = x+y+z we have a valid n=1+x+y+z.
Unfortunately, your recurrence is wrong, because for n = 9, the solution is 6, not 8.
If p(n,k) is the number of ways to partition n into k non-zero integer parts, then we have
p(0,0) = 1
p(n,k) = 0 if k > n or (n > 0 and k = 0)
p(n,k) = p(n-k, k) + p(n-1, k-1)
Because there is either a partition of value 1 (in which case taking this part away yields a partition of n-1 into k-1 parts) or you can subtract 1 from each partition, yielding a partition of n - k. It's easy to show that this process is a bijection, hence the recurrence.
UPDATE:
For the specific case k = 4, OEIS tells us that there is another linear recurrence that depends only on n:
a(n) = 1 + a(n-2) + a(n-3) + a(n-4) - a(n-5) - a(n-6) - a(n-7) + a(n-9)
This recurrence can be solved via standard methods to get an explicit formula. I wrote a small SAGE script to solve it and got the following formula:
a(n) = 1/144*n^3 + 1/32*(-1)^n*n + 1/48*n^2 - 1/54*(1/2*I*sqrt(3) - 1/2)^n*(I*sqrt(3) + 3) - 1/54*(-1/2*I*sqrt(3) - 1/2)^n*(-I*sqrt(3) + 3) + 1/16*I^n + 1/16*(-I)^n + 1/32*(-1)^n - 1/32*n - 13/288
OEIS also gives the following simplification:
a(n) = round((n^3 + 3*n^2 -9*n*(n % 2))/144)
Which I have not verified.
#include <iostream>
using namespace std;
int func_count( int n, int m )
{
if(n==m)
return 1;
if(n<m)
return 0;
if ( m == 1 )
return 1;
if ( m==2 )
return (func_count(n-2,2) + func_count(n - 1, 1));
if ( m==3 )
return (func_count(n-3,3) + func_count(n - 1, 2));
return (func_count(n-1, 3) + func_count(n - 4, 4));
}
int main()
{
int t;
cin>>t;
cout<<func_count(t,4);
return 0;
}
I think that the definition of a function f(N,m,n) where N is the sum we want to produce, m is the maximum value for each term in the sum and n is the number of terms in the sum should work.
f(N,m,n) is defined for n=1 to be 0 if N > m, or N otherwise.
for n > 1, f(N,m,n) = the sum, for all t from 1 to N of f(S-t, t, n-1)
This represents setting each term, right to left.
You can then solve the problem using this relationship, probably using memoization.
For maximum n=4, and N=5000, (and implementing cleverly to quickly work out when there are 0 possibilities), I think that this is probably computable quickly enough for most purposes.

Recurrences using Substitution Method

Determine the positive number c & n0 for the following recurrences (Using Substitution Method):
T(n) = T(ceiling(n/2)) + 1 ... Guess is Big-Oh(log base 2 of n)
T(n) = 3T(floor(n/3)) + n ... Guess is Big-Omega (n * log base 3 of n)
T(n) = 2T(floor(n/2) + 17) + n ... Guess is Big-Oh(n * log base 2 of n).
I am giving my Solution for Problem 1:
Our Guess is: T(n) = O (log_2(n)).
By Induction Hypothesis assume T(k) <= c * log_2(k) for all k < n,here c is a const & c > 0
T(n) = T(ceiling(n/2)) + 1
<=> T(n) <= c*log_2(ceiling(n/2)) + 1
<=> " <= c*{log_2(n/2) + 1} + 1
<=> " = c*log_2(n/2) + c + 1
<=> " = c*{log_2(n) - log_2(2)} + c + 1
<=> " = c*log_2(n) - c + c + 1
<=> " = c*log_2(n) + 1
<=> T(n) not_<= c*log_2(n) because c*log_2(n) + 1 not_<= c*log_2(n).
To solve this remedy used a trick a follows:
T(n) = T(ceiling(n/2)) + 1
<=> " <= c*log(ceiling(n/2)) + 1
<=> " <= c*{log_2 (n/2) + b} + 1 where 0 <= b < 1
<=> " <= c*{log_2 (n) - log_2(2) + b) + 1
<=> " = c*{log_2(n) - 1 + b} + 1
<=> " = c*log_2(n) - c + bc + 1
<=> " = c*log_2(n) - (c - bc - 1) if c - bc -1 >= 0
c >= 1 / (1 - b)
<=> T(n) <= c*log_2(n) for c >= {1 / (1 - b)}
so T(n) = O(log_2(n)).
This solution is seems to be correct to me ... My Ques is: Is it the proper approach to do?
Thanks to all of U.
For the first exercise:
We want to show by induction that T(n) <= ceiling(log(n)) + 1.
Let's assume that T(1) = 1, than T(1) = 1 <= ceiling(log(1)) + 1 = 1 and the base of the induction is proved.
Now, we assume that for every 1 <= i < nhold that T(i) <= ceiling(log(i)) + 1.
For the inductive step we have to distinguish the cases when n is even and when is odd.
If n is even: T(n) = T(ceiling(n/2)) + 1 = T(n/2) + 1 <= ceiling(log(n/2)) + 1 + 1 = ceiling(log(n) - 1) + 1 + 1 = ceiling(log(n)) + 1.
If n is odd: T(n) = T(ceiling(n/2)) + 1 = T((n+1)/2) + 1 <= ceiling(log((n+1)/2)) + 1 + 1 = ceiling(log(n+1) - 1) + 1 + 1 = ceiling(log(n+1)) + 1 = ceiling(log(n)) + 1
The last passage is tricky, but is possibile because n is odd and then it cannot be a power of 2.
Problem #1:
T(1) = t0
T(2) = T(1) + 1 = t0 + 1
T(4) = T(2) + 1 = t0 + 2
T(8) = T(4) + 1 = t0 + 3
...
T(2^(m+1)) = T(2^m) + 1 = t0 + (m + 1)
Letting n = 2^(m+1), we get that T(n) = t0 + log_2(n) = O(log_2(n))
Problem #2:
T(1) = t0
T(3) = 3T(1) + 3 = 3t0 + 3
T(9) = 3T(3) + 9 = 3(3t0 + 3) + 9 = 9t0 + 18
T(27) = 3T(9) + 27 = 3(9t0 + 18) + 27 = 27t0 + 81
...
T(3^(m+1)) = 3T(3^m) + 3^(m+1) = ((3^(m+1))t0 + (3^(m+1))(m+1)
Letting n = 3^(m+1), we get that T(n) = nt0 + nlog_3(n) = O(nlog_3(n)).
Problem #3:
Consider n = 34. T(34) = 2T(17+17) + 34 = 2T(34) + 34. We can solve this to find that T(34) = -34. We can also see that for odd n, T(n) = 1 + T(n - 1). We continue to find what values are fixed:
T(0) = 2T(17) + 0 = 2T(17)
T(17) = 1 + T(16)
T(16) = 2T(25) + 16
T(25) = T(24) + 1
T(24) = 2T(29) + 24
T(29) = T(28) + 1
T(28) = 2T(31) + 28
T(31) = T(30) + 1
T(30) = 2T(32) + 30
T(32) = 2T(33) + 32
T(33) = T(32) + 1
We get T(32) = 2T(33) + 32 = 2T(32) + 34, meaning that T(32) = -34. Working backword, we get
T(32) = -34
T(33) = -33
T(30) = -38
T(31) = -37
T(28) = -46
T(29) = -45
T(24) = -96
T(25) = -95
T(16) = -174
T(17) = -173
T(0) = -346
As you can see, this recurrence is a little more complicated than the others, and as such, you should probably take a hard look at this one. If I get any other ideas, I'll come back; otherwise, you're on your own.
EDIT:
After looking at #3 some more, it looks like you're right in your assessment that it's O(nlog_2(n)). So you can try listing a bunch of numbers - I did it from n=0 to n=45. You notice a pattern: it goes from negative numbers to positive numbers around n=43,44. To get the next even-index element of the sequence, you add powers of two, in the following order: 4, 8, 4, 16, 4, 8, 4, 32, 4, 8, 4, 16, 4, 8, 4, 64, 4, 8, 4, 16, 4, 8, 4, 32, ...
These numbers are essentially where you'd mark an arbitary-length ruler... quarters, halves, eights, sixteenths, etc. As such, we can solve the equivalent problem of finding the order of the sum 1 + 2 + 1 + 4 + 1 + 2 + 1 + 8 + ... (same as ours, divided by 4, and ours is shifted, but the order will still work). By observing that the sum of the first k numbers (where k is a power of 2) is equal to sum((n/(2^(k+1))2^k) = (1/2)sum(n) for k = 0 to log_2(n), we get that the simple recurrence is given by (n/2)log_2(n). Multiply by 4 to get ours, and shift x to the right by 34 and perhaps add a constant value to the result. So we're playing around with y = 2nlog_n(x) + k' for some constant k'.
Phew. That was a tricky one. Note that this recurrence does not admit of any arbitary "initial condiditons"; in other words, the recurrence does not describe a family of sequences, but one specific one, with no parameterization.

Resources