time complexity of recursion function - algorithm

I don't know where to start to calculate the time complexity of this function. What is O(time complexity) of this function? I learned that the answer is 3^n.
int f3(int n) {
if (n < 100)
return 1;
return n* f3(n-1) * f3(n-2) * f3(n-3)
}

We have:
1 : if statement
3 : * operator
3 : function statement
Totally: 7
So we have:
T(n)=t(n-1) + t(n-2) + t(n-3) + 7
T(99)=1
T(98)=1
...
T(1)=1
Then to reduce T(n), we have T(n-3)<T(n-2)<T(n-1) therefore:
T(n) <= T(n-1)+T(n-1)+T(n-1) + 7
T(n) <= 3T(n-1) + 7
So we can solve T(n) = 3T(n-1) + 7 (in big numbers T(n-1) is almost equal T(n-2) and ... )
Then we can calculate T(n) by following approach:
T(n) = 3.T(n-1) + 7
=3.(3.T(n-2)+7) + 7 = 3^2.T(n-2) + 7.(3^1 + 3^0)
=3^2.(3.T(n-3)+7) + ... = 3^3.T(n-3) + 7.(3^2 + 3^1 + 3^0)
= 3^4.T(n-4) + 7.(3^4 + 3^2 + 3^1 + 3^0)
...
=3^(n-99).T(n-(n-99)) + 7.(3^(n-98) + 3^(n-97) + ...+ 3^1 + 3^0)
=3^(n-99) + 7.(3^(n-98) + 3^(n-97) + ...+ 3^1 + 3^0)
So consider that 1 + 3 + 3^2 ...+3^(n-1) = 3^n - 1/2
Therefore we reached:
T(n) = 3^(n-99) + 7.(3^(n-99) + 1/2) = 8.3^(n-99) - 7/2
= 8/(3^99) . 3^n -7/2
Finally: Big O is O(3^n)
If you just have T(1)=1, the answer is the same.

Related

Time complexity for a recursive function

I'm a computer science student, I got the final exam next week and I got confused trying to find the time complexity for the following function. Can you explain this to me?
int bss(int n){
if(n <= 1)
return n;
return bss(n/2) + bss(n/2);
}
For problems like this, you should figure out the recurrence relation first (by looking at the code), then solve the recurrence relation (using mathematics).
To do step 1, we need to look at each line and see what it contributes to the overall running time T(n) of our algorithm:
int bss(int n){
if(n <= 1) // contributes a constant time a
return n; // contributes a constant time b in the base case only
return bss(n/2) + bss(n/2); // contributes a constant time c
// for the two divisions and one addition,
// plus 2T(n/2)
}
Adding up, we get two cases:
n <= 1: T(n) = a + b
n > 1: T(n) = a + c + 2T(n/2)
To solve this system, we can start writing out terms for values of n. Because we divide n by 2, we might as well choose even n only. Also, it would be nice to have already calculated T(n/2) to calculate T(n); so, we can double our test values of n each time.
n T(n)
---------
1 a + b
2 a + c + 2T(1) = a + c + 2a + 2b = 3a + 2b + c
4 a + c + 2T(2) = a + c + 6a + 4b + 2c = 7a + 4b + 3c
8 a + c + 2T(4) = a + c + 14a + 8b + 6c = 15a + 8b + 7c
16 a + c + 2T(8) = a + c + 30a + 16b + 14c = 31a + 16b + 15c
...
k (2k - 1)a + kb + (k - 1)c
Based on the pattern we saw, it seems as though the solution for n = k is (2k - 1)a + kb + (k - 1)c. We can try to verify this by plugging it into our equations:
k = 1: (2k - 1)a + kb + (k - 1)c = a + b = T(1) ... correct
k > 1:
(2k - 1)a + kb + (k - 1)c ?= a + c + 2[(2k/2 - 1)a + (k/2)b + (k/2 - 1)c]
?= a + c + (2k - 2)a + kb + (k - 2)c
?= a + c + 2ka - 2a + kb + kc - 2c
?= -a -c + 2ka + kb + kc
?= (2k - 1)a + kb + (k - 1)c ... correct
So, we have found a valid solution to our recurrence relation. The solution is:
T(n) = (2n - 1)a + nb + (n - 1)c
Rearranging:
T(n) = (2a + c + 1)n - (a + c)
T(n) is the equation of a line.

Recurence relation of recursive call inside another

I'm trying to analyze the cost of this code:
static int funct1(int x) {
if (x<=1) return x;
return funct1(funct1(x/2)) + 1;
}
How i could resolve the recurrence relation T(n)=T(T(n/2))+1?
Your analysis is not correct. T(n) = T(f(n/2)) + T(n/2) + 1 (You've missed T(n/2) but the program should compute T(n/2) first and then run T(f(n/2))). and it is not T(T(n/2)).
Now try to consider f(n) as a value of the function funct1.
Suppose n = 2^k, f(2^k) = f(f(2^{k-1})) + 1 = f(f(2^{k-2}) + 1) + 1 = ... = f(f(f(...f(1) + 1..)+1)+1) + 1. f(1) = 1, f(2) = 2, f(3) = 2, f(4) = 3, f(8) = f(f(4)) + 1 = f(3) + 1 = 3, f(16) = f(f(8)) + 1 = f(3) + 1 = 3, and f(32) = f(f(16)) + 1 = f(3) + 1 = 3. As you can see it repeats for all values 2^k, and f(2^k) = 3.
Therefore, T(n) = T(3) + T(n/2) + 1 = O(log n).

The sum from 1 to n in theta(log n)

Is there anyway to calculate the sum of 1 to n in Theta(log n)?
Of course, the obvious way to do it is sum = n*(n+1)/2.
However, for practicing, I want to calculate in Theta(log n).
For example,
sum=0; for(int i=1; i<=n; i++) { sum += i}
this code will calculate in Theta(n).
Fair way (without using math formulas) assumes direct summing all n values, so there is no way to avoid O(n) behavior.
If you want to make some artificial approach to provide exactly O(log(N)) time, consider, for example, using powers of two (knowing that Sum(1..2^k = 2^(k-1) + 2^(2*k-1) - for example, Sum(8) = 4 + 32). Pseudocode:
function Sum(n)
if n < 2
return n
p = 1 //2^(k-1)
p2 = 2 //2^(2*k-1)
while p * 4 < n:
p = p * 2;
p2 = p2 * 4;
return p + p2 + ///sum of 1..2^k
2 * p * (n - 2 * p) + ///(n - 2 * p) summands over 2^k include 2^k
Sum(n - 2 * p) ///sum of the rest over 2^k
Here 2*p = 2^k is the largest power of two not exceeding N. Example:
Sum(7) = Sum(4) + 5 + 6 + 7 =
Sum(4) + (4 + 1) + (4 + 2) + (4 + 3) =
Sum(4) + 3 * 4 + Sum(3) =
Sum(4) + 3 * 4 + Sum(2) + 1 * 2 + Sum(1) =
Sum(4) + 3 * 4 + Sum(2) + 1 * 2 + Sum(1) =
2 + 8 + 12 + 1 + 2 + 2 + 1 = 28

find the complexity for loop

what is the time complexity T(n) of the following potion of the code? for simplicity you may assume that n is power of 2. that is n=2^k . for some positive integer k.
for(i=1;;I<=n;i++)
for(j=n/2;j<n;j++)
print x;
choose the correct answer:
T(n)=n+1
T(n)=n^2+2n
T(n)=n
T(n)=n^3
T(n)=n(n/2 +1)
So lets write some iterations:
i j
1 0 to n-1
2 1 to n-1
3 1 to n-1
4 2 to n-1
5 2 to n-1
.
.
.
n-2 (n-2)/2 to n-1
n-1 (n-2)/2 to n-1
n n/2 to n-1
Now lets remember 1st and nth and calculate sum of ranges(end - start + 1) from i = 2 till i = n-1.
S = (n-1 - 1 + 1) + (n-1 - 1 + 1) + (n-1 - 2 + 1) + (n-1 - 2 + 1) + ... + (n-1 - (n-2)/2 + 1) + (n-1 - (n-2)/2 + 1)
...
S = (n - 1) + (n - 1) + (n - 2) + (n - 2) + ... + (n - (n-2)/2) + (n - (n-2)/2)
...
S = 2*(n - 1) + 2*(n - 2) + ... + 2*(n - (n - 2)/2)
...
S = 2*((n - 1) + (n - 2) + ... + (n - (n - 2)/2))
...
S = 2*(n*(n-2)/2 - (1 + 2 + 3 + ... + (n - 2)/2 ))
...
Last is arithmetic progression which is calculated by formula:
N*(A1 + A2)/2
where:
N = (n - 2)/2, A1 = 1, A2 = (n - 2)/2
So we end up with:
1 + 2 + 3 + ... + (n - 2)/2 = n*(n - 2)/8
...
S = n^2 - 2*n - 2*(n*(n - 2)/8)
...
S = n^2 - 2*n - n*(n - 2)/4
Now lets add 2 values that we remembered:
S = n + n/2 + n^2 - 2*n - n*(n - 2)/4
...
S = (4*n + 2*n + 4*n^2 - 8*n)/4 - n*(n - 2)/4
...
S = (4*n + 2*n + 4*n^2 - 8*n - n^2 + 2*n)/4
...
S = (3/4)*n^2
I am getting none of the above. I have checked and rechecked but can not find the error.

Recurrences using Substitution Method

Determine the positive number c & n0 for the following recurrences (Using Substitution Method):
T(n) = T(ceiling(n/2)) + 1 ... Guess is Big-Oh(log base 2 of n)
T(n) = 3T(floor(n/3)) + n ... Guess is Big-Omega (n * log base 3 of n)
T(n) = 2T(floor(n/2) + 17) + n ... Guess is Big-Oh(n * log base 2 of n).
I am giving my Solution for Problem 1:
Our Guess is: T(n) = O (log_2(n)).
By Induction Hypothesis assume T(k) <= c * log_2(k) for all k < n,here c is a const & c > 0
T(n) = T(ceiling(n/2)) + 1
<=> T(n) <= c*log_2(ceiling(n/2)) + 1
<=> " <= c*{log_2(n/2) + 1} + 1
<=> " = c*log_2(n/2) + c + 1
<=> " = c*{log_2(n) - log_2(2)} + c + 1
<=> " = c*log_2(n) - c + c + 1
<=> " = c*log_2(n) + 1
<=> T(n) not_<= c*log_2(n) because c*log_2(n) + 1 not_<= c*log_2(n).
To solve this remedy used a trick a follows:
T(n) = T(ceiling(n/2)) + 1
<=> " <= c*log(ceiling(n/2)) + 1
<=> " <= c*{log_2 (n/2) + b} + 1 where 0 <= b < 1
<=> " <= c*{log_2 (n) - log_2(2) + b) + 1
<=> " = c*{log_2(n) - 1 + b} + 1
<=> " = c*log_2(n) - c + bc + 1
<=> " = c*log_2(n) - (c - bc - 1) if c - bc -1 >= 0
c >= 1 / (1 - b)
<=> T(n) <= c*log_2(n) for c >= {1 / (1 - b)}
so T(n) = O(log_2(n)).
This solution is seems to be correct to me ... My Ques is: Is it the proper approach to do?
Thanks to all of U.
For the first exercise:
We want to show by induction that T(n) <= ceiling(log(n)) + 1.
Let's assume that T(1) = 1, than T(1) = 1 <= ceiling(log(1)) + 1 = 1 and the base of the induction is proved.
Now, we assume that for every 1 <= i < nhold that T(i) <= ceiling(log(i)) + 1.
For the inductive step we have to distinguish the cases when n is even and when is odd.
If n is even: T(n) = T(ceiling(n/2)) + 1 = T(n/2) + 1 <= ceiling(log(n/2)) + 1 + 1 = ceiling(log(n) - 1) + 1 + 1 = ceiling(log(n)) + 1.
If n is odd: T(n) = T(ceiling(n/2)) + 1 = T((n+1)/2) + 1 <= ceiling(log((n+1)/2)) + 1 + 1 = ceiling(log(n+1) - 1) + 1 + 1 = ceiling(log(n+1)) + 1 = ceiling(log(n)) + 1
The last passage is tricky, but is possibile because n is odd and then it cannot be a power of 2.
Problem #1:
T(1) = t0
T(2) = T(1) + 1 = t0 + 1
T(4) = T(2) + 1 = t0 + 2
T(8) = T(4) + 1 = t0 + 3
...
T(2^(m+1)) = T(2^m) + 1 = t0 + (m + 1)
Letting n = 2^(m+1), we get that T(n) = t0 + log_2(n) = O(log_2(n))
Problem #2:
T(1) = t0
T(3) = 3T(1) + 3 = 3t0 + 3
T(9) = 3T(3) + 9 = 3(3t0 + 3) + 9 = 9t0 + 18
T(27) = 3T(9) + 27 = 3(9t0 + 18) + 27 = 27t0 + 81
...
T(3^(m+1)) = 3T(3^m) + 3^(m+1) = ((3^(m+1))t0 + (3^(m+1))(m+1)
Letting n = 3^(m+1), we get that T(n) = nt0 + nlog_3(n) = O(nlog_3(n)).
Problem #3:
Consider n = 34. T(34) = 2T(17+17) + 34 = 2T(34) + 34. We can solve this to find that T(34) = -34. We can also see that for odd n, T(n) = 1 + T(n - 1). We continue to find what values are fixed:
T(0) = 2T(17) + 0 = 2T(17)
T(17) = 1 + T(16)
T(16) = 2T(25) + 16
T(25) = T(24) + 1
T(24) = 2T(29) + 24
T(29) = T(28) + 1
T(28) = 2T(31) + 28
T(31) = T(30) + 1
T(30) = 2T(32) + 30
T(32) = 2T(33) + 32
T(33) = T(32) + 1
We get T(32) = 2T(33) + 32 = 2T(32) + 34, meaning that T(32) = -34. Working backword, we get
T(32) = -34
T(33) = -33
T(30) = -38
T(31) = -37
T(28) = -46
T(29) = -45
T(24) = -96
T(25) = -95
T(16) = -174
T(17) = -173
T(0) = -346
As you can see, this recurrence is a little more complicated than the others, and as such, you should probably take a hard look at this one. If I get any other ideas, I'll come back; otherwise, you're on your own.
EDIT:
After looking at #3 some more, it looks like you're right in your assessment that it's O(nlog_2(n)). So you can try listing a bunch of numbers - I did it from n=0 to n=45. You notice a pattern: it goes from negative numbers to positive numbers around n=43,44. To get the next even-index element of the sequence, you add powers of two, in the following order: 4, 8, 4, 16, 4, 8, 4, 32, 4, 8, 4, 16, 4, 8, 4, 64, 4, 8, 4, 16, 4, 8, 4, 32, ...
These numbers are essentially where you'd mark an arbitary-length ruler... quarters, halves, eights, sixteenths, etc. As such, we can solve the equivalent problem of finding the order of the sum 1 + 2 + 1 + 4 + 1 + 2 + 1 + 8 + ... (same as ours, divided by 4, and ours is shifted, but the order will still work). By observing that the sum of the first k numbers (where k is a power of 2) is equal to sum((n/(2^(k+1))2^k) = (1/2)sum(n) for k = 0 to log_2(n), we get that the simple recurrence is given by (n/2)log_2(n). Multiply by 4 to get ours, and shift x to the right by 34 and perhaps add a constant value to the result. So we're playing around with y = 2nlog_n(x) + k' for some constant k'.
Phew. That was a tricky one. Note that this recurrence does not admit of any arbitary "initial condiditons"; in other words, the recurrence does not describe a family of sequences, but one specific one, with no parameterization.

Resources