Time Complexity of a recursive function calling itself thrice - algorithm

I'm working on my DSA. I came across a question for which the recursive func looks something like this:
private int func(int currentIndex, int[] arr, int[] memo){
if(currentIndex >= arr.length)
return 0;
if(memo[currentIndex] > -1)
return memo[currentIndex];
int sum = 0;
int max = Integer.MIN_VALUE;
for(int i=currentIndex;i<currentIndex+3 && i<arr.length;i++){
sum += arr[i];
max = Math.max(max, sum - func(i+1, arr, memo));
}
memo[currentIndex] = max;
return memo[currentIndex];
}
If I'm not using memoization, by intuition at every step I've 3 choices so the complexity should be 3^n. But how do I prove it mathematically?
So far I could come up with this: T(n) = T(n-1) + T(n-2) + T(n-3) + c
Also, what should be the complexity if I use memoization? I'm completely blank here.

The recurrence relation without memoization is:
      𝑇(𝑛) = 𝑇(𝑛-1) + 𝑇(𝑛-2) + 𝑇(𝑛-3) + 𝑐
This is similar to the Tribonnaci sequence, which corresponds to a complexity of about O(1.84𝑛).
With memoization it becomes a lot easier, as then the function runs in constant time when it is called with an argument that already has the result memoized. In practice this means that when one particular execution of the for loop has executed the first recursive call, the two remaining recursive calls will run in constant time, and so the recurrence relation simplifies to:
      𝑇(𝑛) = 𝑇(𝑛-1) + 𝑐
...which is O(𝑛).

Related

How do you get the complexity of an sequence alignment algorithm?

void opt(int i , int j )
{
if(i == m)
opt = 2( n - j);
else if(j == n)
opt = 2( m - i);
else{
if(x[i] == x[j])
penalty = 0;
else
penalty = 1;
opt = min(opt(i+1,j+1) + penalty, opt(i+1,j)+2, opt(i, j+1)+2);
}
}
Why is the complexity of this algorithm 3^n ?
Analyze the time complexity of Algorithm opt.
You should specify how are you calling this function first.
Regarding the analysis of Big O, you can get it by drawing the recursion tree. You can do it for small n samples and you will notice that the height of the tree is n. Now, you can notice that for each instance of the function you call the same function 3 times again, so you have a tree that expands by a factor of 3 exponentially. Hence your complexity is O(3^n).
Bonus: Analogy with Fibonacci
Check the basic (withot memoization) recusrive version of the Fibonacci algorithm and you will see the similar structure except for the fact that 2 calls are done each time, and hence the complexity is O(2^n).

Selection Sort Recurrence Relation

up front this is a homework question but I am having a difficult time understanding recurrence relations. I've scoured the internet for examples and they are very vague to me. I understand that recurrence relations for recursive algorithms don't have a set way of handling each one but I am lost at how to understand these. Here's the algorithm I have to work with:
void selectionSort(int array[]) {
sort(array, 0);
}
void sort(int[] array, int i) {
if (i < array.length - 1)
{
int j = smallest(array, i); T(n)
int temp = array[i];
array[i] = array[j];
array[j] = temp;
sort(array, i + 1); T(n)
}
}
int smallest(int[] array, int j) T(n - k)
{
if (j == array.length - 1)
return array.length - 1;
int k = smallest(array, j + 1);
return array[j] < array[k] ? j : k;
}
So from what I understand this is what I'm coming up with: T(n) = T(n – 1) +cn + c
The T(n-1) represents the recursive function of sort and the added cn represents the recursive function of smallest which should decrease as n decreases since it's called only the amount of times that are remaining in the array each time. The constant multiplied by n is the amount of time to run the additional code in smallest and the additional constant is the amount of time to run the additional code in sort. Is this right? Am I completely off? Am I not explaining it correctly? Also the next step is to create a recursion tree out of this but I don't see this equation as the form T(n) = aT(n/b) + c which is the form needed for the tree if I understand this right. Also I don't see how my recurrence relation would get to n^2 if it is correct. This is my first post too so I apologize if I did something incorrect here. Thanks for the help!
The easiest way to compute the time complexity is to model the time complexity of each function with a separate recurrence relation.
We can model the time complexity of the function smallest with the recurrence relation S(n) = S(n-1)+O(1), S(1)=O(1). This obviously solves to S(n)=O(n).
We can model the time complexity of the sort function with T(n) = T(n-1) + S(n) + O(1), T(1)=O(1). The S(n) term comes in because we call smallest within the function sort. Because we know that S(n)=O(n) we can write T(n) = T(n-1) + O(n), and writing out the recurrence we get T(n)=O(n)+O(n-1)+...+O(1)=O(n^2).
So the total running time is O(n^2), as expected.
In selection sort algo
Our outer loop runs for n- 1 times (n is the length of the array) so n-1 passes would be made... and then element is compared with other elements ....so n-1 comparisons
T(n)=T(n-1) + n-1
Which can be proved as O(n^2) by solving the particular relation ..

What is the complexity of these functions with explanation?

i would like to know how to find the complexity of these functions using T(n) .. and stuff like this .. Because i can only Guess.
First Function :
int f(int n)
{
if (n == 1)
return 1 ;
return 1 + f(f(n-1));
}
Time&Space complexity ??
Second Function :
time&space Complexity of Function f() ??? :
void f(int n)
{
int i ;
if(n < 2) return ;
for(i = 0 ; i < n/2 , i+= 5)
printf("*");
g(n/3);
g(n/3);
}
void g(int n)
{
int i ;
for(i = 0 ; i < n ; i++)
printf("?") ;
f(3*n/2);
}
Many Thanks :)
It may surprise you, but the second one is easier to start with. O_o ikr
Second function:
g(n) = n + f(3n/2), f(n) = n/10 + 2g(n/3). Therefore f(n) = 21n/10 + 2f(n/2).
Substitute n = 2^m, therefore f(2^m) = 21(2^m)/10 + 2f(2^(m-1)) = 2*21(2^m)/10 + 4f(2^(m-2)) etc...
The first term sums to m*21(2^m)/10, which may be obvious to you.
The second term (with the f()) grows geometrically; now f(1) = 1 (as there is only 1 operation), so if you expand to the last term you will find this term is 2^m * f(1) = 2^m. Therefore the total complexity of f is f(2^m) = m*21(2^m)/10 + 2^m, or f(n) = n(2.1*log(n) + 1), where log is base-2 logarithm.
Thus f(n) is O(n log(n)).
First function:
Ok I'll be honest I didn't know how to start, but I tested the code in C++ and the result is exactly f(n) = n.
Proof by induction:
Suppose f(n) = n, then f(n + 1) = 1 + f(f(n)) = n + 1. Thus if true for n then also true for n + 1
Now f(1) = 1 obviously. Therefore it's true for 2, and for 3, 4, 5 ... and so on.
Therefore by mathematical induction, f(n) = n.
Now for the time complexity bit. Since f(n) returns n, the outer call in the nested f(f(n-1)) will effectively be a second call, as the argument is the same: f(n-1); f(n-1);. Thus T(n) = 2*T(n-1) and therefore T(n) = 2^n. O(2^n).

Recursion, inner loop and time complexity

Consider the following function:
int testFunc(int n){
if(n < 3) return 0;
int num = 7;
for(int j = 1; j <= n; j *= 2) num++;
for(int k = n; k > 1; k--) num++;
return testFunc(n/3) + num;
}
I get that the first loop is O(logn) while the second loop gives O(n) which gives a time complexity of O(n) in total. But due to the recursive calls I thought the time complexity would be O(nlogn), but apperantly it is only O(n). Can anyone explain why?
The recursive call pretty much gives the following for the complexity(denoting the complexity for input n by T(n)):
T(n) = log(n) + n + T(n/3)
First observation as you correctly noted is that you can ignore the logarithm as it is dominated by n. Now we are only left with T(n) = n + T(n/3). Try writing this up to 0 for instance. We have:
T(n) = n + n/3 + n/9+....
You can easily prove that the above sum is always less than 2*n. In fact better limits can be proven but this one is enough to state that overall complexity is O(n).
For procedures using a recursive algorithm such as the following:
procedure T( n : size of problem ) defined as:
if n < base_case then exit
Do work of amount f(n) // In this case, the O(n) for loop
T(n/b)
T(n/b)
... a times... // In this case, b = 3, and a = 1
T(n/b)
end procedure
Applying the Master theorem to find the time complexity, the f(n) in this case is O(n) (due to the second for loop, like you said). This makes c = 1.
Now, logba = log31 = 0, making this the 3rd case of the theorem, according to which the time complexity T(n) = Θ(f(n)) = Θ(n).

Mergesort recurrence formulas - reconciling reality with textbooks

I think this is more programming than math, so I posted here.
All the java algorithms in my question come from here.
We have an iterative and recursive merge sort. Both using the same merge function.
The professor teaching this lecture says that the critical operation for merge sort is the comparison.
So I came up with this formula for merge() based on compares:
>3n + 2
3: worst case compares through each loop.
n: amount of times loop will iterate.
2: the "test" compares.
The recursiveMergesort() has the base case compare plus the recursive calls for a total of:
>T(n/2) + 1 + 3n + 2 = T(n/2) + 3n + 3
The iterativeMergesort() simply has one loop that runs *n/2* times with a nested loop that runs n times. That leads me to this formula (but I think it's wrong):
>(n/2) * n + 3n + 2 = (n^2)/2 + 3n + 2
The books say that the recurrence formula for the recursive mergesort is
2T(n/2) + theta(n)
Which solves with the master method to
theta(NlogN)
Question 1:
How are the formulas I created simplified to
T(n/2) + theta(n)
Question 2:
Can I use any of these formulas (the ones I created, the textbook formula, or the time complexity *theta(nlogn)*) to predict number of compares when running this particular algorithm on an array size n
Question 3:
For the bonus: Is my formula for the iterative method correct?
Merge:
private static void merge(int[] a, int[] aux, int lo, int mid, int hi) {
// DK: add two tests to first verify "mid" and "hi" are in range
if (mid >= a.length) return;
if (hi > a.length) hi = a.length;
int i = lo, j = mid;
for (int k = lo; k < hi; k++) {
if (i == mid) aux[k] = a[j++];
else if (j == hi) aux[k] = a[i++];
else if (a[j] < a[i]) aux[k] = a[j++];
else aux[k] = a[i++];
}
// copy back
for (int k = lo; k < hi; k++)
a[k] = aux[k];
}
Recursive Merge sort:
public static void recursiveMergesort(int[] a, int[] aux, int lo, int hi) {
// base case
if (hi - lo <= 1) return;
// sort each half, recursively
int mid = lo + (hi - lo) / 2;
recursiveMergesort(a, aux, lo, mid);
recursiveMergesort(a, aux, mid, hi);
// merge back together
merge(a, aux, lo, mid, hi);
}
public static void recursiveMergesort(int[] a) {
int n = a.length;
int[] aux = new int[n];
recursiveMergesort(a, aux, 0, n);
}
Iterative merge sort:
public static void iterativeMergesort(int[] a) {
int[] aux = new int[a.length];
for (int blockSize=1; blockSize<a.length; blockSize*=2)
for (int start=0; start<a.length; start+=2*blockSize)
merge(a, aux, start, start+blockSize, start+2*blockSize);
}
Wow, you made it all the way down here. Thanks!
Question 1:
Where are you getting your facts? To obtain theta(nlogn) complexity you need
T(n) = a T(n/b) + f(n), where a > 1, b > 1 and f(n) = cn + d. c != 0
Note: There are additional constraints, dictated by the Master theorem
You cannot derive from reccurence relation based on T(n) > T(n/2) + 3n + 3. You probably forgot that the cost of an array of size n is the cost of the merge plus twice the cost of each part. So rather
T(n) = 2T(n/2) + 3n + 3
Question 2:
You cannot use theta, Big O or Big Omega to predict number of compares when running this particular algorithm on an array size n. Because they are asymptotic expressions. You need to solve the relation above, assuming it is correct.
For instance T(n) = 2T(n/2) + 3n + 3 has the solution
T(n) = 3n log2(n) + 1/2(c+6)n - 3, c constant
Still this is the number of comparisons of the algorithm. All optimizations and constraints of a real program are not considered.
Question 3:
Nope

Resources