Analyzing recursion in algorithms - algorithm

This is an old homework problem from my algorithms class. I have the solution to the problem, but even after repeated attempts, I fail to understand how to think in the right direction to arrive at the solution.
function h(N) {
if (N==1) return 3;
else {
sum = 1;
i = 0;
while (i < h(N-1))
sum = sum + i;
i = i + 1;
return sum;
}
}
According to me, since h(N-1) is called repeatedly in the while loop, the while loop should run as many number of times as what is returned by h(N-1). Besides that, the function call h(N-1) in the while loop will also happen that many number of times. Thus, according to me, I should get something like this:
T(N) = T(N-1)*H(N-1) + C*H(N-1) + D
where
1. T(N) is the running time,
2. T(N-1)*H(N-1) because the one recursive call to h(N-1) will take T(N-1) and since it's recomputed every time the comparison is made, it will be called H(N-1) times. (where H(N-1) is the value returned from the call)
3. and C*H(N-1) is the running time for the statements inside the while loop (since the while loop runs H(N-1) times.
I did not get a satisfactory answer from my professor and I would appreciate if someone could help me understand this.
Thank you!

Try understanding this in two steps, first consider this simpler function, where we replace the while loop with an if.
function g(N) {
if (N==1) return 3;
else {
sum = 1;
i = 0;
if(i < g(N-1))
sum = sum + i;
i = i + 1;
return sum;
}
}
Here, we get the recurrence:
G(N) = G(N-1) + O(1)
So far, so good? Here, the work to compute g(N) involves solving the smaller problem g(N-1) plus a constant amount of work.
Now, let's go back to the original function h(N). What has changed? Now, the work to compute h(N) involves solving the subproblem h(N - 1), h(N-1) times. And in each of those times (i.e. in the while loop), we do a constant amount of work. There is also another constant amount of work that is done only once in h(N), i.e. out of the while loop. So, we essentially get:
H(N) = H(N - 1) *{H(N - 1) + O(1)} + O(1)
We can rewrite the above by making the substitution T(n) = H(n) + O(1). Thus, we get:
T(N) = H(N - 1) * T(N - 1) + O(1)

Assume that in executing h(N), the value of h(N-1) is recomputed at each iteration of the loop (which is probably the case for most languages and most compilers)

Related

How to calculate time complexity of this recursion?

I'm interested in calculating the following code's time and space complexity but seem to be struggling a lot. I know that the deepest the recursion could reach is n so the space should be O(n). I have no idea however how to calculate the time complexity... I don't know how to write the formula when it comes to recursions similar to this forms like: f(f(n-1)) .
if it was something like, return f3(n-1) + f3(n-1) then i know it should be O(2^n) since T(n) = 2T(n-1) correct?
Here's the code:
int f3(int n)
{
if(n <= 2)
return 1;
f3(1 + f3(n-2));
return n - 1;
}
Thank you for your help!
Notice that f3(n) = n - 1 for all n, so the line f3(1 + f3(n-2)), first f3(n-2) is computed, which returns n - 3 and then f3(1 + n - 3) = f3(n-2) is computed again!
So, f3(n) computes f3(n-2) twice, alongside with some O(1) overhead.
We got the recursion formula T(n) = 2T(n-2) + c for some constant c, and T(n) is the running time of f3(n).
Solving the recursion, we get T(n) = O(2^(n/2)).

Complexity Algorithm Analysis with if

I have the following code. What time complexity does it have?
I have tried to write a recurrence relation for it but I can't understand when will the algorithm add 1 to n or divide n by 4.
void T(int n) {
for (i = 0; i < n; i++);
if (n == 1 || n == 0)
return;
else if (n%2 == 1)
T(n + 1);
else if (n%2 == 0)
T(n / 4);
}
You can view it like this: you always divide by four only if you have odd you add 1 to n before division. So, you should count how many times 1 was added. If there no increments then you have log4n recursive calls. Let's assume that you always have to add 1 before division. Then can rewrite it like this:
void T(int n) {
for (i = 0; i < n; i++);
if (n == 1 || n == 0)
return;
else if (n%2 == 0)
T(n / 4 + 1);
}
But n/4 + 1 < n/2, and in case of recursive call T(n/2), running time is O(log(n,4)), but base of logarithm doesn't impact running time in big-O notation because it's just like constant factor. So running time is O(log(n)).
EDIT:
As ALB pointed in a comment, there is cycle of length n. So, with accordance with master theorem running time is Theta(n). You can see it in another way as sum of n * (1 + 1/2 + 1/4 + 1/8 + ...) = 2 * n.
Interesting question. Be aware that even though your for loop is doing nothing, since it is not an optimized solution (see Dukeling's comment), it will be considered in your time complexity as if computing time was taken to iterate through it.
First part
The first section is definitely O(n).
Second part
Let's suppose for the sake of simplicity that half the time n will be odd and the other half time it will be even. Hence, the recursion is looping (n+1) half the time and (n/4) the other half.
Conclusion
For each time T(n) is called, the recursion will implicitly loop n times. Hence, The first half of the time, we will have a complexity of n * (n+1) = n^2 + n. The other half of the time, we will deal with a n * (n/4) = (1/4)n^2.
For Big O notation, we care more about the upper bound than its precise behavior. Hence, your algorithm would be bound by O(n^2).

Is my understanding of big-O correct for these Java functions incorrect?

My approach (might be incorrect) is formulaic. If there is a loop then (n+1) if there is a nested loop (n^2) if a statement then O(1). If division then log(n).
Here are some example and my reasoning to solving, not sure at all if this approach is problematic or if any of them are correct. I need help with this.
Example1:
i = n; // I think O(1) because it's a statment
while (i > 1) // I think O(n) because it's a loop
i = i/4; // O(n) because it's in a loop and log_4(n) b/c division
// I think Overall if we combine the O(n) from earlier and the log_4(n)
// Therefore, I think overall O(nlog(n))
Example2:
for (i = 1; i < n; i = i + i) // I think this is O(n+1) thus, O(n)
System.out.println("Hello World"); // O(n) because it's in a loop
// Therefore, overall I think O(n)
Example3:
for (i = 0; i < n; i = i + 1) // I think O(n+1), thus O(n)
for (j = 1; j < n; j++) // I think O(n^2) because in a nested loop
System.out.println("Hello Universe!"); // O(n^2) because in a nested
// Therefore, overall I think O(n^2)
Example4:
for (i = 1; i < (n * n + 3 * n + 17) / 4; i = i + 1) // O((20+n^3)/4) thus, O(n^3)
System.out.println("Hello Again!'); // O(n) because it's in a loop
// Therefore, overall I think O(n^3) because largest Big-O in the code
Thank you
Example1: Your result is wrong. Because the loop happens log_4(n) times and also the division takes O(1) (division by 4 just needs a bitwise shift). Thus the overall time is only O(log(n))
Example2: It's wrong too. In each iteration you duplicate the loop variable. So the loop happens O(log(n)) times. The print command takes O(1) and total time is O(log(n)).
Example3: Your answer is correct. Because you have two nested O(n) loops. Note that this loop is different from two previous examples.
Example4: I think you made a writing mistake. Does ((n * n + 3 * n + 17) / 4) equal O((20+n^3)/4)??? It is O(n^2). Thus, according to my previous explanations, the overall time is O(n^2).

Selection Sort Recurrence Relation

up front this is a homework question but I am having a difficult time understanding recurrence relations. I've scoured the internet for examples and they are very vague to me. I understand that recurrence relations for recursive algorithms don't have a set way of handling each one but I am lost at how to understand these. Here's the algorithm I have to work with:
void selectionSort(int array[]) {
sort(array, 0);
}
void sort(int[] array, int i) {
if (i < array.length - 1)
{
int j = smallest(array, i); T(n)
int temp = array[i];
array[i] = array[j];
array[j] = temp;
sort(array, i + 1); T(n)
}
}
int smallest(int[] array, int j) T(n - k)
{
if (j == array.length - 1)
return array.length - 1;
int k = smallest(array, j + 1);
return array[j] < array[k] ? j : k;
}
So from what I understand this is what I'm coming up with: T(n) = T(n – 1) +cn + c
The T(n-1) represents the recursive function of sort and the added cn represents the recursive function of smallest which should decrease as n decreases since it's called only the amount of times that are remaining in the array each time. The constant multiplied by n is the amount of time to run the additional code in smallest and the additional constant is the amount of time to run the additional code in sort. Is this right? Am I completely off? Am I not explaining it correctly? Also the next step is to create a recursion tree out of this but I don't see this equation as the form T(n) = aT(n/b) + c which is the form needed for the tree if I understand this right. Also I don't see how my recurrence relation would get to n^2 if it is correct. This is my first post too so I apologize if I did something incorrect here. Thanks for the help!
The easiest way to compute the time complexity is to model the time complexity of each function with a separate recurrence relation.
We can model the time complexity of the function smallest with the recurrence relation S(n) = S(n-1)+O(1), S(1)=O(1). This obviously solves to S(n)=O(n).
We can model the time complexity of the sort function with T(n) = T(n-1) + S(n) + O(1), T(1)=O(1). The S(n) term comes in because we call smallest within the function sort. Because we know that S(n)=O(n) we can write T(n) = T(n-1) + O(n), and writing out the recurrence we get T(n)=O(n)+O(n-1)+...+O(1)=O(n^2).
So the total running time is O(n^2), as expected.
In selection sort algo
Our outer loop runs for n- 1 times (n is the length of the array) so n-1 passes would be made... and then element is compared with other elements ....so n-1 comparisons
T(n)=T(n-1) + n-1
Which can be proved as O(n^2) by solving the particular relation ..

Building a recurrence relation for this code?

I need to build a recurrence relation for the following algorithm (T(n) stands for number of elemental actions) and find it's time complexity:
Alg (n)
{
if (n < 3) return;
for i=1 to n
{
for j=i to 2i
{
for k=j-i to j-i+100
write (i, j, k);
}
}
for i=1 to 7
Alg(n-2);
}
I came to this Recurrence relation (don't know if it's right):
T(n) = 1 if n < 3
T(n) = 7T(n-2)+100n2 otherwise.
I don't know how to get the time complexity, though.
Is my recurrence correct? What's the time complexity of this code?
Let's take a look at the code to see what the recurrence should be.
First, let's look at the loop:
for i=1 to n
{
for j=i to 2i
{
for k=j-i to j-i+100
write (i, j, k);
}
}
How much work does this do? Well, let's begin by simplifying it. Rather than having j count up from i to 2i, let's define a new variable j' that counts up from 0 to i. This means that j' = j - i, and so we get this:
for i=1 to n
{
for j' = 0 to i
{
for k=j' to j'+100
write (i, j' + i, k);
}
}
Ah, that's much better! Now, let's also rewrite k as k', where k' ranges from 1 to 100:
for i=1 to n
{
for j' = 0 to i
{
for k'= 1 to 100
write (i, j' + i, k' + j);
}
}
From this, it's easier to see that this loop has time complexity Θ(n2), since the innermost loop does O(1) work, and the middle loop will run 1 + 2 + 3 + 4 + ... + n = Θ(n2) times. Notice that it's not exactly 100n2 because the summation isn't exactly n2, but it is close.
Now, let's look at the recursive part:
for i=1 to 7
Alg(n-2);
For starters, this is just plain silly! There's no reason you'd ever want to do something like this. But, that said, we can say that this is 7 calls to the algorithm on an input of size n - 2.
Accordingly, we get this recurrence relation:
T(n) = 7T(n - 2) + Θ(n2) [if n ≥ 3]
T(n) = Θ(1) [otherwise]
Now that we have the recurrence, we can start to work out the time complexity. That ends up being a little bit tricky. If you think about how much work we'll end up doing, we'll get that
There's 1 call of size n.
There's 7 calls of size n - 2.
There's 49 calls of size n - 4.
There's 343 calls of size n - 6.
...
There's 7k calls of size n - 2k
From this, we immediately get a lower bound of Ω(7n/2), since that's the number of calls that will get made. Each call does O(n2) work, so we can get an upper boudn of O(n27n/2). The true value lies somewhere in there, though I honestly don't know how to figure out what it is. Sorry about that!
Hope this helps!
A formal method is to do the following:
The prevailing order of growth can be intuitively inferred from the source code, when it comes to the number of recursive calls.
An algorithm with 2 recursive calls has a complexity of 2^n; with 3 recursive calls the complexity 3^n and so on.

Resources