Design And Analysis of Algorithm : Recursive Relation - algorithm

I have a great Doubt in solving this recursive relation. Can anyone provide me a solution?
The relation:
T(n) = Sumation i=1 to N T(i) +1...,
What is the bigOh order?

Taking the first order difference allows you to get rid of the summation.
T(n) - T(n-1) = (Sum(1<=i<n: T(i)) + 1) - (Sum(1<=i<n-1: T(i)) + 1) = T(n-1)
Hence
T(n) = 2.T(n-1)

A recurrence relation describes a sequence of numbers. Early terms are specified explicitly and later terms are expressed as a function of their predecessors. As a trivial example, this recurrence describes the sequence 1, 2, 3, etc.:
void Sample(int n)
{
if (n > 0) {
Sample(n-1);
System.out.println('here');
} else {
return 1;
}
}
Sample(3);
Here, the first term is defined to be 1 and each subsequent term is one more than its predecessor. In order to analyze a recurrence relationship, we need to know execution time of each line of code, In above example:
void Sample(int n)
{
if (n > 0) {
// T(n-1) Sample(n-1);
// 1 System.out.println('here');
} else {
return 1;
}
}
We define T(n) as a:
For solving T(n)= T(n-1)+1, If we know what is T(n-1), then we can substitute it and get answer, Let's substitute it with n then, we will have:
T(n-1)= T(n-1-1)+1 => T(n-2)+1
//in continue
T(n)=[T(n-2)+1]+1 => T(n-2)+2
//and again
T(n)=[T(n-3)+2]+1 => T(n-3)+3
.
.
.
// if repeat it k times, while it ends
T(n)= T(n-k)+k
As you see it is increasing by 1 in each step, If we go k times, Then T(n)= T(n-k)+k. Now, we need to know the smallest value(when the the function will stop, always must be a point to stop). In this problem zero is end of recursive stack. sine we assume that we go k times to reach zero, the solution will be:
// if n-k is final step
n-k = 0 => n = k
// replace k with n
T(n)= T(n-n)+n => T(n)= T(0)+n;
// we know
T(0) = 1;
T(n) = 1+n => O(n)
The big O is n, It means this recursive algorithms in worst case goes n times.

Related

Finding the linear recurrence (of a recursive algorithm)

I need help with finding the complexity of a recursive algorith; I know that in order to solve this I have to find the linear recurrence, then apply the Master Theorem. As of my knowledge, finding the recurrence would be straightforward when only one parameter is considered;
In this case there are two parameters (i, j). Consider the function below called on (A,1,n):
integer stuff(integer [] A, integer i, integer j){
if i ≥ j then return i – j
integer h ← 0
for integer k ← 1 to floor((j – i + 1)/3) do {
h ← h + 1
}
return stuff(A, i , i + h) + stuff(A, j – h, j) – stuff(A, i + h + 1, j – h − 1)
}
Assuming various things, I guessed the relation to be:
T(1) = k
T(n) = T(n/3) + T(n/3) + T(n/3) + 1/3*n = 3*T(n/3) + 1/3*n
I assumed that because it looks that the function is called over 3 parts of 3, of which each is one third of n; being h = O(n/3)
First call: h+i-i = h ~ n/3
Second call: j-(j-h) = h ~ n/3
Third call: j-h-1-(i+h) = j-i-2h ~ n/3 (which I only assumed)
Even though I can try to guess the relation and make sense out of it, I don't know how to formally prove it.
If my guessing is correct, how do you get to that conclusion? If not, what am I missing?
Sorry for the long question, Thanks in advance
As you return inside the for, it means all the time the function will be finished just with a constant complexity! Because all the time goes to the for loop and it return the value of the function and everything is finished and the result is ready to be returned.
Also, the proof of the recurrent relationship comes from your analysis. If you use some counting principle in Combinatorics, the final result will be proved.
Moreover, if you correct the pseudocode and put the return at the end of the function, the complexity is T(n) = 3T(n/3) + \Theta(n) (as you analyzed). Now, from the master theorem, you can say that T(n) = n log(n)).

How to calculate time complexity of this recursion?

I'm interested in calculating the following code's time and space complexity but seem to be struggling a lot. I know that the deepest the recursion could reach is n so the space should be O(n). I have no idea however how to calculate the time complexity... I don't know how to write the formula when it comes to recursions similar to this forms like: f(f(n-1)) .
if it was something like, return f3(n-1) + f3(n-1) then i know it should be O(2^n) since T(n) = 2T(n-1) correct?
Here's the code:
int f3(int n)
{
if(n <= 2)
return 1;
f3(1 + f3(n-2));
return n - 1;
}
Thank you for your help!
Notice that f3(n) = n - 1 for all n, so the line f3(1 + f3(n-2)), first f3(n-2) is computed, which returns n - 3 and then f3(1 + n - 3) = f3(n-2) is computed again!
So, f3(n) computes f3(n-2) twice, alongside with some O(1) overhead.
We got the recursion formula T(n) = 2T(n-2) + c for some constant c, and T(n) is the running time of f3(n).
Solving the recursion, we get T(n) = O(2^(n/2)).

Selection Sort Recurrence Relation

up front this is a homework question but I am having a difficult time understanding recurrence relations. I've scoured the internet for examples and they are very vague to me. I understand that recurrence relations for recursive algorithms don't have a set way of handling each one but I am lost at how to understand these. Here's the algorithm I have to work with:
void selectionSort(int array[]) {
sort(array, 0);
}
void sort(int[] array, int i) {
if (i < array.length - 1)
{
int j = smallest(array, i); T(n)
int temp = array[i];
array[i] = array[j];
array[j] = temp;
sort(array, i + 1); T(n)
}
}
int smallest(int[] array, int j) T(n - k)
{
if (j == array.length - 1)
return array.length - 1;
int k = smallest(array, j + 1);
return array[j] < array[k] ? j : k;
}
So from what I understand this is what I'm coming up with: T(n) = T(n – 1) +cn + c
The T(n-1) represents the recursive function of sort and the added cn represents the recursive function of smallest which should decrease as n decreases since it's called only the amount of times that are remaining in the array each time. The constant multiplied by n is the amount of time to run the additional code in smallest and the additional constant is the amount of time to run the additional code in sort. Is this right? Am I completely off? Am I not explaining it correctly? Also the next step is to create a recursion tree out of this but I don't see this equation as the form T(n) = aT(n/b) + c which is the form needed for the tree if I understand this right. Also I don't see how my recurrence relation would get to n^2 if it is correct. This is my first post too so I apologize if I did something incorrect here. Thanks for the help!
The easiest way to compute the time complexity is to model the time complexity of each function with a separate recurrence relation.
We can model the time complexity of the function smallest with the recurrence relation S(n) = S(n-1)+O(1), S(1)=O(1). This obviously solves to S(n)=O(n).
We can model the time complexity of the sort function with T(n) = T(n-1) + S(n) + O(1), T(1)=O(1). The S(n) term comes in because we call smallest within the function sort. Because we know that S(n)=O(n) we can write T(n) = T(n-1) + O(n), and writing out the recurrence we get T(n)=O(n)+O(n-1)+...+O(1)=O(n^2).
So the total running time is O(n^2), as expected.
In selection sort algo
Our outer loop runs for n- 1 times (n is the length of the array) so n-1 passes would be made... and then element is compared with other elements ....so n-1 comparisons
T(n)=T(n-1) + n-1
Which can be proved as O(n^2) by solving the particular relation ..

Recurrence Relation based off Pseudo Code (Time complexity)

Consider the element uniqueness problem, in which we are given a range, i, i + 1, . . . , j, of indices for an array, A, and we want to determine if the elements of this range, A[i], A[i+1], . . . , A[j], are all unique, that is, there is no repeated element in this group of array entries. Consider the following (inefficient) recursive algorithm.
public static boolean isUnique(int[] A, int start, int end) {
if (start >= end) return true; // the range is too small for repeats
// check recursively if first part of array A is unique
if (!isUnique(A,start,end-1) // there is duplicate in A[start],...,A[end-1]
return false;
// check recursively if second part of array A is unique
if (!isUnique(A,start+1,end) // there is duplicate in A[start+1],...,A[end]
return false;
return (A[start] != A[end]; // check if first and last are different
}
Let n denote the number of entries under consideration, that is, let n = end − start + 1. What is an upper is upper bound on the asymptotic running time of this code fragment for large n? Provide a brief and precise explanation.
(You lose marks if you do not explain.) To begin your explanation, you may say how many recursive calls the
algorithm will make before it terminates, and analyze the number of operations per invocation of this algorithm.
Alternatively, you may provide the recurrence characterizing the running time of this algorithm, and then solve it
using the iterative substitution technique?
This question is from a sample practise exam for an Algorithms class this is my current answer can some one please help verify if im on the right track
Answer:
The recurrence equation:
T(n) = 1 if n = 1,
T(n) = 2T(n-1) if n > 1
after solving using iterative substitution i got
2^k * T (n-k) and I solved this to O(2^(n-1)) and I simplified it O(2^n)
Your recurrence relation should be T(n) = 2T(n-1) + O(1) with T(1) = O(1). However this doesn't change the asymptotics, the solution is still T(n) = O(2^n). To see this you can expand the recurrence relation to get T(n) = O(1) + 2(O(1) + 2(O(1) + ...)) so you have T(n) = O(1) * (1 + 2 + 4 = ... + 2^n) = O(1) * (2^(n+1) - 1) = O(2^n).

Time complexity of recursive algorithm

I have a grid with x-sided field in it. Every field contains a link to it's x surrounding fields. [x is constant]
I have an algorithm which is implemented in this field, (which can probably be optimized):
[java like pseudocode]
public ArrayList getAllFields(ArrayList list) {
list.addToList(this);
for each side {
if ( ! list.contains(neighbour) && constantTimeConditionsAreMet()) {
neighbour.getAllFields(list) //Recursive call
}
}
return list;
}
I'm having trouble finding the time complexity.
ArrayList#contains(Object) runs in linear time
How do i find the time complexity? My approach is this:
T(n) = O(1) + T(n-1) +
c(nbOfFieldsInArray - n) [The time to check the ever filling ArrayList]
T(n) = O(1) + T(n-1) + c*nbOfFieldsInArray - cn
Does this give me T(n) = T(n-1) + O(n)?
The comment you added to your code is not helpful. What does getContinent do?
In any case, since you're using a linear search (ArrayList.contains) for every potential addition to the list, then it looks like the complexity will be Omega(n^2).
You recurrence seems correct T(n) = T(n-1) + theta(1).
If you draw the recursion tree you'll notice you have a single branch with the values theta(n-1), theta(n-2), ..., theta(2), theta(1), if you add up all the levels you get the arithmetic series 1+2+3+...+n
S1 = 1+2+3+...+n
If you define
S2 = n+...+3+2+1
and then calculate S1+S2 you get
S1 + S2 = 2*S1 = (n+1) + (n+1) + ... + (n+1) = n(n+1)
therefore
2*S1 = n(n-1) => S1 = n(n-1)/2
which means T(n) = 1/2 theta(n(n-1)) = 1/2 theta(n^2) = theta(n^2)

Resources