Time complexity of recursive algorithm - algorithm

I have a grid with x-sided field in it. Every field contains a link to it's x surrounding fields. [x is constant]
I have an algorithm which is implemented in this field, (which can probably be optimized):
[java like pseudocode]
public ArrayList getAllFields(ArrayList list) {
list.addToList(this);
for each side {
if ( ! list.contains(neighbour) && constantTimeConditionsAreMet()) {
neighbour.getAllFields(list) //Recursive call
}
}
return list;
}
I'm having trouble finding the time complexity.
ArrayList#contains(Object) runs in linear time
How do i find the time complexity? My approach is this:
T(n) = O(1) + T(n-1) +
c(nbOfFieldsInArray - n) [The time to check the ever filling ArrayList]
T(n) = O(1) + T(n-1) + c*nbOfFieldsInArray - cn
Does this give me T(n) = T(n-1) + O(n)?

The comment you added to your code is not helpful. What does getContinent do?
In any case, since you're using a linear search (ArrayList.contains) for every potential addition to the list, then it looks like the complexity will be Omega(n^2).

You recurrence seems correct T(n) = T(n-1) + theta(1).
If you draw the recursion tree you'll notice you have a single branch with the values theta(n-1), theta(n-2), ..., theta(2), theta(1), if you add up all the levels you get the arithmetic series 1+2+3+...+n
S1 = 1+2+3+...+n
If you define
S2 = n+...+3+2+1
and then calculate S1+S2 you get
S1 + S2 = 2*S1 = (n+1) + (n+1) + ... + (n+1) = n(n+1)
therefore
2*S1 = n(n-1) => S1 = n(n-1)/2
which means T(n) = 1/2 theta(n(n-1)) = 1/2 theta(n^2) = theta(n^2)

Related

Calculate the time complexity of recurrence relation f(n) = f(n/2) + f(n/3)

How to calculate time complexity of recurrence relation f(n) = f(n/2) + f(n/3). We have base case at n=1 and n=0.
How to calculate time complexity for general case i.e f(n) = f(n/x) + f(n/y), where x<n and y<n.
Edit-1 :(after first answer posted) every number considered is integer.
Edit-2 :(after first answer posted) I like the answer given by Mbo but is it possible to answer this without using any fancy theorem like master theorem etc.Like by making tree etc.
However users are free to answer the way they like and i will try to understand.
In "layman terms" you can get dependence with larger coefficient:
T(n) = T(n/2) + T(n/2) + O(1)
build call tree for n=2^k and see that the last tree level contains 2^k items, higher level 2^k-1 items, next one 2^k-2 and so on. Sum of sequence (geometric progression)
2^k + 2^k-1 + 2^k-2 + ... + 1 = 2^(k+1) = 2*n
so complexity for this dependence is linear too.
Now get dependence with smaller (zero) second coefficient:
T(n) = T(n/2) + O(1)
and ensure in linear complexity too.
Seems clear that complexity of recurrence in question lies between complexities for these simpler examples, and is linear.
In general case recurrences with complex branching might be solved with Aktra-Bazzi method (more general approach than Master theorem)
I assume that dependence is
T(n) = T(n/2) + T(n/3) + O(1)
In this case g=1, to find p we should numerically solve
(1/2)^p + (1/3)^p = 1
and get p~0.79, then integrate
T(x) = Theta(x^0.79 * (1 + Int[1..x]((1/u^0.79)*du))) =
Theta(x^0.79 * (1 + 4.8*x^0.21 - 4.8) =
Theta(x^0.79 + 4.8*x) =
Theta(x)
So complexity is linear

How to calculate time complexity of this recursion?

I'm interested in calculating the following code's time and space complexity but seem to be struggling a lot. I know that the deepest the recursion could reach is n so the space should be O(n). I have no idea however how to calculate the time complexity... I don't know how to write the formula when it comes to recursions similar to this forms like: f(f(n-1)) .
if it was something like, return f3(n-1) + f3(n-1) then i know it should be O(2^n) since T(n) = 2T(n-1) correct?
Here's the code:
int f3(int n)
{
if(n <= 2)
return 1;
f3(1 + f3(n-2));
return n - 1;
}
Thank you for your help!
Notice that f3(n) = n - 1 for all n, so the line f3(1 + f3(n-2)), first f3(n-2) is computed, which returns n - 3 and then f3(1 + n - 3) = f3(n-2) is computed again!
So, f3(n) computes f3(n-2) twice, alongside with some O(1) overhead.
We got the recursion formula T(n) = 2T(n-2) + c for some constant c, and T(n) is the running time of f3(n).
Solving the recursion, we get T(n) = O(2^(n/2)).

what the Time Complexity of T(n) = 2T(n/2) +O(1)

i want to know what the Time Complexity of my recursion method :
T(n) = 2T(n/2) + O(1)
i saw a result that says it is O(n) but i don't know why , i solved it like this :
T(n) = 2T(n/2) + 1
T(n-1) = 4T(n-1/4) + 3
T(n-2) = 8T(n-2/8) + 7
...... ………….. ..
T(n) = 2^n+1 T (n/2^n+1) + (2^n+1 - 1)
I think you have got the wrong idea about recursive relations. You can think as follows:
If T(n) represents the value of function T() at input = n then the relation says that output is one more double the value at half of the current input. So for input = n-1 output i.e. T(n-1) will be one more than double the value at half of this input, that is T(n-1) = 2*T((n-1)/2) + 1
The above kind of recursive relation should be solved as answered by Yves Daoust. For more examples on recursive relations, you can refer this
Consider that n=2^m, which allows you to write
T(2^m)=2T(2^(m-1))+O(1)
or by denoting S(m):= T(2^m),
S(m)=2 S(m-1) + O(1),
2^m S(m)=2 2^(m-1)S(m-1) + 2^(m-1) O(1)
and finally,
R(m) = R(m-1) + 2^(m-1) O(1).
Now by induction,
R(m) = R(0) + (2^m-1) O(1),
T(n) = S(m) = 2^(1-m) T(2^m) + (2 - 2^(m-1)) O(1) = 2/n T(n) + (2 - n/2) O(1).
There are a couple of rules that you might need to remember. If you can remember these easy rules then Master Theorem is very easy to solve recurrence equations. The following are the basic rules which needs to be remembered
case 1) If n^(log b base a) << f(n) then T(n) = f(n)
case 2) If n^(log b base a) = f(n) then T(n) = f(n) * log n
case 3) 1) If n^(log b base a) >> f(n) then T(n) = n^(log b base a)
Now, lets solve the recurrence using the above equations.
a = 2, b = 2, f(n) = O(1)
n^(log b base a) = n = O(n)
This is case 3) in the above equations. Hence T(n) = n^(log b base a) = O(n).

What's the best bound for T(n)=T(0.8n)+n, I suppose that's O(n), but I'm not sure of it

I don't think I have fully understand recurrence in algorithm.
Well, the n in the recurrence function can also be changed into n^2 or n^3. Are they just the same with the n case?
If applicable, what's the typical method of finding the best bounds of running time?
I also figured out that T(n) = T(0.8n) + n = O(n).
When solving recurrence relations, the most common way is by repeatly replacing functions by their expressions. S.t. T(n) = T(0.8n) + n = T(0.64n) + 0.8n + n = ... = (1 + 0.8 + 0.64 + 0.512 + ...)n. This is a typical geometric infinite progression. By applying basic calculus, we can easily get that T(n) = 5n = O(n).
When we change the n in the original expression by n^x and x is an arbitrary non-zero constants, we can always let some variable t = n^x, and T(t) is O(t), so n^x should be the same case with n. T(n^x) = O(n^x).

Design And Analysis of Algorithm : Recursive Relation

I have a great Doubt in solving this recursive relation. Can anyone provide me a solution?
The relation:
T(n) = Sumation i=1 to N T(i) +1...,
What is the bigOh order?
Taking the first order difference allows you to get rid of the summation.
T(n) - T(n-1) = (Sum(1<=i<n: T(i)) + 1) - (Sum(1<=i<n-1: T(i)) + 1) = T(n-1)
Hence
T(n) = 2.T(n-1)
A recurrence relation describes a sequence of numbers. Early terms are specified explicitly and later terms are expressed as a function of their predecessors. As a trivial example, this recurrence describes the sequence 1, 2, 3, etc.:
void Sample(int n)
{
if (n > 0) {
Sample(n-1);
System.out.println('here');
} else {
return 1;
}
}
Sample(3);
Here, the first term is defined to be 1 and each subsequent term is one more than its predecessor. In order to analyze a recurrence relationship, we need to know execution time of each line of code, In above example:
void Sample(int n)
{
if (n > 0) {
// T(n-1) Sample(n-1);
// 1 System.out.println('here');
} else {
return 1;
}
}
We define T(n) as a:
For solving T(n)= T(n-1)+1, If we know what is T(n-1), then we can substitute it and get answer, Let's substitute it with n then, we will have:
T(n-1)= T(n-1-1)+1 => T(n-2)+1
//in continue
T(n)=[T(n-2)+1]+1 => T(n-2)+2
//and again
T(n)=[T(n-3)+2]+1 => T(n-3)+3
.
.
.
// if repeat it k times, while it ends
T(n)= T(n-k)+k
As you see it is increasing by 1 in each step, If we go k times, Then T(n)= T(n-k)+k. Now, we need to know the smallest value(when the the function will stop, always must be a point to stop). In this problem zero is end of recursive stack. sine we assume that we go k times to reach zero, the solution will be:
// if n-k is final step
n-k = 0 => n = k
// replace k with n
T(n)= T(n-n)+n => T(n)= T(0)+n;
// we know
T(0) = 1;
T(n) = 1+n => O(n)
The big O is n, It means this recursive algorithms in worst case goes n times.

Resources