Determine the CN and time comlexity for the recurrence function - algorithm

public static int test(int N) {
if (N == 1) return 1;
return (3 * (test(N/2) + test(N/2)) + f(N))
}
public static void f(int a) {
for (int i = 1; i <= a; i++)
System.out.println(“algo rocks”);
}
I was trying to determine the CN and the complexity for the code above
I came to this conclusion
C1 = 0 --> the terminating condition
CN = 2CN/2 + N
I was lost with this 3 that the functions are multiplied with can you please check my work and guide me where it is wrong.

You're wrong in claiming that C(1) = 0, it is 1 actually.
So, C(1) = 1.
Also, time complexity of function f() in worst case comes out to be O(N), since N is being passed out to the function.
So, your recurrence relation turns out to be :
T(N) = 3 * 2 * T(N/2) + O(N)
= 6 T(N/2) + O(N).
I am leaving the recurrence relation for you to solve. It's easy. If you're unable to calculate, ping below this answer after trying once at least.

Related

complexity of nT(n/2)+n

what is the complexity for this code? i know the relation is T(n)=nT(n/2)+n
Code
void methode(int n)
for (int i = 1; i <= n; i++) {
ifs1 = ifs1 + 1;
if (n >= 1)
methode(n / 2);
}
This method has a very unusual recurrence relation, T(n) = n(T(n) + 1). I believe the asymptotic solution is
T(n) ∈ Θ(n^((log2(n)+1)/2))
or nicely formatted:
Let me know if you need me to spell it out in more detail.

Recurrence relation for running time of recursive algorithm for finding factorial(n)

Try to write down the recurrence relation for running time of recursive algorithm for finding factorial(n).
Suggest the base cases for that recurrence relation.
I could not solve this problem, Can someone explain it?
Factorial can be determine by this algorithm
1: Read number n.
2. Initialize i and fact to 1.
3. Repeat step 4 and step 5 while i is not equal to n.
4. fact <- fact * i
5. i <- i +1
6. Return fact
In short if we write down this by recurrence relation then it will be -
int Fact(int num) {
if(num <= 1) {
return 1;
}
else {
return num * Fact(num - 1);
}
}
Basically factorial of a number num will be factorial of Fact[num - 1] * num.
So we can write it like this -
int fact[num+1];
fact[0] = 1;
for(int i = 1; i <= num; i++) {
fact[i] = fact[i-1] * i; // recurrence relation is here
}
printf("%d\n", fact[num]);

Big O notations - Recursive functions

I need to find the complexity of this recursive algorithms, so, i have 3 recursive algorithms and simply just want to know the Big O Notation for them. I think i have the solution for 2 of these algorithms, just wanna check with the community.
int f1(int n)
{
if ( n<= 1)
return (1);
else
return (n *f1(n-1))
}
I think the solution of this is O(n).
int f2 (int n)
{
if(n<=1)
return(1);
else
return(n*f2(n / 2))
}
I think the solution of this is O(Log 2 (n))
int f3
{
int x, i;
if( n <= 1)
return 1;
else
{
x = f3 (n / 2);
for( i = 1 ; i <= n ; i++)
x++;
return x;
}
}
What is the complexity of this recursive algorithm, i don't have the solution for this algorithm, Can you help me?
Your first two answer is correct.
Let's do analysis for your third problem,
for each times, n is divides by 2 and we need to add x for n times,
so the complexity will be
1*n+1*n/2+1*n/4+.....+1=n(1+1/2+1/4+...)=O(n)
#codecrazers answer already covers up how to calculate the complexity step-by-step. But in general the Master-Theorem makes the problem a lot simpler.
To start, lets transform this code
int f3 (int n)
{
int x, i;
if( n <= 1)
return 1;
else
{
x = f3 (n / 2);
for( i = 1 ; i <= n ; i++)
x++;
return x;
}
}
Into a recurrence:
int f(int n)
{
if( n <= 1)
1
else
f(n / 2) + θ(n)
}
So
T(n) = T(n / 2) + θ(n)
T(n <= 1) = 1
Which is case 3, thus yielding
T(n) = θ(n)

Confused about these asymptotic notation and its runtime

public static Asymptotic f3_notation = Asymptotic.BIG_THETA;
public static Runtime f3_runtime = Runtime.LINEAR;
/* When f3 is first called, start will be 0 and end will be the length of the array - 1 */
public int f3(char[] array, int start, int end) {
if (array.length <= 1 || end <= start){
return 1;
}
int mid = start + ((end - start) / 2);
return f3(array, start, mid) + f3(array, mid + 1, end);
}
public static Asymptotic f4_notation = Asymptotic.BIG_THETA;
public static Runtime f4_runtime = Runtime.LINEARITHMIC;
/* When f4 is first called, start will be 0 and end will be the length of the array - 1 */
public int f4(char[] array, int start, int end) {
if (array.length <= 1 || end <= start) return 1;
int counter = 0;
for (int i = start; i < end; i++) {
if (array[i] == 'a') counter++;
}
int mid = start + ((end - start) / 2);
return counter + f4(array, start, mid) + f4(array, mid + 1, end);
}
So I have these two methods. What I don't understand is that both have recursion but why is the first one is linear and the second method is linearithmic?
I was told that if there is division or multiplication, usually its runtime is log-n. Though the first method has the division, it still is considered as linear but the second is not.
The more I understand, the more it confuses me and makes me feel like I know nothing.
The formula for the first method is:
T(n) = 2T(n/2) + O(1)
So if you draw the corresponding tree for this formula you will see that the amount of work is proportional to number of nodes in the tree which is O(n). Also you could use Master Method to solve this.
But for the second it is:
T(n) = 2T(n/2) + O(n)
In fact, what happens here is that your tree will have (just like the first method) O(log n) levels, but here in each level you are spending O(n) time which will result in O(n log n) time complexity. Again, Master Theorem works for this. Note that in the first case, though your tree (for the formula) will have O(log n) levels but in each level you will spend time proportional to the number of nodes on that level, and not O(n).

What is the time complexity of a function with two parameters?

What is the time complexity of the following method? Two parameters give me lots of confusion. Thanks in advance.
public int count(int m, int n) {
if(m == 1 || n == 1) return 1;
return count(m-1, n) + count(m, n-1);
}
This is in O(2^(n+m)).
It can be proven using induction, where the induction step is:
T(n,m) = T(n-1,m) + T(n, m-1) =(*) 2^(n+m-1) + 2^(n+m-1) = 2*2^(n+m-1) = 2^(n+m)
Where (*) is the induciton hypothesis.

Resources