What is the time complexity of the following method? Two parameters give me lots of confusion. Thanks in advance.
public int count(int m, int n) {
if(m == 1 || n == 1) return 1;
return count(m-1, n) + count(m, n-1);
}
This is in O(2^(n+m)).
It can be proven using induction, where the induction step is:
T(n,m) = T(n-1,m) + T(n, m-1) =(*) 2^(n+m-1) + 2^(n+m-1) = 2*2^(n+m-1) = 2^(n+m)
Where (*) is the induciton hypothesis.
Related
I did this as a solution to one of the leetcode problems, but I'm not sure what the complexity of my algorithm is.
public String countAndSay(int n) {
if (n == 1) return "1";
String pre = countAndSay(n-1);
char[] prev = pre.toCharArray();
int len = prev.length;
if (len == 1 && prev[0] == '1') return "11";
int idx = 1;
int rep = 1;
String res = "";
while (idx <= len) {
if (idx == len) {
res += (Integer.toString(rep) + prev[idx-1]);
break;
}
if (prev[idx-1] == prev[idx]) rep++;
else {
res += (Integer.toString(rep) + prev[idx-1]);
rep = 1;
}
idx++;
}
return res;
}
Since the recursion takes place n times and the loop is O(n), I feel like it should be O(n^2). Is that correct? If not, can you please explain why?
Here are a few facts:
The method calls itself recursively on input n-1.
The method produces the sequence known as look-and-say sequence.
The length of the resulting string grows exponentially with base λ, where λ = 1.303577269034... is Conway's constant, so the length is O(λ^n).
The loop is quadratic on the length of the string (because of the repeated string concatenations), so we have O((λ^n)^2) = O((λ^2)^n) for the loop.
Hence we can derive the following recurrence relation:
T(0) = 1
T(n) = T(n-1) + O((λ^2)^n)
The asymptotic behaviour of this relation is given by
T(n) ∈ Θ((λ^2)^n) = Θ(1.6993^n)
If you use a StringBuilder instead of doing the evil repeated string concatenations, you can get it down to Θ(λ^n) which would also be asymptotically optimal for this problem.
I need to find the complexity of this recursive algorithms, so, i have 3 recursive algorithms and simply just want to know the Big O Notation for them. I think i have the solution for 2 of these algorithms, just wanna check with the community.
int f1(int n)
{
if ( n<= 1)
return (1);
else
return (n *f1(n-1))
}
I think the solution of this is O(n).
int f2 (int n)
{
if(n<=1)
return(1);
else
return(n*f2(n / 2))
}
I think the solution of this is O(Log 2 (n))
int f3
{
int x, i;
if( n <= 1)
return 1;
else
{
x = f3 (n / 2);
for( i = 1 ; i <= n ; i++)
x++;
return x;
}
}
What is the complexity of this recursive algorithm, i don't have the solution for this algorithm, Can you help me?
Your first two answer is correct.
Let's do analysis for your third problem,
for each times, n is divides by 2 and we need to add x for n times,
so the complexity will be
1*n+1*n/2+1*n/4+.....+1=n(1+1/2+1/4+...)=O(n)
#codecrazers answer already covers up how to calculate the complexity step-by-step. But in general the Master-Theorem makes the problem a lot simpler.
To start, lets transform this code
int f3 (int n)
{
int x, i;
if( n <= 1)
return 1;
else
{
x = f3 (n / 2);
for( i = 1 ; i <= n ; i++)
x++;
return x;
}
}
Into a recurrence:
int f(int n)
{
if( n <= 1)
1
else
f(n / 2) + θ(n)
}
So
T(n) = T(n / 2) + θ(n)
T(n <= 1) = 1
Which is case 3, thus yielding
T(n) = θ(n)
I've trying to calculate complexity of an algorithm, but I'm not sure how to do that. I know how to solve simple algorithms, but I'm struggling with recursion.
There is the code of recursion:
static int F(int m, int n)
{
if (n == 0)
return m;
if (m == 0 && n > 0)
return n;
else
return Math.Min((1 + F(m - 1, n)), Math.Min((1 + F(m, n - 1)), (D(m - 1, n - 1) + F(m - 1, n - 1))));
}
Can someone explain me or help me to calculate this function? I've tried googling it but I can find only simple examples.(maybe mine code is also simple?)
Thank you!
I don't know what the D function is in your first piece of code. I'll consider it as a constant function.
The first piece of your code is equivalent to the following one.
static int F(int m, int n)
{
if (n == 0 || m == 0)
return n + m;
else
return Math.Min((1 + F(m - 1, n)), Math.Min((1 + F(m, n - 1)), (D(m - 1, n - 1) + F(m - 1, n - 1))));
}
It's a little difficult to calculate the time complexity of a recursive function with two parameters, but we can estimate it roughly. We have the following equation.
T(n, m) = T(n-1, m) + T(n, m-1) + T(n-1, m-1)
We can discover that the equation is quite similar to the recursive equation of binomial coefficient, but with even larger result. This tells us that the time complexity of the algorithm is an exponential function, which is quite slow.
But actually, you can use some tricks to reduce its time complexity to O(mn).
bool calculated[MAX_M][MAX_N];
int answer[MAX_M][MAX_N]
static int F(int m, int n)
{
if (n == 0 || m == 0)
return n + m;
else
{
if (calculated[m][n] == false)
{
answer[m][n] = Math.Min((1 + F(m - 1, n)), Math.Min((1 + F(m, n - 1)), (D(m - 1, n - 1) + F(m - 1, n - 1))));
calculated[m][n] = true;
}
return answer[m][n];
}
}
I can't quite understand what the second piece of code is going to do as there are lots of functions unprovided in your code. Maybe you can explain that a bit?
public static int test(int N) {
if (N == 1) return 1;
return (3 * (test(N/2) + test(N/2)) + f(N))
}
public static void f(int a) {
for (int i = 1; i <= a; i++)
System.out.println(“algo rocks”);
}
I was trying to determine the CN and the complexity for the code above
I came to this conclusion
C1 = 0 --> the terminating condition
CN = 2CN/2 + N
I was lost with this 3 that the functions are multiplied with can you please check my work and guide me where it is wrong.
You're wrong in claiming that C(1) = 0, it is 1 actually.
So, C(1) = 1.
Also, time complexity of function f() in worst case comes out to be O(N), since N is being passed out to the function.
So, your recurrence relation turns out to be :
T(N) = 3 * 2 * T(N/2) + O(N)
= 6 T(N/2) + O(N).
I am leaving the recurrence relation for you to solve. It's easy. If you're unable to calculate, ping below this answer after trying once at least.
I came across with an interesting puzzle on my previous interview.
You need to implement a function which would fit the following conditions:
m, n - positive integer numbers > 0
F(m, n) = F(m-1, n-1) + F(m, n-1)
F(1, n) = 1
F(m, 1) = 1
Obviously you can write the recursive implementation:
int F(int m, int n)
{
if(m == 1) return 1;
if(n == 1) return 1;
return F(m-1, n-1) + F(m, n-1);
}
But for input data equals one billion it will run very long time because it will get 2^1000000000 iterations :)
Does anybody have any ideas how to optimize this solution?
function F(m, n)
v = 1
s = 1
k = 1
while k < m do
v = v * (n-k) / k
s = s + v
k = k + 1
end
return s
end