what is the complexity for this code? i know the relation is T(n)=nT(n/2)+n
Code
void methode(int n)
for (int i = 1; i <= n; i++) {
ifs1 = ifs1 + 1;
if (n >= 1)
methode(n / 2);
}
This method has a very unusual recurrence relation, T(n) = n(T(n) + 1). I believe the asymptotic solution is
T(n) ∈ Θ(n^((log2(n)+1)/2))
or nicely formatted:
Let me know if you need me to spell it out in more detail.
Related
I'd be very grateful if someone explained to me how to analyse the time complexity of this loop:
int t;
for (i = 1; i <= n; i++){
t = i;
while(t > 0){
t = t/2
}
}
I'm inclined to think that is O(n*log(n)) since it's very similar to:
int t;
for (i = 1; i <= n; i++){
t = n;
while(t > 0){
t = t/2
}
}
but it does less assignments to the variable t. Is this the case?
If it's the case, How could I arrive to this conclusion more rigorously?
For the first snippet, the inner loop runs log(1) + log(2) + ... + log(n) times, which is the same as log(1 * 2 * ... * n) = log(n!) which through Sterling's Approximation is n log(n). Your second snippet has the same complexity. Even if it happens to do less assignments we only care about the overall behavior. In this case both are linearithmic.
I have a Computer Science assignment to determine recurrence of algorithm given as:
public void foobar(int n)
if n < 0
return
foobar (n/3)
for (int i = 1; i <= n; i = i+1)
for (int j = n; j >= 1; j = j/2)
print(i+j)
foobar(2n/3)
for (int i = 1; i >= 5000; i = i+1)
print(i)
I know about basic relation like:
T(n) = T(n/3) + T(2n/3) + time for other stuff
I am unable to determine the running time of loops at each recursive call . Any help would be much appreciated and would greatly help in my studies, Thank you!
I need to find the complexity of this recursive algorithms, so, i have 3 recursive algorithms and simply just want to know the Big O Notation for them. I think i have the solution for 2 of these algorithms, just wanna check with the community.
int f1(int n)
{
if ( n<= 1)
return (1);
else
return (n *f1(n-1))
}
I think the solution of this is O(n).
int f2 (int n)
{
if(n<=1)
return(1);
else
return(n*f2(n / 2))
}
I think the solution of this is O(Log 2 (n))
int f3
{
int x, i;
if( n <= 1)
return 1;
else
{
x = f3 (n / 2);
for( i = 1 ; i <= n ; i++)
x++;
return x;
}
}
What is the complexity of this recursive algorithm, i don't have the solution for this algorithm, Can you help me?
Your first two answer is correct.
Let's do analysis for your third problem,
for each times, n is divides by 2 and we need to add x for n times,
so the complexity will be
1*n+1*n/2+1*n/4+.....+1=n(1+1/2+1/4+...)=O(n)
#codecrazers answer already covers up how to calculate the complexity step-by-step. But in general the Master-Theorem makes the problem a lot simpler.
To start, lets transform this code
int f3 (int n)
{
int x, i;
if( n <= 1)
return 1;
else
{
x = f3 (n / 2);
for( i = 1 ; i <= n ; i++)
x++;
return x;
}
}
Into a recurrence:
int f(int n)
{
if( n <= 1)
1
else
f(n / 2) + θ(n)
}
So
T(n) = T(n / 2) + θ(n)
T(n <= 1) = 1
Which is case 3, thus yielding
T(n) = θ(n)
I am not sure of the rules/procedure to follow when determining asymptotic complexity of methods. I know single declarations is O(1) and single loops are O(n) and nested loops are O(n^2). Things which are doubled are log_2(n) things which are quartered are log_4(n). And if we have a loop that's O(n) and a something inside of that which produces log(n) then that's nlog(n). But I am still unsure how to figure all that out. Do we focus on the user inputted 'n' variable value to determine asymptotic complexity or do we focus on the incrementing variable value 'i'.
Can someone please walk me through these examples and show how it's done.
Example1:
for (k = 0; k < n; k = k + 3)
for (p = n; p > 6; p--)
System.out.prtinln(p%2);
T(n) = ?
Asymptotic complexity + ?
Example2:
for (k = 0; k <= n/8; k++)
System.out.println(k);
System.out.prtinln("Next");
for (p = n; p >= 1; p--)
System.out.prtinln(p % 2);
T(n) = ?
Asymptotic complexity = ?
Example3:
for (i = n - 3; i < = n - 1; i++)
System.out.println(i);
for (k = 1; k <= n; k++)
System.out.prtinln(i + k);
T(n) = ?
Asymptotic complexity = ?
Example4:
for (a = 1; a <= n/3; a++)
for (b = 1; b <= 2 * n; b++)
System.out.println(a * b);
T(n) = ?
Asymptotic complexity = ?
public static int test(int N) {
if (N == 1) return 1;
return (3 * (test(N/2) + test(N/2)) + f(N))
}
public static void f(int a) {
for (int i = 1; i <= a; i++)
System.out.println(“algo rocks”);
}
I was trying to determine the CN and the complexity for the code above
I came to this conclusion
C1 = 0 --> the terminating condition
CN = 2CN/2 + N
I was lost with this 3 that the functions are multiplied with can you please check my work and guide me where it is wrong.
You're wrong in claiming that C(1) = 0, it is 1 actually.
So, C(1) = 1.
Also, time complexity of function f() in worst case comes out to be O(N), since N is being passed out to the function.
So, your recurrence relation turns out to be :
T(N) = 3 * 2 * T(N/2) + O(N)
= 6 T(N/2) + O(N).
I am leaving the recurrence relation for you to solve. It's easy. If you're unable to calculate, ping below this answer after trying once at least.