I have a Computer Science assignment to determine recurrence of algorithm given as:
public void foobar(int n)
if n < 0
return
foobar (n/3)
for (int i = 1; i <= n; i = i+1)
for (int j = n; j >= 1; j = j/2)
print(i+j)
foobar(2n/3)
for (int i = 1; i >= 5000; i = i+1)
print(i)
I know about basic relation like:
T(n) = T(n/3) + T(2n/3) + time for other stuff
I am unable to determine the running time of loops at each recursive call . Any help would be much appreciated and would greatly help in my studies, Thank you!
Related
can anyone help me to identify the steps for the following example and give more explanation on this Example the steps That determine Big-O notation is O(2n)
int i, j = 1;
for(i = 1; i <= n; i++)
{
j = j * 2;
}
for(i = 1; i <= j; i++)
{
cout << j << "\n";
}
thank you in advance
The first loop has n iterations and assigns 2^n to j.
The second loop has j = 2^n iterations.
The cout has time complexity O(log j) = O(n).
Hence the overall complexity is O(n * 2^n), which is strictly larger than O(2^n).
function sum(arr){
let count = 0;
let count1 = 0;
for(let i = 0; i<arr.length;++i){
count = arr[i] + count}
for(let j = 0; j < arr.length;++j){
if(j === arr.length - 1){
break}
count1 = arr[j] * arr[j + 1] + count1;}
count = count + count1;
return count;
}
I implemented above's code to calculating the sum of numbers and each of their product in an array. For example, if array = [1,2,3], then it would be 1+2+3+1*2+2*3 = 14. However, writing two separate for loop seems silly to me, is there a more elegant way to do this? In addition, I'm stuck on prove this algorithm's correctness and the running time. But it looks like O(n) + O(n)= 2O(n) = O(n) to me. For proving the correctness, I think induction is a way to do it but currently I'm having trouble of using induction to prove an algorithm.
Simple modification:
function sum(arr){
let count = arr[0];
for(let i = 1; i<arr.length;++i){
count = count + arr[i] * (arr[i-1] + 1); }
return count;
}
And yes, you are right about O(n) + O(n)= 2O(n) = O(n) because constants are ignored
I'm kind of confused to conclude the Big-O notation for this while loop, where N is the input size:
int array[0][(N-1)/2] = 1;
int key = 2,k,l;
i = 0;
int j = (N-1)/2;
while(key <= N*N)
{
if(i <= 0)
k = N-1;
else
k = i-1;
if(j <= 0)
l = N-1;
else
l = j-1;
if(array[k][l])
i = (i+1)%N;
else
{
i = k;
j = l;
}
array[i][j] = key;
key++;
}
I concluded it as O(N2)
because when N=5 it iterates until N*N i.e 5*5=25 times but I'm still confused regarding the rest of the code inside the loop. would really appreciate it if someone could give a step by step explanation of the code, and this loop is just part of a bigger function which has 4 more loops which i understood but not this loop.
What you should actually care about is, how k changes. It grows by one in each iteration, and there are no shortcuts here.
So it's just O(N2).
I am not sure of the rules/procedure to follow when determining asymptotic complexity of methods. I know single declarations is O(1) and single loops are O(n) and nested loops are O(n^2). Things which are doubled are log_2(n) things which are quartered are log_4(n). And if we have a loop that's O(n) and a something inside of that which produces log(n) then that's nlog(n). But I am still unsure how to figure all that out. Do we focus on the user inputted 'n' variable value to determine asymptotic complexity or do we focus on the incrementing variable value 'i'.
Can someone please walk me through these examples and show how it's done.
Example1:
for (k = 0; k < n; k = k + 3)
for (p = n; p > 6; p--)
System.out.prtinln(p%2);
T(n) = ?
Asymptotic complexity + ?
Example2:
for (k = 0; k <= n/8; k++)
System.out.println(k);
System.out.prtinln("Next");
for (p = n; p >= 1; p--)
System.out.prtinln(p % 2);
T(n) = ?
Asymptotic complexity = ?
Example3:
for (i = n - 3; i < = n - 1; i++)
System.out.println(i);
for (k = 1; k <= n; k++)
System.out.prtinln(i + k);
T(n) = ?
Asymptotic complexity = ?
Example4:
for (a = 1; a <= n/3; a++)
for (b = 1; b <= 2 * n; b++)
System.out.println(a * b);
T(n) = ?
Asymptotic complexity = ?
I have these 2 codes, the question is to find how many times x=x+1 will run in each occasion as T1(n) stands for code 1 and T2(n) stands for code 2. Then I have to find the BIG O of each one, but I know how to do it, the thing is I get stuck in finding how many times ( as to n of course ) will x = x + 1 will run.
CODE 1:
for( i= 1; i <= n; i++)
{
for(j = 1; j <= sqrt(i); j++)
{
for( k = 1; k <= n - j + 1; k++)
{
x = x + 1;
}
}
}
CODE 2:
for(j = 1; j <= n; j++)
{
h = n;
while(h > 0)
{
for (i = 1; i <= sqrt(n); i++)
{
x = x+1;
}
h = h/2;
}
}
I am really stuck, and have read already a lot so I ask if someone can help me, please explain me analytically.
PS: I think in the code 2 , this for (i = 1; i <= sqrt(n); i++) will run n*log(n) times, right? Then what?
For code 1 you have that the number of calls of x=x+1 is:
Here we bounded 1+sqrt(2)+...+sqrt(n) with n sqrt(n) and used the fact that the first term is the leading term.
For code 2 the calculations are simpler:
The second loop actually goes from h=n to 0 by iterating h = h/2 but you can see that this is the same as going from 1 to log n. What we used is the fact the j, t, i are mutually independent (analogously just like we can write that sum from 1 to n of f(n) is just nf(n)).