How is complexity of the following algorithm calculated? - algorithm

int find_c(int n)
int i,j,c
for(i=1; i < 2*n; i=i*3)
for(j=400; j > 1; j--)
c++
for(i=c; i > 0; i--)
if(even(i/3))
for(j=n; j < n*n; j++)
c++
else
for(j=1; j < 900; j=j*3)
c++
return c
I have this algorithm which is written in java and i need to find it's complexity depending on the input parameter n in worst case scenario. I can understand the basics of this procedure in which each loop has to be viewed separately. The complexity of this algorithm should be n2 log n but i can't understand how it is calculated.
Can someone please explain this to me ?

Since this is some kind of excercise, I will just show some parts and let the rest for you. In particular look at this code:
if(even(i/3))
for(j=n; j < n*n; j++)
c++
else
for(j=1; j < 900; j=j*3)
c++
From the loop enclosing this we know that even(i/3) will be true approximately in half of the executions, so the then and the else part contribute in the same amount to the runtime.
Lets look now at the then part: j will run from n to n*n so order of O(n * n). The statements in the body of the loop have a runtime of O(1), so all together O(n * n).
The loop in the else part on the other hand will execute log3(900) times which is a constant and the runtime of the body is constant as well. So this loop will in total contribute a runtime of O(1).
In total we get therefore for the whole if a runtime of O(n * n) + O(1) which is just O(n * n).

Related

Find the Big O time complexity of the code

I am fairly familiar with simple time complexity regarding constant, linear, and quadratic time complexities. In simple code segments like:
int i = 0;
i + 1;
This is constant. So O(1). And in:
for (i = 0; i < N; i++)
This is linear since it iterates n+1 times, but for Big O time complexities we remove the constant, so just O(N). In nested for loops:
for (i = 0; i < N; i++)
for (j = 0; j < N; j++)
I get how we multiply n+1 by n and reach a time complexity of O(N^2). My issue is with slightly more complex versions of this. So, for example:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < N*N; j++)
S++;
In such a case, would I be multiplying n+1 by the inner for loop time complexity, of which I presume is n^2? So the time complexity would be O(n^3)?
Another example is:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < i*i; j++)
for (k = 0; k < j; k++)
S++;
In this case, I expanded it and wrote it out and realized that the inner, middle for loop seems to be running at an n*n time complexity, and the most inner for loop at the pace of j, which is also nxn. So in that case, would I be multiplying n+1 x n^2 x n^2, which would give me O(n^5)?
Also, I am still struggling to understand what kind of code has logarithmic time complexity. If someone could give me an algorithm or segment of code that performs at log(n) or n log(n) time complexity, and explain it, that would be much appreciated.
All of your answers are correct.
Logarithmic time complexity typically occurs when you're reducing the size of the problem by a constant factor on every iteration.
Here's an example:
for (int i = N; i >= 0; i /= 2) { .. do something ... }
In this for-loop, we're dividing the problem size by 2 on every iteration. We'll need approximately log_2(n) iterations prior to terminating. Hence, the algorithm runs in O(log(n)) time.
Another common example is the binary search algorithm, which searches a sorted interval for a value. In this procedure, we remove half of the values on each iteration (once again, we're reducing the size of the problem by a constant factor of 2). Hence, the runtime is O(log(n)).

what is the time complexity of this code and how? in Big-O

int i, j, k = 0;
for (i = n/2; i <= n; i++) {
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
}
I came across this question and this is what I think.
The outer loop will run, N/2 times and the inner loop will run logN times so it should be N/2*logN. But this is not the correct answer.
The correct answer is O(NlogN), can anybody tell me what I am missing?
Any help would be appreciated.
Let's take a look at this block of code.
First of all, you can notice that inner loop doesn't depend on the external, so the complexity of it would not change at any iteration.
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
I think, your knowledge will be enough to understand, that complexity of this loop is O(log n).
Now we need to understand how many times this loop will be performed. So we should take a look at external loop
for (i = n/2; i <= n; i++) {
and find out, that there will be n / 2 iterations, or O(n) in a Big-O notation.
Combine these complexities and you'll see, that your O(log n) loop will be performed O(n) times, so the total complexity will be O(n) * O(log n) = O(n log n).

What is the asymptotic running time of the following piece of code?

What is the asymptotic running time of the following piece of code?
if (N % 2 == 0) // N is even
for (int i = 0; i < N; i = i+1)
for (int j = 0; j < N; j = j+1)
A[i] = j;
else // N is odd
for (int i = 0; i < N; i = i+1)
A[i] = i;
If N is even we see the running time is O(n^2), when N is odd the running time is O(n). But I can't determine what the asymptotic running time is.
The possible answers are:
~ O(n)
~ O(n^2)
~ O(N * sqrt(N))
~ O(n log n)
There isn't a simple function you can use to asymptotically tightly bound the runtime. As you noted, the runtime oscillates between linear and quadratic at each step. You can say that the runtime is O(n2) and Ω(n), but without defining a piecewise function you can't give a Θ bound here.

Time complexity of Special Double For-Loop?

So I was just asked this question in an exam, and it's driving me crazy. The question is this:
What is the time complexity for the following code in terms of n:
int count = 0;
for(int i = 0; i < n; i++) {
for(int j = 1; j < n; j = j * 2) {
count++;
}
}
a) O(n log(n))
b) O(n^2)
I firmly believe the answer to be n(log(n)), because the inner loop only runs k times, where k^2 <= n, which is only log2(n), for which the time complexity is log(n). However, everyone I have talked to who was also in the exam thinks the answer to be n^2. Can anyone give me a firm reasoning for either way? Thank you!
Your reasoning is correct with minor edit below about the answer as O(n log n). There is no way the answer can be O(n^2).
2^k = n so k = log n
The inner loop will run: log n times
The outer loop will run: n times
So this is O(n log n) time complexity

Time Complexity of an Algorithm

Here is problem in which we have to calculate the time complexity of given function
f(i) = 2*f(i+1) + 3*f(i+2)
For (int i=0; i < n; i++)
F[i] = 2*f[i+1]
What i think is the complexity of this algorithm is O(2^n) + O(n) which ultimately is O(2^n).
Please correct me if i am wrong?
Firstly, all the information you required to work these out in future is here.
To answer your question. Because you have not provided a definition of f(i) in terms of I itself it is impossible to determine the actual complexity from what you have written above. However, in general for loops like
for (i = 0; i < N; i++) {
sequence of statements
}
executes N times, so the sequence of statements also executes N times. If we assume the statements are O(1), the total time for the for loop is N * O(1), which is O(N) overall. In your case above, if I take the liberty of re-writing it as
f(0) = 0;
f(1) = 1;
f(i+2) = 2*f(i+1) + 3*f(i)
for (int i=0; i < n; i++)
f[i] = 2*f[i+2]
Then we have a well defined sequence of operations and it should be clear that the complexity for the n operations is, like the example I have given above, n * O(1), which is O(n).
I hope this helps.

Resources