Time Complexity of an Algorithm - algorithm

Here is problem in which we have to calculate the time complexity of given function
f(i) = 2*f(i+1) + 3*f(i+2)
For (int i=0; i < n; i++)
F[i] = 2*f[i+1]
What i think is the complexity of this algorithm is O(2^n) + O(n) which ultimately is O(2^n).
Please correct me if i am wrong?

Firstly, all the information you required to work these out in future is here.
To answer your question. Because you have not provided a definition of f(i) in terms of I itself it is impossible to determine the actual complexity from what you have written above. However, in general for loops like
for (i = 0; i < N; i++) {
sequence of statements
}
executes N times, so the sequence of statements also executes N times. If we assume the statements are O(1), the total time for the for loop is N * O(1), which is O(N) overall. In your case above, if I take the liberty of re-writing it as
f(0) = 0;
f(1) = 1;
f(i+2) = 2*f(i+1) + 3*f(i)
for (int i=0; i < n; i++)
f[i] = 2*f[i+2]
Then we have a well defined sequence of operations and it should be clear that the complexity for the n operations is, like the example I have given above, n * O(1), which is O(n).
I hope this helps.

Related

Find the Big O time complexity of the code

I am fairly familiar with simple time complexity regarding constant, linear, and quadratic time complexities. In simple code segments like:
int i = 0;
i + 1;
This is constant. So O(1). And in:
for (i = 0; i < N; i++)
This is linear since it iterates n+1 times, but for Big O time complexities we remove the constant, so just O(N). In nested for loops:
for (i = 0; i < N; i++)
for (j = 0; j < N; j++)
I get how we multiply n+1 by n and reach a time complexity of O(N^2). My issue is with slightly more complex versions of this. So, for example:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < N*N; j++)
S++;
In such a case, would I be multiplying n+1 by the inner for loop time complexity, of which I presume is n^2? So the time complexity would be O(n^3)?
Another example is:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < i*i; j++)
for (k = 0; k < j; k++)
S++;
In this case, I expanded it and wrote it out and realized that the inner, middle for loop seems to be running at an n*n time complexity, and the most inner for loop at the pace of j, which is also nxn. So in that case, would I be multiplying n+1 x n^2 x n^2, which would give me O(n^5)?
Also, I am still struggling to understand what kind of code has logarithmic time complexity. If someone could give me an algorithm or segment of code that performs at log(n) or n log(n) time complexity, and explain it, that would be much appreciated.
All of your answers are correct.
Logarithmic time complexity typically occurs when you're reducing the size of the problem by a constant factor on every iteration.
Here's an example:
for (int i = N; i >= 0; i /= 2) { .. do something ... }
In this for-loop, we're dividing the problem size by 2 on every iteration. We'll need approximately log_2(n) iterations prior to terminating. Hence, the algorithm runs in O(log(n)) time.
Another common example is the binary search algorithm, which searches a sorted interval for a value. In this procedure, we remove half of the values on each iteration (once again, we're reducing the size of the problem by a constant factor of 2). Hence, the runtime is O(log(n)).

Time complexity of this algorithm with constant loop

I know that:
for(int i = 0; i < 1337; i++){
printf("\n");
}
Is O(1) since we have 1337 irrespective of n
But what about:
char * s = "abcdef";
for(int i = 0; i < 1337; i++){
strlen(s);
}
Is this O(N) now? Please any one explain
If you have a constant input the complexity of the algorithm will always be O(1) as you will perform the same number of operations.
If, however, s is input to your algorithm, the complexity will become O(len(s)). Note that O(N) has no meaning in this example as you have no not defined what is N.
You need to define what N is. Your first example is O(1) no matter how you view it. Your second example is O(N) in the length of s because strlen is O(N).

How to calculate time complexity?

I've gone through some basic concepts of calculating time complexities. I would like to know the time complexity of the code that follows.
I think the time complexity would be O(log3n*n2). It may still be wrong and I want to know the exact answer and how to arrive at the same. Thank you :)
function(int n){
if(n == 1) return;
for(int i = 1; i <= n; i++)
for(int j = 1; j <= n; j++)
printf("*");
function(n-3);
}
Two nested loops with n iterations give O(n^2). Recursion calls the function itself for O(n)-time since it decreases n for the constant 3, thus it's called n/3 + 1 = O(n) times. In total, it's O(n^3).
The logarithm constant in your result would be in case that function is called with the value of n/3.

Logarithmic complexity of an algorithm

It's diffcult for me to understand logarithmic complexity of algorithm.
For example
for(int j=1; j<=n; j*=2){
...
}
Its complexity is O(log2N)
So what if it is j*=3? The complexity will then be O(log3N)?
You could say yes as long as the loop body is O(1).
However, note that log3N = log2N / log23, so it is also O(log2N), since the constant factor does not matter.
Also note it is apparent from this argument, for any fixed constant k, O(logkN) is also O(log2N), since you could substitute 3 with k.
Basicly, yes.
Let's assume that your for loop looks like this:
for (int j = 1; j < n; j *= a) {...}
where a is some const.
If the for loop executes k times, then in the last iteration, j will be equal to ak. And since N = O(j) and j = O(ak), N = O(ak). It follows that k = O(logaN). Once again, for loop executes k times, so time complexity of this algorithm is O(k) = O(logaN).

Complexity of algorithm

What is the complexity given for the following problem is O(n). Shouldn't it be
O(n^2)? That is because the outer loop is O(n) and inner is also O(n), therefore n*n = O(n^2)?
The answer sheet of this question states that the answer is O(n). How is that possible?
public static void q1d(int n) {
int count = 0;
for (int i = 0; i < n; i++) {
count++;
for (int j = 0; j < n; j++) {
count++;
}
}
}
The complexity for the following problem is O(n^2), how can you obtain that? Can someone please elaborate?
public static void q1E(int n) {
int count = 0;
for (int i = 0; i < n; i++) {
count++;
for (int j = 0; j < n/2; j++) {
count++;
}
}
}
Thanks
The first example is O(n^2), so it seems they've made a mistake. To calculate (informally) the second example, we can do n * (n/2) = (n^2)/2 = O(n^2). If this doesn't make sense, you need to go and brush up what the meaning of something being O(n^k) is.
The complexity of both code is O(n*n)
FIRST
The outer loop runs n times and the inner loop varies from 0 to n-1 times
so
total = 1 + 2 + 3 + 4 ... + n
which if you add the arithmetic progression is n * ( n + 1 ) / 2 is O(n*n)
SECOND
The outer loop runs n times and the inner loop varies from 0 to n-1/2 times
so
total = 1 + 1/2 + 3/2 + 4/2 ... + n/2
which if you add the arithmetic progression is n * ( n + 1 ) / 4 is also O(n*n)
First case is definitely O(n^2)
The second is O(n^2) as well because you omit constants when calculate big O
Your answer sheet is wrong, the first algorithm is clearly O(n^2).
Big-Oh notation is "worst case" so when calculating the Big-Oh value, we generally ignore multiplications / divisions by constants.
That being said, your second example is also O(n^2) in the worst case because, although the inner loop is "only" 1/2 n, the n is the clear bounding factor. In practice the second algorithm will be less than O(n^2) operations -- but Big-Oh is intended to be a "worst case" (ie. maximal bounding) measurement, so the exact number of operations is ignored in favor of focusing on how the algorithm behaves as n approaches infinity.
Both are O(n^2). Your answer is wrong. Or you may have written the question incorrectly.

Resources