Time complexity of this algorithm with constant loop - algorithm

I know that:
for(int i = 0; i < 1337; i++){
printf("\n");
}
Is O(1) since we have 1337 irrespective of n
But what about:
char * s = "abcdef";
for(int i = 0; i < 1337; i++){
strlen(s);
}
Is this O(N) now? Please any one explain

If you have a constant input the complexity of the algorithm will always be O(1) as you will perform the same number of operations.
If, however, s is input to your algorithm, the complexity will become O(len(s)). Note that O(N) has no meaning in this example as you have no not defined what is N.

You need to define what N is. Your first example is O(1) no matter how you view it. Your second example is O(N) in the length of s because strlen is O(N).

Related

Big-O & Exact Runtime

I am learning about Big-O and although I started to understand things, I still am not able to correctly measure the Big-O of an algorithm.
I've got a code:
int n = 10;
int count = 0;
int k = 0;
for (int i = 1; i < n; i++)
{
for (int p = 200; p > 2*i; p--)
{
int j = i;
while (j < n)
{
do
{
count++;
k = count * j;
} while (k > j);
j++;
}
}
}
which I have to measure the Big-O and Exact Runtime.
Let me start, the first for loop is O(n) because it depends on n variable.
The second for loop is nested, therefore makes the big-O till now an O(n^2).
So how we gonna calculate the while (j < n) (so only three loops till now) and how we gonna calculate the do while(k > j) if it appears, makes 4 loops, such as in this case?
A comprehend explanation would be really helpful.
Thank you.
Unless I'm much mistaken, this program has an infinite loop and therefore it's time complexity cannot usefully be analyzed.
In particular
do
{
count++;
k = count * j;
} while (k > j);
as soon as this loop is entered for the second time and count = 2, k will be set greater to j, and will remain so indefinitely (ignoring integer overflow, which will happen pretty quickly).
I understand that you're learning Big-Oh notation, but creating toy examples like this probably isn't the best way to understand Big-Oh. I would recommend reading a well-known algorithms textbook where they walk you through new algorithms, explaining and analyzing the time and space complexity as they do so.
I am assuming that the while loop should be:
while (k < j)
Now, in this case, the first for loop would take O(n) time. The second loop would take O(p) time.
Now, for the third loop,
int j = i;`
while (j < n){
...
j++;
}
could be rewritten as
for(j=i;j<n;j++)
meaning it shall take O(n) time.
For the last loop, the value of k increases exponentially.
Consider it to be same as
for(k = count*j ;k<j ;j++,count++)
Hence it shall take O(logn) time.
The total time complexity is O(n^2*p*logn).

What is the time complexity of these loops 1 and 2

I was going through an article on analysis of time complexity of loops at a very popular website(link given below) and according to that article the time complexity of below loops 1st and 2nd are O(1) and O(n) respectively.
But i think the time complexity of both loop is same O(n)
for (int i = 1; i <= c; i++) {
// some O(1) expressions
}
My reasoning : `c*n=O(n)
after going through answers below , My reasoning is wrong as there is no varying input n. the loop run is fixed- c times. hence irrespective of the input value n , the loop will run constant time. so O(1) complexity
for (int i = 1; i <= n; i += c) {
// some O(1) expressions
}
My reasoning : c*n=O(n)
Am i missing something ?I would be grateful if someone can help and explain
This is the link of the article : http://www.geeksforgeeks.org/analysis-of-algorithms-set-4-analysis-of-loops/
A loop or recursion that runs a constant number of times is also
considered as O(1).
Here: C is a constant value. So basically, you are performing constant number of operation irrespective of the value of n
// Here c is a constant
for (int i = 1; i <= c; i++) {
// some O(1) expressions
}
Also in Second Loop:
for (int i = 1; i <= n; i += c) {
// some O(1) expressions
}
Your reason c*n = O(n) is not correct. Here
Increment by C. For n elements, loops occur n/c which is asymptotically O(n/c) ~ O(n)
for (int i = 1; i <= c; i++) { // some O(1) expressions }
Here c is a constant. So basically, you are performing constant number of operation irrespective of the value of n. That is why is it considered as constant complexity, O(1).
for (int i = 1; i <= n; i += c) { // some O(1) expressions }
You are looping with a input value n, which is essentially variable with the given input to the program or algorithm. Now the c is again a constant, which will remain same for all the different values of n. The complexity is considered as O(n).
for (int i = 1; i <= n; i++) { // some O(1) expressions }
This is same as the above only, just that value of the c is 1.
All the complexities are represented in asymptotic notation format. Constant factors are removed because they will be same irrespective of the value of n.
1) There is no n in the picture, i dunno why you think it O(n). The loop is going from 1 to c, so its O(c), and as c is a constant, the complexity is O(1).
2) The loop starts from 1 and goes till n, incrementing c at every step. Clearly the complexity is O(n/c), which asymptotically is O(n).
O(1) : The complexity of this loop is O(1) as it runs a constant amount of time c.
// Here c is a constant
for (int i = 1; i <= c; i++) {
// some O(1) expressions
}
O(n): The complexity of the loop is O(n) if it is incremented or decremented by constant amount. For example, these loops have O(n) time complexity.
// Here c is a positive integer constant
for (int i = 1; i <= n; i += c) {
// some O(1) expressions
}
for (int i = n; i > 0; i -= c) {
// some O(1) expressions
}

Why is this time complexity O(n)?

Why does the below function have a time complexity of O(n)? I can't figure it out for the life of me.
void setUpperTriangular (
int intMatrix[0,…,n-1][0,…,n-1]) {
for (int i=1; i<n; i++) {
for (int j=0; j<i; j++) {
intMatrix[i][j] = 0;
}
}
}
}
I keep getting the final time complexity as O(n^2) because:
i: execute n times{//Time complexity=n*(n*1)
j: execute n times{ //Time complexity=n*1
intMatrix[i][j] = 0; //Time complexity=1
}
}
The code iterates through n^2/2 (half a square matrix) locations in the array, so its time complexity is O(n^2)
This is same as insertion sort's for loop. Time complexity of insertion sort is O(n2).
So, CS department head explained it a different way. He said that since the second loop doesn't iterate n times, it iterates n! times. So technically it is O(n).
It can be at most considered as O(n.m) which finally comes down to O(n.n) or O(n^2)..

Time Complexity of an Algorithm

Here is problem in which we have to calculate the time complexity of given function
f(i) = 2*f(i+1) + 3*f(i+2)
For (int i=0; i < n; i++)
F[i] = 2*f[i+1]
What i think is the complexity of this algorithm is O(2^n) + O(n) which ultimately is O(2^n).
Please correct me if i am wrong?
Firstly, all the information you required to work these out in future is here.
To answer your question. Because you have not provided a definition of f(i) in terms of I itself it is impossible to determine the actual complexity from what you have written above. However, in general for loops like
for (i = 0; i < N; i++) {
sequence of statements
}
executes N times, so the sequence of statements also executes N times. If we assume the statements are O(1), the total time for the for loop is N * O(1), which is O(N) overall. In your case above, if I take the liberty of re-writing it as
f(0) = 0;
f(1) = 1;
f(i+2) = 2*f(i+1) + 3*f(i)
for (int i=0; i < n; i++)
f[i] = 2*f[i+2]
Then we have a well defined sequence of operations and it should be clear that the complexity for the n operations is, like the example I have given above, n * O(1), which is O(n).
I hope this helps.

Logarithmic complexity of an algorithm

It's diffcult for me to understand logarithmic complexity of algorithm.
For example
for(int j=1; j<=n; j*=2){
...
}
Its complexity is O(log2N)
So what if it is j*=3? The complexity will then be O(log3N)?
You could say yes as long as the loop body is O(1).
However, note that log3N = log2N / log23, so it is also O(log2N), since the constant factor does not matter.
Also note it is apparent from this argument, for any fixed constant k, O(logkN) is also O(log2N), since you could substitute 3 with k.
Basicly, yes.
Let's assume that your for loop looks like this:
for (int j = 1; j < n; j *= a) {...}
where a is some const.
If the for loop executes k times, then in the last iteration, j will be equal to ak. And since N = O(j) and j = O(ak), N = O(ak). It follows that k = O(logaN). Once again, for loop executes k times, so time complexity of this algorithm is O(k) = O(logaN).

Resources