Complexity of else block containing loop inside a loop block - algorithm

I'm trying to find the frequency of each statement and the big o of this method.
But I'm struggling with the else part, I know that we take the worst complexity of if and else
But logically, in this case, do I have multiply the frequency of the outer loop (n) by the frequency of the else loop (n+1) ? Although I know that the else block will be executed for one time only, when i=0
But when we follow the rules we have to multiply it
So I'm stuck here and I don't know what to do in this case, I hope you guys could help me
Thanks!
int i, j, sum = 0;
for (i = 0 ; i < n; i++)
if ( i != 0)
Sum += i;
else
for (j = 0 ; j< n; j++)
Sum + = j;

So, you have a O(n) loop and a O(n) else inside it.
But in your code, we only go through the else statement once.
So you have a for loop, and inside the if you do O(1) work (n - 1) times, and once (i == 0) you do O(n) work ( when the else statement is reached )
So the complexity would be O(n) = (n - 1) * O(1) + O(n) ~ O(2n) which is also O(n) because we drop the constants when doing asymptotic analysis.

Related

Time complexity of bubble sort with single for loop

I studied that bubble sort is an O(n^2) algorithm. But I designed an algorithm which looks as O(n). The following is my code:
void bubbleSort(int[] arr) {
int counter = 1, i = 0;
int N = arr.length-counter;
for(i=0; i<N; ){
if(arr[i]>arr[i+1]){
int temp = arr[i];
arr[i] = arr[i+1];
arr[i+1] = temp;
}
i++;
if(i == N){
N = arr.length - (++counter);
i = 0;
}
}
There is one for loop and the counter i is set when it is equal to N. According to me the loop is O(n) and it resets for n-1 times. So, it becomes O(n) + O(n-1) = O(n).
Am I correct? If not what should be the complexity of this code.
No, you are not correct. Having one loop doesn't mean it's O(n). You have to take into consideration how many steps are executed.
Your loop gets re-initialized when i == N. You are right about - the loop gets re-initialized (n-1) times. Now each time loop is executed the then value of N times. So it's not O(n) + O(n-1) rather it's O(n*(n-1)) which eventually leads to O(n^2).
For example -
at first pass, loop will be executed (N) times. then N will be re-initialized to (N-1)
at second pass, loop will be executed (N-1) times. then N will be re-initialized to (N-2)
...
...
this will go on in total of (n-1) times.
So it will be - O(N + (N - 1) + (N - 2) + ... + 1), which will be evaluated into O(n^2)
For experimental purpose you can initialize a counter globally. and check what's the value of total steps executed by your program to check what's actually going on -
void bubbleSort(int[] arr) {
int counter = 1, i = 0;
int total = 0;
int N = arr.length-counter;
for(i=0; i<N; ){
total++;
if(arr[i]>arr[i+1]){
int temp = arr[i];
arr[i] = arr[i+1];
arr[i+1] = temp;
}
i++;
if(i == N){
N = arr.length - (++counter);
i = 0;
}
}
printf(%d", total); // this will give you total number of steps executed. check for various value of n
your loop is initializing inside your outer loop n-1 times so
loops will be executing in following way
inner loop for n-1 times for each iteration of outer loop which is n iterations of outer loop * n-1 iterations of inner loop = (n*n-1) = n2 making it's complexity O(n2)
The answer lies within your own statement:
There is one for loop and the counter i is set when it is equal to N. According to me the loop is O(n) and it resets for n-1 times. So, it becomes O(n) + O(n-1) = O(n). Am I correct? If not what should be the complexity of this code.
There is only one loop. Correct.
The time complexity of that loop is O(n). Correct.
The loop resets for n-1 times. Correct.
But the calculation is wrong.
If the loop resets for n-1 times, the complexity becomes:
O(n)+O(n-1)+O(n-2)+....+O(1) which essentially is O(n^2).

Find an upper and lower bound of following code

I need to find a closest upper and lower bound of following code. I am beeginer in this so sorry for my mistake.
Upper bound for p() is O(log(n)), and lower bound is O(1)
Upper bound for notp() is O(log(n)), and lower bound is O(1)
I think that lower bound is O(1) because if I have n=4 then I get in the loop and since n%i==0 I call p() and it notice that is not a prime number so O(1) then since i=2 the other notp would not be executed. That is bast case scenario.
Worst case scenario that I go through loop so that log(n), and execute a p and a upper bound is O(log(n)) so it is O(log(n)^2) but I am not so sure that this is good, please tell me where I made a mistake?
int i;
for (i = 2; i*i <= n; i++) {
if (n % i == 0) {
p();
break;
}
}
if(i*i > n)
notp();
For order calculations, you would normally multiply them together when there is a loop.
for( i = 0; i < n ; i++ ){
DoSomething(i);
}
That would be O(n), as the loop is a single case.
A nested loop would be
for( i = 0; i < n ; i++ ){
for( j =0; j < n; j++ ){
DoSomething(i,j);
}
}
That becomes O( n^2 ) as they are additive.
If you have non-nested loops...
for( i = 0; i < n ; i++ ){
DoSomething(i);
}
for( j = 0; j < sqrt(n) ; j++ ){
DoSomething(j);
}
Then the O( x ), is the largest term. That is because for very large n, the smaller values of x don't matter.
So you have a loop in your case, which is O( sqrt( n ) ). That is because it is limited by i *i being less than n.
Then either one of p() or notp() will be called.
(I think p() and notp() are the wrong way round).
Re-factoring your code....
int i;
for (i = 2; i*i <= n; i++) {
if (n % i == 0) {
/* p(); */
break;
}
}
if(i*i > n)
notp();
else
p();
So we have the O( sqrt(n) ) time plus either of the p / notp functions which are O( log(n) )
O( sqrt(n ) + log(n) )
As the sqrt grows faster than n, it overpowers the log(n) wikipedia Big O, leaving it as the final value.
O( sqrt(n) )
As the termination condition of the loop is i*i <= n, the most possible number of iteration of the loop is sqrt(n). As there is a break in the condition of n % i == 0, the worst case of this part is sqrt(n) + log(n) which is O(sqrt(n)). Also, eighter the second condition is true or not, as the worst case of nopt() is O(log(n)), the worst case of the algorithm in total is O(sqrt(n)).

how i can find the time complexity of the above code

for(i=0; i<n; i++) // time complexity n+1
{
k=1; // time complexity n
while(k<=n) // time complexity n*(n+1)
{
for(j=0; j<k; j++) // time complexity ??
printf("the sum of %d and %d is: %d\n",j,k,j+k); time complexity ??
k++;
}
What is the time complexity of the above code? I stuck in the second (for) and i don't know how to find the time complexity because j is less than k and not less than n.
I always having problems related to time complexity, do you guys got some good article on it?
especially about the step count and loops.
From the question :
because j is less than k and not less than n.
This is just plain wrong, and I guess that's the assumption that got you stuck. We know what values k can take. In your code, it ranges from 1 to n (included). Thus, if j is less than k, it is also less than n.
From the comments :
i know the the only input is n but in the second for depends on k an not in n .
If a variable depends on anything, it's on the input. j depends on k that itself depends on n, which means j depends on n.
However, this is not enough to deduce the complexity. In the end, what you need to know is how many times printf is called.
The outer for loop is executed n times no matter what. We can factor this out.
The number of executions of the inner for loop depends on k, which is modified within the while loop. We know k takes every value from 1 to n exactly once. That means the inner for loop will first be executed once, then twice, then three times and so on, up until n times.
Thus, discarding the outer for loop, printf is called 1+2+3+...+n times. That sum is very well known and easy to calculate : 1+2+3+...+n = n*(n+1)/2 = (n^2 + n)/2.
Finally, the total number of calls to printf is n * (n^2 + n)/2 = n^3/2 + n^2/2 = O(n^3). That's your time complexity.
A final note about this kind of codes. Once you see the same patterns a few times, you quickly start to recognize the kind of complexity involved. Then, when you see that kind of nested loops with dependent variables, you immediately know that the complexity for each loop is linear.
For instance, in the following, f is called n*(n+1)*(n+2)/6 = O(n^3) times.
for (i = 1; i <= n; ++i) {
for (j = 1; j <= i; ++j) {
for (k = 1; k <= j; ++k) {
f();
}
}
}
First, simplify the code to show the main loops. So, we have a structure of:
for(int i = 0; i < n; i++) {
for(int k = 1; k <= n; k++) {
for(int j = 0; j < k; j++) {
}
}
}
The outer-loops run n * n times but there's not much you can do with this information because the complexity of the inner-loop changes based on which iteration of the outer-loop you're on, so it's not as simple as calculating the number of times the outer loops run and multiplying by some other value.
Instead, I would find it easier to start with the inner-loop, and then add the outer-loops from the inner-most to outer-most.
The complexity of the inner-most loop is k.
With the middle loop, it's the sum of k (the complexity above) where k = 1 to n. So 1 + 2 + ... + n = (n^2 + n) / 2.
With the outer loop, it's done n times so another multiplication by n. So n * (n^2 + n) / 2.
After simplifying, we get a total of O(n^3)
The time complexity for the above code is : n x n x n = n^3 + 1+ 1 = n^3 + 2 for the 3 loops plus the two constants. Since n^3 carries the heaviest growing rate the constant values can be ignored, so the Time complexity would be n^3.
Note: Take each loop as (n) and to obtained the total time, multiple the (n) values in each loop.
Hope this will help !

How to calculate time complexity?

I've gone through some basic concepts of calculating time complexities. I would like to know the time complexity of the code that follows.
I think the time complexity would be O(log3n*n2). It may still be wrong and I want to know the exact answer and how to arrive at the same. Thank you :)
function(int n){
if(n == 1) return;
for(int i = 1; i <= n; i++)
for(int j = 1; j <= n; j++)
printf("*");
function(n-3);
}
Two nested loops with n iterations give O(n^2). Recursion calls the function itself for O(n)-time since it decreases n for the constant 3, thus it's called n/3 + 1 = O(n) times. In total, it's O(n^3).
The logarithm constant in your result would be in case that function is called with the value of n/3.

Iterative function

Stuck with me HW - Need to try complexity
time=0;
for (i=n; i>=1; i = sqrt(i))
for (j=1; j<=i; j++)
time++;
What I did - First loop going like this:
i=n, n^(1/2), n^(1/4)...1
than we get:
n^(1/2)^k = 1
if I log both sides one side get 0... what should I do?
I suppose there is a typo somewhere because otherwise it's Θ(∞) if the input n is not smaller than 1. (For i == 1, the update i = sqrt(i) doesn't change i, so that's an infinite loop.)
So let us suppose it's actually
time = 0;
for (i = n; i > 1; i = sqrt(i))
for (j = 1; j <= i; j++)
time++;
Then, to get the complexity of nested loops, you need to sum the complexity of the inner loop for each iteration of the outer loop. Here, the inner loop runs i times, obviously, so we need to sum the values i runs through in the outer loop. These values are n, n^0.5, n^0.25, ..., n^(1/2^k), where k is characterised by
n^(1/2^(k+1)) < 2 <= n^(1/2^k)
or, equivalently,
2^(2^k) <= n < 2^(2^(k+1))
2^k <= lg n < 2^(k+1)
k <= lg (lg n) < k+1
k = floor(lg(lg n))
Now it remains to estimate the sum from above and below to get the Θ of the algorithm. This estimate is very easy if you start writing down the sums for a few (large) values of n.

Resources