Calculating time complexity of inner loops - algorithm

So I am learning how to calculate time complexities of algorithms from Introduction to Algorithms by Cormen. The example given in the book is insertion sort:
1. for i from 2 to length[A]
2. value := A[i]
3. j := i-1
4. while j > 0 and A[j] > value
5. A[j+1] := A[j]
6. j := j-1
7. A[j+1] = value
Line 1. executes n times.
Line 4., according to the book, executes times.
So, as a general rule, is all inner loops' execution time represented by summation?

Anecdotally, most loops can be represented by summation. In general this isn't true, for example I could create the loop
for(int i = n; i > 1; i = ((i mod 2 == 0) ? i / 2 : 3 * i + 1)))
i.e. initialize i to n, on each iteration if i is even then divide it by two, else multiply i by three and add one, halt when i <= 1. What's the big-oh for this? Nobody knows.
Obviously I've never seen a real-life program using such a weird loop, but I have seen while loops that either increase and decrease the counter depending on some arbitrary condition that might change from iteration to iteration.

When the inner loop has a condition, the times the program will iterate through it(ti) will differ in each iteration of the outer loop. That's why it's equal to the summation of all the (ti)s.
However, if the inner loop was a for loop, the times the program will iterate through it is constant. So you just multiply the times of the outer loop by the times of the inner loop.
like this:
for i=1 to n
for j=1 to m
s++
the complexity here is n*m

Related

Time complexity of two separate nested while loops

I just started my data structures course and I'm having troubles trying to figure out the time complexity in the following code:
{
int j, y;
for(j = n; j >= 1; j--)
{
y = 1;
while (y < j)
y *= 2;
while (y > 2)
y = sqrt(y);
}
The outer 'for' loop is running n times in every run of the code, and the first 'while' loop runs about log2(j) if I'm not mistaken.
I'm not sure about the second 'while' loop and how to determine the overall time complexity of the code.
My initial thoughts were to determine which 'while' loop would "cost" more in each iteration of the 'for' loop, consider only the higher of the two and sum it up but obviously it didn't lead me to an answer.
Would appreciate any help, especially in what is the the process and overall approach in trying to compute the complexity in codes such as this one.
You are right that the first while loop has time complexity O(log(j)). The second while loop repeatedly executes square root on y until it's less than 2.
Since y is approximately j (it's between j and 2j), the question is: how often can you perform a square root on j until you get a number less than or equal to 2? But equivalently you could ask: how often can you square 2 until you get a number larger than or equal to j? Or as an equation:
(((2^2)^...)^2 >= j // k repeated squares
<=> 2^(2^k) >= j
<=> k >= log(log(j))
So the second while loop has time complexity O(log(log(j)). That's negligible compared to O(log(j)). Now since j <= n and the loop is itereated n times, we get the overall complexity O(n log(n)).

What is the complexity of this pseudo-code

I found this excersise in the exam and I ecountered difficulty to solve the problem.
I can suppuse for sure the algorithm will take at least O(n) due the for, but I don't know how to approach the while. I know that in this case i have to evaluate the worst if-else branch and for sure it is the second one.
for i=1...n do
j=n
while i<j do
if j mod 2 = 0 then j=j-1
else j=i
intuitively i think the total cost is: O(nlogn)=O(n)*O(logn)
In short: the while loop will each time run at most two iterations. This makes the algorithm O(n).
The while loop will iterate at most two times. Indeed let us take a look at the while loop:
while i < j do
if j mod 2 = 0 then
j=j-1
else
j=i
It is clear that we will only execute the while loop if i < j. Furthermore if j mod 2 == 1 (so j is odd), then it will set j = i, and thus the while loop will no longer run.
If on the other hand j mod 2 == 0 (so j is even), then we decrement j. There are now two things that could happen, either i == j, in which case we do not perform an extra iteration. If we however perform an extra iteration, the if condition will fail, since decrementing an even number results in an odd number. Since we each time set j = n, it also means that the number of steps that the while loop performs is determined by n itself.
This thus means that regardless what the values for i and j are, the body of the while loop will be performed at most two times.
Since we perform the while loop extactly n times, it thus means that the algorithm is still O(n). We here make the assumption that we can check the parity of a number and decrement a number in constant time.
Does mod refer to Modulo? In that case, the while loop will evaluate at most twice; once to decrement j, but then j mod 2 will be 1, and after setting j=i your i<j will be false. The complexity difference here oscillates rather than increases with input, and is thus O(1) for this branch.

Triple Nested For-loop with two independent outer loops and one dependent inner loop

I have the following sequence of loops.
TripleLoop(int n)
for i <- 1 to n
for j <- 1 to n
for k <- j to n
do num <- j + i
return num
I know the two outer loops run "n" times.
Thus, we have n * n = n^2 for the two outer loops.
However, the third inner loop depends on the variable "j".
How do I begin to solve these types of nested dependent for-loops?
I'm not sure if I should multiply, or add the third inner loop to the two outer loops.
Can someone help me out here?
Well the inner loop (the one with k as iterator) is executed n-j+1 times, since it starts at j and ends with n.
The total number of steps the middle for loop thus performs is the sum of steps per iteration for j, so that means that the total number of times we run the body of the inner for loop is:
n
---
\ n * (n + 1)
/ n - j + 1 = -------------
--- 2
j=1
so after one iteration of the outer loop (the one with i as an iterator), we have n*(n+1)/2 steps.
In total, our algorithm will thus run the body of the inner loop for a total of n * n * (n+1)/2 times. Since the outer loop runs n times, and the number of steps in the body of that loop does not depend on the value of i itself.
If we consider the num <- j + 1 part to run in constant time (well strictly speaking summing up huge numbers can not be done in constant time), then this is thus an O(n3) algorithm.

Why doesn't the while loop contribute O(n) to the total time complexity? [duplicate]

This question already has answers here:
What is time complexity for the following code?
(6 answers)
Closed 5 years ago.
I got this problem from Interview Bit
Problem
int j = 0;
for(i = 0; i < n; ++i) {
while(j < n && arr[i] < arr[j]) {
j++;
}
}
Question
The total number of comparisons that the while loop does is about n (probably less than or equal to n depending on arr). The loop runs n times. Shouldn't the time complexity be O(n^2)?
One of the conditions in the while loop is while j < n. Meaning worst case, the code will only ever loop in the while loop n times regardless of how many loops the outer for loop does (j starts at zero and only increases, never resets to zero or decreases). Since the the for loop also loops n times, your big-O is O(n+n) => O(n)
NOTE: You can safely ignore the other condition arr[i] < arr[j], since we're just considering what the runtime would be in the worst-case.
This code looks like it was purposefully designed to be misleading. The while loop only runs once from 0 to n, and does not reset for every iteration of the outer for loop.
You need to count the total number of times the statements in the innermost loop get executed.
The nested while loop does not contribute to the complexity because it goes through the values between 0 to n-1 only once, even though the steps through these values may be distributed among different iterations of the outer loop.
The innermost "payload" of the loop, i.e. arr[i] < arr[j] and j++, will execute at most n times, because incrementing j is a "one-way street": its value is never reset back to zero, so once j reaches n, the body of the loop no longer executes.
Actually the inner loop is not dependent on 'i' ,so it will run maximum n times if 'i' goes from 0 to n-1.
The complexity would be O(n^2) if before while loop 'j' was initialised by 0, then in worst case for each 'i' while loop will execute 1+2+3+.......n-2 + n-1 times= O(n^2) when array elements are in descending order.

Why is this algorithm O(nlogn)?

I'm reading a book on algorithm analysis and have found an algorithm which I don't know how to get the time complexity of, although the book says that it is O(nlogn).
Here is the algorithm:
sum1=0;
for(k=1; k<=n; k*=2)
for(j=1; j<=n; j++)
sum1++;
Perhaps the easiest way to convince yourself of the O(n*lgn) running time is to run the algorithm on a sheet of paper. Consider what happens when n is 64. Then the outer loop variable k would take the following values:
1 2 4 8 16 32 64
The log_2(64) is 6, which is the number of terms above plus one. You can continue this line of reasoning to conclude that the outer loop will take O(lgn) running time.
The inner loop, which is completely independent of the outer loop, is O(n). Multiplying these two terms together yields O(lgn*n).
In your first loop for(k=1; k<=n; k*=2), variable k reaches the value of n in log n steps since you're doubling the value in each step.
The second loop for(j=1; j<=n; j++) is just a linear cycle, so requires n steps.
Therefore, total time is O(nlogn) since the loops are nested.
To add a bit of mathematical detail...
Let a be the number of times the outer loop for(k=1; k<=n; k*=2) runs. Then this loop will run 2^a times (Note the loop increment k*=2). So we have n = 2^a. Solve for a by taking base 2 log on both sides, then you will get a = log_2(n)
Since the inner loop runs n times, total is O(nlog_2(n)).
To add to #qwerty's answer, if a is the number of times the outer loop runs:
    k takes values 1, 2, 4, ..., 2^a and 2^a <= n
    Taking log on both sides: log_2(2^a) <= log_2(n), i.e. a <= log_2(n)
So the outer loop has a upper bound of log_2(n), i.e. it cannot run more than log_2(n) times.

Resources