Nested for loops that can be separated - time

I'm learning about time complexities and i'm wondering how I go about calculating the time complexity of a function like foo1
foo2(j):
for i = j down to 0:
print(i)
foo1(N)
for j = 1 to N:
foo2(j)
Would I just simply calculate it by thinking foo2 would run 1 time for the first iteration of foo1, 2 times the 2nd iteration, and so on? So,
1 + 2 + ... + n = n*(n+1)/2 = O(n^2)
Is it always the inner-most loop that decides the total time complexity?

Related

confused about the time complexity of the follwing func. A good explanation would be helpful

If the the first loop runs for n+1 times.
second loop runs for n(n+1) times.
third loop will run for ??? it has n^2+1 one relation with with with the second loop i guess but how about with the first one
somefunction(n) {
c = 0
for (i = 1 to n*n)
for (j = 1 to n)
for (k = 1 to 2*j)
c = c+1
return c
}
The first loop has O(n**2) iterations.
The second loop has O(n) iterations.
The third loop has O(n) iterations as well, since j is steadily increasing towards n.
(It's a little easier to see if you sum up the number of times c = c + 1 executes for the two inner loops combined. The inner loop runs 2 times for j = 1, 4 for j = 2, ..., and 2*n times for j = n. 2 + 4 + .. + 2*n = O(n**2).)
You can then (loosely speaking) multiply the three values together to get a total bound of O(n**4).

Compute time complexity of code [duplicate]

This question already has answers here:
How can I find the time complexity of an algorithm?
(10 answers)
Closed 5 years ago.
Can someone help me on how to analyze the run-time of the given below pseudocode
for i = 1 to n
k[i] = 0
for i = 1 to n
for j = i to n
k[i] = k[i] +j
I guess that it's time complexity is O(n^2). Please correct me if I am wrong.
The for loop consists of three elements. The assignment 1. The conditional branch. And the increment operation. If you can quantify the execution time for each line, you can calculate the overall time to execute. So for example call k[i] = 0 operation a. k[i] = k[i] +j. Operation b. The for loop assignment operation c. The for loop increment and conditional branch operation d.
This would yield:
(sum(n - i) for i = 1 to n)*(b + d) + (2 + n)c + na.
Which I think would simplify to
~(b + d)*(n^2)/2 for very large values of n. So I would agree it's complexity is O(n^2).
The way to analyze complexity of these nested loops is from the deepest loop.
for i = 1 to n
k[i] = 0
for i = 1 to n
for j = i to n
k[i] = k[i] +j
For the first loop it is very easy to see the operation k[i] = 0 will be performed n times
So order of that is O(N)
Now for the nested loop, loop for j starts from i , where i loops from 1 to n and continues till n.
So the key question to ask how many times the loop is executing.
when i = 1 it will be executing N times
when i = 2 it will be executing N-1 times
...
when i = 1 it will be executing 1 time
so if you sum them all it becomes N + N-1+ ... 1 = N(N-1)/2 = N^2/2 - N/2
So the order of the nested loop is O(N^2/2)- O(N/2) = O(N^2)
Also for the 1st loop the order is O(N)
so the total time complexity id O(N) + O(N^2) = O(N^2)

bubble sort algorithm find time complexity

i am trying to find the time complexity of the bubble sort
n=length[A]
for j <- n-1 to 1
for i <- 0 to j-1
if A[i]>a[i+1]
temp=A[i]
A[i]=A[i+1]
A[i+1]=temp
return A
please any one can help thanks
In line 1 we are assigning length of array to n so constant time
In line 2 we have a for loop that decrements j by 1 every iteration until j=1 and in total will iterate n-2 times.
Inside the first for loop we have a second for loop that increments i by 1 every iteration until i=j-1 and will iterate j-1 times. On each iteration of the inner for loop we have lines 4,5,6,7 which are all just assignments and array access which cost, in total, constant time.
We can think about the two for loops in the following way: For every iteration of the outer for loop, the inner for loop will iterate j-1 times.
Therefore on the first iteration of the outer for loop, we have j = n-1. That means the inner for loop will iterate (n-1)-1 = (n-2) times. Then on the second iteration of the outer for loop we have j= n-2 so the inner for loop will iterate (n-2)-1 = (n-3) times and so on. And we do this until j = 1.
We will then have the equation: (n-2) + (n-3) + ... + 2 + 1 which is the total number of times the inner loop will iterate after the entire algorithm executes. We know that 1 + 2 + ... + n-1 + n = n(n-1)/2 so our expression can be simplified to this: n(n-1)/2 -(n-1) -n = n(n-1)/2 -2n + 1 = O(n^2).
Since our inner for loop will iterate O(n^2) times, and on each iteration do constant work, then that means our runtime will be O(cn^2) where c is the amount of constant work done by lines 4,5,6,7. Combine O(cn^2) with line 1 which is O(1) we have O(cn^2) + O(1) which is just O(n^2).
Therefore runtime of BubbleSort is O(n^2).
If you are still confused then maybe this will help: https://www.youtube.com/watch?v=Jdtq5uKz-w4

What is the O notation of this loop?

I understand that this is O(N^2):
Loop from i=1 to N
Loop from j=1 to N
Do something with i,j
But what about this?
Loop from i=1 to N
Loop from j=1 to i
Do something with i,j
Is it still O(N^2) or O(N log N)? I don't really understand how to tell.
this is also O(N^2).
N(N-1)/2 ~ O(N^2).
i = 1 than j = 1
i = 2 than j = 1 to 2
i = 3 than j = 1 to 3
i = 4 than j = 1 to 4
…….
…
i = N than j = 1 to N
So for total is 1 + 2 + 3 + 4 + …. + N = (N * (N+1))/2 ~ O(N^2).
For second problem, the running time will be O (1/2 N^2), which later becomes O(N^2), as we don't care about the constant in O notation. Usually the log N algorithm involves dividing the subproblem into half-size of the actual size in each iteration. Take for example merge sort. In merge sort, in each iteration it is dividing the size of the array into half.
Also O(n^2).
You have to look at the worst case of how long your code will run.
So first loop runs from 1 to N.
For each iteration of that loop there is second loop, which runs from 1 to i.
And we know that i will be N on the last iteration, hence it will run for O(N*N), which is (N^2)
We ignore constants in big-O notations.
If these concepts are difficult, try googling some tutorials and examples. All you need is some practice, and you will get it.

What is the running time complexity of this algorithm

What is the time complexity of this algorithm:
sum = 0
i = 1
while (i < n) {
for j = 1 to i {
sum = sum + 1
}
i = i*2;
}
return sum
I know that the while loop is O(logn), but what is the complexity of the for loop? Is it O(n) or O(logn)?
One way to analyze this would be to count up the number of iterations of the inner loop. On the first iteration, the loop runs one time. On the second iteration, it runs two times. It runs four times on the third iteration, eight times on the fourth iteration, and more generally 2k times on the kth iteration. This means that the number of iterations of the inner loop is given by
1 + 2 + 4 + 8 + ... + 2r = 2r + 1 - 1
Where r is the number of times that the inner loop runs. As you noted, r is roughly log n, meaning that this summation works out to (approximately)
2log n + 1 - 1 = 2(2log n) - 1 = 2n - 1
Consequently, the total work done by the inner loop across all iterations in O(n). Since the program does a total of O(log n) work running the outer loop, the total runtime of this algorithm is O(n + log n) = O(n). Note that we don't multiply these terms together, since the O(log n) term is the total amount of work done purely in the maintenance of the outer loops and the O(n) term is total amount of work done purely by the inner loop.
Hope this helps!

Resources