for i = 1 to n
for j = 1 to i - 1
Is the runtime of this O(n^2)?
Is there a good way to visualize things when approaching these types of problems to find the right answer?
Inner loop executes
1 + 2 + 3 + 4 + 5 +...n-1 = n*(n-1)/2 times
using formula for arithmetic progression sum, so overall compexity is O(n^2)
Each for loop is O(n), two for loops O(n)*O(n) = O(n^2)
Check this link out. The author explains good ways to approach figuring out runtimes.
Related
I have the following code, and I want to find the complexity:
analizz(int n)
c = 1
k = n*n
while k > 1 do k = k - 2
for i = 0 to 1 do
if n >1 then analizz(n/2)
The problem that the code written in this way ,and I'm trying to understand, the FOR loop is inside the while loop ,so the cost should be O(n^2) ,and one recursive call if n > 1, so T(n/2).
The answer should be T(n) = 2T(n/2) + cn2 , I cannot understand how exactly 2T(n/2) ? if there is only one recursive call?
ps.I don't know which title will describe best my problem
The code is written poorly, but if the answer is correct than the for is not inside the while loop and the if is inside the for loop. the while gives the cn^2 and the two recursive calls are inside the for loop
I have a question..I got an algorithm
procedure summation(A[1...n])
s=0
for i= 1 to n do
j=min{max{i,A[i],n^3}
s= s + j
return s
And I want to find the min and max output at this algorithm with the use of asymptotic notation θ.
Any ideas on how to do that?
What I have to look on an algorithm to understand it's complexity?
If you want to know the big O notation or time complexity works? You might want to look at the following post What is a plain English explanation of "Big O" notation?.
For the psuedo code that you showed, the complexity is O(n). were n is the length of the array.
Often you can determine the complexity by just looking at how many nested loops the algorithm has. Of course is this not always the case but this can used as rule of thumb.
In the following example:
procedure summation(A[B1[1...n],B2[1...n],...,Bn[1...n]])
s=0
for i= 1 to n do
for j= 1 to m do
j=min{max{i,A[i,j],n^3}
s= s + j
return s
the complexity would be O(n m). (length of all arrays b -> m)
best or worst case
For the algorithm that you showed there is no best or worse case. It always runs the same time for the same array, the only influence on the run time is the length of the array.
An example where there you could be a best or worst case is following.
lets say you need to find the location of a specific number in a array.
If you method would be to to through the array from start to end. The best case would that the number is at the start. The worst case would be if the number would be at the end.
For a more detailed explanation look at the link.
Cheers.
The best and the worst case are the same because the algorithm will run the "same way" every time no matter the input. So based on that we will calculate the time complexity of the algorithm using math:
T(n) = 1 + (3+2) + 1
T(n) = 2 + 5
T(n) = 2 + 5 1
T(n) = 2 + 5 (n-1+1)
T(n) = 5n + 2
This summation (3+2) is due to the fact that inside the loop we have 5 distinct and measurable actions:
j = min{max{i,A[i]}, n^3} that counts as three actions because we have 2 comparisons and a value assignment to the variable j.
s = s + j that counts as 2 actions because we have one addition and a value assignment to the variable s.
Asymptotically: Θ(n)
How we calculate Θ(n):
We look at the result that is 5n + 2 and we take out the constants so it becomes
n. Then we choose the "biggest" variable that is n.
Other examples:
8n^3+5n+2 -> Θ(n)=n^3
10logn+n^4+7 -> Θ(n)=n^4
More info: http://bigocheatsheet.com/
This question already has answers here:
What is the Big-O of a nested loop, where number of iterations in the inner loop is determined by the current iteration of the outer loop?
(8 answers)
Closed 6 years ago.
I have a piece of code which executes three nested for loops in the following way (written as language agnostic as possible):
for i = 0 to n - 1
for j = 0 to n - 1
for k = 0 to n - 1
[do computation in linear time]
My intuition says that this should result in N^3 complexity. I'd like to compare its complexity to the following nested loops:
for i = 0 to n - 1
for j = i + 1 to n - 1
for k = j + 1 to n - 1
[do computation in linear time]
Logically, it should be faster, because as i increases, the inner loops should compute faster; however, having seen countless other threads on the site explaining that the latter of those is also N^3 is confusing.
Is my assumption of N^3 for both correct, and if so, how do I quantify the differences, assuming there are any?
With big O notation, only the leading polynomial value is listed (i.e., in this case, N^3). The latter example you provide could be described as a(N^3)-b(N^2)-c(N)-d, where {a,b,c,d} are integers, and therefore the latter might be significantly quicker depending on how large or small i or n are. However, because you have 3 nested loops, big-O notation will still be listed as N^3.
The time complexity for both the above code will be order of n^3 i,e Big O (n^3). By the definition of Big O notation T(n) ≤ k f(n) for some constant k. So 2nd one code will not much more contribute for some constant k.
If you had an algorithm with a loop that executed n steps the first time through,then n − 2 the second time, n − 4 the next time, and kept repeating until the last time through the loop it executed 2 steps, what would be the complexity measure of this loop?
I believe this exhibits O(n^2) complexity, as the number of steps not being executed increases quadratically. I am having a hard time visualizing such the loop itself, which makes me less confident about my answer.
Any kind of help/second opinion is greatly appreciated :)
You are correct that the complexity is Θ(n2). This is because what you describe is an arithmetic progression:
(n - 2) + (n - 4) + ... + 2 (or an odd number at the end)
(which is, obviously, 2 + 4 + 6 + ... + (n - 2) or the odd-beginning equivalent, BTW).
Using the formula for the sum, it is the average of the first and last elements, times the number of elements. Each of these terms is Θ(n), and their product is Θ(n2).
Could someone help me out with the time complexity analysis on this pseudocode?
I'm looking for the worst-case complexity here, and I can't figure out if it's O(n^4), O(n^5) or something else entirely. If you could go into detail into how you solved it exactly, it would be much appreciated.
sum = 0
for i = 1 to n do
for j = 1 to i*i do
if j mod i == 0 then
for k = 1 to j do
sum = sum + 1
First loop: O(n)
Second loop: i is in average n/2, you could have an exact formula but it's O(n²)
Third loop happens i times inside the second loop, so an average of n/2 times. And it's O(n²) as well, estimating it.
So it's O(n*n²*(1 + 1/n*n²)), I'd say O(n^4). The 1/n comes from the fact that the third loop happens roughly 1/n times inside the second one.
It's all a ballpark estimation, with no rigorous proof, but it should be right. You could confirm it by running code yourself.