Algorithm Time Complexity Analysis (three nested for loops) - algorithm

Here's the code I've implemented in a nutshell. The two inside for loop should have a complexity of O(n2) where n=vertices. I just can't figure out the overall time complexity along the outer for loop. I think its gonna be O( E * n2) where E is the number of edges and n is the number of vertices.
int vertices;
for(int Edges = 0; Edges < vertices-1 ; Edge++)
{
for (int i=0; i< vertices; i++)
for (int j=0; j<vertices; j++)
{
// TASKS
}
}
This code is for a prim algorithm. I can post the whole code if you want. Thanks :)

Ahh!!! What is so typical about it!
In your inner loops,you have variables named i & j,so you figured out easily the complexity.
Just with an addition of EDGE variable which is not at all different from the other 2,you have got confused! Check the number of iterations!!!
The outer-loop would run VERTICES-1 iterations.
Therefore, complexity = (VERTICES-1) * (VERTICES) *(VERTICES) = (VERTICES^3) - (VERTICES^2).
Program's complexity would be O(Vertices^3) OR O(n^3) where n=vertices...

Sigma notation makes things clear:

Related

How complex is this algorithm

I am having a really hard time getting my head around algorithm complexity analysis for university. My professor is giving us simple bits of code to work out the complexity of and this is one of them:
double minValue(double* pd, int& p, int N)
{
double minV = pd[0];
for (int i=1; i < N; i++)
if (minV > pd[i]) {
minV = pd[i];
p = i;
}
return minV;
}
Can someone tell me what the complexity of this is? My guess is O(N²)
The only repetitive portion of this function is the following for loop:
for (int i+1; i < N; i++)
This loop is just O(N), so that is also the overall complexity of the function.
Since the inputs to the function are an array, a scalar and a scalar, the array is what has an impact on your time complexity. You only go through the array once in the for loop, provided. Increasing the size of the input increases the time of the for loop linearly as it will pass through every value once. The time complexity is therefore O(N).

Time Complexity of 3 nested for loops

Although, I found pretty good replies to the same question! However, I want time complexity equation of the following piece of code
sum = 0; c1=11
for (i=0; i<N; i++) c2=6
for (j=0; j<N; j++) c2=6
for (j=0; j<N; j++) c2=7
sum += arr[i][j] c3=2
While each statement has a cost associated with it, I require complete time complexity equation and its answer.
Regards
The comments section got quite long so I am going to write up an answer summarizing everything.
Measuring Time Complexity
In Computer Science, we measure time complexity by the number of steps/iterations your algorithm takes to evaluate.
So if you have a simple array of length n and you go through this array only once, say to print all the elements, we say that this algorithm is O(n) because the time is takes to run will grow proportionally to the size of the array you have, thus n
You can think of Big-O O(..) as a higher order function that compares other functions. if we say f(x) = O(n) it means that you function grows at most as fast as y=n thus linearly. This means that if you were to plot these functions on a graph, there would be a point c x = c after which the graph of n will always be on top of f(x) for any given x > c. Big-O signifies upper bound of a function in terms of another function.
So let's look at your original question and what it means to be constant time. Say we have this function
def printFirst5(arr: Array[Int]) = {
for(i =0 ;i < 5; i++){
println(arr[i])
}
}
This is what we call a constant time algorithm. Can you see why? Because no matter what array you pass into this (as long as it has at least 5 elements), it will only go through the first 5 elements. You can pass it an array of length 100000000000 you can pass it an array of length 10 it doesn't matter. In each case it will only look at the first 5 elements. Meaning this function printFirst5 will never go above the line y=5 in terms of time complexity. These kind of functions are denoted O(1) for constant time.
Now, finally, let's look at you edited version. (I am not sure what you are trying to do in your example because it is syntactically wrong, so I will write my own example)
def groupAllBy3(array: Array[Int]) = {
for(i=0; i < array.length; i++){
for(j=0; j < array.length; j++){
for(k=0; k< array.length; k++{
println(s"$array[i], $array[j], $array[k]")
}
}
}
}
This functions time complexity is O(N3). Why? Let's take a look.
The innermost loop will go through N elements for every j
How many js are there? Well there will be N js for every i.
How many is are there? N many.
So in total we get numberof-i * numberof-j * numberof-k = N * N * N = O(N^3)
Just to make sure you understand this correctly, let's take a look at another example. What would happen if these loops weren't nested? If we had:
def printAllx3(array: Array[Int]) = {
for(i=0; i < array.length; i++){
println(s"array[i]")
}
for(j=0; j < array.length; j++){
println(s"array[j]")
}
for(k=0; k< array.length; k++{
println(s"array[k]")
}
}
What is the case here?
The first loop goes through N elements, the second loop goes through N elements, the third loop goes through N elements. But they don't depend on each other in terms of iterations so we get N + N + N = 3N = O(N)
Do you see the difference?
With all due respect, I believe you are missing some of the fundamentals of what time complexity is & how we measure it. There is only so much I can explain here, I highly recommend you do some reading on the subject and ask any further questions you don't understand.
Hope this helps

Time Complexity of this nested for-loop algorithm?

I'm having some trouble calculating the bigO of this algorithm:
public void foo(int[] arr){
int count = 0;
for(int i = 0; i < arr.length; i++){
for(int j = i; j > 0; j--){
count++;
}
}
}
I know the first for loop is O(n) time but I can't figure out what nested loop is. I was thinking O(logn) but I do not have solid reasoning. I'm sure I'm missing out on something pretty easy but some help would be nice.
Let's note n the length of the array.
If you consider the second loop alone, it is just a function f(i), and since it will iterate on all elements from i to 1, its complexity will be O(i). Since you know that j<n, you can say that it is O(n). However, there is no logarithm involved, since in the worst case, i.e. j=n, you will perfrom n iterations.
As for evaluating the complexity of both loops, observe that for each value of i, the second loop goes throught i iterations, so the total number of iterations is
1+2+...+(n-1)= n*(n-1)/2=(1/2)*(n^2-n)
which is O(n^2).
If we consider c as a number of times count is incremented in the inner loop then the total number of times count is incremented can be represented by the formula below:
As you can see, the total time complexity of the algorithm is O(n^2).

How to determine computational complexity for algorithms with nested loops?

After looking at this question, this article, and several other questions, I have still not been able to find a general way to determine the computational complexity of algorithms with looping variables dependent on the parent loop's variable. For example,
for (int i = 0; i < n; i++) {
for (int j = i; j < n; j++) {
for (int k = i; k < j; k++) {
//one statement
}
}
}
I know that the first loop has a complexity of n, but the inner loops are confusing me. The second loop seems to be executed n-i times and the third loop seems to be executed j-i times. However, I'm not sure how to turn this into a regular Big-O statement. I don't think I can say O(n(n-i)(j-i)), so how can I get rid of the i and j variables here?
I know this is something on the order of n^3, but how can I show this? Do I need to use series?
Thanks for your help!
(If you were wondering, this is from a brute force implementation of the maximum sum contiguous subsequence problem.)
First loop hits N items on average.
Second loop hits N / 2 items on avarage
Third loop hits N / 4 items on average
O(N * N / 2 * N / 4) is about O((N^3)/8) is about O(N^3)

Time Complexity - Calculating Worst Case For Algorithms

I am reading some information on time complexity and I'm quite confused as to how the following time complexities are achieved and if there is a particular set of rules or methods for working this out?
1)
Input: int n
for(int i = 0; i < n; i++){
print("Hello World, ");
}
for(int j = n; j > 0; j--){
print("Hello World");
}
Tight: 6n + 5
Big O: O(n)
2)
Input: l = array of comparable items
Output: l = array of sorted items
Sort:
for(int i = 0; i < l.length; i++){
for(int j = 0; j < l.length; j++){
if(l{i} > l{j}){
} }
Swap(l{i},l{j});
}
return ls;
Worst Case Time Complexity: 4n2 +3n+2 = O(n2)
For a given algorithm, time complexity or Big O is a way to provide some fair enough estimation of "total elementary operations performed by the algorithm" in relationship with the given input size n.
Type-1
Lets say you have an algo like this:
a=n+1;
b=a*n;
there are 2 elementary operations in the above code, no matter how big your n is, for the above code a computer will always perform 2 operations, as the algo does not depend on the size of the input, so the Big-O of the above code is O(1).
Type-2
For this code:
for(int i = 0; i < n; i++){
a=a+i;
}
I hope you understand the Big-O in O(n), as elementary operation count directly depend on the size of n
Type-3
Now what about this code:
//Loop-1
for(int i = 0; i < n; i++){
print("Hello World, ");
}
//Loop-2
for(int i = 0; i < n; i++){
for(int j = 0; j < n; j++) {
x=x+j;
}
}
As you can see loop-1 is O(n) and loop-2 is O(n^2). So it feel like total complexity should be O(n)+O(n^2). But no, the time complexity of the above code is O(n^2). Why? Because we are trying to know the fair enough count of elementary operations performed by the algorithm for a given input size n, which will be comparatively easy to understand by another person. With this logic, O(n)+O(n^2) become O(n^2), or O(n^2)+O(n^3)+O(n^4) become O(n^4)!
Again, you may ask: But how? How all the lower power of Big-O become so insignificant as we add it with a higher power of Big-O, that we can completely omit them (lower powers) when we are describing the complexity of our algorithm to another human?
I will try show the reason for this case: O(n)+O(n^2)=O(n^2).
Lets say n=1000 then the exact count for O(n) is 1000 operations and the exact count for O(n^2) is 1000*1000=1000000, so O(n^2) is 1000 time bigger than O(n), which means your program will spend most of the execution time in O(n^2) and thus it is not worth to mention that your algorithm also has some O(n).
PS. Pardon my English :)
In the first example, the array has n elements, and you go through these elements Twice. The first time you start from index 0 until i, and the second time you start from index n to 0. So, to simplify this, we can say that it took you 2n. When dealing with Big O notation, you should keep in mind that we care about the bounds:
As a result, O(2n)=O(n)
and O(an+b)=O(n)
Input: int n // operation 1
for(int i = 0; i < n; i++){ // operation 2
print("Hello World, "); // Operation 3
}
for(int j = n; j > 0; j--) // Operation 4
{
print("Hello World"); //Operation 5
}
As you can see, we have a total of 5 operations outside the loops.
Inside the first loop, we do three internal operations: checking if i is less than n, printing "Hello World", and incrementing i .
Inside the second loop, we also have three internal operations.
So, the total number of of opetations that we need is: 3n ( for first loop) + 3n ( second loop) + 5 ( operations outside the loop). As a result, the total number of steps required is 6n+5 ( that is your tight bound).
As I mentioned before, O( an +b )= n because once an algorithm is linear, a and b do not have a great impact when n is very large.
So, your time complexity will become : O(6n+5) =O(n).
You can use the same logic for the second example keeping in mind that two nested loops take n² instead of n.
I will slightly modify Johns answer. Defining n is one constant operation, defining integer i and assigning it to 0 is 2 constant operations. defining integer j and assigning with n is another 2 constant operations. checking the conditions for i,j inside for loop,increment,print statement depends on n so the total will be 3n+3n+5 which is equal to 6n+5. Here we cannot skip any of the statements during execution so its average case running time will also be its worst case running time which is O(n)

Resources