This is my first question and I'm pretty sure that I will receive also my first answer :P
I have to make an asymptotic analysis of an algorithm that from an array A[1...n] compute a matrix M[n][n] which contains for each M[i][j] a value given by:
M[i][j] = (A[i] + ... + A[j])/(j-i+1), if i<=j
and
M[i][j] = 0, if i>j
for i=1 to N do [N]
for j=1 to N do [N]
if(i<=j) [cost]
start=i [cost]
end=j [cost]
sum=0 [cost]
for a=start to end do [??]
sum += A[a] [cost]
M[i][j]=sum/(j-i+1 [cost]
else [cost]
M[i][j]=0 [cost]
Considering that giving the first two for loop I have to expect at least a running time of O(n^2), with the third inner for loop I will get something like O(N*N*[??]).
The third for loop executes everytime j-i+1 operations and only for i<=j.
The matrix will have the first row filled with the computed average, the second the first element equal to 0 and then the computed averages...
The final matrix will result almost half filled(but not N/2) So the value for the third loop is not [N/2]
How can I compute the running time for the innermost For and also the running time for the whole algorithm?
You could try to just calculate the number of times the inner loop statement gets executed.
You loop over i from 1 to N. Now you will only do the inner loop (with the sum) when j is larger than or equal to N so j loops from i to N. The inner most loop then consists of j-i additions. In all you get (in maple syntax)
sum(sum(j-i,j=i..N), i=1..N)= 1/6*N^3-1/6*N
You then need to take into account the assignment of M[i][j] which is done N^2 times.
The previous was the total amount of instructions. However, if you're just looking for the overal complexity then you should just see that the inner loop is dependent on N which gives you O(N^3) overall complexity.
Do note that the code could avoid the inner loop complexity by storing the A[i] sums from the start and don't try to recalculate them every time
Related
I am currently learning about asymptotic analysis, however I am unsure of how many operations are in these nested loops. Would somebody be able to help me understand how to approach them? Also what would the Big-O notation be? (This is also my first time on stack overflow so please forgive any formatting errors in the code).
public static void primeFactors(int n){
while( n % 2 == 0){
System.out.print(2 + " ");
n /= 2;
}
for(int i = 3; i <= Math.sqrt(n); i += 2){
while(n % i == 0){
System.out.print(i + " ");
n /= i;
}
}
}
In the worst case, the first loop (while) is O(log(n)). In the second loop, the outer loop runs in O(sqrt(n)) and the inner loop runs in O(log_i(n)). Hence, the time complexity of the second loop (inner and outer in total) is:
O(sum_{i = 3}^{sqrt(n)} log_i(n))
Therefore, the time complexity of the mentioned algorithm is O(sqrt(n) log(n)).
Notice that, if you mean n is modified inside the inner loop, and it affects on sqrt(n) in the outer loop, so the complexity of the second loop is O(sqrt(n)). Therefore, under this assumtion, the time complexity of the alfgorithm will be O(sqrt(n)) + O(log(n)) == O(sqrt(n)).
First, we see that the first loop is really a special case of the inner loop that occurs in the other loop, with i equal to two. It was separated as a special case in order to be able to increase i with steps of 2 instead of 1. But from the point of view of asymptotic complexity the step by 2 makes no difference: it represents a constant coefficient, which we can ignore. And so for our analysis we can just rewrite the code to this:
public static void primeFactors(int n){
for(int i = 2; i <= Math.sqrt(n); i += 1){ // note the change in start and increment value
while(n % i == 0){
System.out.print(i + " ");
n /= i;
}
}
}
The number of times that n/i is executed, corresponds to the number of non-distinct prime divisors that a number has. According to this Q&A that number of times is O(loglogn). It is not straightforward to derive this, so I had to look it up.
We should also consider the number of times the for loop iterates. The Math.sqrt(n) boundary for i can lower as the for loop iterates. The more divisions take place, the (much) fewer iterations the for loop has to make.
We can see that at the time that the loop exits, i has surpassed the square root of the greatest prime divisor of n. In the worst case that greatest prime divisor is n itself (when n is prime). So the for loop can iterate up to the square root of n, so O(√n) times. In that case the inner loop never iterates (no divisions).
We should thus see which is more determining, and we get O(√n + loglogn). This is O(√n).
The first loop divides n by 2 until it is not divisible by 2 anymore. The maximum number of time this can happen is log2(n).
The second loop, at first sight, seems to run sqrt(n) times the inner loop which is also O(log(n)), but that is actually not the case. Everytime the while condition of the second loop is satisfied, n is drastically decreased, and since the condition of a for-loop is executed on each iteration, sqrt(n) also decreases. The worst case actually happens if the condition of while loop is never satisfied, i.e. if n is prime.
Hence the overall time complexity is O(sqrt(n)).
I'm trying to understand the time complexity of insertion sort. I got stuck at while loop. I'm unable to understand how many times while loop executes
InsertionSort(A)
for j = 2 to A.length
key = A[j]
i = j - 1
while i>0 and A[i]>key
A[i+1] = A[i]
i = i - 1
A[i+1] = key
I know that for loop executes n+1 times and every statement in the loop execute n times
while loop also executes n times
But, what I don't understand is "How many times statements under while loop executes for both worst and best cases?"
In the worst case, A is sorted in descending order, which means that for the j'th entry, the inner loop will run j times (give or take a "+1" or "-1"...). Happily, there is a formula for that: as Gauss famously found out spontaneously and under duress, summing up all numbers from 1 to n yields a result of n*(n+1)/2.
As we only care about complexity and not actual values, we can leave the constant and multiplicative factors off and end up with O(n^2).
Tongue-in-cheek aside, the fact that there is a loop within a loop is a strong indication for O(n^2) when the inner loop count is bounded linearly - which it is here.
Best case, with A already sorted in ascending order, the inner loop will be entered not at all, and overall complexity will be O(n).
The average case depends heavily on what your expected "unorderedness" looks like. For example, the sort will behave greatly if your list is basically always sorted already, and there are only very few, very local switchups.
I'm reading a book on algorithm analysis and have found an algorithm which I don't know how to get the time complexity of, although the book says that it is O(nlogn).
Here is the algorithm:
sum1=0;
for(k=1; k<=n; k*=2)
for(j=1; j<=n; j++)
sum1++;
Perhaps the easiest way to convince yourself of the O(n*lgn) running time is to run the algorithm on a sheet of paper. Consider what happens when n is 64. Then the outer loop variable k would take the following values:
1 2 4 8 16 32 64
The log_2(64) is 6, which is the number of terms above plus one. You can continue this line of reasoning to conclude that the outer loop will take O(lgn) running time.
The inner loop, which is completely independent of the outer loop, is O(n). Multiplying these two terms together yields O(lgn*n).
In your first loop for(k=1; k<=n; k*=2), variable k reaches the value of n in log n steps since you're doubling the value in each step.
The second loop for(j=1; j<=n; j++) is just a linear cycle, so requires n steps.
Therefore, total time is O(nlogn) since the loops are nested.
To add a bit of mathematical detail...
Let a be the number of times the outer loop for(k=1; k<=n; k*=2) runs. Then this loop will run 2^a times (Note the loop increment k*=2). So we have n = 2^a. Solve for a by taking base 2 log on both sides, then you will get a = log_2(n)
Since the inner loop runs n times, total is O(nlog_2(n)).
To add to #qwerty's answer, if a is the number of times the outer loop runs:
k takes values 1, 2, 4, ..., 2^a and 2^a <= n
Taking log on both sides: log_2(2^a) <= log_2(n), i.e. a <= log_2(n)
So the outer loop has a upper bound of log_2(n), i.e. it cannot run more than log_2(n) times.
I don't mean to be asking for help with something simple, but I can't seem to figure out how to answer this question.
Compute the time complexity of the following program
fragment:
sum = 0;
for i=1 to n do
for j=1 to i do
k = n*2
while k>0 do
sum=sum+1;
k = k div 2;
I recognize that what is inside the while loop takes O(1), the while loop takes O(logn), but then I don't follow how that connects to the nested for loops, since I am used to just doing nested sigma notations for for loops.
Thanks!
A formal demonstration which shows step by step the order of growth of your algorithm:
Here are some hints on to break down this function's complexity:
Look at the inner loop, where k=n*2. Lets assume n=8 so k=16, k keeps being divided by 2 until it's 0 or less (i'll assume here that rounding 0.5 yields 0). So the series describing k until the end of the loop will be 16,8,4,2,1,0. try to think what function describes the number of elements in this series, if you know the first value is k.
You have two nested for loops, the first loop just iterates n times, then second (inner) loop iterates until it reaches the number of the iteration of the first loop (represented by i), which means at first it will iterate once, then twice, and so on until n. So the number of iterations performed by the second loop can be described by the series: 1, 2, 3, ... , n. This is a very simple arithmetic progression, and the sum of this series will give you the total number of iterations of the inner for loop. This is also the number of times you call the inner while loop (which is not affected by the number of the current iteration, as k depends on n which is constant and not on i or j).
How to find an algorithm for calculating the sum value in the array??
Is is Something like this?
Algorithm Array Sum
Input: nonnegative integer N, and array A[1],A[2],...,A[N]
Output: sum of the N integers in array A
Algorith Body:
j:=1
sum:=0
while j<N
sum := sum + a[J]
j:=j+1
end while
end Algorithm Array Sum
And how I can relate it with the running time of the algorithm by using O-Notation
This is the past year exam and I need to make revision for my exam.
Question
An Array A[] holding n integer value is given
1.Give an algorithm for calculating the sum of all the value in the array
2.Find the simplest and best O-notation for the running time of the algorithm.
The question is to find the sum of all the values so iterate through each element in the array and add each element to a temporary sum value.
temp_sum = 0
for i in 1 ...array.length
temp_sum = temp_sum + array[i]
Since you need to go through all the elements in the array, this program depends linearly to the number of elements. If you have 10 elements, iterate through 10 elements, if you have a million you have no choice other than to go through all the million elements and add each of them. Thus the time complexity is Θ(n).
If you are finding the sum of all the elements and you dont know any thing about the data then you need to look at all the elements at least once. Thus n is the lowerbound. You also need not look at the element more than once. n is also the upper bound. Hence the complexity is Θ(n).
However if you know something about the elements..say you get a sequency of n natural numbers, you can do it in constant time with n(n+1)/2. If the data you get are random then you have no choice but do the above linear time algorithm.
Since n is the size of array and all you have to do is iterate from begeinning to end the the Big O notation is O[n]
integer N= Size_array;
array a[N]
j=1
sum=0
while j<=N
sum += a[j]
j++
end while
I think that you meant "while j <= N", you need to specify this.
The running time shall be O(n), I think, as you have only one loop.
To calculate O for this algorithm you need to count the number of times each line of code executes. Later on you will only count the fundamental operations but start by counting all.
So how many times will the j := 1 line run? How many times will the sum := 0 run?
How many times will the while loop's condition execute? The statements inside the while loop?
Sum these all up. You will notice that the value you get will be something like 1 + 1 + n + n + n = 2 + 3n. thus you can conclude that it is a linear function on n.