What is the asymptotic running time for variance? - algorithm

As you can see I'm still pretty new with all these run time analyses and want to make sure each step I'm calculating is right..
Also I hate writing in pseudocode form so I did this in Python instead.. here goes
def mean(n):
sum = 0 #cost = 1
for i in n: #cost = n
sum += i #cost = n
return sum/len(n) #cost = 1
Therefore overall running time for mean (correct me if im wrong) = O(1) + O(n) + O(n) + O(1) = O(n)
def variance(n):
var = 0 #cost = 1
for i in n: #cost = n
var += (i-mean(n))**2 #cost = n*n or n+n ??
return var / len(n) #cost = 1
Question is what is the overall running time for variance? Can you show all workings please?

O(N^2), since you're performing an O(N) operation N times.
In general, loops are multiplicative when determining runtime; had your variance loop been "for i in lg(n)" then your runtime would be O(N * lg(N)) since you'd be performing an O(N) operation lg(N) times; likewise had your inner operation been O(2^N) with an outer loop of "for i in n" then your runtime would be O(N * 2^N)
Another common loop format is
for(int i = 0; i < N; i++) {
for(int j = i; j < N; j++) {
// O(1) operation
}
}
This is still O(N^2), but the analysis is a bit trickier: you need to take the sum of the arithmetic series "1 + 2 + 3 + ... + n-1 + n" which is n * (n - 1) / 2, or O(N^2)

Related

What's the complexity of sum i=0 -> n (n_i*i))

This is a test I failed because I thought this complexity would be O(n), but it appears i'm wrong and it's O(n^2). Why not O(n)?
First, notice that the question does not ask what is the time complexity of a function calculating f(n), but rather the complexity of the function f(n) itself. you can think about f(n) as being the time complexity of some other algorithm if you are more comfortable talking about time complexity.
This is indeed O(n^2), when the sequence a_i is bounded by a constant and each a_i is at least 1.
By the assumption, for all i, a_i <= c for some constant c.
Hence, a_1*1+...+a_n*n <= c * (1 + 2 + ... + n). Now we need to show that 1 + 2 +... + n = O(n^2) to complete the proof.
1 + 2 + ... + n <= n + n + ... + n = n * n = n ^ 2
and
1 + 2 + ... + n >= n / 2 + (n / 2 + 1) + ... + n >= (n / 2) * (n / 2) = n^2/4
So the complexity is actually Theta(n^2).
Note that if a_i was not constant, e.g., a_i = i then the result is not correct.
in that case, f(n) = 1^2 + 2^2 + ... + n^2 and you can show easily (using the same method as before) that f(n) = Omega(n^3), which means it's not O(n^2).
Preface, not super great with complexity-theory but I'll take a stab.
I think what is confusing is that its not a time complexity problem, but rather the functions complexity.
So for easy part i just goes up to n ie. 1,2,3 ...n , then for ai all entries must be above 0 meaning that a could be something like this 2,5,1... for n times. If you multiply them together n*n = O(n2).
The best case would be if a is 1,1,1 which drop the complexity down to O(n) but the average case will be n so you get squared.
Unless it's mentioned that a[i] is O(n), it's definitely O(n)
Here an another try to achieve O(n*n) if sum should be returned as result.
int sum = 0;
for(int i = 0; i<=n; i++){
for(int j = 0; j<=n; j++){
if(i == j){
sum += A[i] * j;
}
}
return sum;

Time complexity of nested for loop with inner iteration variable dependent on the outer one

This is the loop structure :
for (int i = 1 ; i < n ; i++) {
for (int j = 0 ; j < n ; j += i) {
// do stuff
}
}
My guess was O(nlogn) as it clearly cannot be O(n^2) since the increment in j is increasing and it clearly cannot be O(n sqrt(n)) since the increment is not that high. But I have no idea how to prove it formally.
Each time complexity of the inner loop is based on the value of i is n/i. Hence, total time would be n + n/2 + n/3 + ... + n/n = n(1+1/2+1/3+...+1/n).
As we know 1+1/2+1/3+...+1/n is a harmonic sereis and asymptotically is log(n). Hence, the algorithm is run in O(nlog(n)).

How do I find the worst-case time-complexity of an algorithm with an inner loop? [duplicate]

This question already has answers here:
Time complexity for dependant nested for loop?
(3 answers)
Closed 5 years ago.
Suppose I have a code with two for-loops:
int sum = 0;
for (int i = 1; i < N; i *= 2)
for(int j = 0; j < i; j++)
sum++;
How would I find the worst-case time complexity of this code? I've looked at many tutorials on finding time complexity and I understood them. But this one just seems a bit different from those in the tutorials.
Lets do simple maths.
Value of i in each iteration goes as 1,2,4,8... (log N + 1) terms. In each iteration, the inner loop goes i times. Adding up the values of i
T(N) = 1 + 2 + ... + (log N + 1) terms i.e. GP with a = 1 and r = 2 and n = (log N + 1)
T(N) = a[(rn-1)/(r-1)]
= 1[(2(logN+1) - 1)/(2-1)]
= 2N - 1
= O(N)
So the complexity is O(N) in all scenarios.
First consider a simpler case when N=2^k+1 for some integer k. In this case the outer loop has k iterations and the overall number of operations is
T(N) = 1 + 2 + 4 + ... + 2^k = 2^(k+1) - 1 = 2N - 3.
So the worst case complexity is at least O(2N - 3) = O(N).
Now suppose 2^k + 1 < N <= 2^(k+1) for some k. Then
T(N) = 1 + 2 + 4 + ... + 2^k = 2^(k+1) - 1 < 2N = O(N)
Therefore T(N) = O(N).

Big-O complexity of algorithms

I'm trying to figure out the exact big-O value of algorithms. I'll provide an example:
for (int i = 0; i < n; i++) // 2N + 2
{
for (int x = i; x < n; x++) // N * 2N + 2 ?
{
sum += i; // N
}
} // Extra N?
So if I break some of this down, int i = 0 would be O(1), i < n is N+1, i++ is N, multiply the inner loop by N:
2N + 2 + N(1 + N + 1 + N) = 2N^2 + 2N + 2N + 2 = 2N^2 + 4N + 2
Add an N for the loop termination and the sum constant, = 3N^2 + 5N + 2...
Basically, I'm not 100% sure how to calculate the exact O notation for an algorithm, my guess is O(3N^2 + 5N + 2).
What do you mean by exact? Big O is an asymptotic upper bound, so it's by definition not exact.
Thinking about i=0 as O(1) and i<n as O(N+1) is not correct. Instead, think of the outer loop as doing something n times, and for every iteration of the outer loop, the inner loop is executed at most n times. The calculation inside the loop takes constant time (the calculation is not getting more complex as n gets bigger). So you end up with O(n*n*1) = O(n^2), quadratic complexity.
When asking about "exact", you're running the inner loop from 0 to n, then from 1 to n, then from 2 to n, ... , from (n-1) to n, each time doing a constant time operation. So you do n + (n-1) + (n-2) + ... + 1 = n*(n+1)/2 = n^2/2 + n/2 iterations. To get from the exact number of calculations to big O notation, omit constants and lower-order terms, and you'll end up with O(n^2) (the 1/2 and +n/2 are omitted).
Big O means Worst case complexity.
And Here worst case will occur only if both the loops will run for n numbers of time i.e n*n.
So, complexity is O(n2).

Recursion, inner loop and time complexity

Consider the following function:
int testFunc(int n){
if(n < 3) return 0;
int num = 7;
for(int j = 1; j <= n; j *= 2) num++;
for(int k = n; k > 1; k--) num++;
return testFunc(n/3) + num;
}
I get that the first loop is O(logn) while the second loop gives O(n) which gives a time complexity of O(n) in total. But due to the recursive calls I thought the time complexity would be O(nlogn), but apperantly it is only O(n). Can anyone explain why?
The recursive call pretty much gives the following for the complexity(denoting the complexity for input n by T(n)):
T(n) = log(n) + n + T(n/3)
First observation as you correctly noted is that you can ignore the logarithm as it is dominated by n. Now we are only left with T(n) = n + T(n/3). Try writing this up to 0 for instance. We have:
T(n) = n + n/3 + n/9+....
You can easily prove that the above sum is always less than 2*n. In fact better limits can be proven but this one is enough to state that overall complexity is O(n).
For procedures using a recursive algorithm such as the following:
procedure T( n : size of problem ) defined as:
if n < base_case then exit
Do work of amount f(n) // In this case, the O(n) for loop
T(n/b)
T(n/b)
... a times... // In this case, b = 3, and a = 1
T(n/b)
end procedure
Applying the Master theorem to find the time complexity, the f(n) in this case is O(n) (due to the second for loop, like you said). This makes c = 1.
Now, logba = log31 = 0, making this the 3rd case of the theorem, according to which the time complexity T(n) = Θ(f(n)) = Θ(n).

Resources