Big-O complexity in a loop - algorithm

If you had an algorithm with a loop that executed n steps the first time through,then n − 2 the second time, n − 4 the next time, and kept repeating until the last time through the loop it executed 2 steps, what would be the complexity measure of this loop?
I believe this exhibits O(n^2) complexity, as the number of steps not being executed increases quadratically. I am having a hard time visualizing such the loop itself, which makes me less confident about my answer.
Any kind of help/second opinion is greatly appreciated :)

You are correct that the complexity is Θ(n2). This is because what you describe is an arithmetic progression:
(n - 2) + (n - 4) + ... + 2 (or an odd number at the end)
(which is, obviously, 2 + 4 + 6 + ... + (n - 2) or the odd-beginning equivalent, BTW).
Using the formula for the sum, it is the average of the first and last elements, times the number of elements. Each of these terms is Θ(n), and their product is Θ(n2).

Related

Asymptotic analaysis nlogn vs. n

image of code that is analyzed for asymptotic runtime
I was given a problem of analyzing the runtime of this code, and I came up with NlogN since the outerloop is Logn and the innerloop is N, so the loops multiply out to NlogN. However, the solution says this is incorrect and says the actual runtime of this code is O(n) due to the inner loop running on O(n), but for some reason the loops don't multiply so Logn is dropped due to it being lower order?
Can someone help me out on understanding this? Why is LogN + N instead of LogN * N?
Okay, let's count how many time the instruction sum++ is triggered
in the first loop of while it will be triggered 2^0 = 1
in the second one 2^1 = 2 ,in the third it is 2^2 = 4 and ..so on
in the loop number var loop it will be triggered 2^var times
so the overall count is
2^0 + 2^1 + 2^2 + ... + 2^var ,which is equal to 2^(var+1) - 1
now our problem is to find out the value of var, it is obviously log(N)
so the overall is 2^(log(N)+1) - 1, which is O(N)

Big-O complexity of a piece of code

I have a question in algorithm design about complexity. In this question a piece of code is given and I should calculate this code's complexity.
The pseudo-code is:
for(i=1;i<=n;i++){
j=i
do{
k=j;
j = j / 2;
}while(k is even);
}
I tried this algorithm for some numbers. and I have gotten different results. for example if n = 6 this algorithm output is like below
i = 1 -> executes 1 time
i = 2 -> executes 2 times
i = 3 -> executes 1 time
i = 4 -> executes 3 times
i = 5 -> executes 1 time
i = 6 -> executes 2 times
It doesn't have a regular theme, how should I calculate this?
The upper bound given by the other answers is actually too high. This algorithm has a O(n) runtime, which is a tighter upper bound than O(n*logn).
Proof: Let's count how many total iterations the inner loop will perform.
The outer loop runs n times. The inner loop runs at least once for each of those.
For even i, the inner loop runs at least twice. This happens n/2 times.
For i divisible by 4, the inner loop runs at least three times. This happens n/4 times.
For i divisible by 8, the inner loop runs at least four times. This happens n/8 times.
...
So the total amount of times the inner loop runs is:
n + n/2 + n/4 + n/8 + n/16 + ... <= 2n
The total amount of inner loop iterations is between n and 2n, i.e. it's Θ(n).
You always assume you get the worst scenario in each level.
now, you iterate over an array with N elements, so we start with O(N) already.
now let's say your i is always equals to X and X is always even (remember, worst case every time). how many times you need to divide X by 2 to get 1 ? (which is the only condition for even numbers to stop the division, when they reach 1).
in other words, we need to solve the equation
X/2^k = 1 which is X=2^k and k=log<2>(X)
this makes our algorithm take O(n log<2>(X)) steps, which can easly be written as O(nlog(n))
For such loop, we cannot separate count of inner loop and outer loop -> variables are tighted!
We thus have to count all steps.
In fact, for each iteration of outer loop (on i), we will have
1 + v_2(i) steps
where v_2 is the 2-adic valuation (see for example : http://planetmath.org/padicvaluation) which corresponds to the power of 2 in the decomposition in prime factor of i.
So if we add steps for all i we get a total number of steps of :
n_steps = \sum_{i=1}^{n} (1 + v_2(i))
= n + v_2(n!) // since v_2(i) + v_2(j) = v_2(i*j)
= 2n - s_2(n) // from Legendre formula (see http://en.wikipedia.org/wiki/Legendre%27s_formula with `p = 2`)
We then see that the number of steps is exactly :
n_steps = 2n - s_2(n)
As s_2(n) is the sum of the digits of n in base 2, it is negligible (at most log_2(n) since digit in base 2 is 0 or 1 and as there is at most log_2(n) digits) compared to n.
So the complexity of your algorithm is equivalent to n:
n_steps = O(n)
which is not the O(nlog(n)) stated in many other solutions but a smaller quantity!
lets start with worst case:
if you keep dividing with 2 (integral) you don't need to stop until you
get to 1. basically making the number of steps dependent on bit-width,
something you find out using two's logarithm. so the inner part is log n.
the outer part is obviously n, so N log N total.
A do loop halves j until k becomes odd. k is initially a copy of j which is a copy of i, so do runs 1 + power of 2 which divides i:
i=1 is odd, so it makes 1 pass through do loop,
i=2 divides by 2 once, so 1+1,
i=4 divides twice by 2, so 1+2, etc.
That makes at most 1+log(i) do executions (logarithm with base 2).
The for loop iterates i from 1 through n, so the upper bound is n times (1+log n), which is O(n log n).

Time complexity of the following algorithm?

I'm learning Big-O notation right now and stumbled across this small algorithm in another thread:
i = n
while (i >= 1)
{
for j = 1 to i // NOTE: i instead of n here!
{
x = x + 1
}
i = i/2
}
According to the author of the post, the complexity is Θ(n), but I can't figure out how. I think the while loop's complexity is Θ(log(n)). The for loop's complexity from what I was thinking would also be Θ(log(n)) because the number of iterations would be halved each time.
So, wouldn't the complexity of the whole thing be Θ(log(n) * log(n)), or am I doing something wrong?
Edit: the segment is in the best answer of this question: https://stackoverflow.com/questions/9556782/find-theta-notation-of-the-following-while-loop#=
Imagine for simplicity that n = 2^k. How many times x gets incremented? It easily follows this is Geometric series
2^k + 2^(k - 1) + 2^(k - 2) + ... + 1 = 2^(k + 1) - 1 = 2 * n - 1
So this part is Θ(n). Also i get's halved k = log n times and it has no asymptotic effect to Θ(n).
The value of i for each iteration of the while loop, which is also how many iterations the for loop has, are n, n/2, n/4, ..., and the overall complexity is the sum of those. That puts it at roughly 2n, which gets you your Theta(n).

Resolving a running time of recursive relations

My algorithm is given below i know that this algorithm has an exponential running time but i dont know how to show that mathematically. Does anyone have any idea on this?
if(n = 1 or n = 2) then return n
else
return 2 * RecursiveMNum(n - 1) * RecursiveMNum(n - 2)
As you can see the complexity depends on the call 2 * RecursiveMNum(n - 1) * RecursiveMNum(n - 2), as the other will be computated on O(1).
So you can solve this using Substitution.
T(n) = T(n-1) + T(n-2) < 2T(n-1)
and now
2T(n-1)=2(2T(n-2))=2(2(2T(n-3)))=...=2^kT(n-k)=...=2^nT(0)=O(2^n)
T(0) = Θ(1) (base case)
So you can say it has an O(2^n) complexity generally.
As you said for n=1 one call to function is made to get the answer, same for n=2. The number of calls made for n=3 is number of calls made for n=1 plus number of calls made for n=2. Thus the sequence for number of calls is 1,1,2,3 ..... . This indeed is the Fibonacci series. If you know the concept of golden numbers the ratio of two successive numbers in this series is about 1.6 ( Golden Number ) which is close to the golden number.
Now the successive ratio in an exponential series such as a^n is a. In our case we can say a is roughly about 1.6. Thus indeed it has an exponential growth, ie. you can say it is O(a^n) where a is about 1.6. You can also have a look at the closed form of nth fibonacci number here : Nth Fibonacci which justifies the argument as it has an exponential form.

What is the complexity of an arithmetic progression?

I dont really understand how to calculate the complexity of a code. I was told that i need to look on the number of actions that are done on each item in my code. So when I have a loop that runs over an array and based on the idea of arithmetic progression (I want to calculate the sum from every index till the end of the array) which means at first i pass over n cells and the second time n-1 cells and so on... why is the complexity considerd O(N^2) and not O(n) ?
As I see it, n + n-1 +n-2 + n-c.. is xn -c , In other words O(n). SO WHY am i wrong ?
As I see it, n + n-1 +n-2 + n-c.. is xn -c , In other words O(n). SO WHY am i wrong ?
Actually, it is not true. The sum of this arithmetic progression is n*(n-1)/2 = O(n^2)
P.S I have read your task : you need only one loop over an array using the previous results, so you can solve this one with O(n) complexity.
for i=1 to n
result[i] = a[i]+result[i-1]
What your code is telling to do is the following :-
traverse array from 1 to n
traverse array from 2 to n
... similarly after total n-1 iterations
traverse array's nth element
As you can notice that array traversing of cells is decreasing in order of 1.
Each traversal is being guided by loop which is increasing upto value of i. The whole code is wrapped under a function of n.
The concrete idea for number of actions performed on each item of the array is :-
for ( i = 1 to n )
for ( j = i to n )
traverse array[j] ;
Hence, complexity of your code = O(n^2) and the order is clearly in AP as it forms the series n + (n-1) + ... + 1 with a common difference of 1.
I hope it is clear...
The time complexity is: 1 + 2 + ... + n.
This is equal to n(n+1)/2.
For example, for n = 3: 1 + 2 + 3 = 6
and 3(4)/2 = 12/2 = 6
n(n+1)/2 = (n^2 + n) / 2 which is O(n^2) because we can remove constant factors and lower order terms.
As an arithmetic progression has a closed form solution, its efficient computation is o(1): that is its computation time does not depend on the number of elements.
If you were to use a loop then it would be o(n) as the execution time would be linear on the number of elements.
You're adding up n numbers whose average value is (n/2) because they range from 1 to n. Thus n times (n/2) = n^2 / 2. We don't care about the constant multiple, so O(n^2).
You are getting it wrong somewhere! The sum of an arithmetic progression is of the order of n^{2}
To clear your doubts on arithmetic progression, visit this link: http://www.mathsisfun.com/algebra/sequences-sums-arithmetic.html
And as you said, you face difficulty in finding the complexity of any code, you can read from these two links:
http://discrete.gr/complexity/
http://www.cs.cmu.edu/~adamchik/15-121/lectures/Algorithmic%20Complexity/complexity.html
Good enough to get you going and help you understand how to find the complexity of most of the algorithms.

Resources