Time complexity of the inner loop - algorithm

Can someone help me with calculating the time complexity of the inner loop? As far as I understand, the outer one will be O(n). But I have no idea how to calculate what happens inside the second one.
for (int i = 2; i < n; i++) {
for (int j = 2; i * j < n; j++) {
}

For every iteration of "outer loop", inner loop runs n/i times
So, total complexity of this will be given by:
n/2 + n/3 + n/4 + ...
= n * (1/2 + 1/3 + 1/4 ...)
For the right term above, upper bound is ln(n)
Hence, complexity of this code is O(n log n).

The inner loop runs from 2 up to but not including n/i times. You can express it as n/i - 2.
If we run the inner loop n - 2 times (since that's the number of times the outer loop runs), we get the following summation:
(n/2 - 2) + (n/3 - 2) + ... + (3 - 2)
I have a hunch but can't remember 100% that this series sums up to log_e(n) * n or similar. So in terms of time complexity, this becomes O(log n * n).

The loop exits as soon as i * j ≥ n, i.e. when j = ceiling(n / i) ~ n / i. As it starts from j=2, the number of iterations is ceiling(n / i) - 1.

Related

Time Complexity - While loop divided by 2 with for loop nested

I am stuck on a review question for my upcoming midterms, and any help is greatly appreciated.
Please see function below:
void george(int n) {
int m = n; //c1 - 1 step
while (m > 1) //c2 - log(n) steps
{
for (int i = 1; i < m; i++) //c3 - log(n)*<Stuck here>
int S = 1; //c4 - log(n)*<Stuck here>
m = m / 2; //c5 - (1)log(n) steps
}
}
I am stuck on the inner for loop since i is incrementing and m is being divided by 2 after every iteration.
If m = 100:
1st iteration m = 100: loop would run 100, i iterates 100 times + 1 for last check
2nd iteration m = 50: loop would run 50 times, i iterates 50 times + 1 for last check
..... and so on
Would this also be considered log(n) since m is being divided by 2?
External loop executes log(n) times
Internal loop executes n + n/2 + n/4 +..+ 1 ~ 2*n times (geometric progression sum)
Overall time is O(n + log(n)) = O(n)
Note - if we replace i < m with i < n in the inner loop, we will obtain O(n*log(n)) complexity, because in this case we have n + n + n +.. + n operations for inner loops, where number of summands is log(n)

Is this loop O(nlog(n))?

I have a nested for loop that I am trying to analyze the efficiency of. The loop looks like this:
int n = 1000;
for (int i = 0; i < n; i++) {
for (int j = 0; j < i; j++) {
System.out.print("*");
}
}
I don't believe that this algorithm is O(n^2) because the inner loop does not run n times, it only runs i times. However, it certainly is not O(n). So I hypothesize that it must be between the two efficiencies, which gives O(nlog(n)). Is this accurate or is it really a O(n^2) algorithm and I'm misunderstanding the effect the inner loop has on the efficiency?
Your algorithm will run a triangular number of times:
n * (n + 1) / 2
In the above case, n = 999 because the first j loop doesn't run:
(999 * 1000) / 2 = 499500
It is lower than n**2, but it still is O(n**2), because n * (n + 1) / 2 is n**2 / 2 + n / 2. When n is large, you can ignore n / 2 compared to n**2 / 2, and you can also ignore the constant 1 / 2 factor.
I kind of get your doubts, but try to think in this way: what value will i have in the worst case scenario? Answer is n-1, right? So, as the complexity is evaluated by considering the worst case scenario it turns out that it is O(n^2) as n * (n-1) ~ n^2.
The number of iterations is sum from i=0 to n-1 (sum from j=0 to i-1 (1)). The inner sum is obviously equal to i. sum from i=0 to n-1 (i) = n * (n-1) / 2 = O(n^2) is well known.

Time complexity of following code with explanation?

What is the time complexity of this algorithm, and why?
int count = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
count += 1;
}
}
The correct answer is O(n), but I am getting O(nlogn). Can anyone tell me why it's O(n)?
On the first iteration, the inner loop executes N times, and then N/2 times, and then N/4 times, and so on. This can be expressed as an infinite sum:
N + N/2 + N/4 + N/8 + N/16 + N/32 + ...
If you factor out the N from each term, you get:
N * (1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32 + ...)
The infinite sequence in parentheses converges on the value 2 (more info on Wikipedia). Therefore, the number of operations can be simplified to:
N * 2
In terms of Big-O, the asymptotic value is within:
O(N)
You can check this by observing that the relationship in the output between N and count is linear: Ideone Demo

Big Theta complexity for simple algorithm

I have the following code and have to determine the big theta complexity
for i =1 to n do
for j = 1 to n do
k=j
while k<= n do
k = k*3
end while
end for
end for
It's easy to see that the first two for-loops run n times each, but the while loop is throwing my off. The first time it runs log3(n) times, but after that i can't really tell.
Anyone who can help?
Let T be the run time. It is clear T is Ω(n2). We can use Stirling's Approximation to expand ln n! to get
T = ∑i ∑j ⌈lg3(n/j)⌉ = n * O(∑j ln n - ln j + 1) = n * O(n ln n - ln n! + n) = n * O(n ln n - (n ln n - n + O(ln n)) + n) = O(n2)
Thus T = Θ(n2)
Solution without using heavy-weight math:
Turn the problem on its head: instead of thinking about the first time the inner loop runs, think about the last time: it runs only once. In fact, the innermost loop runs only once for most values of j.
It runs once when j > n/3, that is, for 2n/3 values of j
It runs twice when n/9 < j <= n/3, that is, for 2n/9 values of j
It runs 3 times when n/27 < j <= n/9, that is, for 2n/27 values of j
It runs 4 times when n/81 < j <= n/27, that is, for 2n/81 values of j
...
The total number of times the innermost loop runs is going to be
1 * 2n/3 + 2 * 2n/9 + 3 * 2n/27 + 4 * 2n/81 + ...
= 2n(1/3 + 2/9 + 3/27 + ... )
< 2n Sum[k/3^k, for k=1 to infinity]
It's easy to see that the series Sum[k/3^k] converges (ratio test). Therefore the j-loop runs in O(n) time, and the entire thing in O(n²) time.

Analyzing worst case order-of-growth

I'm trying to analyze the worst case order of growth as a function of N for this algorithm:
for (int i = N*N; i > 1; i = i/2)
for (int j = 0; j < i; j++) {
total++;
}
What I'm trying is to analyze how many times the line total++ will run by looking at the inner and outer loops. The inner loop should run (N^2)/2 times. The outer loop I don't know. Could anyone point me in the right direction?
The statement total++; shall run following number of times:
= N^2 + N^2 / 2 + N^2 / 4 ... N^2 / 2^k
= N^2 * ( 1 + 1/2 + 1/4 + ... 1/2^k )
The number of terms in the above expression = log(N^2) = 2log(N).
Hence sum of series = N^2 * (1 - 1/2^(2logN)) / (1/2)
= N^2 * (1 - 1/4N) / (1/2).
Hence according to me the order of complexity = O(N^2)
The outer loop would run with a complexity of log(N) as the series reduces to half on every iteration . For example a binary search.
The outer loop runs exactly 2LOG (base 2) N + 1 times (Float to int conversion and remove decimal places). If you see the value decreases like N^2,N^2/2 , N^2/4 ... 1. ..
So the total number of times total ++ runs is,
Summazion( x from 0 to int(2LOG (base 2) N + 1)) N^2/2^x
for this question as the inner loop is depending upon the value of the variable that is changing by the outer loop (so u cant solve this simply by multiplying the values of inner and the outer loops). u will have to start writing the values in a and then try to figure out the series and then solve the series to get the answer..
like in your question, total++ will run..
n^2 + n^2/2 + n^2/2^2 + n^2/2^3 + .....
then, taking n^2 common, we get
n^2 [ 1 + 1/2 + 1/2^2 + 1/2^3 + ...]
solve this series to get the answer

Resources