What is the time complexity of following nested dependent loops - algorithm

for(int i = 2; i < N; i ++)
for(int j = 1; j < N; j = j * i)
sum += 1
I got
Can we generalize it further?

Using an algebraic identity about logarithms, logᵢ(N) = log N/log i, so we can take log N out as a factor and the summation is then of 1/log i. Approximating this summation as the integral of 1/log x, we get that asymptotically it is O(N/log N), per Wikipedia. Since we previously took out a factor of log N, multiplying by this gives a final result of O(N).

Related

order of growth for loop worst case running time

I am having trouble with this. The inner loop depends on the outer loop and from trying out values of n, the loop runs 1+2+4...+sqrt(n) times. Any help would be greatly appreciated!
int sum = 0;
for (int k = 1; k*k <= n; k = k*2)
for (int j = 0; j < k; j++)
sum++;
If K is the largest power of 2 with K*K <= n, then your sum is 1+2+4+8+...+K = 2K+1.
K is clearly less than or equal to sqrt(n), but it's also greater than sqrt(n)/4 (because if not, then 2K*2K would be less than or equal to n, contradicting the fact that K is the largest power of 2 with K*K <= n.
So sqrt(n)/4 < K <= sqrt(n), and your runtime (2K+1) is between sqrt(n)/2+1 and 2sqrt(n)+1, and thus the complexity is Θ(sqrt(n)).

Find the Big O time complexity of the code

I am fairly familiar with simple time complexity regarding constant, linear, and quadratic time complexities. In simple code segments like:
int i = 0;
i + 1;
This is constant. So O(1). And in:
for (i = 0; i < N; i++)
This is linear since it iterates n+1 times, but for Big O time complexities we remove the constant, so just O(N). In nested for loops:
for (i = 0; i < N; i++)
for (j = 0; j < N; j++)
I get how we multiply n+1 by n and reach a time complexity of O(N^2). My issue is with slightly more complex versions of this. So, for example:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < N*N; j++)
S++;
In such a case, would I be multiplying n+1 by the inner for loop time complexity, of which I presume is n^2? So the time complexity would be O(n^3)?
Another example is:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < i*i; j++)
for (k = 0; k < j; k++)
S++;
In this case, I expanded it and wrote it out and realized that the inner, middle for loop seems to be running at an n*n time complexity, and the most inner for loop at the pace of j, which is also nxn. So in that case, would I be multiplying n+1 x n^2 x n^2, which would give me O(n^5)?
Also, I am still struggling to understand what kind of code has logarithmic time complexity. If someone could give me an algorithm or segment of code that performs at log(n) or n log(n) time complexity, and explain it, that would be much appreciated.
All of your answers are correct.
Logarithmic time complexity typically occurs when you're reducing the size of the problem by a constant factor on every iteration.
Here's an example:
for (int i = N; i >= 0; i /= 2) { .. do something ... }
In this for-loop, we're dividing the problem size by 2 on every iteration. We'll need approximately log_2(n) iterations prior to terminating. Hence, the algorithm runs in O(log(n)) time.
Another common example is the binary search algorithm, which searches a sorted interval for a value. In this procedure, we remove half of the values on each iteration (once again, we're reducing the size of the problem by a constant factor of 2). Hence, the runtime is O(log(n)).

what is the time complexity of this code and how? in Big-O

int i, j, k = 0;
for (i = n/2; i <= n; i++) {
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
}
I came across this question and this is what I think.
The outer loop will run, N/2 times and the inner loop will run logN times so it should be N/2*logN. But this is not the correct answer.
The correct answer is O(NlogN), can anybody tell me what I am missing?
Any help would be appreciated.
Let's take a look at this block of code.
First of all, you can notice that inner loop doesn't depend on the external, so the complexity of it would not change at any iteration.
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
I think, your knowledge will be enough to understand, that complexity of this loop is O(log n).
Now we need to understand how many times this loop will be performed. So we should take a look at external loop
for (i = n/2; i <= n; i++) {
and find out, that there will be n / 2 iterations, or O(n) in a Big-O notation.
Combine these complexities and you'll see, that your O(log n) loop will be performed O(n) times, so the total complexity will be O(n) * O(log n) = O(n log n).

What is the asymptotic running time of the following piece of code?

What is the asymptotic running time of the following piece of code?
if (N % 2 == 0) // N is even
for (int i = 0; i < N; i = i+1)
for (int j = 0; j < N; j = j+1)
A[i] = j;
else // N is odd
for (int i = 0; i < N; i = i+1)
A[i] = i;
If N is even we see the running time is O(n^2), when N is odd the running time is O(n). But I can't determine what the asymptotic running time is.
The possible answers are:
~ O(n)
~ O(n^2)
~ O(N * sqrt(N))
~ O(n log n)
There isn't a simple function you can use to asymptotically tightly bound the runtime. As you noted, the runtime oscillates between linear and quadratic at each step. You can say that the runtime is O(n2) and Ω(n), but without defining a piecewise function you can't give a Θ bound here.

Theta Runtime of a triple loop that essentially much less than n^3

I was looking at a programming question today and I had an issues finding the theta runtime of it. Basically, within my question, I form the following loop structure:
for(int i = 0; i < n; i++)
for(int j = i + 1; j < n; j++)
for(int k = j + 1; k < n; k++)
//check some condition
By obvious inspection, it is O(n^3). More accurately, it is o(n^3). However, I want to know what the theta runtime of this is. If you examine this loop, the actual amount of times the inner condition executes is n!/3!(n-3)! since it is evaluating all combinations of n numbers without repetition.
Is there a way to express the theta runtime in polynomial form other than n choose r?
For example, the runtime of selection sort (similar but with only 2 for loops) can be evaluated by looking at the number of instructions being executed. n + (n-1) + (n-2) ... + 1 would simplify to n(n+1) / 2.
n!/3!(n-3)! = n(n-1)(n-2)/3! = (n^2-n)(n-2)/6 = (n^3-2n^2-n^2+2n)/6
= (n^3 -3n^2 + 2n)/6
You can show easily1 that for large enough values of n:
1/2 n^3 < (n^3 -3n^2 + 2n)/6 < 2n^3
So when it comes to asymptotic notation, it is in Theta(n^3), and NOT in o(n^3).
(1) One way to show it is:
lim 1/2n^3 / ((n^3 -3n^2 + 2n)/6) when n->infinity = 1/2 < infinity
And similarly for the other inequality

Resources