What can be the time complexity of this code? - algorithm

i am kind of confuse in this i know in outer loop if i < n and in inner loop if j < i then the complexity will be O(n^2) but if we increase the limit from n and i respectively to n^2 and i^2 does the complexity doubles like O(n^4) or does it becomes cubic O(n^3) ?
for(long i = 1; i < n*n; i++)
{
for(long j = 1; j < i * i; j++)
{
//some code
}
}

Assuming that //some code takes O(1) operations, the time complexity is O(n^6). Since the inner loop takes i^2 - 1 iterations, you can use the sum of squares formula (or equivalently use Wolfram alpha) to get

Related

Time complexity of a three nested loops?

What is the time complexity for this piece of code?
for(int = n; i > 0; i--){
for(int j = 1; j < n; j*=2){
for(int k = 0; k < j; k++) {
...//constant number C of operations
}
}
}
I find that the two innerloops creates the time complexity of O(n*logn)(?). Together with outer loop this results in time complexity of O(n^2 * logn) for the whole piece of code (?).
According to the answer, the result should be O(n^2) and not O(n^2 * logn).
Can someone help me understand why?
You're right that the first for loop runs O(N) times. Lets look at the inner two.
So looking at j it will be 1,2,4,8,16 .. n, this means the inner k loop will run
1 + 2 + 4 .. + n
times, or written another way
2^0 + 2^1 + 2^2 ... 2^log(n)
you can look up this summation and find its O(2^log(n + 1))) = O(n).
So if we multiply the two inner and outer loops we would get O(n^2)

Find the Big O time complexity of the code

I am fairly familiar with simple time complexity regarding constant, linear, and quadratic time complexities. In simple code segments like:
int i = 0;
i + 1;
This is constant. So O(1). And in:
for (i = 0; i < N; i++)
This is linear since it iterates n+1 times, but for Big O time complexities we remove the constant, so just O(N). In nested for loops:
for (i = 0; i < N; i++)
for (j = 0; j < N; j++)
I get how we multiply n+1 by n and reach a time complexity of O(N^2). My issue is with slightly more complex versions of this. So, for example:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < N*N; j++)
S++;
In such a case, would I be multiplying n+1 by the inner for loop time complexity, of which I presume is n^2? So the time complexity would be O(n^3)?
Another example is:
S = 0;
for (i = 0; i < N; i++)
for (j = 0; j < i*i; j++)
for (k = 0; k < j; k++)
S++;
In this case, I expanded it and wrote it out and realized that the inner, middle for loop seems to be running at an n*n time complexity, and the most inner for loop at the pace of j, which is also nxn. So in that case, would I be multiplying n+1 x n^2 x n^2, which would give me O(n^5)?
Also, I am still struggling to understand what kind of code has logarithmic time complexity. If someone could give me an algorithm or segment of code that performs at log(n) or n log(n) time complexity, and explain it, that would be much appreciated.
All of your answers are correct.
Logarithmic time complexity typically occurs when you're reducing the size of the problem by a constant factor on every iteration.
Here's an example:
for (int i = N; i >= 0; i /= 2) { .. do something ... }
In this for-loop, we're dividing the problem size by 2 on every iteration. We'll need approximately log_2(n) iterations prior to terminating. Hence, the algorithm runs in O(log(n)) time.
Another common example is the binary search algorithm, which searches a sorted interval for a value. In this procedure, we remove half of the values on each iteration (once again, we're reducing the size of the problem by a constant factor of 2). Hence, the runtime is O(log(n)).

what is the time complexity of this code and how? in Big-O

int i, j, k = 0;
for (i = n/2; i <= n; i++) {
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
}
I came across this question and this is what I think.
The outer loop will run, N/2 times and the inner loop will run logN times so it should be N/2*logN. But this is not the correct answer.
The correct answer is O(NlogN), can anybody tell me what I am missing?
Any help would be appreciated.
Let's take a look at this block of code.
First of all, you can notice that inner loop doesn't depend on the external, so the complexity of it would not change at any iteration.
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
I think, your knowledge will be enough to understand, that complexity of this loop is O(log n).
Now we need to understand how many times this loop will be performed. So we should take a look at external loop
for (i = n/2; i <= n; i++) {
and find out, that there will be n / 2 iterations, or O(n) in a Big-O notation.
Combine these complexities and you'll see, that your O(log n) loop will be performed O(n) times, so the total complexity will be O(n) * O(log n) = O(n log n).

Complexity of two imbricated loops

I have this algorithm (in c code for convenience):
int algo(int *T, int size)
{
int n = 0;
for (int i = 1; i < size; i++) {
for (int j = i; j < size; j++) {
n += (T[i] * T[j]);
}
}
return n;
}
What is this algorithm's time complexity?
My bet is it is n * log (n) since we have two imbricated iterations on the size length one time, and onsize - i the second time, but I am not sure.
A formal proof of the complexity is welcome!
This is an O(N2) algorithm.
First iteration of the outer loop runs N-1 times
Second iteration of the outer loop runs N-2 times
Third iteration of the outer loop runs N-3 times
...
Last iteration of the outer loop runs 1 time
The total number of times is (N)+(N-1)+(N-2)+...+1, which is the sum of arithmetic progression. The formula for computing the sum is N*(N-1)/2, which is O(N2).

What is the time complexity of the following method in terms of Big-O notation?

Help me solve the time complexity of this method below here:
void method(int n, int[] array)
{
int i = 0, j = 0;
for(; i < n; ++i)
{
while(j < n && array[i] < array[j])
{
j++;
}
}
}
The runtime is O(n).
In some iterations of the outer loop the inner loop might progress several times and at other it might not progress at all, but in total there will be at most n increases of j. As this is independent of when (which values of i) this happens, you might say this is O(n) for the outer loop plus O(n) for the (up to) n increases of j. O(n) + O(n) = O(n).
This is contrary to the typical 'loop inside loop' which would perform n iterations of the inner loop for every iteration of the outer loop and thus be O(n) * O(n) = O(n^2).

Resources