Big-Oh Notation problem - big-o

I think the Big-O notation is n^2, but im not too sure.
for (int i = 0; i < n -1; i++) {
for (int j = 0; j < n – 1; j++)
if (x[j] > x[j+1]) {
temp = x[j];
x[j] = x[j+1];
x[j+1] = temp;
}
}

You are doing N * (N * (4)) operations = O(N^2)

Yes it's n^2. Ignore the constants, outer loops run n times, and inner loop runs n times for each n.

Related

What’s the time complexity of the this function with a nested 'for' loop?

Consider:
bar (a) {
for (int i = 0; i < n; ++i)
for (int j = 0; j < i; ++j)
a = a * (i + j);
return a;
}
Find the time complexity for the above function.
The time complexity is O(n^2).
Since the inner for loop runs from 0 to i and outer for loop runs from 0 to n it follows the pattern 1 + 2 + ... + n and this sums up to n*(n-1)/2, which in turn is O(n^2)

Time complexity question 3 loops + if statement

I have some trouble finding the time complexity of the code below. I figured that the if statement will run for approximately n times; however, I could not manage to describe it mathematically. Thanks in advance.
int sum = 0;
for (int i = 1; i < n; i++) {
for (int j = 1 ; j < i*i; j++) {
if (j % i == 0) {
for (int k = 0; k < j; k++) {
sum++;
}
}
}
}
Outer loop
Well, it's clear that it's O(n) because i is bounded by n.
Inner loops
If we take a look at the second loop alone, then it looks as follows:
...
for (int j = 1 ; j < i*i; j++){
...
j is bounded by i*i or simply n^2.
However, the innermost loop won't be executed for every j, but only for js that are divisible by i because that's what the constraint j % i == 0 means. Since j ~ i*i, there will be only i cases, when the innermost loop is executed. So, the number of iterations in the inner loops is bounded by i^3 or simply n^3.
Result
Hence, the overall time complexity is O(n4).

Worst-case running time using Big-Θ notation

I understand that the innermost for loop is Θ(logn)
and the two outermost for loops is Θ(n^2) because it's an arithmetic sum. The if-statement is my main problem. Does anyone know how to solve this?
int tally=0;
for (int i = 1; i < n; i ++)
{
for (int j = i; j < n; j ++)
{
if (j % i == 0)
{
for (int k = 1; k < n; k *= 2)
{
tally++;
}
}
}
}
Edit:
Now I noticed loop order: i before j.
In this case for given i value j varies from i to n and there are (n/i) successful if-conditions.
So program will call then most inner loop
n/1 +n/2+n/3+..+n/n
times. This is sum of harmonic series, it converges to n*ln(n)
So inner loop will be executed n*log^2(n) times.
As you wrote, two outermost loops provide O(n^2) complexity, so overall complexity is O(n^2 + n*log^2(n)), the first term overrides the second one, loop, and finally overall complexity is quadratic.
int tally=0;
for (int i = 1; i < n; i ++)
{
// N TIMES
for (int j = i; j < n; j ++)
{
//N*N/2 TIMES
if (j % i == 0)
{
//NlogN TIMES
for (int k = 1; k < n; k *= 2)
{
//N*logN*logN
tally++;
}
}
}
}
Old answer (wrong)
This complexity is linked with sum of sigma0(n) function (number of divisors) and represented as sequence A006218 (Dirichlet Divisor problem)
We can see that approximation for sum of divisors for values up to n is
n * ( log(n) + 2*gamma - 1 ) + O(sqrt(n))
so average number of successful if-conditions for loop counter j is ~log(j)

Big O notation of nested sequential loops

I have been searching through the forums on big O notation and learned quite a bit. My problem is pretty specific and I think a unique case will better help me understand big O, I am ignoring constants.
To my understanding if a loop goes through all elements than it's O(n).
for(int i = 0; i < n; i++)
{
}
If a loop goes through all of n, inside another loop that goes through all of n, it's multiplied n * n = n^2
for(int i = 0; i < n; i++)
{
for(int j = 0; j < n; j++)
{
}
}
Lastly if a loop is followed by another loop that goes through all elements it is n + n = 2n
for(int j = 0; j < n; j++)
{
}
for(int k = 0; k < n; k++)
{
}
My question directly proceeds these lines of code
for(int i = 0; i < n; i++)
{
for(int j = 0; j < n; j++)
{
}
for(int k = 0; k < n; k++)
{
}
for(int l = 0; l < n; l++)
{
for(int m = 0; m < n; m++)
{
}
}
}
So based on the rules above I am calculating big O to be n * (n + n + n * n), which is n^3 + 2n^2. So would that make my big O(n^3) or would my big O be O(n^3 +2n^2). Am I going about this all wrong? Or am I somewhere close in the ballpark? Mainly I'm trying to figure out if these loops would be less than O(n^4). Thanks in advance.
The big-O notation is used to characterize the asymptotic behavior of an algorithm depending on some value n that indicates the data volume, but independent of any constant, e.g. processor speed.
In your example, n^3 grown faster than 2n^2, i.e., for large n, 2n^2 can be neglected compared to n^3. The asymptotic behavior of your nested loops thus have order O(n^3).

Show that the time complexity of this double loop is O(n)

How would you show that this entire sequence of loops is O(n)? Or is it O(n)? At first glance, just looking at the double loops one might think it's O(n^2) but I don't think it is...
int i = 0;
int arr[N];
int idx = 0;
for (i = 0; i < 2; i++)
{
for (j = 0; j < N/2; j++)
{
idx = (i * N/2) + j;
foo(arr[idx]);
}
}
The outer loop is constant, so the loop is the same as doing:
for (j = 0; j < N/2; j++)
{
idx = (0 * N/2) + j;
foo(arr[idx]);
idx = (1 * N/2) + j;
foo(arr[idx]);
}
Which actually makes the intent of the code a lot clearer also, but as you can see, the amount of operations scales linearly with N, so it's O(N) complexity rather than exponential growth. I can't explain it any better as it's 8.30am and I haven't slept, but I imagine you get my gist.

Resources