Understanding asymptotic complexity of these code fragments - big-o

I am not sure of the rules/procedure to follow when determining asymptotic complexity of methods. I know single declarations is O(1) and single loops are O(n) and nested loops are O(n^2). Things which are doubled are log_2(n) things which are quartered are log_4(n). And if we have a loop that's O(n) and a something inside of that which produces log(n) then that's nlog(n). But I am still unsure how to figure all that out. Do we focus on the user inputted 'n' variable value to determine asymptotic complexity or do we focus on the incrementing variable value 'i'.
Can someone please walk me through these examples and show how it's done.
Example1:
for (k = 0; k < n; k = k + 3)
for (p = n; p > 6; p--)
System.out.prtinln(p%2);
T(n) = ?
Asymptotic complexity + ?
Example2:
for (k = 0; k <= n/8; k++)
System.out.println(k);
System.out.prtinln("Next");
for (p = n; p >= 1; p--)
System.out.prtinln(p % 2);
T(n) = ?
Asymptotic complexity = ?
Example3:
for (i = n - 3; i < = n - 1; i++)
System.out.println(i);
for (k = 1; k <= n; k++)
System.out.prtinln(i + k);
T(n) = ?
Asymptotic complexity = ?
Example4:
for (a = 1; a <= n/3; a++)
for (b = 1; b <= 2 * n; b++)
System.out.println(a * b);
T(n) = ?
Asymptotic complexity = ?

Related

Time Complexity log(n) vs Big O (root n)

Trying to analyze the below code snippet.
For the below code can the time complexity be Big O(log n)?. I am new to asymptotic analysis. In the tutorial it says its O( root n).
int p = 0;
for(int i =1;p<=n;i++){
p = p +i;
}
,,,
Variable p is going to take the successive values 1, 1+2, 1+2+3, etc.
This sequence is called the sequence of triangular numbers; you can read more about it on Wikipedia or OEIS.
One thing to be noted is the formula:
1 + 2 + ... + i = i*(i+1)/2
Hence your code could be rewritten under the somewhat equivalent form:
int p = 0;
for (int i = 1; p <= n; i++)
{
p = i * (i + 1) / 2;
}
Or, getting rid of p entirely:
for (int i = 1; (i - 1) * i / 2 <= n; i++)
{
}
Hence your code runs while (i-1)*i <= 2n. You can make the approximation (i-1)*i ≈ i^2 to see that the loop runs for about sqrt(2n) operations.
If you are not satisfied with this approximation, you can solve for i the quadratic equation:
i^2 - i - 2n == 0
You will find that the loop runs while:
i <= (1 + sqrt(1 + 8n)) / 2 == 0.5 + sqrt(2n + 0.125)

Big notation for an algorithm efficiency

can anyone help me to identify the steps for the following example and give more explanation on this Example the steps That determine Big-O notation is O(2n)
int i, j = 1;
for(i = 1; i <= n; i++)
{
j = j * 2;
}
for(i = 1; i <= j; i++)
{
cout << j << "\n";
}
thank you in advance
The first loop has n iterations and assigns 2^n to j.
The second loop has j = 2^n iterations.
The cout has time complexity O(log j) = O(n).
Hence the overall complexity is O(n * 2^n), which is strictly larger than O(2^n).

What is the runtime of this pseudo-code

i = 1;
while (i <= n)
j = i;
x = x+A[i];
while (j > 0)
y = x/(2*j);
j = j/2; // Assume here that this returns the floor of the quotient
i = 2*i;
return y;
I'm not sure about my answer, I got O(n2).
Lets remove x and y variables because it doesn't affect to complexity.
i = 1;
while (i <= n)
j = i;
while (j > 0)
j = j/2;
i = 2*i;
In the inner loop you every time divide j by 2, so actually it is not liner it is O(logn). For example when j is 16 you will do 5 ( O(log16)+1 ) steps: 8,4,2,1,0.
In the outer loop you every time multiply i by 2, so it is also O(logn).
So overall complexity will be O(logn * logn).

What is the Big-O of this section of code?

x = 0;
for (i = 1; i <= n; i = i * 3) // Is big(O) of this statement O(log base 3 n)?
{
if (i % 2 != 0)
for (j = 0; j < i; j++) // What will be big(O) of this loop and how?
x++;
}
I read an answer that overall big-O will be O(n). How will be big(O) of this code O(n) if this is true?
The inner loop will execute i operations for each i from 0 to log_3(n). So the number of operations performed is given by this common sum. It has a closed form of (log_3(n) + 1)(log_3(n))/2 which is O([log n]^2)

Big-Oh Notation problem

I think the Big-O notation is n^2, but im not too sure.
for (int i = 0; i < n -1; i++) {
for (int j = 0; j < n – 1; j++)
if (x[j] > x[j+1]) {
temp = x[j];
x[j] = x[j+1];
x[j+1] = temp;
}
}
You are doing N * (N * (4)) operations = O(N^2)
Yes it's n^2. Ignore the constants, outer loops run n times, and inner loop runs n times for each n.

Resources