Can you please explain what O(n^2 * log n) can look like? I do understand O(n * log n) :
s=0
for(i=0; i<n; i++)
{
for (j=1; j<n; j *= 2)
{
s=s+i*j;
}
s=s+1
}
when outer loop runs from 1 to n is O(n) and the inner loop repeats log(n) times per outer loop e.g j *= 2. Also I do understand what O(n^2) does (performance is directly proportional to the square of the size of the input data e.g)
s=0
for(i=0; i<n; i++)
{
for (j=0; j<n; j++)
{
s=s+i*j;
}
s=s+1
}
but what is O(n^2 * log n)? Can you please give an example.
You only need to add one for loop on the outside. That makes it O(n^2 * log n) because you repeat O(n * log n) n-times.
for(int k=0; k<n; k++)
{
s=0
for(i=0; i<n; i++)
{
for (j=0; j<n; j *= 2)
{
s=s+i*j;
}
s=s+1
}
}
for(i=0; i<n; i++)
{
for (int ij = 0; ij < n; ++ik) {
for (j=0; j < ij; j *= 2)
{
s=s+i*j;
}
}
s=s+1
}
Think about it like this:
My input is n.
In first loop I'm traversing each element fromn, so my complexity is n
So it means that in second loop i'm traversing each element ij n times, my complexity becomes n*n
In third loop I'm doing logaritmic traversal which visits logn elements. But each logaritmic traversal is executed n*n times from previous loops, so it becomes n*n*logn
Related
01.) Considering the growth time, When the inner loop of a nested loop is false, is its growth time O (n)?
ex:
for (int i=0; i<n; i++){
for (int j=0; j>n; i++){
//some code
}
}
02.) Considering the growth time, When the inner loop of a nested loop has another array length variable as 'm', is its growth time O (n m)?
ex:
for (int i=0; i<n; i++){
for (int j=0; j<m; i++){
//some code
}
}
What is the running time of the following code? (please explain with steps)
for (int i = 0; i < n; i++) {
for (int j = 0; j > n; j++) {
for (int k = 0; k > n; k++) {
System.out.println("*");
}
}
}
Thank You.
Yes, if it is absolutely clear that inner loop will not be executed due to the conditional's being false regardless of the input size, then time complexity will be O(n).
Yes, a detailed analysis can be found below. However, I assume you incorrectly incremented i in your inner loop.
for (int i=0; i<n; i++){
for (int j=0; j<m; j++){ // j++
//some code
}
}
number of operations performed by inner loop in each iteration of outer loop : m
number of operations performed by outer loop : n
Total number of operations : n*m
Time complexity f(n) ∈ O(n*m)
If the code you posted is not incorrect, then the time complexity analyis:
for (int i=0; i<n; i++){
for (int j=0; j<m; i++){
//some code
}
}
j=0 --> j<m (i++)
If 0<m, then it is an infinite loop. It's pointless to make a time complexity analysis.
If 0>=m, then inner loop is not going to be executed, therefore time complexity will be O(n).
Detailed analysis:
for (int i = 0; i < n; i++) { // n many times
for (int j = 0; j > n; j++) { // n many times
for (int k = 0; k > n; k++) { // n many times
System.out.println("*");
}
}
}
All of the loops are iterated n many times. Most-inner loop will perform n operations. Outer loop will perform n, which makes n*n. And most-outer loop will perform n operations, which makes n*n*n in total.
Time complexity f(n) ∈ O(n^3)
Question 1.
My Answer: 1 + n^2 * n = n^3
Correct Answer: O(n)
void f(int n) {
if (n<1000000)
return 0;
for (int i=0; i<(n*n); i++)
return f(n-1);
}
Question 2.
My Answer: n * n * logn = n^2*logn
Correct Answer: O(n^3)
int f(int n) {
for (int i=1; i< (n/2); i++)
for(double j =0; j <100; j+= (100.0/n)
doSomething (n/2); //doSomething(m) has time complexity of O(m)
return 0;
}
Question 3.
My Answer: 1 + n * (logn + 1) = O(nlogn)
Correct Answer: O(logn)
int f(n) {
if (n<1) return;
g(n-1);
}
void g(n) {
f(n/2)
doOne(); // doOne has time complexity O(1)
}
Question 1
void f(int n) {
if (n<1000000)
return 0;
for (int i=0; i<(n*n); i++)
return f(n-1);
}
The for loop is not looping at all because the content is a return statement, so you have at most one loop iteration. This means you can can simplify this code to:
void f(int n) {
if (n<=0)
return 0;
return f(n-1);
}
(simplified in regard for the O(n) analysis)
Here you see why it is O(n) because it is counting down until it hit the recursion stop condition. The fact that there is a "high" value check for n<100000 doesn't matter when you call it with something like f(5*10^300);
Question 2
int f(int n) {
for (int i=1; i< (n/2); i++)
for(double j =0; j <100; j+= (100.0/n)
doSomething (n/2); //doSomething(m) has time complexity of O(m)
return 0;
}
In regard for the O(n) analysis you can simplify some lines:
for (int i=1; i< (n/2); i++)
This can be simplified to:
for (int i=1; i<n; i++)
Therefore it's O(n) as already identified by you.
for(double j =0; j <100; j+= (100.0/n)
This can be simplified as:
for(double j =0; j <1; j+= (1.0/n) (divided by 100)
for(double j =0; j <n; j+= 1.0) (multiplied by n)
And again, a simple O(n) loop.
doSomething (n/2);
That is by definition a O(n) statement.
So in total you have O(n)*O(n)*O(n), which is O(n^3).
Question 3
int f(n) {
if (n<1) return;
g(n-1);
}
void g(n) {
f(n/2)
doOne(); // doOne has time complexity O(1)
}
Not sure how you got the O(n*log(n)) here because not every n value is checked. In fact you start at n and the steps are n/2, n/4, n/8, n/16, n/32, n/64, n/128, ... and at some point you reached the terminate condition if (n<1). This is a simple O(log(n)) loop.
Do following algorithms run in O(n) time?
1
s=0
for(i=0; i<n; i++)
{
for (j=0; j<n; j++)
{
s=s+i*j;
}
s=s+1
}
This is a O(n^2) since here performance directly proportional to the square of the size of the input data set N
2
s=0
for(i=0; i<n; i++)
{ if (i>20)
for (j=0; j<n; j++)
{
s=s+i*j;
}
s=s+1
}
3
s=0
for(i=0; i<n; i++)
{ if(i<4)
for (j=0; j<n; j++)
{
s=s+i*j;
}
s=s+1
}
Can you please explain how the if statement affects O(n)? In both cases (#2 and #3) first loop is O(n) and the second loop is going to run if N > 20 or N < 4 respectively. But how this affects complexity? Will these are still be O(n^2) with if i > 20 does 20^2 operations less and if i < 4 4^2 less? Also that Big O assumes that N is going to infinity?
2
Still O(N^2). The loop runs a total of
20 + (N - 20) * N iterations (and each iteration is constant) ==> O (N^2)
3
O(N). The loop runs a total of
N * 4 + (N - 4) iterations (and each iteration is constant) ==> O(N)
What is the time complexity for the below function?
void function(int n){
for(int i=0; i<n; i++){. //n times
for(int j=i ; j< i*i; j++){ // n*n times
if(j%i == 0){
for(int k=0; k<j; k++){ // j times = n * n times.
++counter;
}
}
}
}
}
Book says O(n^5). But I couldn't understand why book didn't take j%i into account. The k loop determines based on that. Can you clarify?
Let's consider the part of code in which u have doubt :
for(int j=i ; j< i*i; j++)
{
if(j%i == 0)
{
for(int p=0; p<j; p++){
++counter;
}
}
Let's just analyse this part of code .
For every i, j%i==0 will be true whenever j is a multiple of i, i.e.
i, 2i, 3i, .... i*i // upto i*i only,since j<i*i
Now, for every j, Inside complexity is O(j).
So Complexity for this part of code is :
So, Overall Complexity = O(n*n^3) = O(n^4)
In fact the function runs in time O(n^4), since the if condition happens only once every approximately n loops. See below for precise analysis.
void function (int n) {
for(int i=0; i<n; i++) {
// Runs a total of n times
for(int j=i; j< i*i; j++) {
// Runs a total of (n^3 - n)/3 = O(n^3) times
if(j%i == 0) {
// Runs a total of (n^2 - n)/2 = O(n^2) times
for(int k=0; k<j; k++) {
// Runs a total of (3n^4 - 10n^3 + 9n^2 - 2n)/24 = O(n^4) times
++counter;
}
}
}
}
}
I have a series of questions in which I need feedback and answers. I will comment as to what I think, this is not a homework assignment but rather preparation for my exam.
My main problem is determining the iterations of a loop for different cases. How would go about attempting to figure that out?
Evaluate Running time.
Q2.
for(int i =0 ; i < =n ; i++) // runs n times
for(int j =1; j<= i * i; j++) // same reasoning as 1. n^2
if (j % i == 0)
for(int k = 0; k<j; k++) // runs n^2 times? <- same reasoning as above.
sum++;
Correct Answer: N × N2 × N = O(N^4)
For the following Questions below, I do not have the correct answers.
Q3. a)
int x=0; //constant
for(int i=4*n; i>=1; i--) //runs n times, disregard the constant
x=x+2*i;
My Answer: O(n)
b) Assume for simplicity that n = 3^k
int z=0;
int x=0;
for (int i=1; i<=n; i=i*3){ // runs n/3 times? how does it effect final answer?
z = z+5;
z++;
x = 2*x;
}
My Answer: O(n)
c) Assume for simplicity that n = k^2,
int y=0;
for(int j=1; j*j<=n; j++) //runs O(logn)? j <= (n)^1/2
y++; //constant
My Answer: O(logn)
d)
int b=0; //constant
for(int i=n; i>0; i--) //n times
for(int j=0; j<i; j++) // runs n+ n-1 +...+ 1. O(n^2)
b=b+5;
My Answer: O(n^3)
(e)
int y=1;
int j=0;
for(j=1; j<=2n; j=j+2) //runs n times
y=y+i;
int s=0;
for(i=1; i<=j; i++) // runs n times
s++;
My Answer: O(n)
(f)
int b=0;
for(int i=0; i<n; i++) //runs n times
for(int j=0; j<i*n; j++) //runs n^2 x n times?
b=b+5;
My Answer: O(n^4)
(g) Assume for simplicity that n = 3k, for some positive integer k.
int x=0;
for(int i=1; i<=n; i=i*3){ //runs 1, 3, 9, 27...for values of i.
if(i%2 != 0) //will always be true for values above
for(int j=0; j<i; j++) // runs n times
x++;
}
My Answer: O (n x log base 3 n? )
(h) Assume for simplicity that n = k2, for some positive integer k.
int t=0;
for(int i=1; i<=n; i++) //runs n times
for(int j=0; j*j<4*n; j++) //runs O(logn)
for(int k=1; k*k<=9*n; k++) //runs O(logn)
t++;
My Answer: n x logn x log n = O(n log n^2)
(i) Assume for simplicity that n = 2s, for some positive integer s.
int a = 0;
int k = n*n;
while(k > 1) //runs n^2
{
for (int j=0; j<n*n; j++) //runs n^2
{ a++; }
k = k/2;
}
My Answer: O(n^4)
(j)
int i=0, j=0, y=0, s=0;
for(j=0; j<n+1; j++) //runs n times
y=y+j; //y equals n(n+1)/2 ~ O(n^2)
for(i=1; i<=y; i++) // runs n^2 times
s++;
My Answer: O(n^3)
(k)
int i=1, z=0;
while( z < n*(n+1)/2 ){ //arithmetic series, runs n times
z+=i; i++;
}
My Answer: O(n)
(m) Assume for simplicity that n = 2s, for some positive integer s.
int a = 0;
int k = n*n*n;
while(k > 1) //runs O(logn) complexity
{
for (int j=0; j<k; j++) //runs n^3 times
{ a--; }
k = k/2;
}
My Answer: O(n^3 log n)
Question 4
a) True - since its bounded below by n^2
b) False - f(n) not strictly smaller than g(n)
c) True
d) True -bounded by n^10
e) False - f(n) not strictly smaller than g(n)
f) True
g) True
h) false - since does not equal O(nlogn)
i) true
j) not sure
k) not sure
l) not sure - how should I even attempt these?*
Let's go through these one at a time.
Part (a)
int x=0; //constant
for(int i=4*n; i>=1; i--) //runs n times, disregard the constant
x=x+2*i;
My Answer: O(n)
Yep! That's correct. The loop runs O(n) times and does O(1) work per iteration.
Part (b)
int z=0;
int x=0;
for (int i=1; i<=n; i=i*3){ // runs n/3 times? how does it effect final answer?
z = z+5;
z++;
x = 2*x;
}
My Answer: O(n)
Not quite. Think about the values of i as the loop progresses. It will take on the series of values 1, 3, 9, 27, 81, 243, ..., 3k. Since i is tripling on each iteration, it takes on successive powers of three.
The loop clearly only does O(1) work per iteration, so the main question here is how many total iterations there will be. The loop will stop when i > n. If we let k be some arbitrary iteration of the loop, the value of i on iteration k will be 3k. The loop stops when 3k > n, which happens when k > log3 n. Therefore, the number of iterations is only O(log n), so the total complexity is O(log n).
Part (c)
int y=0;
for(int j=1; j*j<=n; j++) //runs O(logn)? j <= (n)^1/2
y++; //constant
My Answer: O(logn)
Not quite. Notice that j is still growing linearly, but the loop runs as long as j2 ≤ n. This means that as soon as j exceeds √ n, the loop will stop. Therefore, there will only be O(√n) iterations of the loop, and since each one does O(1) work, the total work done is O(√n).
Part (d)
int b=0; //constant
for(int i=n; i>0; i--) //n times
for(int j=0; j<i; j++) // runs n+ n-1 +...+ 1. O(n^2)
b=b+5;
My Answer: O(n^3)
Not quite. You're actually doubly-counting a lot of the work you need to do. You're correct that the inner loop will run n + (n-1) + (n-2) + ... + 1 times, which is O(n2) times, but you're already summing up across all iterations of the outer loop. You don't need to multiply that value by O(n) one more time. The most accurate answer would be O(n2).
Part (e)
int y=1;
int j=0;
for(j=1; j<=2n; j=j+2) //runs n times
y=y+i;
int s=0;
for(i=1; i<=j; i++) // runs n times
s++;
My Answer: O(n)
Yep! Exactly right.
Part (f)
int b=0;
for(int i=0; i<n; i++) //runs n times
for(int j=0; j<i*n; j++) //runs n^2 x n times?
b=b+5;
My Answer: O(n^4)
Again, I believe you're overcounting. The inner loop will run 0 + n + 2n + 3n + 4n + ... + n(n-1) = n(0 + 1 + 2 + ... + n - 1) times, so the total work done is O(n3). You shouldn't multiply by the number of times the outer loop runs because you're already summing up across all iterations. The most accurate runtime would be O(n3).
Part (g)
int x=0;
for(int i=1; i<=n; i=i*3){ //runs 1, 3, 9, 27...for values of i.
if(i%2 != 0) //will always be true for values above
for(int j=0; j<i; j++) // runs n times
x++;
}
My Answer: O (n x log base 3 n? )
The outer loop here will indeed run O(log n) times, but let's see how much work the inner loop does. You're correct that the if statement always evaluates to true. This means that the inner loop will do 1 + 3 + 9 + 27 + ... + 3log3 n work. This summation, however, works out to (3log3 n + 1 - 1) / 2 = (3n + 1) / 2. Therefore, the total work done here is just O(n).
Part (h)
int t=0;
for(int i=1; i<=n; i++) //runs n times
for(int j=0; j*j<4*n; j++) //runs O(logn)
for(int k=1; k*k<=9*n; k++) //runs O(logn)
t++;
My Answer: n x logn x log n = O(n log n^2)
Not quite. Look at the second loop. This actually runs O(√n) times using the same logic as one of the earlier parts. That third inner loop also runs O(√n) times, and so the total work done will be O(n2).
Part (i)
int a = 0;
int k = n*n;
while(k > 1) //runs n^2
{
for (int j=0; j<n*n; j++) //runs n^2
{ a++; }
k = k/2;
}
My Answer: O(n^4)
Not quite. The outer loop starts with k initialized to n2, but notice that k is halved on each iteration. This means that the number of iterations of the outer loop will be log (n2) = 2 log n = O(log n), so the outer loop runs only O(log n) times. That inner loop does do O(n2) work, so the total runtime is O(n2 log n).
Part (j)
int i=0, j=0, y=0, s=0;
for(j=0; j<n+1; j++) //runs n times
y=y+j; //y equals n(n+1)/2 ~ O(n^2)
for(i=1; i<=y; i++) // runs n^2 times
s++;
My Answer: O(n^3)
Close, but not quite! The first loop runs in time O(n) and by the time it's done, the value of j is Θ(n2). This means that the second loop runs for time Θ(n2), so the total time spent is Θ(n2).
Part (k)
int i=1, z=0;
while( z < n*(n+1)/2 )//arithmetic series, runs n times
{
z+=i; i++;
}
My Answer: O(n)
That's correct!
Part (l)
That's odd, there is no part (l).
Part (m)
int a = 0;
int k = n*n*n;
while(k > 1) //runs O(logn) complexity
{
for (int j=0; j<k; j++) //runs n^3 times
{ a--; }
k = k/2;
}
My Answer: O(n^3 log n)
Close, but not quite. You're right that the outer loop runs O(log n) times and that the inner loop does O(n3) work on the first iteration. However, look at the number of iterations of the inner loop more closely:
n3 + n3 / 2+ n3 / 4 + n3 / 8 + ...
= n3 (1 + 1/2 + 1/4 + 1/8 + ...)
≤ 2n3
So the total work done here is actually only O(n3), even though there are log n iterations.
Question 4
Your answers are all correct except for these:
f) True
This is actually false. The expression on the left is
(3/2)n3/2 + 5n2 + lg n
which is not Ω(n2 √n) = Ω(n5/2)
For (j), note that log n6 = 6 log n. Does that help?
For (k), ask whether both sides are O and Ω of one another. What do you find?
For (l), use the fact that alogb c = clogba. Does that help?
Hope this helps!