This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 10 years ago.
A homework question asks me to analyse the following code fragement:
for (int i = N; i > 0; i--)
for (int j = 0; j < i; j++)
I think the inner loop runs the following number of times:
N + (N-1) + (N-2) + ... + (N - N + 1)
However, I'm having trouble converting that into O() notation.
Could someone point me in the right direction?
By observation, the inner loop runs 1 + 2 + ... + N times. That's exactly N(N+1)/2 (which is the formula for triangular numbers). First, remember the definition of big-O: it's that f is O(g) if |f/g| is bounded for large enough N. So for example this is O(exp(n)) and it's also O(n^3). It's also O(N(N+1)/2), but your teacher is probably expecting the answer O(N^2). How does one show that this is O(N^2)? Well (N(N+1)/2) / N^2 is 1/2 + 1/2N. This is bounded by 1 for N > 0.
Related
This question already has answers here:
Time complexity of nested for-loop
(10 answers)
Is Big O(logn) log base e?
(7 answers)
Closed 5 years ago.
In many definitions of O(n log(n)), I typically see as a requirement that the subproblem must be a division of the original problem size by two. However, in particular, I have seen that O(log(n)) need only be a problem of reducing size.
Do we necessarily need to divide the problem into halves to get a problem of nlog(n)?
Or could it be merely a reductive problem? Like so:
for (i = 0; i < A.length(); i++)
{
for (j = i; j < A.length(); j++)
{
//do something
}
}
Would something like this also be categorized as n(log(n))? Or is it closer to O(n^2)?
Dividing by any other constant would also give you log(n) complexity. That's because you can convert log bases, and the constant drops out when you are interested in Big-O.
http://www.purplemath.com/modules/logrules5.htm
You'll note the denominator is a constant.
The code shown is O(n^2)
Outer loop determines number of iterations for the inner loop.
N + N-1 + N-2 + N-3 + ... + 1 = O(n^2)
To get a complexity of log(n), at each iteration you need to get rid of cn elements. Where 0<c<1
I would like to understand Asymptotic Analysis better since I believe I don't have solid understanding on that. I would appreciate if someone can highlight a better approach to it. Here are two examples
for (int i = 1; i <= n; i *= 2) {
for (int j = 0; j < n; j++) {
count++;
}
}
This question is from Quiz and its answer is O(n log n)
I watched a lecture of Stanford University and its example is below
for i = 1 to n
for j = i + 1 to n
if A[i] == A [j] return TRUE otherwise
return FALSE
Asymptotic Analysis for second given problem is Quadratic O(n^2)
How can I know when is O(n log n) or O(n^2) they both have nested for loop?
Any answer is highly appreciated. Thanks beforehand
The first example is O(nlogn) because the outer loop repeats log(n) times, and each of its repeat require O(n) repeats of the inner loop - so this totals in O(log(n) * n) = O(nlogn).
However, in the 2nd example - the outer loop requires O(n) iterations, and for each iteration i - it requires O(n-i) iterations of the inner loop. This means it will take n + (n-1) + (n-2) + ... + 2 + 1 total time. This is arithmetic progression, and the sum of it is in O(n^2)
There is no "easy way" to know the complexity without understanding some of what happens - complexity analysis is case dependent.
However, there are some hints that might help you, for example - if the iteration counter is multiplying each iteration - it is a strong indication a logarithm will be involved in the complexity function, like in your first example.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Here's the algorithm:
Let a = 30, i = 1
While i < n
For j = i+1 to n
If ABS[(j*i)+20] < a then a = ABS[(j*i)+20]
i = i + 1
Return k
Whats the number of steps this algorithm will take for the general case where the input is of size n? How do you work that out?
Also does this algorithm come under the quadratic complexity class?
I think this is with O(n^2)
we have
n+(n-1)+(n-2)+(n-3)......[total n] ....3.2.1
if we calculate it, it would be
0.5( (n^2) + n) = C (n^2 + n)
and it is quadratic complexity class.
Let f(i) denote the number of times the inner for loop runs assuming that j goes from i+1 to n. For example f(5) = n - 5 + 1, since j goes through 6,7,...,n. So we want f(1) + f(2) + f(3) + ... + f(n - 1). Compute what each f(i) and then sum them to see the exact answer.
In general there is an outer loop that runs n times, then the inner loop runs at most n times, for a complexity upper bounded by ???
If I was a compiler, I would notice that this code only changes i, j, and a, local variables; and the only variable whose value is subsequently used is k. So I would gradually optimize away everything but this:
Return k
and the computation would be all constant time, just a few machine instructions. Therefore also within quadratic time.
This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 9 years ago.
This question is for revision from a past test paper
just wondering if i am doing it right
work out the time complexity T(n) of the following piece of code in terms of number of operations for a given integer n:
for ( int i = 1; i < n*n*n; i *= n ) {
for ( int j = 0; j < n; j += 2 ) {
for ( int k = 1; k < n; k *= 3 ) {
// constant number C of elementary operations
}
}
}
so far i've come up with n^3 * n * log n = O( n^4 log n)
I'll have a go.
The first loop is O(1) constant since it will always run 3 iterations (1*n*n*n == n*n*n).
for ( int i = 1; i < n*n*n; i *= n )
The second loop is O(0.5n) = O(n).
for ( int j = 0; j < n; j += 2 )
The third loop is O(log n).
for ( int k = 1; k < n; k *= 3 )
Therefore the time complexity of the algorithm is O(n log n).
i think your missing the key point. I don't see anywhere in the question it asking you to work out complexity in terms of Big-Oh. Instead its asking for number of operations for a given integer n.
Here is my solution,
For a given n, the inner loop variable successively takes the following
values: k = 1 ,3^0, 3, 3^2, . . . , 3^(m-1)
Therefore, the inner loop performs C log3n operations for each pair of values of the
variables j and i.
The middle loop variable j takes n=2 values,
And the outer loop variable i takes three values, 1, n, and n^2 for a given n.
Therefor the time complexity of the whole piece of code is equal to T(n) =
3C(n/2)log3n = 1.5Cnlog3n.
You may want to check this, but this is my interpretation of your question.
This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 10 years ago.
What is the complexity of this loop? I can't wrap my head around it.
for (i = 0; i < n; ++i) {
for (j = i; j < n; ++j) {
for (k = 0; k < j; ++k) {
// Do something
}
}
}
O(n^3), I believe. See Square pyramidal number.
i loop has n iterations.
j loop: (1 + 2 + ... + n), starting with n iterations, and finishing with 1.
k loop: (1² + 2² + ... n²), j times per each iteration of the j loop.
And finally: