This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 10 years ago.
What is the complexity of this loop? I can't wrap my head around it.
for (i = 0; i < n; ++i) {
for (j = i; j < n; ++j) {
for (k = 0; k < j; ++k) {
// Do something
}
}
}
O(n^3), I believe. See Square pyramidal number.
i loop has n iterations.
j loop: (1 + 2 + ... + n), starting with n iterations, and finishing with 1.
k loop: (1² + 2² + ... n²), j times per each iteration of the j loop.
And finally:
Related
This question already has answers here:
How can I find the time complexity of an algorithm?
(10 answers)
Closed 2 years ago.
I have a Question about calculating the time complexity with O-Notation . We have given this Code ..
int a=0;
For (int j=0; j<n ; j++){
For(int i=0; i*i<j; i++){
a++; }
}
I think the solution ist O(n^2) That for the first for loop we need n and for the second we need n... but I as I answerd the exam Questions..I got zero points for it
... Also for another code
int g(int y){
If (y<10){
Return 1;}
else {
int a=0;
for ( int i=0;i<n;j++) {
a++;}
return a+g(2(y/3)+1)+g(2(y/3)+2)+g(2(y/3)+3);}
}
I think the solution ist O(n) , That the variables time won't be calculated... the if sentence has a constant time O(1) and would be dominated by the for loop and the for loop would have O(n)
.... Also any advises or resources that explains how a program time would be calculated? And thank you :) 😃
For the first code, you have:
T(n) = 1 + sqrt(2) + ... + sqrt(n) = Theta(n\sqrt(n))
As i*i < j means i < sqrt(j). For the second, you can use Akra-Bazzi theorem:
T(n) = T(2n/3+1) + T(2n/3+2) + T(2n/3+3) + n
and reach to T(n) = 3 T(2n/3) + n to use the master thorem (~O(n^2.7))
I am learning about Big-O and although I started to understand things, I still am not able to correctly measure the Big-O of an algorithm.
I've got a code:
int n = 10;
int count = 0;
int k = 0;
for (int i = 1; i < n; i++)
{
for (int p = 200; p > 2*i; p--)
{
int j = i;
while (j < n)
{
do
{
count++;
k = count * j;
} while (k > j);
j++;
}
}
}
which I have to measure the Big-O and Exact Runtime.
Let me start, the first for loop is O(n) because it depends on n variable.
The second for loop is nested, therefore makes the big-O till now an O(n^2).
So how we gonna calculate the while (j < n) (so only three loops till now) and how we gonna calculate the do while(k > j) if it appears, makes 4 loops, such as in this case?
A comprehend explanation would be really helpful.
Thank you.
Unless I'm much mistaken, this program has an infinite loop and therefore it's time complexity cannot usefully be analyzed.
In particular
do
{
count++;
k = count * j;
} while (k > j);
as soon as this loop is entered for the second time and count = 2, k will be set greater to j, and will remain so indefinitely (ignoring integer overflow, which will happen pretty quickly).
I understand that you're learning Big-Oh notation, but creating toy examples like this probably isn't the best way to understand Big-Oh. I would recommend reading a well-known algorithms textbook where they walk you through new algorithms, explaining and analyzing the time and space complexity as they do so.
I am assuming that the while loop should be:
while (k < j)
Now, in this case, the first for loop would take O(n) time. The second loop would take O(p) time.
Now, for the third loop,
int j = i;`
while (j < n){
...
j++;
}
could be rewritten as
for(j=i;j<n;j++)
meaning it shall take O(n) time.
For the last loop, the value of k increases exponentially.
Consider it to be same as
for(k = count*j ;k<j ;j++,count++)
Hence it shall take O(logn) time.
The total time complexity is O(n^2*p*logn).
This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 9 years ago.
This question is for revision from a past test paper
just wondering if i am doing it right
work out the time complexity T(n) of the following piece of code in terms of number of operations for a given integer n:
for ( int i = 1; i < n*n*n; i *= n ) {
for ( int j = 0; j < n; j += 2 ) {
for ( int k = 1; k < n; k *= 3 ) {
// constant number C of elementary operations
}
}
}
so far i've come up with n^3 * n * log n = O( n^4 log n)
I'll have a go.
The first loop is O(1) constant since it will always run 3 iterations (1*n*n*n == n*n*n).
for ( int i = 1; i < n*n*n; i *= n )
The second loop is O(0.5n) = O(n).
for ( int j = 0; j < n; j += 2 )
The third loop is O(log n).
for ( int k = 1; k < n; k *= 3 )
Therefore the time complexity of the algorithm is O(n log n).
i think your missing the key point. I don't see anywhere in the question it asking you to work out complexity in terms of Big-Oh. Instead its asking for number of operations for a given integer n.
Here is my solution,
For a given n, the inner loop variable successively takes the following
values: k = 1 ,3^0, 3, 3^2, . . . , 3^(m-1)
Therefore, the inner loop performs C log3n operations for each pair of values of the
variables j and i.
The middle loop variable j takes n=2 values,
And the outer loop variable i takes three values, 1, n, and n^2 for a given n.
Therefor the time complexity of the whole piece of code is equal to T(n) =
3C(n/2)log3n = 1.5Cnlog3n.
You may want to check this, but this is my interpretation of your question.
This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 10 years ago.
A homework question asks me to analyse the following code fragement:
for (int i = N; i > 0; i--)
for (int j = 0; j < i; j++)
I think the inner loop runs the following number of times:
N + (N-1) + (N-2) + ... + (N - N + 1)
However, I'm having trouble converting that into O() notation.
Could someone point me in the right direction?
By observation, the inner loop runs 1 + 2 + ... + N times. That's exactly N(N+1)/2 (which is the formula for triangular numbers). First, remember the definition of big-O: it's that f is O(g) if |f/g| is bounded for large enough N. So for example this is O(exp(n)) and it's also O(n^3). It's also O(N(N+1)/2), but your teacher is probably expecting the answer O(N^2). How does one show that this is O(N^2)? Well (N(N+1)/2) / N^2 is 1/2 + 1/2N. This is bounded by 1 for N > 0.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
What is the complexity of:
int f4(int n)
{
int i, j, k=1, count = 0;
for(i = 0; i < n; i++)
{
k *= 3;
for(j = k; j; j /= 2)
count++;
}
return count;
}
I know it is O(n^2) but how do you calculate this? and why isn't it n*log n?
There are n outer loops. At any point, k = 3i. There are log2(k) inner loops (because we halve j on each iteration.)
log2(3i) = log3 (3i) / log3(2) = i / (constant)
So the complexity of the inner loop is i. In other words, this program has the same complexity (but not the exact same number of iterations) as
int f4changed(int n)
{
int i, j, count = 0;
for(i = 0; i < n; i++)
{
for(j = 0; j < i; j++)
{
count++;
}
}
}
This is O(n2) as you've already seen.
i = 1 results in 3 iterations (of the inner loop) (3, 1, 0)
i = 2 is 8 (5 then 3)
i = 3 is 13 (7 + 5 + 3)
What you have is approximating an arithmetic series, which is O(n2).
For completeness (and to explain why the exact number of iterations doesn't matter), refer to the Plain english explanation of Big O (this is more for other readers than you, the poster since you seem to know what's up).
The complexity of Log(Pow(3,n)) ~ O(N).
If the inner loop was k *= 2, then the number of iterations would have also been n.
For calculating O(~) the highest power term is used and the others are neglected. Log(Pow(3,n)) can be bounded as:
Log(Pow(2,n)) <= Log(Pow(3,n)) <= Log(Pow(4,n))
Now Log(Pow(4,n)) = 2*Log(Pow(2,n)).
The highest power term here is n (as 2 is a constant).