It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
What is the complexity of:
int f4(int n)
{
int i, j, k=1, count = 0;
for(i = 0; i < n; i++)
{
k *= 3;
for(j = k; j; j /= 2)
count++;
}
return count;
}
I know it is O(n^2) but how do you calculate this? and why isn't it n*log n?
There are n outer loops. At any point, k = 3i. There are log2(k) inner loops (because we halve j on each iteration.)
log2(3i) = log3 (3i) / log3(2) = i / (constant)
So the complexity of the inner loop is i. In other words, this program has the same complexity (but not the exact same number of iterations) as
int f4changed(int n)
{
int i, j, count = 0;
for(i = 0; i < n; i++)
{
for(j = 0; j < i; j++)
{
count++;
}
}
}
This is O(n2) as you've already seen.
i = 1 results in 3 iterations (of the inner loop) (3, 1, 0)
i = 2 is 8 (5 then 3)
i = 3 is 13 (7 + 5 + 3)
What you have is approximating an arithmetic series, which is O(n2).
For completeness (and to explain why the exact number of iterations doesn't matter), refer to the Plain english explanation of Big O (this is more for other readers than you, the poster since you seem to know what's up).
The complexity of Log(Pow(3,n)) ~ O(N).
If the inner loop was k *= 2, then the number of iterations would have also been n.
For calculating O(~) the highest power term is used and the others are neglected. Log(Pow(3,n)) can be bounded as:
Log(Pow(2,n)) <= Log(Pow(3,n)) <= Log(Pow(4,n))
Now Log(Pow(4,n)) = 2*Log(Pow(2,n)).
The highest power term here is n (as 2 is a constant).
Related
This question already has answers here:
How can I find the time complexity of an algorithm?
(10 answers)
Closed 2 years ago.
I have a Question about calculating the time complexity with O-Notation . We have given this Code ..
int a=0;
For (int j=0; j<n ; j++){
For(int i=0; i*i<j; i++){
a++; }
}
I think the solution ist O(n^2) That for the first for loop we need n and for the second we need n... but I as I answerd the exam Questions..I got zero points for it
... Also for another code
int g(int y){
If (y<10){
Return 1;}
else {
int a=0;
for ( int i=0;i<n;j++) {
a++;}
return a+g(2(y/3)+1)+g(2(y/3)+2)+g(2(y/3)+3);}
}
I think the solution ist O(n) , That the variables time won't be calculated... the if sentence has a constant time O(1) and would be dominated by the for loop and the for loop would have O(n)
.... Also any advises or resources that explains how a program time would be calculated? And thank you :) 😃
For the first code, you have:
T(n) = 1 + sqrt(2) + ... + sqrt(n) = Theta(n\sqrt(n))
As i*i < j means i < sqrt(j). For the second, you can use Akra-Bazzi theorem:
T(n) = T(2n/3+1) + T(2n/3+2) + T(2n/3+3) + n
and reach to T(n) = 3 T(2n/3) + n to use the master thorem (~O(n^2.7))
I'm trying to solve the complexity of this loop
for(int i= 0; i < n; i++) {
c = i;
while(c > 1){
O(1);
c = c / 2;
}
}
as the while condition changes in every loop I don't know how to calculate that strange series.
I mean, if the loop where
for(int i= 0; i < n; i++) {
c = n;
while(c > 1){
O(1);
c = c / 2;
}
}
I know the while has a complexity of O(logn) and it repeats itself n times, so the complexity would be O(nlogn).
The problem I have with previous loop is "c=i". As c=i, first time (c=0) the loop would reproduce 0 times, when c=1 it would reproduce 0 times again, when c=2 it would reproduce 1 time, then the series would follow and it is 0, 0, 1, 2, 2, 3, 3... (while reproductions each time of for loop)
O(logn) would not repeat itself n times, would repeat a number of times I can't come up with, so I don't know how to solve it.
This need a bit of math involved.Given that log is well defined for a and b:
log(a) + log(b) = log(ab)
Here you have
log(1) + log(2) +....+ log(n) = log(1*....*n) = log(n!)
There is a mathematical approximation for log(n!), namely
log(n!) ~ nlog(n) - n + 1
which reveal O(log(n!)= O(nlog(n))
I am learning about Big-O and although I started to understand things, I still am not able to correctly measure the Big-O of an algorithm.
I've got a code:
int n = 10;
int count = 0;
int k = 0;
for (int i = 1; i < n; i++)
{
for (int p = 200; p > 2*i; p--)
{
int j = i;
while (j < n)
{
do
{
count++;
k = count * j;
} while (k > j);
j++;
}
}
}
which I have to measure the Big-O and Exact Runtime.
Let me start, the first for loop is O(n) because it depends on n variable.
The second for loop is nested, therefore makes the big-O till now an O(n^2).
So how we gonna calculate the while (j < n) (so only three loops till now) and how we gonna calculate the do while(k > j) if it appears, makes 4 loops, such as in this case?
A comprehend explanation would be really helpful.
Thank you.
Unless I'm much mistaken, this program has an infinite loop and therefore it's time complexity cannot usefully be analyzed.
In particular
do
{
count++;
k = count * j;
} while (k > j);
as soon as this loop is entered for the second time and count = 2, k will be set greater to j, and will remain so indefinitely (ignoring integer overflow, which will happen pretty quickly).
I understand that you're learning Big-Oh notation, but creating toy examples like this probably isn't the best way to understand Big-Oh. I would recommend reading a well-known algorithms textbook where they walk you through new algorithms, explaining and analyzing the time and space complexity as they do so.
I am assuming that the while loop should be:
while (k < j)
Now, in this case, the first for loop would take O(n) time. The second loop would take O(p) time.
Now, for the third loop,
int j = i;`
while (j < n){
...
j++;
}
could be rewritten as
for(j=i;j<n;j++)
meaning it shall take O(n) time.
For the last loop, the value of k increases exponentially.
Consider it to be same as
for(k = count*j ;k<j ;j++,count++)
Hence it shall take O(logn) time.
The total time complexity is O(n^2*p*logn).
int i, j, k = 0;
for (i = n/2; i <= n; i++) {
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
}
I came across this question and this is what I think.
The outer loop will run, N/2 times and the inner loop will run logN times so it should be N/2*logN. But this is not the correct answer.
The correct answer is O(NlogN), can anybody tell me what I am missing?
Any help would be appreciated.
Let's take a look at this block of code.
First of all, you can notice that inner loop doesn't depend on the external, so the complexity of it would not change at any iteration.
for (j = 2; j <= n; j = j * 2) {
k = k + n/2;
}
I think, your knowledge will be enough to understand, that complexity of this loop is O(log n).
Now we need to understand how many times this loop will be performed. So we should take a look at external loop
for (i = n/2; i <= n; i++) {
and find out, that there will be n / 2 iterations, or O(n) in a Big-O notation.
Combine these complexities and you'll see, that your O(log n) loop will be performed O(n) times, so the total complexity will be O(n) * O(log n) = O(n log n).
This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center.
Closed 9 years ago.
This question is for revision from a past test paper
just wondering if i am doing it right
work out the time complexity T(n) of the following piece of code in terms of number of operations for a given integer n:
for ( int i = 1; i < n*n*n; i *= n ) {
for ( int j = 0; j < n; j += 2 ) {
for ( int k = 1; k < n; k *= 3 ) {
// constant number C of elementary operations
}
}
}
so far i've come up with n^3 * n * log n = O( n^4 log n)
I'll have a go.
The first loop is O(1) constant since it will always run 3 iterations (1*n*n*n == n*n*n).
for ( int i = 1; i < n*n*n; i *= n )
The second loop is O(0.5n) = O(n).
for ( int j = 0; j < n; j += 2 )
The third loop is O(log n).
for ( int k = 1; k < n; k *= 3 )
Therefore the time complexity of the algorithm is O(n log n).
i think your missing the key point. I don't see anywhere in the question it asking you to work out complexity in terms of Big-Oh. Instead its asking for number of operations for a given integer n.
Here is my solution,
For a given n, the inner loop variable successively takes the following
values: k = 1 ,3^0, 3, 3^2, . . . , 3^(m-1)
Therefore, the inner loop performs C log3n operations for each pair of values of the
variables j and i.
The middle loop variable j takes n=2 values,
And the outer loop variable i takes three values, 1, n, and n^2 for a given n.
Therefor the time complexity of the whole piece of code is equal to T(n) =
3C(n/2)log3n = 1.5Cnlog3n.
You may want to check this, but this is my interpretation of your question.