What is the time complexity of the following code?
My guess:
The for loop runs for constant time i.e. 3. And the function calls itself with n/3. So 'n' is contracted by 3 times every time and the time complexity is O(log3N)?
void function(int n){
if(n == 1)
return 1;
for(int i = 0; i < 3; i++){
cout << "Hello";
}
function(n/3);
}
Yes, it's O(log3N). Call the amount of work done by the loop C. The first few calls will go:
f(n) = C + f(n/3) = C + C + f(n/9) = C + ... + C + f(1)
The number of times C appears will be the number of times you can divide n by 3 before it gets to 1, which is exactly log3n. So the total work is C*log3n, or O(log3N).
Related
I have been doing some examples of calculating the cost recursive algorithms which are inside loops, and this one has me wondering how I could calculate it.
int example(int max) {
int i = 1;
double x = 0.0;
while ( i <= max ) {
x = calculate (x , i);
i = 2 ∗ i ;
}
}
We know that calculate (int x, int i) is of O(i), and that the order of example should be based on max.
An easier example of this would be the same code with a for(int i = 1; i <= max; i++) loop, which would make an order on example of O(max^2), but in this case i is multiplied by two on each call. How could the cost be calculated in this case?
while loop will run in log(max). Each iteration of the loop will run in O(i). Hence, the total time complexity is:
T(max) = O(1 + 2 + 2^2 + ... + 2^{log(max)}) = O(2^{log(max) + 1} - 1)
As we know that 2^{log(max)} = max, T(max) = Theta(max).
when I submit to leetcode, it run case 500/502 but failed, reason: 1808548329. But when I run it on my own mac, it gave the same answer as the accepted one.
my code :
int trailingZeroes(int n) {
int count = 0;
int tmp = 0; //check every number in [1, i]
for (int i = 1; i <= n; i++) {
tmp = i;
while (tmp % 5 == 0) {
count++;
tmp /= 5;
}
}
return count;
}
and the ac answer:
int trailingZeroes2(int n) {
return n == 0 ? 0 : n / 5 + trailingZeroes(n / 5);
}
they run the same result, on my mac:
std::cout << trailingZeroes(1808548329) << std::endl; //452137076
std::cout << trailingZeroes2(1808548329) << std::endl; //452137076
Is the reason that first solution not accepted because of time complexity?
(cus' i am running it on my own mac, but it gives the same answer that the ac gave)
how can i calculate the time complexity of the first solution,
is it O(NlogN) ? I am not sure. can u do me a favor? : -)
edited, remove pics.
Your solution is O(n).
The inner loop repeats at least once every 5 items
The inner loop repeats at least twice every 25 items
...
The inner loop repeats at least k times every 5^k items.
Summing it together gives you that the inner loop runs:
n/5 + n/25 + n/125 + ... + 1 =
n (1/5 + 1/25 + 1/125 + ... + 1/n)
This is sum of geometric series, which is in O(n)
In addition, the outer loop itself has O(n) iterations, with each constant cost, if ignoring the inner loops, so this remains O(n).
The alternative solution, however runs in O(logn), which is significantly more efficient.
What is the order of growth of the worst case running time of the following code fragment as a function of N?
int cnt = 0;
for (int i = 1; i <= N; i = i*4)
for (int j = 0; j < i; j++)
{ cnt++; }
I now for example that first loop execute ~log(4, N) times and the second loop execute ~N times. But how to combine this knowlege to find the answer?
What is the general way to find that kind of complexity?
Maybe, we need to know how much time the body of the inner loop is executed?
For example 1 + 4 + 16 + 64 + ... + N
Geometric progression = (x^n - 1)/(x-1) where n=Log(4,N), so the result is
(x^log(x, N) - 1)/ (x-1) = (4N - 1)/3
Let's N belong to the interval [4^k; 4^(k+1)), then we have got sum:
sum 4^i, i=0..k = (4^(k+1)-1)/3 = O(n)
I was late some minutes and minus...
I have the following algorithm:
I analyzed this algoritm as follow:
Since the outer for loop goes from i to n it iterates at most n times,
and the loop on j iterates again from i to n which we can say at most n times,
if we do the same with the whole algorithm we have 4 nested for loop so the running time would be O(n^4).
But when I run this code for different input size I get the following result:
As you can see the result is much closer to n^3? can anyone explain why does this happen or what is wrong with my analysis that I get a loose bound?
Formally, you may proceed like the following, using Sigma Notation, to obtain the order of growth complexity of your algorithm:
Moreover, the equation obtained tells the exact number of iterations executed inside the innermost loop:
int sum = 0;
for( i=0 ; i<n ; i++ )
for( j=i ; j<n ; j++ )
for( k=0 ; k<j ; k++ )
for( h=0 ; h<i ; h++ )
sum ++;
printf("\nsum = %d", sum);
When T(10) = 1155, sum = 1155 also.
I'm sure there's a conceptual way to see why, but you can prove by induction the above has (n + 2) * (n + 1) * n * (n - 1) / 24 loops. Proof left to the reader.
In other words, it is indeed O(n^4).
Edit: You're count increases too frequently. Simply try this code to count number of loops:
for (int n = 0; n < 30; n++) {
int sum = 0;
for (int i = 0; i < n; i++) {
for (int j = i; j < n; j++) {
for(int k = 0; k < j; k++) {
for (int h = k; h < i; h++) {
sum++;
}
}
}
}
System.out.println(n + ": " + sum + " = " + (n + 2) * (n + 1) * n * (n - 1) / 24);
}
You are having a rather complex algorithm. The number of operations is clearly less than n^4, but it isn't at all obvious how much less than n^4, and whether it is O (n^3) or not.
Checking the values n = 1 to 9 and making a guess based on the results is rather pointless.
To get a slightly better idea, assume that the number of steps is either c * n^3 or d * n^4, and make a table of the values c and d for 1 <= n <= 1,000. That might give you a better idea. It's not a foolproof method; there are algorithms changing their behaviour dramatically much later than at n = 1,000.
Best method is of course a proof. Just remember that O (n^4) doesn't mean "approximately n^4 operations", it means "at most c * n^4 operations, for some c". Sometimes c is small.
I had this question for my assignment the other day, but I was still unsure if I'm right.
for(int i =1; i <n; i++) //n is some size
{
for(j=1; j<i; j++)
{
int k=1;
while (k<n)
{
k=k+C; //where C is a constant and >=2
}
}
}
I know the nested for loops is O(n^2) but I wasn't sure with the while loop. I assumed that the whole code will be O(n^3).
The inner loop is literally O(n/C)=O(n), so yes, overall it's O(n^3) (the second loop has an upper bound of O(n))
int k=1;
while (k<n){
k=k+C //where C is a constant and >=2
}
This will take (n-1)/C steps: write u = (k-1)/C. Then, k = Cu + 1 and the statement becomes
u=0;
while(u < (n-1)/C) {
u=u+1
}
Hence the while loop is O(n) (since C is constant)
EDIT: let me try to explain it the other way around.
Start with a dummy variable u. The loop
u=0;
while(u < MAX) {
u = u+1
}
runs MAX times.
When you let MAX = (n-1) / C, the loop is
u=0;
while(u < (n-1)/C) {
u=u+1
}
And that runs (n-1)/C times.
Now, the check u < (n-1)/C is equivalent to C*u < n-1 or C*u + 1 < n, so the loop is equivalent to
u=0;
while(C*u + 1 < n) {
u=u+1
}
Now, suppose that we rewrote this loop in terms of a variable k = C * u + 1. Then,
u=0;
k=1; // C * 0 + 1 = 1
The loop looks like
while(C*u + 1 < n) {
while(k < n) {
and the inner condition is
u=u+1
k=k+C //C * (u+1) + 1 = C * u + 1 + C = old_k + C
Putting it all together:
int k=1;
while (k<n){
k=k+C
}
takes (n-1)/C steps.
Formally, you may proceed using the following methodology (Sigma Notation):
Where a symbolizes the number of constant operations inside the innermost loop (a = 1 if you want to count the exact number of iterations).
Well, you would need to look at how many times the while loop body is run for a given value of n and C. For example n is 10 and C is 3. The body would run 3 times: k = 1, k = 4, k = 7. For n = 100 and C = 2, the body would run 50 times: k = 1,3,5,7,9,...,91,93,95,97,99. It is a matter of counting by C until n. You should be able to calculate the Big-O complexity from that clue.