This question already has answers here:
How can I find the time complexity of an algorithm?
(10 answers)
Closed 2 years ago.
I have a Question about calculating the time complexity with O-Notation . We have given this Code ..
int a=0;
For (int j=0; j<n ; j++){
For(int i=0; i*i<j; i++){
a++; }
}
I think the solution ist O(n^2) That for the first for loop we need n and for the second we need n... but I as I answerd the exam Questions..I got zero points for it
... Also for another code
int g(int y){
If (y<10){
Return 1;}
else {
int a=0;
for ( int i=0;i<n;j++) {
a++;}
return a+g(2(y/3)+1)+g(2(y/3)+2)+g(2(y/3)+3);}
}
I think the solution ist O(n) , That the variables time won't be calculated... the if sentence has a constant time O(1) and would be dominated by the for loop and the for loop would have O(n)
.... Also any advises or resources that explains how a program time would be calculated? And thank you :) 😃
For the first code, you have:
T(n) = 1 + sqrt(2) + ... + sqrt(n) = Theta(n\sqrt(n))
As i*i < j means i < sqrt(j). For the second, you can use Akra-Bazzi theorem:
T(n) = T(2n/3+1) + T(2n/3+2) + T(2n/3+3) + n
and reach to T(n) = 3 T(2n/3) + n to use the master thorem (~O(n^2.7))
Related
This question already has answers here:
Big O, how do you calculate/approximate it?
(24 answers)
Closed 1 year ago.
what is the Big O for below series
1+2+3+4+....+N
if I've to write a code for the series, it will be like
public void sum(int n){
int sum =0;
for(int i=1;i<=n;i++){
sum += i;
}
print(sum);
}
based on the above code its O(N)
Somewhere (in a udemy course) I read the order of the series is O(N square). why?
This below code has runtime O(N).
public void sum(int n){
int sum =0;
for(int i=1;i<=n;i++){
sum=+i;
}
print(sum);
}
However
O(1+2+3+...N) is O(N^2) since O(1+2+3+...N)=O(N(N+1)/2)=O(N^2).
I am guessing you are reading about the second statement and you confuse the two.
You are confusing the complexity of computing 1 + 2 + ... + N (by summing) with the result of computing it.
Consider the cost function f(N) = 1 + 2 + ... + N. That simplifies to N(N + 1)/2, and has the complexity O(N^2).
(I expect that you learned that sum in your high school maths course. They may even have even taught you how to prove it ... by induction.)
On the other hand, the algorithm
public void sum(int n){
int sum = 0;
for(int i = 1; i <= n; i++){
sum += i;
}
print(sum);
}
computes 1 + 2 + ... + N by performing N additions. When do a full analysis of the algorithm taking into account all of the computation, the cost function will be in the complexity class O(N).
But I can also compute 1 + 2 + ... + N with a simpler algorithm that makes use of our match knowledge:
public void sum(int n){
print(n * (n + 1) / 2);
}
This alternative algorithm's cost function is O(1) !!!
Lesson: don't confuse an algorithm's cost function with its result.
This question already has an answer here:
How the time complexity of the following code is O(n)?
(1 answer)
Closed 6 years ago.
I have come across this question which asks to find the time complexity.
int count = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
count += 1;
}
}
It says that it's time complexity is O(n), should it be O(nlogn) as the first loop is logn and second is n.
It says that it`s time complexity is O(n), should it be O(nlogn) as
the first loop is logn and second is n.
The inner-loop is based on outer loop. So, your claim is not valid.
And, += (addition assignment operator) complexity is O(1).
For first iteration of the outer-loop, the inner-loop will execute for N times.
For second iteration of the outer-loop, the inner-loop will execute for N/2 times.
And, so on...
Therefore, total execution steps
= N + N/2 + ... + 1
// log2 N times Geometric Progression...
~ N / (1-(1/2)) (Infinite GP Summation Formula) //though the series would go up to 1
~ 2N.
// ~ means approximately.
Therefore, the time complexity of the code comes out to be O(N).
So, the answer given is correct.
I have the following code and I want to find the Big O. I wrote my answers as comments and would like to check my answers for each sentence and the final result.
public static void findBigO(int [] x, int n)
{
//1 time
for (int i = 0; i < n; i += 2) //n time
x[i] += 2; //n+1 time
int i = 1; //1 time
while (i <= n/2) // n time
{
x[i] += x[i+1]; // n+1 time
i++; //n+1 time
}
} //0
//result: 1 + n + n+1 + n + n+1 + n+1 = O(n)
First of all: simple sums and increments are O(1), they are made in constant time so x[i] += 2; is constant since array indexing is also O(1) the same is true for i++ and the like.
Second: The complexity of a function is relative to its input size, so in fact this function's time complexity is only pseudo-polynomial
Since n is an integer, the loop takes about n/2 interactions which is linear on the value of n but not on the size of n (4 bytes or log(n)).
So this algorithm is in fact exponential on the size of n.
for (int i = 0; i < n; i += 2) // O(n)
x[i] += 2;
int i = 1;
while (i <= n/2) // O(n/2)
{
x[i] += x[i+1];
i++;
}
O(n) + O(n/2) = O(n) in terms of Big O.
You have to watch out for nested loops that depend on n, if (as I first thought thanks to double usage of i) you would've had that O(n) * O(n/2), which is O(n^2). In the first case it is in fact about O(1,5n) + C However that is never ever used to describe an Ordo.
With Big O you push the values towards infinity, no matter how large C you have it will in the end be obsolete, just as if it is 1.000.000n or n. The prefix will eventually be obsolete.
That being said, the modifiers of n as well as the constants do matter, just not in Ordo context.
This question already has answers here:
Running time complexity of double for-loops
(3 answers)
Closed 7 years ago.
Take N=5 as one example. The number of operations in the two loops is 5 + 4 + 3 + 2 + 1 = 15. Which seems to be N + N-1 + N-2 + N-3 + ... + 1. Can I say this is N^2?
for(int i=0; i<N; i++){
for(int j=i; j<N; j++){
...//
}
}
As mentioned the complexity of this is (most likely) O(N2), but only if the complexity in the loop is O(1). If not, the overall complexity could be different.
Also there could be a worst case, if there is a break-condition in the loop.
Example:
for(int i = 0; i < N; i++){
for(int j = i; j < N; j++){
if(N % 2 == 1)
break;
}
}
The best case is Θ(N) (all odd N), the worst case is Θ(N2) (for all even N).
If N = 2n + 1 (for any positive integer n, i.e. N is odd), the condition N % 2 == 1 will be true. So the inner loop will be exited after the first run. So the complexity is O(N â‹… 1) = O(N).
If N = 2â‹…n (for any positive integer n, i.e. N is even), the condition is never true, so the inner loop is completely executed. So the complexity is
O(N + N-1 + ... + 2 + 1) = O(N⋅(N+1)/2) = O(N²).
If the code in the inner loop needs more than O(1), the overall complexity will not fit into O(N²).
It is N(N+1)/2 which in Big-O notation still increases as N^2 but fairly good as N^2 goes.
Using Sigma notation, you obtain this:
What is the complexity given for the following problem is O(n). Shouldn't it be
O(n^2)? That is because the outer loop is O(n) and inner is also O(n), therefore n*n = O(n^2)?
The answer sheet of this question states that the answer is O(n). How is that possible?
public static void q1d(int n) {
int count = 0;
for (int i = 0; i < n; i++) {
count++;
for (int j = 0; j < n; j++) {
count++;
}
}
}
The complexity for the following problem is O(n^2), how can you obtain that? Can someone please elaborate?
public static void q1E(int n) {
int count = 0;
for (int i = 0; i < n; i++) {
count++;
for (int j = 0; j < n/2; j++) {
count++;
}
}
}
Thanks
The first example is O(n^2), so it seems they've made a mistake. To calculate (informally) the second example, we can do n * (n/2) = (n^2)/2 = O(n^2). If this doesn't make sense, you need to go and brush up what the meaning of something being O(n^k) is.
The complexity of both code is O(n*n)
FIRST
The outer loop runs n times and the inner loop varies from 0 to n-1 times
so
total = 1 + 2 + 3 + 4 ... + n
which if you add the arithmetic progression is n * ( n + 1 ) / 2 is O(n*n)
SECOND
The outer loop runs n times and the inner loop varies from 0 to n-1/2 times
so
total = 1 + 1/2 + 3/2 + 4/2 ... + n/2
which if you add the arithmetic progression is n * ( n + 1 ) / 4 is also O(n*n)
First case is definitely O(n^2)
The second is O(n^2) as well because you omit constants when calculate big O
Your answer sheet is wrong, the first algorithm is clearly O(n^2).
Big-Oh notation is "worst case" so when calculating the Big-Oh value, we generally ignore multiplications / divisions by constants.
That being said, your second example is also O(n^2) in the worst case because, although the inner loop is "only" 1/2 n, the n is the clear bounding factor. In practice the second algorithm will be less than O(n^2) operations -- but Big-Oh is intended to be a "worst case" (ie. maximal bounding) measurement, so the exact number of operations is ignored in favor of focusing on how the algorithm behaves as n approaches infinity.
Both are O(n^2). Your answer is wrong. Or you may have written the question incorrectly.