I need to find a closest upper and lower bound of following code. I am beeginer in this so sorry for my mistake.
Upper bound for p() is O(log(n)), and lower bound is O(1)
Upper bound for notp() is O(log(n)), and lower bound is O(1)
I think that lower bound is O(1) because if I have n=4 then I get in the loop and since n%i==0 I call p() and it notice that is not a prime number so O(1) then since i=2 the other notp would not be executed. That is bast case scenario.
Worst case scenario that I go through loop so that log(n), and execute a p and a upper bound is O(log(n)) so it is O(log(n)^2) but I am not so sure that this is good, please tell me where I made a mistake?
int i;
for (i = 2; i*i <= n; i++) {
if (n % i == 0) {
p();
break;
}
}
if(i*i > n)
notp();
For order calculations, you would normally multiply them together when there is a loop.
for( i = 0; i < n ; i++ ){
DoSomething(i);
}
That would be O(n), as the loop is a single case.
A nested loop would be
for( i = 0; i < n ; i++ ){
for( j =0; j < n; j++ ){
DoSomething(i,j);
}
}
That becomes O( n^2 ) as they are additive.
If you have non-nested loops...
for( i = 0; i < n ; i++ ){
DoSomething(i);
}
for( j = 0; j < sqrt(n) ; j++ ){
DoSomething(j);
}
Then the O( x ), is the largest term. That is because for very large n, the smaller values of x don't matter.
So you have a loop in your case, which is O( sqrt( n ) ). That is because it is limited by i *i being less than n.
Then either one of p() or notp() will be called.
(I think p() and notp() are the wrong way round).
Re-factoring your code....
int i;
for (i = 2; i*i <= n; i++) {
if (n % i == 0) {
/* p(); */
break;
}
}
if(i*i > n)
notp();
else
p();
So we have the O( sqrt(n) ) time plus either of the p / notp functions which are O( log(n) )
O( sqrt(n ) + log(n) )
As the sqrt grows faster than n, it overpowers the log(n) wikipedia Big O, leaving it as the final value.
O( sqrt(n) )
As the termination condition of the loop is i*i <= n, the most possible number of iteration of the loop is sqrt(n). As there is a break in the condition of n % i == 0, the worst case of this part is sqrt(n) + log(n) which is O(sqrt(n)). Also, eighter the second condition is true or not, as the worst case of nopt() is O(log(n)), the worst case of the algorithm in total is O(sqrt(n)).
Related
i am kind of confuse in this i know in outer loop if i < n and in inner loop if j < i then the complexity will be O(n^2) but if we increase the limit from n and i respectively to n^2 and i^2 does the complexity doubles like O(n^4) or does it becomes cubic O(n^3) ?
for(long i = 1; i < n*n; i++)
{
for(long j = 1; j < i * i; j++)
{
//some code
}
}
Assuming that //some code takes O(1) operations, the time complexity is O(n^6). Since the inner loop takes i^2 - 1 iterations, you can use the sum of squares formula (or equivalently use Wolfram alpha) to get
I'm trying to find the frequency of each statement and the big o of this method.
But I'm struggling with the else part, I know that we take the worst complexity of if and else
But logically, in this case, do I have multiply the frequency of the outer loop (n) by the frequency of the else loop (n+1) ? Although I know that the else block will be executed for one time only, when i=0
But when we follow the rules we have to multiply it
So I'm stuck here and I don't know what to do in this case, I hope you guys could help me
Thanks!
int i, j, sum = 0;
for (i = 0 ; i < n; i++)
if ( i != 0)
Sum += i;
else
for (j = 0 ; j< n; j++)
Sum + = j;
So, you have a O(n) loop and a O(n) else inside it.
But in your code, we only go through the else statement once.
So you have a for loop, and inside the if you do O(1) work (n - 1) times, and once (i == 0) you do O(n) work ( when the else statement is reached )
So the complexity would be O(n) = (n - 1) * O(1) + O(n) ~ O(2n) which is also O(n) because we drop the constants when doing asymptotic analysis.
This is the loop structure :
for (int i = 1 ; i < n ; i++) {
for (int j = 0 ; j < n ; j += i) {
// do stuff
}
}
My guess was O(nlogn) as it clearly cannot be O(n^2) since the increment in j is increasing and it clearly cannot be O(n sqrt(n)) since the increment is not that high. But I have no idea how to prove it formally.
Each time complexity of the inner loop is based on the value of i is n/i. Hence, total time would be n + n/2 + n/3 + ... + n/n = n(1+1/2+1/3+...+1/n).
As we know 1+1/2+1/3+...+1/n is a harmonic sereis and asymptotically is log(n). Hence, the algorithm is run in O(nlog(n)).
I have the following code and I want to find the Big O. I wrote my answers as comments and would like to check my answers for each sentence and the final result.
public static void findBigO(int [] x, int n)
{
//1 time
for (int i = 0; i < n; i += 2) //n time
x[i] += 2; //n+1 time
int i = 1; //1 time
while (i <= n/2) // n time
{
x[i] += x[i+1]; // n+1 time
i++; //n+1 time
}
} //0
//result: 1 + n + n+1 + n + n+1 + n+1 = O(n)
First of all: simple sums and increments are O(1), they are made in constant time so x[i] += 2; is constant since array indexing is also O(1) the same is true for i++ and the like.
Second: The complexity of a function is relative to its input size, so in fact this function's time complexity is only pseudo-polynomial
Since n is an integer, the loop takes about n/2 interactions which is linear on the value of n but not on the size of n (4 bytes or log(n)).
So this algorithm is in fact exponential on the size of n.
for (int i = 0; i < n; i += 2) // O(n)
x[i] += 2;
int i = 1;
while (i <= n/2) // O(n/2)
{
x[i] += x[i+1];
i++;
}
O(n) + O(n/2) = O(n) in terms of Big O.
You have to watch out for nested loops that depend on n, if (as I first thought thanks to double usage of i) you would've had that O(n) * O(n/2), which is O(n^2). In the first case it is in fact about O(1,5n) + C However that is never ever used to describe an Ordo.
With Big O you push the values towards infinity, no matter how large C you have it will in the end be obsolete, just as if it is 1.000.000n or n. The prefix will eventually be obsolete.
That being said, the modifiers of n as well as the constants do matter, just not in Ordo context.
for( int bound = 1; bound <= n; bound *= 2 ) {
for( int i = 0; i < bound; i++ ) {
for( int j = 0; j < n; j += 2 ) {
... // constant number of operations
}
for( int j = 1; j < n; j *= 2 ) {
... // constant number of operations
}
}
}
The correct answer is O(n2). As I understand, the Of the first two for loops, they are "nested". Since bound goes from 1 to n, and i goes to bound. The third and fourth loops are not nested.
The third for loop has complexity O(n+2) , The fourth for loop has complexity O(log n)
Someone told me since n2 > n+2 + log n, algorithm is O(n2). (I'm really confused about this step)
I thought you are suppose to multiply each loop, so shouldn't it be log n(outer loop) * n(2nd outer loop) * n(third) * log N(fourth). how do I know which loop I need to add or multiply, and which loop's value should I take(do I take N over log N for the first two loops because N is smaller to process?
The complexity should be O(n3).
First consider the inner-most loops:
for( int j = 0; j < n; j += 2 ) {
... // constant number of operations
}
Index j takes values 0, 2, 4, 6, 8, ... up to n, so j can take at most n / 2 + 1 possible values, hence the complexity of this loop is O(n).
And for another inner-most loop:
for( int j = 1; j < n; j *= 2 ) {
... // constant number of operations
}
Index j takes values 1, 2, 4, 8, 16, ... up to n, so j can take at most log(n) + 1 possible values, hence the complexity of this loop is O(log(n)).
Therefore for any fixed bound and i, the total work done by the two inner-most loops is O(n) + O(log(n)) = O(n).
Now consider the two outer-most loops. Note that if the variable bound for the outer-most loop is k, then the loop indexed by i will loop exactly k times.
for( int bound = 1; bound <= n; bound *= 2 ) {
for( int i = 0; i < bound; i++ ) {
// work done by the inner-most loops
}
}
Since bound takes values 1, 2, 4, 8, ..., these two nested loops will loop:
1^2 + 2^2 + 4^2 + 8^2 + ... + (2^⌊log(n)⌋)^2
Note that this is just a geometric series with common ratio 4, so the summation gives us:
( 4(^(⌊log(n)⌋+1) - 1 ) / 3
= O(2^(2log(n))
= O(2^log(n^2))
= O(n^2)
Since the work done of the inner-most loops is independent of these two loops, the total complexity is given by O(n2) * O(n) = O(n3).
The first and second successive innermost loops have O(n) and O(log n)
complexity, respectively. Thus, the overall complexity of the innermost
part is O(n). The outermost and middle loops have complexity O(log n)
and O(n), so a straightforward (and valid) solution is that the overall
complexity is O(n^2*log(n))