What is this function time complexity? - algorithm

is this function complexity O(nlogn)?
find(n){
for(i=0; i<=n-1; i++){
for(j=1; j<=i; j=j*2)
print("*");
}
}

The complexity of the inner loop is O(log2(i)). The outer loop starts from 0 until n - 1. so the complexity should be log2(1) + log2(2).... log2(n-1) = log((n-1)!).
So the complexity is O(log2(n!)).
This is because log(a) + log(b) + log(c)... = log(abc...).
In your case it's log(1) + log(2) + log(3)... + log(n-1) = log(1*2*3*...*(n-1)) = log((n-1)!).
O(log n!) approximates to O(n log n) so your complixity is: O(n log n).

Related

Big O complexity of nested loop

What could be the big O of this code?
I thought --> n + n/2 + n/3 + .....+1 which is just n, but also looks like O(n^2)
public int sums(int n){
int sum = 0;
for (int i = 1; i < n; i++) {
for (int j = 0; j < n/i; j++) {
sum++;
}
}
return sum;
}
This will be O(n log n).
n + n/2 + n/3 + ... + 1 = n (1 + 1/2 + 1/3 + ... 1/n),
where (1 + 1/2 + 1/3 + ... 1/n) ~ O(log n).
Reference: https://www.quora.com/What-is-the-sum-of-the-series-1-frac-1-2-frac-1-3-frac-1-4-frac-1-5-up-to-infinity-How-can-it-be-calculated/answer/Avinash-Sahu-7
In general, nested loops fall into the O(n)*O(n) = O(n^2) time complexity order, where one loop takes O(n) and if the function includes loops inside loops, it takes O(n)*O(n) = O(n^2).
Similarly, if the function has ‘m' loops inside the O(n) loop, the order is given by O (n*m), which is referred to as polynomial time complexity function.

Find time complexity of a given nested loop

What is the time complexity for this loop below?
int sum = 0;
for(int n = N; n > 0; n/=2)
for(int i = 0; i < n; i++)
sum++
I figured out that the inner loop runs N + N/2 + N/4 + N/8 + ... + 1 times, but now I don't know how to proceed. How do I get the tilde or big-O from this loop?
Thanks in advance.
Big-O from this loop is O(N) such as the dependence of the number of iterations on N is linear.
About 1/2 + 1/4 + 1/8 + ... https://en.wikipedia.org/wiki/1/2_%2B_1/4_%2B_1/8_%2B_1/16_%2B_%E2%8B%AF
About Big-O https://www.khanacademy.org/computing/computer-science/algorithms/asymptotic-notation/a/big-o-notation
N + N/2 + N/4 + N/8 + ... + 1 forms a GP(geometric progress) which sums up to 2N. While defining time complexity in terms of big O, we discard the constant. So time complexity of your problem is O(N).

Why isn't heapsort lg(n!)?

I'm reading CLRS and it says heapsort is
HEAPSORT(A):
BUILD-MAX-HEAP(A);
for (i = A.length; i >= 1; i++)
{
exchange A[1] with A[i];
A.heap-size = A.heap-size - 1;
MAX-HEAPIFY(A,1);
}
MAX_HEAPIFY is O(lg n).
The book says it runs MAX-HEAPIFY n times thus it is O(n lg n) time.
But if the heap is shrinking in size by 1 each iteration shouldn't it be O(lg n!) ?
It would be lg 1 + lg 2 ... + lg(n-1) + lg (n) = lg(n!), right ?
Actually it's Stirling's Approximation:
O( log(n!) ) = O(nlogn)

Big O, algorithm analysis

I'm wondering what is the big O notation for each of the below statement:
Sum = 0;
for i=1 to N^2 do:
for j=1 to N do:
'Sum += 1;
Sum = 0 is O(1) for sure, because it will only be executed once.
But I'm confused by the second statement, should it be O(N) because it's the first loop? or it should be O(N^2) because N^2 is a quadratic function about variable N?
The first loop is O(N2) because it executes N2 steps. Each of those steps executes the inner loop, which involves N steps, so there are N2 * N or N3 steps, and the algorithm is O(N3).
You'll be looping through N three rounds..so i say: O(n^3)
Algorithm:
Sum = 0; ~ 1
for i=1 to N^2 do: ~ 1+2N^2
for j=1 to N do: ~ (1+2N) * N^2
'Sum += 1; ~ 1 * N * N^2
Time Complexity:
Time = 1 + 1+2N^2 + (1+2N)*N^2 + 1 * N * N^2
Time = 2 + 2N^2 + N^2 + 2N^3 + N^3
Time = 2 + 3N^2 + 3N^3
O(Time) = O(2 + 3N^2 + 3N^3) ~ O(N^3)

Timecomplexity analysis of function, Big O

What time-complexity will the following code have in respect to the parameter size? Motivate.
// Process(A, N) is O(sqrt(N)).
Function Complex(array[], size){
if(size == 1) return 1;
if(rand() / float(RAND_MAX) < 0.1){
return Process(array, size*size)
+ Complex(array, size/2)
+ Process(array, size*size);
}
}
I think it is O(N), because if Process(A, N) is O(sqrt(N)), then Process(A, N*N) should be O(N), and Complex(array, size/2) is O(log(n)) because it halves the size every time it runs. So on one run it takes O(N) + O(log(N)) + O(N) = O(N).
Please correct me and give me some hints on how I should think / proceed an assignment like this.
I appreciate all help and thanks in advance.
The time complexity of the algorithm is O(N) indeed, but for a different reason.
The complexity of the function can be denoted as T(n) where:
T(n) = T(n/2) + 2*n
^ ^
recursive 2 calls to
invokation Process(arr,n*n),
each is O(n(
This recursion is well known to be O(n):
T(n) = T(n/2) + 2*n =
= T(n/4) + 2*n/2 + 2*n =
= T(n/8) + 2*n/4 + 2*n/2 + 2*n
= ....
= 2*n / (2^logN) + ... + 2*n/2 + 2*n
< 4n
in O(n)
Let's formally prove it, we will use mathematical induction for it:
Base: T(1) < 4 (check)
Hypothesis: For n, and for every k<n the claim T(k) < 4k holds true.
For n:
T(n) = T(n/2) + n*2 = (*)
< 2*n + 2*n
= 4n
Conclusion: T(n) is in O(n)
(*) From the induction hypothesis

Resources