I want to find the overall time complexity of this:
O( n log(log n)) + n O(L)
where n is the number of objects and each object has a string with length L.
L is constant so you can rewrite it as
O(n log(log(n)) + O(n)
as n is less than n log(log(n)) the result is
O(n log(log(n))
Related
Can we say O(K + (N-K)logK) is equivalent to O(K + N logK) for 1 < = K <= N?
The short answer is they are not equivalent and it depends on the value of k. If k is equal to N, then the first complexity is O(N), and the second complexity is O(N + Nlog N) which is equivalent to O(NlogN). However, O(N) is not equivalent to O(N log N).
Moreover, if a function is in O(K + (N-K) log K) is in O(K + N log K) (definitely for every positive K), and the proof of this is straightforward.
Yes because in the worst case (N-K) logK is at most N logK given your constraints since 1 <= K <= N.
Not exactly.
If they are equivalent, then every function in O(k + (n-k)log k) is also in O(k + n log k) and vice-versa.
Let f(n,k) = n log k
This function is certainly in O(k + n log k), but not in O(k + (n-k)log k).
Let g(n,k) = k + (n-k)log k
Then as x approaches infinity, f(x,x)/g(x,x) grows without bound, since:
f(x,x) / g(x,x)
= (x log x) / x
= log x
See the definition of big-O notation for multiple variables: http://mathwiki.cs.ut.ee/asymptotics/04_multiple_variables
Wikipedia provides the same information, but in less accessible notation:
https://en.wikipedia.org/wiki/Big_O_notation#Multiple_variables
To find their relation I substituted log n = x and log n! = n(log n) so with base a , O( log n! ) became a^x(x) and (log n)! became x(x-1)(x-2)....
now I think the first one has a higher growing speed. But can you help me to find their relation using big O of n^2
Actually x(x-1)(x-2).... becomes x^x + ... because you have x scopes. This means that O((log n)!)has a higher growing speed.
Also, if log(n) := x, then n = 2^x and n^2 will become (2^x)^2 = 2^2x which has lower growing speed than x^x
Summary
O(log n!) < O(n^2) < O((log n)!)
We have two sets A, B and we want to compute set difference A - B, we will sort first elements of B with quicksort which have average complexity O(n * log n) and after we search each element from A in B with binary search which have complexity O(log n), the entire set difference algorihm described up which complexity will have ? if we consider that we use qucksort and binary search. I tried follow way to compute complexity of set difference using this algorithms: O(n * log n) + O(log n) = O(n * log n + log n) = O(log n * (n + 1)) = O((n + 1) * log n). Is it correct ?
First, constant does not really count in O notation facing a polynomial that grows faster than a constant, so 1 will be owned by n, which means O((n + 1) * log n) is just O(n * log n).
Now the important issue - suppose A has m elements, you need to do m binary searches, each has complexity O(log n). So totally, the complexity should be O(n * log n) + O(m * log n) = O((n + m) * log n).
O (n * log n) + O (log n) = O (n * log n)
http://en.wikipedia.org/wiki/Big_O_notation#Properties
If a function may be bounded by a polynomial in n, then as n tends to
infinity, one may disregard lower-order terms of the polynomial.
I'm trying to order the following functions in terms of Big O complexity from low complexity to high complexity: 4^(log(N)), 2N, 3^100, log(log(N)), 5N, N!, (log(N))^2
This:
3^100
log(log(N))
2N
5N
(log(N))^2
4^(log(N))
N!
I figured this out just by using the chart given on wikipedia. Is there a way of verifying the answer?
3^100 = O(1)
log log N = O(log log N)
(log N)^2 = O((log N)^2)
N, 2N, 5N = O(N)
4^logN = O(e^logN)
N! = o(N!)
you made just one small mistake. this is the right order.
I need to design an algorithm that is able to do some calculations in given O notation. It has been some time since I last calculated with O notation and I am a bit confused on how to add different O notations together.
O(n) * O(log n) = O(n log n)
O(n) + O(n) = O(2n) = O(n)
O(n) * O(log n) + O(n log n) = O(n log n) + O(n log n) = O(n log n)
Are these correct? What other rules have I overlooked?
The rule for multiplication is really simple:
O(f) * O(g) = O(f * g)
The sum of two O terms is harder to calculate if you want it to work for arbitrary functions.
However, if f ∈ O(g), then f + g ∈ O(g).
Therefore, your calculations are correct, but your original title is not;
O(n) + O(log n) = O(n)