If I have an algorithm that runs in log(n^(5/4)!) time, how can I represent this as something log(n)? Is it just I know that log(n!) would be asymptotically equal to nlog(n) but does the (5/4) change anything, and if it does how so?
Good question! As you noted log(n!) = O(n log n). From this it follows that
log(n^{5/4}!) = O(n^{5/4} log n^{5/4}) = O(n^{5/4} log n)
The last equality follows because log n^{5/4} = (5/4)*log n.
So the you can simplify the expression to O(n^{5/4} log n).
The answer is yes, the factor 5/4 in the exponent matters: the function n^{5/4} grows asymptotically fast than n so you can't ignore it. (This follows from the fact that n^{5/4}/n = n^{1/4}, for example.)
Related
Is O(logn) = O(2^O(log logn))?
I tried to take the log of both sides
log logn = log2^(log logn)
log logn = log logn log2
We can find a constant C > log2 s.t C log logn > log logn log2
So they are equal to each other. Am I right?
I think what you want to ask is if log n = O(2^(log log n))?
Think of O (big-O) as a <= operator, but the comparison is made asymptotically.
Now, to answer your question, we have to compare log n and 2^(log log n).
We use asymptotic notations only when we need to visualize how much an algorithm will scale as the input grows drastically.
log n is a logarithmic function.
2^(log log n) is an exponential function. (Notice that log log n is the exponent of 2)
It will always be true that a logarithmic function is asymptotically less than an exponential function. If you want to understand, try computing both the functions for very large values of n (like 10000 or 100000000).
So, it can be very easily inferred that log n = O(2^(log log n)).
NOTE: We do not compare asymptotic notations like you asked (O(logn) = O(2^O(log logn))). We compare functions (like log n) using these notations.
I have been asked the following question by one of my fellow mates.
Which of the following expressions is not sublinear?
O(log log n)
O(n)
O(logn)
O(root(n))
I have gone through https://en.wikipedia.org/wiki/Time_complexity#Sub-linear_time but couldn't but I am not sure that I have understood it completely. Could someone point me in the right direction.
A function, f(x), is said to grow faster than another function, g(x), if the limit of their ratios as x approaches infinity goes to some positive number (or infinity), as seen in the definition below.
In the case of sublinear, we want to prove that a function grows slower than c*n, where c is some positive number.
Thus, for each function, f(n), in your list, we want the ratio of f(n) to (c*n). If the limit is 0, this means the function, f(n), is sublinear. Otherwise it grows at the same (approximate) speed of n or faster.
lim n->inf (log log n)/(c*n) = 0 (via l'Hopital's)
(sublinear)
lim n->inf (n)/(c*n) = 1/c != 0
(linear)
lim n->inf (log n)/(c*n) = 0 (via l'Hopital's)
(sublinear)
lim n->inf (sqrt(n))/(c*n) = 0
(sublinear)
I think I understood why you're confused: the wikipedia page you link uses Little-Oh notation:
Sub-linear time
An algorithm is said to run in sub-linear time (often spelled sublinear time) if T(n) = o(n)
Beware that T(n) = o(n) is a stronger requirement than saying T(n) = O(n).
In particular for a function in O(n) you can't always have the inequality
f(x) < k g(x) for all x > a
satisfied for every k you choose. y=x and k=1 will prove you wrong and little-oh notation requires every k to satisfy that expression.
Any O(n) function is not also in o(n). Thus your non-sublinear expression is O(n).
I recommend reading this answer to continue your studies
In big-O notation is O((log n)^k) = O(log n), where k is some constant (e.g. the number of logarithmic for loops), true?
I was told by my professor that this statement was true, however he said it will be proved later in the course. I was wondering if any of you could demonstrate its validity or have a link where I could confirm if it is true.
(1) It is true that O(log(n^k)) = O(log n).
(2) It is false that O(log^k(n)) (also written O((log n)^k)) = O(log n).
Observation: (1) has been proven by nmjohn.
Exercise: prove (2). (Hint: f(n) = log^2 n is O(log^2 n). Is it O(log n)? What is a sufficiently large constant c such that, for all n greater than n0, c log n > log^2 n?)
EDIT:
On a related note, anybody who finds this question helpful and/or interesting should go show some love for the new "Computer Science" StackExchange site. Here's a link. Go make this new place a reality!
http://area51.stackexchange.com/proposals/35636/computer-science-non-programming?referrer=rpnXA1_2BNYzXN85c5ibxQ2
Are you sure he didn't mean O(log n^k), because that equals O(k*log n) = k*O(log n) = O(log n).
O(log n) is a class of functions. You cannot perform computations such as ^k on it. Thus, the term O(log n)^k does not even look sensible to me.
So, clearly, log(n) is O(n). But, what about (log(n))^2? What about sqrt(n) or log(n)—what bounds what?
There's a family of comparisons like this:
nᵃ (vs.) (log(n))ᵇ
I run into these comparisons a lot, and I've never come up with a good way to solve them. Hints for tactics for solving the general case?
[EDIT: I'm not talking about the computational complexity of calculating the values of these functions. I'm talking about the functions themselves. E.g., f(n) = n is an upper bound on g(n) = log(n) because f(n) ≤ c g(n) for c = 1 and n₀ > 0.]
log(n)^a is always O(n^b), for any positive constants a, b.
Are you looking for a proof? All such problems can be reduced to seeing that log(n) is O(n), by the following trick:
log(n)^a = O(n^b) is equivalent to:
log(n) = O(n^{b/a}), since raising to the 1/a power is an increasing function.
This is equivalent to
log(m^{a/b}) = O(m), by setting m = n^{b/a}.
This is equivalent to log(m) = O(m), since log(m^{a/b}) = (a/b)*log(m).
You can prove that log(n) = O(n) by induction, focusing on the case where n is a power of 2.
log n -- O(log n)
sqrt n -- O(sqrt n)
n^2 -- O(n^2)
(log n)^2 -- O((log n)^2)
n^a versus (log(n))^b
You need either bases or powers the same. So use your math to change n^a to log(n)^(whatever it gets to get this base) or (whatever it gets to get this power)^b. There is no general case
I run into these comparisons a lot (...)
Hints for tactics for solving the general case?
As you as about general case and that you following a lot into such questions. Here is what I recommend :
Use limit definition of BigO notation, once you know:
f(n) = O(g(n)) iff limit (n approaches +inf) f(n)/g(n) exists and is not +inf
You can use Computer Algebra System, for example opensource Maxima, here is in Maxima documentation about limits .
For more detailed info and example - check out THIS answer
if T(n) is O(n), then it is also correct to say T(n) is O(n2) ?
Yes; because O(n) is a subset of O(n^2).
Assuming
T(n) = O(n), n > 0
Then both of the following are true
T(n) = O(2n)
T(n) = O(n2)
This is because both 2n and n2 grow as quickly as or more quickly than just plain n. EDIT: As Philip correctly notes in the comments, even a value smaller than 1 can be the multiplier of n, since constant terms may be dropped (they become insignificant for large values of n; EDIT 2: as Oli says, all constants are insignificant per the definition of O). Thus the following is also true:
T(n) = O(0.2n)
In fact, n2 grows so quickly that you can also say
T(n) = o(n2)
But not
T(n) = Θ(n2)
because the functions given provide an asymptotic upper bound, not an asymptotically tight bound.
if you mean O(2 * N) then yes O(n) == O(2n). The time taken is a linear function of the input data in both cases
I disagree with the other answer that says O(N) = O(N*N). It is true that the O(N) function will finish in less time than O(N*N), but the completion time is not a function of n*n so it really isnt true
I suppose the answer depends on why u r asking the question
O also known as Big-Oh is a upper bound. We can say that there exists a C such that, for all n > N, T(n) < C g(n). Where C is a constant.
So until an unless the large co-efficient in T(n) is smaller or equal to g(n) then that statement is always valid.