Big -O notation - big-o

Hey i have a question.
say t(n) = O(n log(n)) and u know that this is true.
and then your given these statements and told to say whether they must be true or false. t(n) = n^4 and t(n) = O(N^4)
The statement t(n) = n^4 is false while the statement t(n) = O(N^4) is true. Why?

You have to remember that when you write t(n) = O(n log(n)) and t(n) = O(N^4), what it actually means is that t(n) is in O(...), not that it's equal to it (as O(...) is a set of functions and a function can not be equal to a set of functions). However when you write f(n) = n^4, that means that f(n) is equal to n^4.
Now if f(n) is in O(n log n), it is also in O(n^4) because O(n^4) is a superset of O(n log n). However it can not be equal to n^4, because n^4 is not in O(n log n).

The idea of Big O notation is that it represents an abstracted function of time, it focuses on the slowest part of your algorithm and ignores things that affect execution time (i.e. t(n)) but don't actually make a huge difference.
For exmaple, if your function works on a set of items of size n and just loops through them performing some calculations then you'd say t(n) = O(n). Say you performed some operation only on a few of elements according to some criteria, you would still still say t(n) = O(n) but the actual time taken t(n) would not be a function of n directly, hence t(n) = nx would not hold true.

Look at the second equation in this. From this equation, t(n) = n^4 = O(n^4) is obvious.
t(n) = O(n log n) is false, because ∀M>0,x, ∃n>x, t(n) = n^4 > M(n log n).
(if n > log n and n>M, n^4 > M*n^3 = M(n * n^2) > M(n * log n) = M(n log n), and n>log n when (roughly) n>5)

Related

Does g(n) = O(f(n)) imply g(n^2) = O(f(n^2))?

I have been stuck on a problem for a while, and I would like to know if g(n) = O(log n), does g(n^2) = O(log n^2)? I would like to know whether I can apply this method to solve the aforementioned problem.
The problem in question (for context only):
If f(n) = n * g(n^2) and g(n) = O(log n), prove that f(n) = O(n log n).
The answer is yes: n is just a number, and so is n2. Imagine if you call a = n2, then isn't g(a) still O(log(a))?
Your confusion is that in fact, O(log(n2)) = O(2 log n) = O(log n). This follows from the properties of the logarithm.
To illustrate, for your specific problem, we have that: If
g(n) = O(log n)
then
g(n2) = O(log(n2)) = O(2 log n)) = O(log n).
And since
f(n) = n * g(n2)
then clearly
f(n) = O(n log n). □
(The amount of rigor your proof requires would depend on the context of your course or book.)

Big O complexity: T(n) = O(f(n)), G(n) = O(h(n)). Is T(G(n)) = O(h(f(n)))?

T(n)= O(f(n)), G(n)= O(h(n))
How would I prove or disprove:
T(G(n))= O(h(f(n))
I think this is false, because it should be O(f(h(n))) instead of O(h(f(n))), since G is applied before T is applied, I tried substituting polynomial functions for T and G, I think the order matters, (n^2)! is not equal to (n!)^2 , but I am not sure if this reasoning is correct?
You are correct, this is false, however I did not quite understand your counterexample.
A counterexample would be to take a function and its inverse, while keeping h very small asymptotically:
T(n) = 2^n , f(n) = 2^n This is compliant with the given fact that 2^n = O(2^n)
And
G(n) = lg(n!) , h(n) = n lg(n) This is also compliant with the given fact that lg(n!) < O(lg n^n) = O(n lg n)
However, T(G(n)) = 2^(lg(n!)) = n! and h(f(n)) = 2^n lg(2^n) = n*2^n but, n! =/= O(n*2^n) as we have a factorial function vs an exponent function (multiplied by a linear function) thus we proved it is not true.
The reason n! =/= O(n 2^n) is because: n*2^n < 2^n * 2^n = 4^n and we know that a factorial function 'beats' an exponent.

How to approve or disprove a statement-Time Complexity

For all functions f, log_2(f(n)) + O(n) = O(n).
I have tried disapproving it by taking the limit. But got infinity as a result. Is it right?
The statement is not true. As a counterexample f(n) = n^n. Therefore, log(f(n)) = n log(n) and n log n + O(n) is not in O(n).

Algorithm domination

Studying for a test and getting this question:
Comparing two algorithms with asymptotic complexities O(n) and O(n + log(n)),
which one of the following is true?
A) O(n + log(n)) dominates O(n)
B) O(n) dominates O(n + log(n))
C) Neither algorithm dominates the other.
O(n) dominates log(n) correct? So in this do we just take o(n) from both and deduce neither dominate?
[C] is true, because of the summation property of Big-O
Summation
O(f(n)) + O(g(n)) -> O(max(f(n), g(n)))
For example: O(n^2) + O(n) = O(n^2)
In Big-O, you only care about the largest-growing function and ignore all the other additives.
Edit: originally I put [A] as an answer, I just didn't put much attention to all the options and misinterpreted the [A] option. Here is more formal proof
O(n) ~ O(n + log(n)) <=>
O(n) ~ O(n) + O(log(n)) <=>
O(n) ~ O(n).
Yes, that's correct. If runtime is the sum of several runtimes, by order of magnitude, the largest order of magnitude dominates.
Assuming that big-O notation is used in the sense of asymptotic tight bound, which really should be denoted with a big-Theta, then I would answer C), because Theta(n) = Theta(n + log(n)). (Because log(n) is dominated by n).
If I am formally (mathematically) correct, then I would say that none of these answers is correct, because O(n) and O(n+log(n)) only give upper bounds, but not lower bounds on the asymptotic behaviour:
Let f(n) in O(n) and g(n) in O(n + log(n)). Then there are the following contra examples:
For A): Let f(n) = n in O(n) and g(n) = 1 in O(n + log(n)). Then g(n) does not dominate f(n).
for B): Let f(n) = 1 in O(n) and g(n) = n in O(n + log(n)). Then f(n) does not dominate g(n).
for C): Let f(n) = 1 in O(n) and g(n) = n in O(n + log(n)). Then g(n) does dominate f(n).
As this would be a very tricky question, I assume that you use the more common sloppy definition, which would give the answer C). (But you might want to check your definitions for big-O).
If my answer confuses you, then you probably didn't use the formal definition and you should probably ignore my answer...

Big-Oh Notation

if T(n) is O(n), then it is also correct to say T(n) is O(n2) ?
Yes; because O(n) is a subset of O(n^2).
Assuming
T(n) = O(n), n > 0
Then both of the following are true
T(n) = O(2n)
T(n) = O(n2)
This is because both 2n and n2 grow as quickly as or more quickly than just plain n. EDIT: As Philip correctly notes in the comments, even a value smaller than 1 can be the multiplier of n, since constant terms may be dropped (they become insignificant for large values of n; EDIT 2: as Oli says, all constants are insignificant per the definition of O). Thus the following is also true:
T(n) = O(0.2n)
In fact, n2 grows so quickly that you can also say
T(n) = o(n2)
But not
T(n) = Θ(n2)
because the functions given provide an asymptotic upper bound, not an asymptotically tight bound.
if you mean O(2 * N) then yes O(n) == O(2n). The time taken is a linear function of the input data in both cases
I disagree with the other answer that says O(N) = O(N*N). It is true that the O(N) function will finish in less time than O(N*N), but the completion time is not a function of n*n so it really isnt true
I suppose the answer depends on why u r asking the question
O also known as Big-Oh is a upper bound. We can say that there exists a C such that, for all n > N, T(n) < C g(n). Where C is a constant.
So until an unless the large co-efficient in T(n) is smaller or equal to g(n) then that statement is always valid.

Resources