Does g(n) = O(f(n)) imply g(n^2) = O(f(n^2))? - big-o

I have been stuck on a problem for a while, and I would like to know if g(n) = O(log n), does g(n^2) = O(log n^2)? I would like to know whether I can apply this method to solve the aforementioned problem.
The problem in question (for context only):
If f(n) = n * g(n^2) and g(n) = O(log n), prove that f(n) = O(n log n).

The answer is yes: n is just a number, and so is n2. Imagine if you call a = n2, then isn't g(a) still O(log(a))?
Your confusion is that in fact, O(log(n2)) = O(2 log n) = O(log n). This follows from the properties of the logarithm.
To illustrate, for your specific problem, we have that: If
g(n) = O(log n)
then
g(n2) = O(log(n2)) = O(2 log n)) = O(log n).
And since
f(n) = n * g(n2)
then clearly
f(n) = O(n log n). □
(The amount of rigor your proof requires would depend on the context of your course or book.)

Related

Big O complexity: T(n) = O(f(n)), G(n) = O(h(n)). Is T(G(n)) = O(h(f(n)))?

T(n)= O(f(n)), G(n)= O(h(n))
How would I prove or disprove:
T(G(n))= O(h(f(n))
I think this is false, because it should be O(f(h(n))) instead of O(h(f(n))), since G is applied before T is applied, I tried substituting polynomial functions for T and G, I think the order matters, (n^2)! is not equal to (n!)^2 , but I am not sure if this reasoning is correct?
You are correct, this is false, however I did not quite understand your counterexample.
A counterexample would be to take a function and its inverse, while keeping h very small asymptotically:
T(n) = 2^n , f(n) = 2^n This is compliant with the given fact that 2^n = O(2^n)
And
G(n) = lg(n!) , h(n) = n lg(n) This is also compliant with the given fact that lg(n!) < O(lg n^n) = O(n lg n)
However, T(G(n)) = 2^(lg(n!)) = n! and h(f(n)) = 2^n lg(2^n) = n*2^n but, n! =/= O(n*2^n) as we have a factorial function vs an exponent function (multiplied by a linear function) thus we proved it is not true.
The reason n! =/= O(n 2^n) is because: n*2^n < 2^n * 2^n = 4^n and we know that a factorial function 'beats' an exponent.

Where do the functions 2n^2 , 100n log n, and (log n) ^3 fit in the big-O hierarchy?

The big-O hierarchy for any constants a, b > 0 is;
O(a) ⊂ O(log n) ⊂ O(n^b) ⊂ O(C^n).
I need some explanations, thanks.
Leading constants don't matter, so O(2n^2) = O(n^2) and O(100 n log n) = O (n log n).
If f and g are functions then O(f * g) = O(f) * O(g). Now, apparently you are ok accepting that O(log n) < O(n). Multiply both sides by O(n) and you get O(n) * O(log n) = O(n * log n) < O(n * n) = O(n^2).
To see that O((log n)^3) is less than O(n^a) for any positive a is a little trickier, but if you are willing to accept that O(log n) is less than O(n^a) for any positive a, then you can see it by taking the third root of O((log n)^3) and O(n^a). You get O(log n) on the one side, and O(n^(a/3)) on the other side, and the inequality you are looking for is easy to deduce from this.
you can think about that number that a function produces, it usually goes that the smaller the number, the faster the algorithm is. And if it is a larger number the function produces its slower.
log 10 < 10^b < C^n

When c > 0 Log(n) = O(n)? Not sure why it isn't O(log n)

In my homework, the question asks to determine the asymptotic complexity of n^.99999*log(n). I figured that it would be closer to O( n log n) but the answer key suggests that when c > 0, log n = O(n). I'm not quite sure why that is, could someone provide an explanation?
It's also true that lg n = O( nk ) (in fact, it is o(nk); did the hint actually say that, perhaps?) for any constant k, not just 1. Now consider k=0.00001. Then n0.99999 lg n = O(n0.99999 n0.00001 ) = O(n). Note that this bound is not tight, since I could choose an even smaller k, so it's perfectly fine to say that n0.99999 lg n is O(n0.99999 lg n), just as we say n lg n is O(n lg n).

Big-O notation calculation, O(n) * O(log n) = O(n log n)

I need to design an algorithm that is able to do some calculations in given O notation. It has been some time since I last calculated with O notation and I am a bit confused on how to add different O notations together.
O(n) * O(log n) = O(n log n)
O(n) + O(n) = O(2n) = O(n)
O(n) * O(log n) + O(n log n) = O(n log n) + O(n log n) = O(n log n)
Are these correct? What other rules have I overlooked?
The rule for multiplication is really simple:
O(f) * O(g) = O(f * g)
The sum of two O terms is harder to calculate if you want it to work for arbitrary functions.
However, if f ∈ O(g), then f + g ∈ O(g).
Therefore, your calculations are correct, but your original title is not;
O(n) + O(log n) = O(n)

Big -O notation

Hey i have a question.
say t(n) = O(n log(n)) and u know that this is true.
and then your given these statements and told to say whether they must be true or false. t(n) = n^4 and t(n) = O(N^4)
The statement t(n) = n^4 is false while the statement t(n) = O(N^4) is true. Why?
You have to remember that when you write t(n) = O(n log(n)) and t(n) = O(N^4), what it actually means is that t(n) is in O(...), not that it's equal to it (as O(...) is a set of functions and a function can not be equal to a set of functions). However when you write f(n) = n^4, that means that f(n) is equal to n^4.
Now if f(n) is in O(n log n), it is also in O(n^4) because O(n^4) is a superset of O(n log n). However it can not be equal to n^4, because n^4 is not in O(n log n).
The idea of Big O notation is that it represents an abstracted function of time, it focuses on the slowest part of your algorithm and ignores things that affect execution time (i.e. t(n)) but don't actually make a huge difference.
For exmaple, if your function works on a set of items of size n and just loops through them performing some calculations then you'd say t(n) = O(n). Say you performed some operation only on a few of elements according to some criteria, you would still still say t(n) = O(n) but the actual time taken t(n) would not be a function of n directly, hence t(n) = nx would not hold true.
Look at the second equation in this. From this equation, t(n) = n^4 = O(n^4) is obvious.
t(n) = O(n log n) is false, because ∀M>0,x, ∃n>x, t(n) = n^4 > M(n log n).
(if n > log n and n>M, n^4 > M*n^3 = M(n * n^2) > M(n * log n) = M(n log n), and n>log n when (roughly) n>5)

Resources