Algorithm's complexity : Big O notation - complexity-theory

O( sqrt(n) ) = O(n) ?
We should find c and n0 verifying :
0 ( sqrt(n) ) < c*n ; c>0 and n>n0
How to find c and n0 ? or should i find another idea ?
Thanks

For n > 1, we have √n > 1, hence we have the following inequality:
√n < √n * √n = n, for any n > 1.
So we can take c = 1 and n0 = 2 to prove that √n = O(n).
Remark
Strictly speaking, you should avoid writing down something like O(√n) = O(n). Big-O notation is for describing the asymptotic upper bound of a function, but O(√n) is not a function.
O(√n) = O(n) is an abuse of notation, it really means the following:
If f is a function such that f(n) = O(√n), then f(n) = O(n).
In our case, if for any function f we have f(n) = O(√n), since √n < n for any n > 1, clearly we have f(n) < c * √n < c * n for any n > 1, and hence f(n) = O(n).

Related

if log n^2 is big theta of log n , is (logn)^2 also big theta of logn?

log n^2 is equivalent to 2logn which grows at the same rate as logn, as I disregard the factors and constants. but if I was to square the whole term so that I end up with (logn)^2 is it also big theta of logn?
No. If f is any unbounded function then f(n)^2 is not O(f).
Because f(n)^2 = O(f) means there's a c and N such that n > N implies f(n)^2 <= cf(n). Which implies f(n) <= c, and so f is bounded.
log(n) is unbounded, so log(n)^2 is not O(log(n)).
log (n^2) = 2 log(n)
and as you know x^2 is not in thetha(x).
Think this way: let N=log(n). Then f1(N)=N^2 where f2(N)=N, obviously,
N=o(N^2)!=theta(N^2), i.e., log(n)=o((log(n))^2)!=theta((log(n))^2).
Also, lim {n->inf} f2(n) / f1(n) = lim {n->inf} 1 / log(n) = 0, by definition of small o (https://en.wikipedia.org/wiki/Big_O_notation) it implies f2(n)=o(f1(n)).

What is the complexity of function f(n) with n=f(n).log(f(n))

What is the complexity of function f(n),preferably the Big-O notation, and f(n) satisfies the condition n = f(n).log(f(n)) ,f(n) > 1 .Let assume that log in base 2.
I tried to isolate f(n) from the condition but could not get it done.
After using excel to get the graph of function f(n).
It seems that f(n) = O(n^2) but I cant figure out how to get it out?
I think the complexity is even lower than O(n) - namely, O(n/ln(n)). Semi-proof:
(substituting n/ln(n) for f(n))
RHS = n/ln(n) * ln(n/ln(n)) = n/ln(n) * (ln(n) -ln(ln(n))) =
= n - n * ln(ln(n))/ln(n) = n * (1-ln(ln(n))/ln(n)) = n*Theta(1) = Theta(n) = LHS
For clarity, I skipped Theta notation almost everywhere.
Comments are getting long, so here's a sketch of a proof for you. This is probably homework, so please make sure you learn something instead of just copying it down.
In order to show that f is O(n), you have to show that there is an M and n1 where f(n) < M|n| for all n > n1.
We know that n = f(n) log(f(n)), so M |n| = M |f(n)| |log(f(n))|.
So what we are trying to find is an M and n1 for which
f(n) < M |n| = M |f(n)| |log(f(n))|
for n > n1.
n, f, and log f are all positive, so we can drop the |.| to get
f(n) < M f(n) log(f(n)) = M |n|
Our goal is to find an M and n1 for which
f(n) < M f(n) log(f(n)) = M |n|
is true for all n > n1. Pick M = 1, n1 = 10, then
f(n) < f(n) log(f(10)) <= f(n) log(f(n)) = |n| (where M is now set to 1)
for n > n1. f(n) log(f(10)) <= f(n) log(f(n)) is true because log(f(n)) is monotonic for n>n1 (homework exercise: show that this is true). f(n) < f(n) log(f(10)) is trivially true because log(f(10)) > 1.
This shows, then, that f(n) is O(n).

Is O(log(n*log n) can be considered as O(log n)

Consider I get f(n)=log(n*log n). Should I say that its O(log(n*log n)?
Or should I do log(n*log n)=log n + log(log n) and then say that the function f(n) is O(log n)?
First of all, as you have observed:
log(n*log n) = log(n) + log(log(n))
but think about log(log N) as N->large (as Floris suggests).
For example, let N = 1000, then log N = 3 (i.e. a small number) and log(3) is even smaller,
this holds as N gets huge, i.e. way more than the number of instructions your code could ever generate.
Thus, O(log(n * log n)) = O(log n + k) = O(log(n)) + k = O(log n)
Another way to look at this is that: n * log n << n^2, so in the worse case:
O(log(n^2)) > O(log(n * log n))
So, 2*O(log(n)) is an upper bound, and O(log(n * log n)) = O(log n)
Use the definition. If f(n) = O(log(n*log(n))), then there must exist a positive constant M and real n0 such that:
|f(n)| ≤ M |log(n*log(n))|
for all n > n0.
Now let's assume (without loss of generality) that n0 > 0. Then
log(n) ≥ log(log(n))
for all n > n0.
From this, we have:
log(n(log(n)) = log(n) + log(log(n)) ≤ 2 * log(n)
Substituting, we find that
|f(n)| ≤ 2*M|log(n))| for all n > n0
Since 2*M is also a positive constant, it immediately follows that f(n) = O(log(n)).
Of course in this case simple transformations show both functions differ by a constant factor asymptotically, as shown.
However, I feel like it is worthwhile remind a classic test for analyzing how two functions relate to each other asymptotically. So here's a little more formal proof.
You can check how does f(x) relates to g(x) by analyzing lim f(x)/g(x) when x->infinity.
There are 3 cases:
lim = infinty <=> O(f(x)) > O(g(x))
inf > lim > 0 <=> O(f(x)) = O(g(x))
lim = 0 <=> O(f(x)) < O(g(x))
So
lim ( log( n * log(n) ) / log n ) =
lim ( log n + log log (n) ) / log n =
lim 1 + log log (n) / log n =
1 + 0 = 1
Note: I assumed log log n / log n to be trivial but you can do it by de l'Hospital Rule.

Prove 2^(n a) = O(2^n)?

How can I prove that 2^(n+a) is O(2^n)? The only thing I can think of is that n in 2^n is an arbitrary value, therefore n+a is just as arbitrary, so n+a = n. Alternatively, 2^(n+a) = 2^n * 2^a. 2^n is obviously O(2^n), and a exists as an arbitrary value in 2^a, so 2^a = 2^n = O(2^n). Is there a clearer/more formal way to prove this?
For the formal definition of big-O, there must exist an M and n0 such that 2^(n+a) <= M*2^n for all n > n0.
If we choose M = 2^a, and n0 = 0, then we can see that 2^(n+a) = 2^a * 2^n = M*2^n, which is <= M*2^n for all n > n0. Therefore, 2^(n+a) is O(2^n)
See the definition of the big-O notation here. Think about whether you can find a constant M as in the definition.
In general, to prove that f(n) is O(g(n)), you must find a positive integer N such that for all n >= N, f(n) <= g(n).

How to Prove Asymptotic Notations

I want to prove the following statement
2^(⌊lg n⌋+⌈lg n⌉)∕n ∈ Θ(n)
I know that to prove it, we have to find the constants c1>0, c2>0, and n0>0 such that
c1.g(n) <= f(n) <= c2.g(n) for all n >= n0
In other words, we have to prove f(n) <= c.g(n) and f(n) >= c.g(n).
The problem is how to prove the left hand side (2^(⌊lg n⌋+⌈lg n⌉)∕n)
Thank you
You can start by expanding the exponential. It is equal to n1*n2/n, where n1<=n<=n2, 2*n1>n and n*2>n2. The rest should be easy.
Here's a derivation for the upper bound:
2^(⌊lg n⌋+⌈lg n⌉)/n
= 2^(2⌊lg n⌋+1)/n
<= 2^(2 lg n + 1)/n
= 2^(2 lg n) 2^(1) / n
= 2 n^2 / n
= 2 n
= O(n)
So we know your function can be bounded above by 2*n. Now we do the lower bound:
2^(⌊lg n⌋+⌈lg n⌉)/n
= 2^(2⌈lg n⌉ - 1) / n
>= 2^(2 lg n - 1)/n
= 2^(2 lg n) 2^(-1) / n
= 1/2 n^2 / n
= 1/2 n
= O(n)
We now know that your function can be bounded below by n/2.
Checked on gnuplot; these answers look good and tight. This is a purely algebraic solution using the definition if floor() and ceiling() functions.

Resources