time complexity - algorithm

Hi
I have a question that:
consider I have T(n) = m * n^2 (n<m)
is this correct to write T(n) = O(m) ? because I have written T(n) = m*n*n So because n<m we have T(n) = O(m)
thanks

No, the best thing you can do is to write T(n,m) = O(m^3). n < m is a very weak condition and basically just gives you n in O(m). For example, n could always be m-1.
Edit: My first answer was imprecise, as T was only a function in n. If m is constant, the answer is still holds, but O(m^3) is equal to O(1).

I believe in this case T(n) = O(n^2)
The formal definition of big-O:
f(x) = O(g(x)) if and only if there exists a positive real number M and a real number x0 such that |f(x)| ≤ M|g(x)| for all x > x0.
In your case, T(n) will always be less than or equal to m(n^2).

Related

Algorithm complexity, solving recursive equation

I'm taking Data Structures and Algorithm course and I'm stuck at this recursive equation:
T(n) = logn*T(logn) + n
obviously this can't be handled with the use of the Master Theorem, so I was wondering if anybody has any ideas for solving this recursive equation. I'm pretty sure that it should be solved with a change in the parameters, like considering n to be 2^m , but I couldn't manage to find any good fix.
The answer is Theta(n). To prove something is Theta(n), you have to show it is Omega(n) and O(n). Omega(n) in this case is obvious because T(n)>=n. To show that T(n)=O(n), first
Pick a large finite value N such that log(n)^2 < n/100 for all n>N. This is possible because log(n)^2=o(n).
Pick a constant C>100 such that T(n)<Cn for all n<=N. This is possible due to the fact that N is finite.
We will show inductively that T(n)<Cn for all n>N. Since log(n)<n, by the induction hypothesis, we have:
T(n) < n + log(n) C log(n)
= n + C log(n)^2
< n + (C/100) n
= C * (1/100 + 1/C) * n
< C/50 * n
< C*n
In fact, for this function it is even possible to show that T(n) = n + o(n) using a similar argument.
This is by no means an official proof but I think it goes like this.
The key is the + n part. Because of this, T is bounded below by o(n). (or should that be big omega? I'm rusty.) So let's assume that T(n) = O(n) and have a go at that.
Substitute into the original relation
T(n) = (log n)O(log n) + n
= O(log^2(n)) + O(n)
= O(n)
So it still holds.

Big-O with positive constants

In this Big-O / Computational Complexity problem
given that a and b are positive constants greater than 1 and n is a variable parameter.
I assumed that an+1 = O(an) , abn = O(an) & an+b = O(an)
First I need to know if I am correct in assuming this.
If so, how would I prove that f(n) = O(f(n)).
Recall the definition of big-O:
f(n) ∈ O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n.
Let g=f, c=1 and k=0, then you have a trivial demonstration of f(n) ∈ O(f(n))
Similarly, from an+1=a⋅an, let f(n)=an+1, g(n)=an, c=a, k=0, again the proof of O(an+1)=O(an) is trivial. The proof for O(an+b)=O(an) is identical.
O(abn) is not equal to O(an) with a,b>1, see Exponential growth in big-o notation

Prove 2^(n a) = O(2^n)?

How can I prove that 2^(n+a) is O(2^n)? The only thing I can think of is that n in 2^n is an arbitrary value, therefore n+a is just as arbitrary, so n+a = n. Alternatively, 2^(n+a) = 2^n * 2^a. 2^n is obviously O(2^n), and a exists as an arbitrary value in 2^a, so 2^a = 2^n = O(2^n). Is there a clearer/more formal way to prove this?
For the formal definition of big-O, there must exist an M and n0 such that 2^(n+a) <= M*2^n for all n > n0.
If we choose M = 2^a, and n0 = 0, then we can see that 2^(n+a) = 2^a * 2^n = M*2^n, which is <= M*2^n for all n > n0. Therefore, 2^(n+a) is O(2^n)
See the definition of the big-O notation here. Think about whether you can find a constant M as in the definition.
In general, to prove that f(n) is O(g(n)), you must find a positive integer N such that for all n >= N, f(n) <= g(n).

Proving a given function equals o(N)

I am trying to prove that for any constant,k, log^k N = o(N) (little O of N)
All that I know for little o is that it follows the form T(n) = o(p(n)) where T(n) grows at a rate slower than p(n). Also I can't really do a limit and use L'hopital rule because I do not know what f(n) or g(n) is. Can someone please help getting me started!
You need to show that
lim (log^k N)/N = 0
N -> infinity
Write N = e^x, and that becomes
lim (x^k)/(e^x) = 0
Now, use the Power series expansion of the exponential function,
e^x = ∑ (x^n)/n!
to get an estimate for that quotient.
Spoiler: Picking the term with n = k+1 gives e^x > x^(k+1)/(k+1)! and from that easily follows (x^k)/(e^x) < (k+1)!/x.

Big-Oh Notation

if T(n) is O(n), then it is also correct to say T(n) is O(n2) ?
Yes; because O(n) is a subset of O(n^2).
Assuming
T(n) = O(n), n > 0
Then both of the following are true
T(n) = O(2n)
T(n) = O(n2)
This is because both 2n and n2 grow as quickly as or more quickly than just plain n. EDIT: As Philip correctly notes in the comments, even a value smaller than 1 can be the multiplier of n, since constant terms may be dropped (they become insignificant for large values of n; EDIT 2: as Oli says, all constants are insignificant per the definition of O). Thus the following is also true:
T(n) = O(0.2n)
In fact, n2 grows so quickly that you can also say
T(n) = o(n2)
But not
T(n) = Θ(n2)
because the functions given provide an asymptotic upper bound, not an asymptotically tight bound.
if you mean O(2 * N) then yes O(n) == O(2n). The time taken is a linear function of the input data in both cases
I disagree with the other answer that says O(N) = O(N*N). It is true that the O(N) function will finish in less time than O(N*N), but the completion time is not a function of n*n so it really isnt true
I suppose the answer depends on why u r asking the question
O also known as Big-Oh is a upper bound. We can say that there exists a C such that, for all n > N, T(n) < C g(n). Where C is a constant.
So until an unless the large co-efficient in T(n) is smaller or equal to g(n) then that statement is always valid.

Resources