Recurrence of T(n) = T(n/3) + T(2n/3) [closed] - algorithm

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I've searched online for this but I only seem to find answers for a similar equation:
T(n) = T(n/3) + T(2n/3) + cn
But the one I'm trying to solve is:
T(n) = T(n/3) + T(2n/3)
Base case: We can assume T(a) = Theta(1) for any constant a.
I've succeeded in proving (by induction) that T(n) = O(n*log(n)). I thought the answer should be Theta(n*log(n)), but I cannot prove that T(n) = Omega(n*log(n)).
So my question is - am I correct that the answer is O(n*log(n)), and NOT Theta(n*log(n))? IF that's true that would really be great...
If I'm wrong I will of course explain where I'm stuck in the induction process...
Thanks!
P.S. If you need to, please try to explain using induction, because I haven't learned all methods for solving these problems yet.

You can't prove that it's Omega(n log n) because T(n) = n satisfies the base case and the recurrence.

Related

Prove or disprove either t(n) ∈ O(g(n)), or t(n) ∈ Ω(g(n)), or both [closed]

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 months ago.
Improve this question
Does anyone know how to prove or disapprove that:
For any two nonnegative functions t(n) and g(n) defined on the set of nonnegative
integers, either t(n) ∈ O(g(n)), or t(n) ∈ Ω(g(n)), or both.
I've found the answer of this question on Chegg, but the answer doesn't make sense for me since it just simply approved that t(n) = g(n) when n=1. However, I think it's wrong because the assertion looks like still True since it said "both", which includes the case t(n) = g(n) .
Hope someone could tell me this assertion is true and false with proof.
It's false. For example f(n)=1 if n is a multiple of 3, and n otherwise. Let g(n)=1 if n is a multiple of 2, and n otherwise.
f is neither bound above nor below by any constant multiple of g.

How to Solve: T(n) = 2 T(n/4) + T(n/4) + T(n/4) + 311 via The Master Theorem? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 1 year ago.
Improve this question
I really couldn't solve this because it was kinda tricky to me (as a new one to Master theorem).Any help would be greatly appreciated!
I've never heard of the master theorem before. But I read this when I googled it:
T(n) = aT(n/b) + f(n)
where, T(n) has the following asymptotic bounds:
1. If f(n) = O(n^(log_b a-ϵ)), then T(n) = Θ(nlog_b a).
Here, a=4; b=4; So log_b(a)=1.
Here, f is O(1), so epsilon=3 for log_b(1)=0 and n^0=1.
So it looks like T(n) is O(n).

How to prove this: log n = O(n^c) [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
In a data structure textbook, the author use this to prove that O(log^c(n))is effective because the complexity is very close to the constant, I don't quite understand the equation.
The intuitive reason why this is is true is that log is the inverse of e^x. Just as the exponential function grows faster than x^k for any k, its inverse must grow slower than x^(1/k) for any k. (Draw pictures and flip the x and y axis to get this intuition.)
However intuition does not lead to a formal proof.
So first, convince yourself that log(log(n)) = o(log(n)).
From that, for any given c, there is an N such that for all n > N that log(log(n)) < c log(n). Now take e^x of both sides and you have found that for sufficiently large n, log(n) < n^c. And therefore log(n) = O(n^c) for any given c.
But that is big-O. We wanted little-o. Well, log(n) = O(n^(c/2) which means that log(n) is actually in o(n^c). And now we're done.

Solve Recurrence Relation by Master theorem? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
Can someone please clarify this solution a little more?
T(n) = 2T(n^1/2) + log n
Solution:
Let k = log n,
T(n) = T(2^k)=2T(2^(k/2)) + k
Substituting into this the equation S(k) = T(2^k)
we get that
S(k)=2S(k/2) + k
Now, this recurrence equation allows us to use master theorem, which specifies that
S(k) is O(k log k). Substituting back for T(n) implies T(n) is O(log n log log n)
How many times can you keep dividing n by 2? Log_2(n) times. Because Log_2(n) is to what power you need to raise 2, to get n.
Also loglog(n) is how many times you can take the square root of n, so maybe that substitution isn't that necessary, if you know this.

Prove n^2 + 5 log(n) = O(n^2) [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I am trying to prove that n^2 + 5 log(n) = O(n^2), O representing big-O notation. I am not great with proofs and any help would be appreciated.
Informally, we take big-O to mean the fastest growing term as n grows arbitrarily large. Since n^2 grows much faster than log(n), that should be clear.
More formally, asymptotic behaviors are identical when the limit of the ratio of two functions approaches 1 as their parameter(s) approach(es) infinity, which should sound like the same thing. So, you would need to show that lim(n->inf)((n^2+5log(n))/n^2) = 1.

Resources