Assume f1(n) is O(g1(n)) and f2(n) is O(g2(n)), show that f1(n)/f2(n) is not O(g1(n)/g2(n).
I have actually worked this out to
F1/f2=c1/c2
But how does this show they are not equal. I am having a problem with that
It is not true, that f1/f2 = c1/c2. As a hint, if you want to prove it is not true, it is enough to show counterexample. Think of when division can make result bigger than the thing we devide.
Related
I'm working on a practice exam and came across this problem:
True or false: 2O(log(n)) = O(n).
I'm not really sure how to figure this out.
I wanted to try to apply the definition for big-o, but I'm not sure how that works with it because of the power of two.
Therefore, (1) cannot be true.
Mathematically this is false, because e != 2 it should be e^O(logn) = O(n), but i guess it depends on the context because in terms of time complexity of a function or program, this almost true. If the question specifies its base 2 log though, it is totally true.
I just wanted to double check my intuition. I suspect a polylog dominates a log, so log(n) is O(log(n)^p). I read somewhere that powers of logs sometimes get thrown away like constants, so I wanted to double check.
In fact, log(x^a) = a*log(x), so in that case this will just be a factor. In your case you cannot simplify the power since it is outside of the log. I think your intuition may be good.
EDIT: Furthermore, log(n)/log(n)^p = 1/log(n)^(p-1) and for p > 1, the limit of this expression is 0. This explains your intuition (along with some basic notions on asymptotic comparison).
def unknownsort(A[],x,y):
if x ==y+1:
if A[x]>A[y]:
switch A[x] and A[y]
elif y > x+1:
z = (y-x+1)/3
unkownsort(A[],x,y-z)
unkownsort(A[],x+z,y)
unkownsort(A[],x,y-z)
Is there a name for this equation? For T(n) what is have is
T(n)= 3T(n) + Theta(n) is this right? I was planning to use Master's Theorem but im not sure exactly if this is right. Also what do you call this process of finding T(n)
I was thinking unkownsort is called three times so, T(n) = 3T(n), but it has a base case depending on the size of the input so T(n) = 3T(n)+theta(n). Now I was wondering if this equation would be wrong because of "z" since z manipulates the size of my array.
Somehow ive come up with this: T(n) = 3T(n/3)+1. Is this correct now?
Ok, homework, let's stick to hints then.
In 3T(n), the 3 is correct since there are 3 recursive calls, but the T(n) is not - the n should be (in the form n/c) the size which the next recursive calls work with, currently you're saying the size is the same.
The Theta(n) is incorrect - apart from the recursive calls, how much work is done in the function? Does the amount of work depend on x and y (you should probably assume that any arithmetic operation always takes a constant amount of time, although this isn't strictly true)?
Did you give us the whole function? If so, I'm not convinced that that algorithm of yours does anything particularly useful (while it looks like it is sorting, I'm not convinced it is, but I could be wrong), and thus probably doesn't have a name.
The T(n) equation is called a recurrence relation, so the process of finding it would simply be called the process of finding the recurrence relation (I'm not aware of a single term to denote this).
(The updated equation you edited into your question is correct)
First, yes it's my HW and I find it hard so I'll really appreciate some guidance.
I need to prove that for denomination of 1,x,x2...xn when x>=1 the greedy algorithm for the coins problem always work .
We will always get the amount of money we need in minimal coins number when we always pick the maximal coin that smaller from the amount.
Thank you.
As this is your homework I will not provide a complete answer but will rather try to guide you:
First as it usually happens for problems of that type try and prove for yourself that the statement is true for the first few natural numbers. Try to summarize what you use to make the proof for them. This usually will give you some guidance of the correct approach.
I would use induction for this one.
Another option that might help you - represent all the numbers in numerical system with base x. This should make it clearer why the statement is true.
Hope this helps you.
I have the following recursion: T(n) = 2*T(n/4) + T(n/2) + n and I need to know the exact equation, I know Master theorem won't help me, and the iteration seems to be wrong...
Please tell me how to do it in general for such recursions.
Thanks in advance.
Hey all, thanks for replying I need complexity. I need to understand how to solve such problems.
T(n) = O(nlogn) and W(nlogn)
To prove that, by definition of O, we need to find constants n0 and c such that:
for every n>=n0, T(n)<=cnlogn.
We will use induction on n to prove that T(n)<=cnlogn for all n>=n0
Let's skip the base case for now... (we'll return later)
Hipothesis: We assume that for every k<n, T(k)<=cklogk
Thesis: We want to prove that T(n)<=cnlogn
But, T(n)=2T(n/4)+T(n/2)+n
Using the hipothesis we get:
T(n)<=2(c(n/4)log(n/4))+c(n/2)log(n/2)+n=cnlogn + n(1-3c/2)
So, taking c>=2/3 would prove our thesis, because then T(n)<=cnlogn
Now we need to prove the base case:
We will take n0=2 because if we take n0=1, the logn would be 0 and that wouldn't work with our thesis. So our base cases would be n=2,3,4. We need the following propositions to be true:
T(2) <= 2clog2
T(3) <= 3clog3
T(4) <= 4clog4
So, by taking c=max{2/3, T(2)/2, T(3)/3log3, T(4)/8} and n0=2, we would be finding constants c and n0 such that for every natural n>=n0, T(n)<=cnlogn
The demonstration for T(n) = W(nlogn) is analog.
So basically, in these cases where you can't use the Masther Theorem, you need to 'guess' the result and prove it by induction.
For more information on these kind of demonstrations, refer to 'Introduction to Algorithms'
First of all you need to define some limits on this, otherwise it won't ever end and you will stuck up with OverflowException.
Something like the n is integer and the minimal value is 0.
Could you please bring up more details on your question in this manner ?
This won't help you figure out how to do it necessarily, but apparently Wolfram Alpha can get the right answer. Perhaps you can look for documentation or have Mathematica show you the steps it takes in solving this:
Wolfram Alpha: T(n)=2*T(n/4)+T(n/2)+n
To put crude upper and lower bounds on the search space, you could have recognized your T(n) is bounded above by 3T(n/2) + n and below by 2T(n/4) + n... so O(n^(3/2)) and W(n), by the master theorem.
In general, solving recurrence relations hard problem.