Analysis of algorithm about log log n [closed] - complexity-theory

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Consider f(n) = log log (n) , g(n) = 10^10^10^10^10^10 , then f(n) is O(g(n)) .
Is that above consideration true or false ? Please Mention the reasons behind that consideration.

f(n) grows (albet very slowly), and g(n) is a constant.
lim n-> inf, f(n) / g(n) = 0, so this is false

Related

Solve Recurrence Relation by Master theorem? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
Can someone please clarify this solution a little more?
T(n) = 2T(n^1/2) + log n
Solution:
Let k = log n,
T(n) = T(2^k)=2T(2^(k/2)) + k
Substituting into this the equation S(k) = T(2^k)
we get that
S(k)=2S(k/2) + k
Now, this recurrence equation allows us to use master theorem, which specifies that
S(k) is O(k log k). Substituting back for T(n) implies T(n) is O(log n log log n)
How many times can you keep dividing n by 2? Log_2(n) times. Because Log_2(n) is to what power you need to raise 2, to get n.
Also loglog(n) is how many times you can take the square root of n, so maybe that substitution isn't that necessary, if you know this.

Prove n^2 + 5 log(n) = O(n^2) [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I am trying to prove that n^2 + 5 log(n) = O(n^2), O representing big-O notation. I am not great with proofs and any help would be appreciated.
Informally, we take big-O to mean the fastest growing term as n grows arbitrarily large. Since n^2 grows much faster than log(n), that should be clear.
More formally, asymptotic behaviors are identical when the limit of the ratio of two functions approaches 1 as their parameter(s) approach(es) infinity, which should sound like the same thing. So, you would need to show that lim(n->inf)((n^2+5log(n))/n^2) = 1.

What is time-complexity of T(N)=4T(N/2)+(N^2)/logN [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
This question was given in MIT video on analysis of algorithms,
The following question can not be done using master method and can be solved using recurrence tree.
Can any please tell me the solution?
Why exactly do you claim that it can not be done with the masters theorem?. This theorem has only some constraints that a and b are constants and a >= 1 and b > 1. It will hold for any f(n) and therefore you can apply it here.
If you will apply it you would see that a=4, b=2 and therefore c = 2. n^c grows faster than your f(n) and therefore the complexity is O(n^2).

Is (log n)^k = O(n^1/2)? For k greater or equal to 0 [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
In big-O notation is O((log n)^k) = O(log n), where k is some constant right? So what's happening with the (log n)^k when k>=0?
Perhaps this might be the source of the misunderstanding?
log(n^k) = k * log(n), but no such simplification works for log(n)^k = (log(n))^k.
O((log n) * k) == O(log n), but (log n)^k is definitely not the same thing. I believe you're thinking of constant multiplication, which is equivalent in big O notation. However, raising f(n) to a power changes the time to completion. This is the same concept as O(n) being different from O(n^2).

Method to solve the stated recurrence? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Need help finding a method for solving the following:
Given f(n) to be 9f(n/3)+(n2)*(log3n) for all n > 1.
And given f(1)=1.
Solve for f(n)
I tried the master theorem, but all the 3 cases did not fit here, my guess would be using the substitution method, but I am not sure how to apply it
Use the substitution f(n) = n2g(n).
This gives us g(n) = g(n/3) + log n.
And so g(n) = Θ(log2n) and f(n) = Θ(n2log2n)

Resources