Solving a Recurrence relation formula with squared - algorithm

I Hope someone Can help me with that:
I will be asked to answer what is the runtime complexity of the algorithm.
I tried to set m=2^ and still failed
Thanks

Let n = 2x
Then T(2x) = T(2√x) + 1
Let U(x) = T(2x), so:
U(x) = U(√x)+1
We assume there is a base case, so U(x) ∈ O(log log x)
Substituting back:
T(2x) ∈ O(log log x)
T(n) ∈ O(log log log n)

Related

Does g(n) = O(f(n)) imply g(n^2) = O(f(n^2))?

I have been stuck on a problem for a while, and I would like to know if g(n) = O(log n), does g(n^2) = O(log n^2)? I would like to know whether I can apply this method to solve the aforementioned problem.
The problem in question (for context only):
If f(n) = n * g(n^2) and g(n) = O(log n), prove that f(n) = O(n log n).
The answer is yes: n is just a number, and so is n2. Imagine if you call a = n2, then isn't g(a) still O(log(a))?
Your confusion is that in fact, O(log(n2)) = O(2 log n) = O(log n). This follows from the properties of the logarithm.
To illustrate, for your specific problem, we have that: If
g(n) = O(log n)
then
g(n2) = O(log(n2)) = O(2 log n)) = O(log n).
And since
f(n) = n * g(n2)
then clearly
f(n) = O(n log n). □
(The amount of rigor your proof requires would depend on the context of your course or book.)

Recurrence Equation for new Algorithm of Factorial

I am looking for a recursive algorithm to evaluate what I call Factorial(m,n)=m*m+1*...*n, for every m
I appreciate any help.
What is the complexity of this algorithm?
Let T(n, m) be the time complexity of Factorial(n, m).
Let g(n) = Factorial(n, 1) and T"(n) be the time complexity of g(n), then:
T(n, m) <= T"(n) + T"(m - 1) for any n, m
and T"(n) = T"(n - 1) + O(1) which is O(n).
To sum up, T(n, m) = O(n) + O(m - 1) = O(n + m)
Its will have recurrence equation T(n) = T(n-1) + 2 , In case of function call of Factorial(n,1)

Complexity of a particular divide and conquer algorithm

An algorithm decomposes (divides) a problem of size n into b sub-problems each of size n/b where b is an integer. The cost of decomposition is n, and C(1)=1. Show, using repeated substitution, that for all values of 2≥b, the complexity of the algorithm is O(n lg n).
This is what I use for my initial equation C(n) = C(n/b) + n
and after k-steps of substituting I get C(n) = C(n/b^k) + n [summation(from i=0 to k-1) of (1/b)^i]
k = log(base b) n
I'm not sure I'm getting all of this right because when I finish doing this i don't get n lgn, anybody can help me figure out what to do?
I think your recurrence is wrong. Since there are b separate subproblems of size n/b, there should be a coefficient of b in front of the C(n / b) term. The recurrence should be
C(1) = 1
C(n) = b C(n/b) +O(n).
Using the Master Theorem, this solves to O(n log n). Another way to see this is that after expanding the recurrence k times, we get
C(n) = bk C(n / bk) + kn
This terminates when k = logb n. Plugging in that value of k and simplifying yields a value that is O(n log n).
Hope this helps!

Solving the recurrence T(n) = 2T(sqrt(n))

I would like to solve the following recurrence relation:
T(n) = 2T(√n);
I'm guessing that T(n) = O(log log n), but I'm not sure how to prove this. How would I show that this recurrence solves to O(log log n)?
One idea would be to simplify the recurrence by introducing a new variable k such that 2k = n. Then, the recurrence relation works out to
T(2k) = 2T(2k/2)
If you then let S(k) = T(2k), you get the recurrence
S(k) = 2S(k / 2)
Note that this is equivalent to
S(k) = 2S(k / 2) + O(1)
Since 0 = O(1). Therefore, by the Master Theorem, we get that S(k) = Θ(k), since we have that a = 2, b = 2, and d = 0 and logb a > d.
Since S(k) = Θ(k) and S(k) = T(2k) = T(n), we get that T(n) = Θ(k). Since we picked 2k = n, this means that k = log n, so T(n) = Θ(log n). This means that your initial guess of O(log log n) is incorrect and that the runtime is only logarithmic, not doubly-logarithmic. If there was only one recursive call being made, though, you would be right that the runtime would be O(log log n).
Hope this helps!
You can solve this easily by unrolling the recursion:
Now the recurrence will finish when T(1) = a and you can find the appropriate a. When a = 0 or 1 it does not make sense but when a=2 you will get:
Substituting the k into latest part of the first equation you will get the complexity of O(log(n)).
Check other similar recursions here:
T(n) = 2T(n^(1/2)) + log n
T(n) = T(n^(1/2)) + Θ(lg lg n)
T(n) = T(n^(1/2)) + 1

How to calculate T(n) = 3T(n/3) + O(lg n)

I know what the O(lg n) and the T(n) mean, and in algorithm analysis I don't know how to calculate the T(n) = 3T(n/3) + O(lg n). Should I expand it?
Just like:
T(n) = 3^2 *T(n/3^2) + O(lg n/3) + O(lg n) and so on...
then I get
T(n) = 3^(log b^n) * T(1)+ 3^[log b ^ (n-1)]* lg (n/(3^[log b ^(n-1)])) ...+ O(lg n/3) +O(lg n)
But how can I get the right answer, and can I get an easy way to find it out?
I think you can use Masters Theorem.
T(n)=aT(n/b) + f(n)
Here a=3, b=3 and f(n)=O(log n)
f(n) = O(log n) = O(n)
which implies the answer as BigTheta(n)
For Masters theorem formula plz see Wikipedia. There are three rules and are quite simple

Resources