Solving recurrence: T(n)=sqrt(2)T(n/2)+log(n) [closed] - algorithm

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
Given the equation T(n)=sqrt(2)T(n/2)+log(n).
The solution points to case 1 of the M.T. with a complexity class of O(sqrt(n)). However after my understanding log(n) is polynomial greater then sqrt(n). Am I missing something?
I used the definition as following: n^e = log_b(a) where a = sqrt(2) and b = 2. This would give me e = 1/2 < 1. log n is obviously polynomial greater then n^e.

No. logx n is not greater than √n.
Consider n=256,
√n = 16,
and
log2 256 = 8 (let us assume base x=2, as with many of the computational problems).
In your recurrence,
T(n)= √2 T(n/2) + log(n)
a = √2, b = 2 and f(n) = log(n)
logb a = log2 √2 = 1/2.
Since log n < na, for a > 0, We have Case 1 of Master Theorem.
There for T(n) = Θ(√n).

Using the masters theorem you get: a=sqrt(2), b = 2 and therefore c = logb(a) = 1/2. Your f(n) = log(n) and therefore you fall into the first case.
So your complexity is O(sqrt(n))

Related

What is the complexity of T(n) = T(3n/4) + 2T(n/2) + n? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I have having trouble solving the problem. I figured out that at last, it will be n[7^log n base (3/4)], what will be the answer.
You can rewrite the equation like the following:
T(n) = T(n/(4/3)) + 2T(n/2) + n
Now as 4/3 < 2, We can say T(n) < 3 T(n/(4/3)) + n. Then using master theorem, T(n) = O(n^{log_{4/3}(3)}) = O(n^{3.82}). On the other, we can find a lower bound as T(n) > 3T(n/2) + n, T(n) = Omega(n^{log_2(3)})=Omega(n^{1.58}).
To find exact solution, you can use Akra-Bazzi Theorem as well:
a1 = 1, b1 = 3/4
a2 = 2, b2 = 1/2
Find a p such that:
(3/4)^p + 2 (1/2)^p = 1
p ~ 2.17. So, T(n) = \Theta(n^{2.17} * (1 + int(u/u^{3.17},1,n)) ) = Theta(n^{2.17}) (as int(u/u^{3.17},1,n) = int(1/u^{2.17},1,n) = O(1)).
In sum:
T(n) = Theta(n^{2.17})

What kind of algorithm does this recurrence relation represent? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 4 years ago.
Improve this question
Take a look at this recurrence relation for algorithm complexity:
T(n) = 2 T(n-1) - 1
What kind of algorithm does this recurrence relation represent. Notice that there's a minus instead of plus, so it can't be a divide and conquer algorithm.
What kind of algorithm will have complexity with this as it's recurrence relation?
T(n) = 2 T(n-1)-1
T(n) = 4 T(n-2)-3
T(n) = 8 T(n-3)-7
T(n) = 16 T(n-4)-15
...
T(n) = 2^k T(n-k) - 2^(k-1)
If, for example T(1) = O(1) then
T(n) = 2^(n-1) O(1) - 2^(n-2) = O(2^(n-1)) = O(2^n)
which is an exponential growth.
Now let's see that O(1) - 1 = O(1). From CLRS:
O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 <= f(n) <= c g(n) for all n >= n0 }
Thus to remove effect of -1 we just need to increase hidden constant c by one.
So, as long as your base case have complexity like O(1), O(n) with n > 0 you shouldn't care about -1. In other words if you base case makes recurrence T(n) = 2 T(n-1) at least exponential in n you don't care about this -1.
Example: imagine that you are asked to told if a string S with n characters contains specified character. And you proceed like this, you run algorithm recursively on S[0..n-2] and S[1..n-1]. You stop the recursion when string is one character length, then you just check the character.
Based on the time complexity given, it is an exponential algorithm.
For reduction of size by 1, you are multiplying the time by 2 (approximately)
So it does not come under any polynomial time algorithmic paradigms like divide and conquer, dynamic programming, ...

Big O notation of simple expressiosn [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
Why
2n^2 = O(n^3)
As definition says
if f(n)<= cg(n),
n ,c > 0
for all n > n0
and since there can be many upper bounds
So any other and better Upper bound
Definition from Skiena:
f(n)=O(g(n)) means c·g(n) is an upper bound on f(n). Thus there exists
some constant c such that always f(n) ≤ c·g(n), for large enough n
(i.e. , n ≥ n0 for some constant n0).
Here f(n) = 2n^2, g(n) = n^3
Let's take constant c = 2. So 2n^2 <= 2n^3 for n >= 1. So it is true.
Of course you can show the same way that it is O(n^2) for same c = 2
From wiki:
A description of a function in terms of big O notation usually only
provides an upper bound on the growth rate of the function.
The big O notation only provides an upper bound, so...
2n² = O(n²), 2n² = O(n³), ... , 2n² = O(whatever bigger than n²)

What is the time complexity of the following codes in Big O notation? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
T(n) = 8*T(n/2) + n*n
T(n) = 3*T(n/4) + n
I want to calculate the time complexity in Big O notation . What is the answer (without using master theorem)
The master theorem applies to any recurrence of the form T(n) = a*T(n/b) + n^c. It looks at and compares the two parts of the recurrence:
1) The size of the constant work at this level (c)
2) The number and size of the recursive calls (a and b)
From here, We compare log_b (a) to c. There are three possibilities
log_b (a) > c -> T(n) is O(n^log_b (a))
log_b (a) < c -> T(n) is O(n^c)
log_b (a) = c -> T(n) is O(n^c log(n))
So for your two examples...
T(n) = 8*T(n/2) + n*n, therefore a = 8, b = 2, c = 2, log_2 (8) > 2, therefore T(n) is O(n^(log_2 (8)) = O(n^3)
T(n) = 3 * T(n/4) + n, therefore a = 3, b = 4, c = 1, log_4 (3) < 1, therefore T(n) is O(n^c) = O(n)
A fuller explanation on Wikipedia
For the first relation, you can do this:

Asymptotic complexity of T(n)=T(n-1)+1/n [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
There is an algorithm which has the time complexity
T(n)=T(n-1)+1/n if n>1
=1 otherwise
I am solving for its asymptotic complexity, and getting order as 'n' but the answer given is 'log n'. Is it correct? If it is log n, then why?
It can be easily seen (or proven formally with induction) that T(n) is the sum of 1/k for the values of k from 1 to n. This is the nth harmonic number, Hn = 1 + 1/2 + 1/3 + ... + 1/n.
Asymptotically, the harmonic numbers grow on the order of log(n). This is because the sum is close in value to the integral of 1/x from 1 to n, which is equal to the natural logarithm of n. In fact, Hn = ln(n) + γ + O(1/n) where γ is a constant. From this, it is easy to show that T(n) = Θ(log(n)).
For more details:
With H(N) = 1 + 1/2 + 1/3 + ... + 1/N
the function x :-> 1/x is a decreasing function so :
We sum from 1 to N the left part and for the right part we sum from 2 to N and we add 1, we get:
Then we calculate the left and right parts : ln(N+1) <= H(N) <= 1 + ln(N)
this implies H(N)/ln(N) -> 1 hence H(N)=Θ(log(N))
(from http://fr.wikipedia.org/wiki/S%C3%A9rie_harmonique#.C3.89quivalent_de_Hn)

Resources