Time complexity for modification on quicksort - algorithm

Let a function named QUARTERSORT which gets an array and sort it in the following way:
If n<100` it uses the regular QUICKSORT
Otherwise, we split the array to on A1 = A[1,...,n/4] and A2 = A[(n/4)+1,...,n].
Then, we call QUARTERSORT twice: B1 = QUARTERSORT(A1) and B2 = QUARTERSORT(A2).
Finally, we merge B1 and B2.
Now, why is the recurrence T(n) = T(0.25n) + T(0.75n) + O(n) and not T(n) = T(0.25n) + T(0.75n) + O(nlogn)?

Intuitively, you can ignore the part about quicksort, because it only happens for small n, and the big-O notation only talks about values of n that are "big enough". So the parts of the algorithm are:
Recursive invocation on 1/4 of input: T(1/4 * n)
Recursive invocation on 3/4 of input: T(3/4 * n)
Merging: O(n)
A bit more formally: time complexity of quicksort is O(1). This addition can be safely ignored, because there are larger parts in the time complexity, like O(n).

The recurrence is T(n) = T(0.25n) + T(0.75n) + O(n), because every step of the algorithm by its own is O(n). Splitting the array to 2 parts is O(n), and merging the two parts is O(n), so each step by its own is O(n), which gives us total of T(n) = T(0.25n) + T(0.75n) + O(n) as expected.

Quick Sort takes O(n) to find the pivot. Once the pivot is found, it remains unchanged.
The size of 2 subproblems are O(N/4) and O(3N/4), so the recurrence is
T(n) = T(0.25n) + T(0.75n) + O(n)

Related

Recurrence Relation for Divide and Conquer

Describe the recurrence running time T(n) on an input of size n?
A divide and conquer algorithm takes an array of n elements and divides into three sub arrays of size n/4 each taking Θ(n) time to do the subdivision. The time taken to combine outputs of each sub-problem is Θ(1).
I came with this recurrence relation, but it's not correct
T(n) = 3T(n/4) + Θ(1)
Can someone knows what I am doing wrong on this?
You missed taking Θ(n) time to do the subdivision part.
So relation should include subdivision + working on smaller parts + combining
T(n)= Θ(n) + 3T(n/4) + Θ(1) = 3T(n/4) + Θ(n)

Recurrence relations, analyze algorithm

Alright I have some difficulty completely understanding recurrence relations.
So for example if I'm analyzing quick sort in the worst case using Θ-notation, giving the array an input of unsorted positive numbers;
When base case n <= 3 I sort the small array using insertionsort.
Time: O(1) or O(n^2)?
I use linear search to set the pivot as the median of all elements.
Time: Θ(n)
Partitioning left and right of pivot and performing recursion.
Time: 2 * T(n/2) I think
would the recurrence be:
T(n) = O(1) + Θ(n) + 2 * T(n/2) then?
Something tells me the base case would instead take O(n^2) time because if the input is small enough that would be the worst case.
Would that then give me the recurrence:
T(n) = O(n^2) + Θ(n) + 2 * T(n/2)?
When base case n <= 3 I sort the small array using insertionsort.
always O(1)
I use linear search to set the pivot as the median of all elements.
You may want to clarify this more, to find the median as pivot is not a simple task of doing linear search. The few ways are i) using
Quick Select Algorithm (average O(N)) or ii) sorting the subpartition
O(NlogN) iii) median-of-median algorithm O(N).
Partitioning left and right of pivot and performing recursion.
2F(N/2) + N
Putting it all together (Assuming that the pivot is always the median & that you take O(N) to find the pivot each time):
Best_Case = Worst_Case (since the pivot is always median)
F(3) = F(2) = F(1) = 1
F(n) = 2F(N/2) + 2N
= 2(2F(N/2^2) + 2(N/2)) + 2N
= 2(2)F(N/2^2) + 2N + 2N
= 2(3)F(N/2^3) + 3(2)N
= 2(LogN)F(N/N) + (2N)LogN
= O(NlogN)

Is my Big O approximations correct?

I have an algorithm to determine if two strings are permutations of each other. The code can be found here: https://jsfiddle.net/bxohcgjn/
N = String A
M = String B
For time complexity, i have: O(N(N logN + M logM))
For space complexity, I have: O(N + M)
N logN = for sorting A
M logM = for sorting B
I understand that the browser's implementation for sort will change it, but I am assuming quicksort.
Just want to see if my thinking is correct on this.
About your time complexity, the for loop (the lineal order), must not be multiplied by the sum of both sorts.
If an algorithm consists of n steps, the order of the algorithm is the sum of their orders:
O(alg) = O(step_1) + O(step_2) + ... + O(step_n)
In your case, n = 3 (both sorts and the for):
O(is_permutation) = O(sort_A) + O(sort_B) + O(for)
= O(n logn) + O(m logm) + O(n)
Which is the maximum of them:
O(is_permutation) = max(O(n logn), O(m logm), O(n))
But since you test before that the sizes of both strings must be the same before applying the sort, in the worst case (which is what you are analyzing), there is only one size, so, the expression is translated to:
O(is_permutation) = max(O(n logn), O(n logn), O(n))
= max(O(n logn), O(n))
= O(n logn)

Expected running time value of a random algorithm

Given this algorithm, I am required to :
Find the recursion formula of the expected value of the running time.
Find the closest upper bound as possible.
I am actually a bit lost so if someone could help...
Recursive formula for worst case: T(n) = T(n/2) + n
Recursive formula for best case: T(n) = T(1) + n
Recursive formula for expected case: T(n) = T(n/4) + n
Worst case: 2n = O(n)
Best case: n = O(n)
Expected case: 4n/3 = O(n)
Some people here seem to be confused about the log(n) factor. log(n) factor would only be required if T(n) = 2T(n/2) + n i.e. if the function was called TWICE recursively with half the input.

Time complexity of one recursive algorithm

here we have an algorithm
T(n) = n-1 + T(i-1) + T(n-i)
T(1) = 1
How to calculate it's time complexity?
i is between 1 and n
I can recognise this as quick sort algorithm (randomized quick sort).
I am sure the question somehow missed the summation part.
Okay! you can use substitution method over here..check with O(n^2). you will get the idea that O(n^2) is the worst case time complexity.
The average case is a bit tricky. Then the pivot can be any element from 1 to n. Then analyse it. Here also you can apply substituin with T(n)=O(nlogn).
I think we should solve it like this
if i = 2 then we have
T(n) = n + T(n-2) = Theta(n^2)
if i = n/2 then we have
T(n) = n-1 + T(n/2 -1) + T(n/2) = Theta(n.logn)
then we have upper bound O(n^2) and algorithm is in order of O(n^2)

Resources