Merge Sort's running time in recursive version - algorithm

I learned that time function of merge sort is right below.
T(n) = 2T(n/2) + Θ(n) if n>1
I understand why T(n) = 2T(n/2)+ A
But why does A = Θ(n)?
I think A is maybe dividing time, but i don't understand why it is expressed as Θ(n)
Please help!

No, A is not the dividing step. A is the merging step which is linear.
void merge(int a[], int b[], int p, int q, int c[])
/* Function to merge the 2 arrays a[0..p} and b[0..q} into array c{0..p + q} */
{
int i = 0, j = 0, k = 0;
while (i < p && j < q) {
if (a[i] <= b[j]) {
c[k] = a[i];
i++;
}
else {
c[k] = b[j];
j++;
}
k++;
}
while (i < p) {
c[k] = a[i];
i++;
k++;
}
while (j < q) {
c[k] = b[j];
j++;
k++;
}
}
This merging step takes O(p + q) time when p and q are the subarray lengths and here p + q = n.

Related

How to calculate O(nlogn) iterative merge sort time complexity step by step using sigma notation

void merge(int arr[], int l, int m, int r);
int min(int x, int y) { return (x<y)? x :y; }
void mergeSort(int arr[], int n)
{
int curr_size;
int left_start;
for (curr_size=1; curr_size<=n-1; curr_size = 2*curr_size)
{
for (left_start=0; left_start<n-1; left_start += 2*curr_size)
{
int mid = min(left_start + curr_size - 1, n-1);
int right_end = min(left_start + 2*curr_size - 1, n-1);
merge(arr, left_start, mid, right_end);
}
}
}
void merge(int arr[], int l, int m, int r)
{
int i, j, k;
int n1 = m - l + 1;
int n2 = r - m;
int L[n1], R[n2];
for (i = 0; i < n1; i++)
L[i] = arr[l + i];
for (j = 0; j < n2; j++)
R[j] = arr[m + 1+ j];
i = 0;
j = 0;
k = l;
while (i < n1 && j < n2)
{
if (L[i] <= R[j])
{
arr[k] = L[i];
i++;
}
else
{
arr[k] = R[j];
j++;
}
k++;
}
while (i < n1)
{
arr[k] = L[i];
i++;
k++;
}
while (j < n2)
{
arr[k] = R[j];
j++;
k++;
}
}
I couldn't figure out O(nlogn) time complexity of the iterative merge sort.
I need to calculate myself using steps like we do in normal calculation of time complexities. Calculations of Recursive merge sort is available but i couldn't find and figure out how to solve for iterative one.
Please help
Assuming n is a power of 2, there's n merges of size 1, and n/2 merges of size 2, n/4 merges of size 4, and so on, down to 1 merge of size n.
The cost of a merge of size k is linear in k.
So the total cost is n + n/2 * 2 + n/4 * 4 + ... + 1 * n = n log_2 n.

Time complexity of this algorithm? How to analysis?

Time complexity of this algorithm? How to analysis?
int fun(int n)
{
int i = 0, j = 0, m = 0;
for (i = n; i > 0; i /= 2)
{
for (j = 0; j < i; j++)
{
m += 1;
}
}
return m;
}
Your running time is Sum_i (n/2^{i}), for i in {0 to log(n)}. n is the leading term, thus O(n). The sum will not exceed 2n.

Kth largest number, why the runtime of this is O(n) not O(nlogn)

I came across kth largest number problem in Leetcode
Input: [3,2,1,5,6,4] and k = 2, Output: 5
Suggested Solution:
public int findKthLargest(int[] nums, int k) {
shuffle(nums);
k = nums.length - k;
int lo = 0;
int hi = nums.length - 1;
while (lo < hi) {
final int j = partition(nums, lo, hi);
if(j < k) {
lo = j + 1;
} else if (j > k) {
hi = j - 1;
} else {
break;
}
}
return nums[k];
}
private int partition(int[] a, int lo, int hi) {
int i = lo;
int j = hi + 1;
while(true) {
while(i < hi && less(a[++i], a[lo]));
while(j > lo && less(a[lo], a[--j]));
if(i >= j) {
break;
}
exch(a, i, j);
}
exch(a, lo, j);
return j;
}
private void exch(int[] a, int i, int j) {
final int tmp = a[i];
a[i] = a[j];
a[j] = tmp;
}
private boolean less(int v, int w) {
return v < w;
}
Doesn't partition take O(n) and the while loop in the main function take O(log n) so it should be O(nlog n)? This looks like it uses Quicksort but the runtime for quicksort is O(nlogn). If quicksort takes O(n), this makes sense but it does not. Please help me understand what is going on?
This is a randomized algorithm that has average/expected O(n) runtime. This is because after randomly shuffling the input list, we typically have pivots good enough to expect that after each partition function call if we don't find the target yet we reduce our list (to be search next) roughly by half. This means even though if we not lucky and have to continuously call partition function we continuously keep reducing our list's size by half, therefore the average runtime is still only O(n) since O(n) + O(n/2) + O(n/4) + ... + O(1) is still O(n).

Recurrence equation - Recursion inside for loop

I was trying to solve my university problem about recurrence equations and computational complexity but I can't understand how to set the recurrence equation.
static void comb(int[] a, int i, int max) {
if(i < 0) {
for(int h = 0; h < a.length; h++)
System.out.print((char)(’a’+a[h]));
System.out.print("\n");
return;
}
for(int v = max; v >= i; v--) {
a[i] = v;
comb(a, i-1, v-1);
}
}
static void comb(int[] a, int n) { // a.length <= n
comb(a, a.length-1, n - 1);
return;
}
I tried to set the following equation
O(n) + c if i < 0
T (n, i, j) = {
(j-i) T(n, i-1, j-1) otherwise
Solving
T(n, i, j) = (j-i) T(n, i-1, j-1) =
(j-i) (j-1-i+1) T(n, i-2, j-2) =
(j-i)^k T(n, i-k, j-k)
At this point I'm stuck and I can not figure out how to proceed.
Thanks and sorry for my bad english.
Luigi
With your derivation
T(n, i, j) = ... = (j-i)^k T(n, i-k, j-k)
you are almost done! Just set k = i+1 and you get:
T(n, i, j) = (j-i)^(i+1) T(n,-1,j-i-1) = (j-i)^(i+1) O(n) =
O(n (j-i)^(i+1))

3SUM With a twist

I got asked this question in an interview and was not sure how to answer. This is a regular 3SUM problem and we all know the O(n^2) answer. Question goes this way: You have 3 non-sorted arrays a, b, c. Find three element such that a[i] + b[j] + c[k] = 0. You are not allowed to use hashing in this scenario and the solution must be <= O(n^2)
Here is my answer and yes this is still O(n^3) unfortunately
public static void get3Sum(int[] a, int[] b, int[] c) {
int i = 0, j = 0, k = 0, lengthOfArrayA = a.length, lengthOfArrayB = b.length, lengthOfArrayC = c.length;
for (i = 0; i < lengthOfArrayA; i++) {
j = k = 0;
while (j < lengthOfArrayB) {
if (k >= lengthOfArrayC) {
j++;
continue;
} else if (a[i] + b[j] + c[k] == 0) {
// found it: so print
System.out.println(a[i] + " " + b[j] + " " + c[k]);
k++;
if (j > lengthOfArrayB - 1)
break;
} else {
k++;
if (k >= lengthOfArrayC) {
j++;
k = 0;
}
}
}
}
}
Anyone has any brilliant ideas to solve this in less then or equal to O(N^2)?
Thanks!
Sort A and Sort B.
Once we sort, given an S, in O(n) time, we can solve the problem of finding i,j such that A[i] + B[j] = S.
This we can do by maintaining two pointers a and b, a initially at the lowest element of A, and b at the largest. Then you increment a or decrement b appropriately after comparing A[a] + B[b] with S.
For your problem, run the O(n) algorithm n times (so O(n^2)) by taking S to be all -C[k].

Resources