Assume we have a set of n jobs to execute, each of which takes unit time. At any time we can serve exactly one job. Job i, 1<=i<=n earns us a profit if and only if it is executed no later than its deadline.
We can a set of jobs feasible if there exists at least one sequence that allows each job in the set to be performed no later than their deadline. "Earliest deadline first" is feasible.
Show that the greedy algorithm is optimal: Add in every step the job with the highest value of profit among those not yet considered, provided that the chosen set of jobs remains feasible.
MUST DO THIS FIRST: show first that is always possible to re-schedule two feasible sequences (one computed by Greedy) in a way that every job common to both sequences is scheduled at the same time. This new sequence might contain gaps.
UPDATE
I created an example that seems to disprove the algorithm:
Assume 4 jobs:
Job A has profit 1, time duration 2, deadline before day 3;
Job B has profit 4, time duration 1, deadline before day 4;
Job C has profit 3, time duration 1, deadline before day 3;
Job D has profit 2, time duration 1, deadline before day 2.
If we use greedy algorithm with the highest profit first, then we only get job B & C. However, if we do deadline first, then we can get all jobs and the order is CDB
Not sure if I am approaching this question in the right way, since I created an example to disprove what the question wants
This problem looks like Job Shop Scheduling, which is NP-complete (which means there's no optimal greedy algorithm - despite that experts are trying to find one since the 70's). Here's a video on a more advanced form of that use case that is being solved with a Greedy algorithm followed by Local Search.
If we presume your use case can indeed be relaxed to Job Shop Scheduling, than there are many optimization algorithms that can help, such as Metaheuristics (including Local Search such as Tabu Search and Simulated Annealing), (M)IP, Dynamic Programming, Constraint Programming, ... The reason there are so many choices, is because none are perfect. I prefer Metaheuristics, as they out-scale the others in all the research challenges I've seen.
In fact, neither "earliest deadline first", "highest profit first" nor "highest profit/duration first" are correct algorithm...
Assume 2 jobs:
Job A has profit 1, time duration 1, deadline before day 1;
Job B has profit 2, time duration 2, deadline before day 2;
Then "earliest deadline first" fails to get correct answer. Correct answer is B.
Assume another 5 jobs:
Job A has profit 2, time duration 3, deadline before day 3;
Job B has profit 1, time duration 1, deadline before day 1;
Job C has profit 1, time duration 1, deadline before day 2;
Job D has profit 1, time duration 1, deadline before day 3;
Job E has profit 1, time duration 1, deadline before day 4;
Then "highest profit first" fails to get correct answer. Correct answer is BCDE.
Assume another 4 jobs:
Job A has profit 6, time duration 4, deadline before day 6;
Job B has profit 4, time duration 3, deadline before day 6;
Job C has profit 4, time duration 3, deadline before day 6;
Job D has profit 0.0001, time duration 2, deadline before day 6;
Then "highest profit/duration first" fails to get correct answer. Correct answer is BC (Thanks for #dognose's counter-example, see comment).
One correct algorithm is Dynamic Programming:
First order by deadline ascending. dp[i][j] = k means within the first i jobs and within j time units we can get k highest profit. Then initially dp[0][0] = 0.
Jobs info are stored in 3 arrays: profit are stored in profit[i], 1<=i<=n, time duration are stored in time[i], 1<=i<=n, deadline are stored in deadline[i], 1<=i<=n.
// sort by deadline in ascending order
...
// initially 2 dimension dp array are all -1, -1 means this condition unreachable
...
dp[0][0] = 0;
int maxDeadline = max(deadline); // max value of deadline
for(int i=0;i<n;i++) {
for(int j=0;j<=maxDeadline;j++) {
// if do task i+1 satisfy deadline condition, try to update condition for "within the first i+1 jobs, cost j+time[i+1] time units, what's the highest total profit will be"
if(dp[i][j] != -1 && j + time[i+1] <= deadline[i+1]) {
dp[i+1][j+time[i+1]] = max(dp[i+1][j+time[i+1]], dp[i][j] + profit[i+1]);
}
}
}
// the max total profit can get is max value of 2 dimension dp array
The time/space complexity (which is n*m, n is job count, m is maximum deadline) of DP algorithm is highly dependent on how many jobs and the maximum deadline. If n and/or m is rather large, it maybe difficult to get answer, while for common use, it will work well.
The problem is called Job sequencing with deadlines, and can be solved by two algorithms based on greedy strategy:
Sort input jobs decreasing on profit. For every job put it in list of jobs of solution sorted increasingly on deadline. If after including a job some jobs in solution has index grater than deadline, do not include this job.
Sort input jobs decreasing on profit. For every job put it in the list of job of solution on the last possible index. If there is no free index less or equal to the job deadline, do not include the job.
public class JOB {
public static void main(String[] args) {
char name[]={'1','2','3','4'};
int dl[] = {1,1,4,1};
int profit[] ={40,30,20,10};
char cap[] = new char[2];
for (int i =0;i<2 ;i++)
{
cap[i]='\0';
}
int j;
int i =0;
j = dl[i]-1;
while (i<4)
{
if(j<0) {
i++;
if(i<4)
j = dl[i]-1;
}
else if(j<2 && cap[j]=='\0')
{
cap[j] = name[i];
i++;
if(i<4)
j = dl[i]-1;
}
else
j=j-1;
}
for (int i1 =0 ; i1< 2 ; i1++)
System.out.println(cap[i1]);
}
}
Related
I am looking for a algorithm that can help me to find out the best way to assign the task to my team.
So here is the problem.
I have n team members (For example n=2) and I have to complete m task (for example m=4) and for every task every team member have their capacity to complete in time. Let say
One condition: the task can only be assigned continuously and output should be the minimum efforts.
in above example output would be 8. Either assign task1 & task2 to member1 and task3 & task4 to member2.
OR task1 to member1 and rest to member2 OR all the task to member2.
I know the stackoverflow helps developer to resolve the error but i don't understand how to build logic for the above problem.
Thanks in advance for suggestion of any algorithm to resolve this problem.
output: 6
My understanding of this problem is that we have a list of members, a list of tasks, and each member has a cost for each task. Starting at the beginning of the tasks we assign some to member 1, then the next block to member 2, then the next block to member 3 and so on. The order is fixed.
We are trying to minimize which member takes the longest.
Is that true?
If so then I recommend doing a binary search for the length of time that the longest member takes. Any time you don't complete everything in time, you have to increase the amount of time. Any time you complete everything, you decrease. The twist is that you need to keep track of what time period you would have made different choices at.
The heart of the algorithm is a function like this pseudocode:
try_time_period(members, costs, max_cost_per_member):
min_same_result = min_change_result = max_cost_per_member
i = 0
for each member:
cost = 0
while cost < max_cost_per_member:
this_cost = cost[member][i]
if max_cost_per_member <= cost + this_cost:
cost += this_cost
i++
if len(costs[member]) <= i:
return (True, min_same_result, min_change_result)
else:
if cost < min_same_result:
min_same_result = cost
if min_change_result < cost + this_cost:
min_change_result = cost + this_cost
next member
return (False, min_same_result, min_change_result)
And with that, we can build a binary search for the max_cost_per_member.
lower = 0
upper = time for first member to do everything
while lower < upper:
mid = (lower + upper)/2
(success, min_same_result, min_change_result) = try_time_period(members, costs, mid)
if success:
upper = min_same_result
else:
lower = min_change_result
And they will converge at the lowest time for completing everything from your assignments. Now you just work out the previous solution for that, and you're done.
Given 2 Lists that indicates the arriving time and leaving time of each guest to a party how can I find the largest number of guests (or who they'r) that hangs together for at least minTime seconds?
Example:
Input:
List<float> ArrivingList = {3,2,8,12,5};
List<float> LevingList = {17,7,19,15,11};
int minTime = 4;
Meaning that first guest arrives at time 3 and leave at time 17, second guest arrives at time 2 and leave at time 7. etc...
Output: {0,1}; //Or {0,4} both are correct for this example.
I know how to solve it without the minTime demand, but this version I just couldn't figure out.
EDIT: Please note that my question is NOT a duplicate of this one.
I'm looking for the maximum number of guests that DO overlap AND for a defined period of time.
Edit 2 My goal is to get the largest overlapping subset of the guests that spends minTime together.
Example 2:
Input:
List<float> ArrivingList = {1,2,3};
List<float> LevingList = {4,5,6};
int minTime = 3;
Consider the interval (2,5). Even though there is an overlap of 3 seconds it's not continues and switch between guest #0 and guest #2.
`Output:` {0};// or {1} or {2} because all of the guests spends the min time some time but never together
I guess you can use the following algorithm:
Init answer as empty array
For each pair of guess i,j:
OverlapTime = min(leaving(i),leaving(j)) - max(arriving(i),arriving(j))
If overlapTime >= minTime:
Push (i,j) to answer array
This will be O(n^2)
We are given two arrays M (money) and E (experience) of integers each of size 50 at most. After Bob does the job i, two things happen:
(Let TE be Bob's total experience initialized by 0)
Bob's experience (i.e. TE) is incremented by E[i]
Then, he will receive money equal to TE*M[i]
What is the maximum profit Bob can make if he does the jobs in the best possible order?
For any i we know:
1 <= E[i] <= 10^5
1 <= M[i] <= 10
Example:
M[] = { 20, 30, 100 }
E[] = { 1, 1, 6 }
Answer: 880 = job 3-1-2 = 6*100 + 7*20 + 8*30 = 980
I think the problem can be solved by Greedy Algorithm (which is a special case of DP) as described follow:
Sort the job by ratio Exp/Money in descending order
If tie, then sort the job by Money in ascending order
Then the sorted job sequence is the order of the job which yields the optimal solution.
My reasoning is as follows: The ratio Exp/Money can be interpreted as How much Exp can you buy with 1 money, so it is always better if we choose the job with higher ratio first, as this increase the experience for later jobs.
In the tie case, choose the job with smaller money reward, as this makes the job with higher money reward can be multiplied by a larger experience factor later on.
For example:
E = {2,1,6,1}
M = {40,20,100,10}
Sorted job = { job3, job4, job2, job1}
= 6*100 + 7*10 + 8*20 + 10*40 = 1230
This was asked by a friend of mine. I had no previous context, so I want to know what type of algorithm this problem belongs to. Any hint or suggestions will do.
Suppose we have a group of N workers working at a car assembly line. Each worker can do 3 types of work, and their skills rated from 1 to 10. For example, Worker1's "paint surface" efficiency is rated 8, but "assemble engine" efficiency is only rated 5.
The manager has a list of M jobs defined by start time, duration, job type, and importance, rated from 0 to 1. Each worker can only work on 1 job at a time, and 1 job can be worked by 1 worker. How can the managers assign the jobs properly to get maximum output?
The maximum output for a job = worker skill rating * job importance * duration.
For example, we have workers {w1, w2}
w1: paint_skill = 9, engine_skill = 8
w2: paint_skill = 10,engine_skill = 5
We have jobs {j1, j2}
j1: paint job, start_time = 0, duration = 10, importance = 0.5
j2: engine job, start_time = 3, duration = 10, importance = 0.9
We should assign w1 to j2, and w2 to j1. output = 8 * 10 * 0.5 + 10 * 10 * 0.9 = 40 + 90 = 130
A greedy solution that matches the next available worker with the next job is clearly sub-optimal, as in the example we could have matched w1 to j1, which is not optimal.
A exhaustive brute-force solution would guarantee the best output, but will use exponentially more time to compute with large job lists.
How can this problem be approached?
This is an interview question:
4 men- each can cross a bridge in 1,3, 7, and 10 min. Only 2 people
can walk the bridge at a time. How many minutes would they take
to cross the bridge?
I can manually think of a solution as: 10 and 7 got together, as soon as 7 reaches the destination, '3' hops in and 10 and 3 complete together. Now 1 goes by itself, and total time taken is 11. So, {10, 7} followed by {10, 3} followed by {1}.
I am unable to think about how I can implement this into a general algorithm into a code. Can someone help me identify how I can go about converting this idea into some real code?
The problem you describe is not subset sum.
Yet you can:
order the array a by descending time
int time1 = 0; // total time taken by the first lane
int time2 = 0; // total time taken by the second lane
for i : 0..n
if(time1 < time2) // add the time to the most "available" lane
time1 += a[i];
else
time2 += a[i];
endif
endfor
return max(time1, time2);
This is not a subset sum problem but a job shop scheduling problem. See Wikipedia entry on Job shop scheduling. You have four "jobs", taking 1, 3, 7 and 10 minutes respectively, and two "lanes" for conducting them, that is, the capacity 2 of the bridge. Calculating exact solution in general for job shop scheduling is hard.