Allocate Table seating to Guests - algorithm

These are the tables in a wedding reception, i want to reserve these tables for diffrent groups, but it should be optimum.
I want to calculate the optimal minimum seat allocation for guests.
NB : a group can take more than one table at a time. but sharing of a
table between teams is not allowed for example, if a group of 6
persons wants sit in the tables, how we can calculate the optimal
seating

your question is probably NP-complete.
Let it be 1 team with K persons and N tables, the optimal solution is equivalent to the subset sum problem -> https://en.wikipedia.org/wiki/Subset_sum_problem
Because if you had a solution of the subset sum problem is also a optimal solution in that case.

Related

Optimize event seat assignments with Corona restrictions

Problem:
Given a set of group registrations, each for a varying number of people (1-7),
and a set of seating groups (immutable, at least 2m apart) varying from 1-4 seats,
I'd like to find the optimal assignment of people groups to seating groups:
People groups may be split among several seating groups (though preferably not)
Seating groups may not be shared by different people groups
(optional) the assignment should minimize the number of 'wasted' seats, i.e. maximize the number of seats in empty seating groups
(ideally it should run from within a Google Apps script, so memory and computational complexity should be as small as possible)
First attempt:
I'm interested in the decision problem (is it feasible?) as well as the optimization problem (see optional optimization function). I've modeled it as a SAT problem, but this does not find an optimal solution.
For this reason, I've tried to model it as an optimization problem. I'm thinking along the lines of a (remote) variation of multiple-knapsack, but I haven't been able to name it yet:
items: seating groups (size -> weight)
knapsacks: people groups (size -> container size)
constraint: combined item weight >= container size
optimization: minimize the number of items
As you can see, the constraint and optimization are inverted compared to the standard problem. So my question is: Am I on the right track here or would you go about it another way? If it's correct, does this optimization problem have a name?
You could approach this as an Integer Linear Programming Problem, defined as follows:
let P = the set of people groups, people group i consists of p_i people;
let T = the set of tables, table j has t_j places;
let x_ij be 1 if people from people group i are placed at table j, 0 otherwise
let M be a large penalty factor for empty seats
let N be a large penalty factor for splitting groups
// # of free spaces = # unavailable - # occupied
// every time a group uses more than one table,
// a penalty of N * (#tables - 1) is incurred
min M * [SUM_j(SUM_i[x_ij] * t_j) - SUM_i(p_i)] + N * SUM_i[(SUM_j(x_ij) - 1)]
// at most one group per table
s.t. SUM_i(x_ij) <= 1 for all j
// every group has enough seats
SUM_j(x_ij * t_j) = p_i for all i
0 <= x_ij <= 1
Although this minimises the number of empty seats, it does not minimise the number of tables used or maximise the number of groups admitted. If you'd like to do that, you could expand the objective function by adding a penalty for every group turned away.
ILPs are NP-hard, so without the right solvers, it might not be possible to make this run with Google Apps. I have no experience with that, so I'm afraid I can't help you. But there are some methods to reduce your search space.
One would be through something called column generation. Here, the problem is split into two parts. The complex master problem is your main research question, but instead of the entire solution space, it tries to find the optimum from different candidate assignments (or columns).
The goal is then to define a subproblem that recommends these new potential solutions that are then incorporated in the master problem. The power of a good subproblem is that it should be reducable to a simpler model, like Knapsack or Dijkstra.

Algorithm to group people based on their preference

I need an algorithm to group people in tables based on their preference. Each person vote sorting the tables from the favorite to the worse.
For example if there are 4 tables in total one person vote is like:
Alice{ table1 => 2, table2 => 4, table3=>1, table4=>3}
which means she would like to be put on table3 and really dislikes table2
The conditions are:
Everyone must be in a group
All groups must have the same number of people (tollerance of 1)
Maximize the global 'happiness'
Trying to sort this out I defined happiness as points, each person will have happiness 10 if they will be put on their favorite table, 6 on their second choice, 4 on the third and 1 on the last.
happiness[10, 6, 4, 1]
The global happiness is the sum of each person's happiness.
One way to solve this is to use integer linear programming.
There are many solvers out there for ILP, for example SCIP (http://scip.zib.de/).
You would have binary variable for each assigment, i.e.
= 1 if person i was assigned to table j (and 0 is it was not assigned).
Your goal is to maximize total happiness, i.e. sum of weights multiplied by
Now you have write some conditions to ensure, that:
each person is assigned to exactly one table, i.e. sum of for each i is equal to one.
all tables have similar number of persons (you can determine possible ranges for number of persons beforehand), i.e. some of for each j is in defined range.

Most efficient seating arrangement

There are n (n < 1000) groups of friends, with the size of the group being characterized by an array A[] (2 <= A[i] < 1000). Tables are present such that they can accommodate r(r>2) people at a time. What is the minimum number of tables needed for seating everyone, subject to the constraint that for every person there should be another person from his/her group sitting at his/her table.
The approach I was thinking was to break every group into sizes of twos and threes and try to solve this problem, but there are many ways of dividing a number n into groups of twos and threes and not all of them may be optimal.
Does a Mixed Integer Programming model count?
Some notes on this formulation:
I used random data to form the groups.
x(i,j) is the number of people of group i sitting at table j.
x(i,j) is a semi-integer variable, that is: it is an integer variable with values zero or between LO and UP. Not all MIP solvers offer semi-continuous and semi-integer variables but it may come handy. Here I use it to enforce that at least 2 persons from the same group need to sit at a table. If a solver does not offer these type of variables, we can formulate this construct using additional binary variables as well.
y(j) is a binary variable (0 or 1) indicating if a table is used.
the capacity equation is somewhat smart: if a table is not used (y(j)=0) its capacity is reduced to zero.
the option optcr=0 indicates we want to solve to optimality. For large, difficult problems we may want to stop say at 5%.
the order equation makes sure we start filling tables from table 1. This also reduces the symmetry of the problem and may speed up solution times.
the above model (with 200 groups and 200 potentially used tables) generates a MIP problem with 600 equations (rows) and 40k variables (columns). There are 37k integer variables. With a good MIP solver we find the proven optimal solution (with 150 tables used) in less than a minute.
Notice this is certainly not a knapsack problem (as suggested in another answer -- a knapsack problem has just one constraint) but it resembles a bin-packing problem.
It is same problem as knapsack problem which is NP complete (see https://en.wikipedia.org/wiki/Bin_packing_problem ). So finding optimal solution is pretty hard.
A heuristic that works most of the time:
Sort the groups according decreasing size.
For each group put it in the table that has least amount of space, but still can accommodate this group.
Your approach is workable. If a solution exists for a given number of tables, then a solution exists where you've split every group into some number of twos and some number of threes. First, split a three off of every group of odd size. You're left with a bunch of groups of even size. Next, split twos off of every group whose size isn't divisible by six. And forget that it's one bigger group; split it into a bunch of groups of six.
At this point, you have split all of your groups into some number of twos, some number of threes, and some number of sixes. Give each table of odd size one three, splitting sixes as necessary; now all tables have even size. All remaining sixes can now be split into twos and seated arbitrarily.

What Sort Of Algorithm Should I Use To Sort Students?

In a program that generates random groups of students, I give the user the option to force specific students to be grouped together and also block students from being paired. I have tried for two days to make my own algorithm for accomplishing this, but I get lost in all of the recursion. I'm creating the program in Lua, but I'll be able to comprehend any sort of pseudo code. Here's an example of how the students are sorted:
students = {
Student1 = {name=Student1, force={"Student2"}, deny={}};
Student2 = {name=Student2, force={"Student1","Student3"}, deny={}};
Student3 = {name=Student3, force={"Student2"}, deny={}};
}-- A second name property is given in the case that the student table needs to be accessed by students[num] to retrieve a name
I then create temporary tables:
forced = {}--Every student who has at least 1 entry in their force table is placed here, even if they have 1 or more in the deny table
denied = {}--Every student with 1 entry for the deny table and none in the force table is placed here
leftovers = {}--Every student that doesn't have any entries in the force nor deny tables is placed here
unsortable = {}--None are placed here yet -- this is to store students that are unable to be placed according to set rules(i.e. a student being forced to be paired with someone in a group that also contains a student that they can't be paired with
SortStudentsIntoGroups()--predefined; sorts students into above groups
After every student is placed in those groups(note that they also remain in the students table still), I begin by inserting the students who are forced to be paired together in groups(well, I have tried to), insert students who have one or more entries in the deny table into groups where they are able to be placed, and just fill the remaining groups with the leftovers.
There are a couple of things that will be of some use:
numGroups--predefined number of groups
maxGroupSize--automatically calculated; quick reference to largest amount of students allowed in a group
groups = {}--number of entries is equivalent to numGroups(i.e. groups = {{},{},{}} in the case of three groups). This is for storing students in so that the groups may be displayed to the end user after the algorithm is complete.
sortGroups()--predefined function that sorts the groups from smallest to largest; will sort largest to smallest if supplied a true boolean as a parameter)
As I stated before, I have no clue how to set up a recursive algorithm for this. Every time I try and insert the forced students together, I end up getting the same student in multiple groups, forced pairs not being paired together, etc. Also note the formats. In each student's force/deny table, the name of the target student is given -- not a direct reference to the student. What sort of algorithm should I use(if one exists for this case)? Any help is greatly appreciated.
Seems to me like you are facing an NP-Hard Problem here.
This is equivalent to graph-coloring problem with k colors, where edges are the denial lists.
Graph Coloring:
Given a graph G=(V,E), and an integer `k`, create coloring function f:V->{1,2,..,k} such that:
f(v) = f(u) -> (v,u) is NOT in E
The reduction from graph coloring to your problem:
Given a graph coloring problem (G,k) where G=(V,E), create an instance of your problem with:
students = V
for each student: student.denial = { student' | for each edge (student,student')}
#groups = k
Intuitively, each vertex is represented by a student, and a student denies all students that there is an edge between the vertices representing them.
The number of groups is the given number of colors.
Now, given a solution to your problem - we get k groups that if student u denies student v - they are not in the same group, but this is the same as coloring u and v in different colors, so for each edge (u,v) in the original graph, u and v are in different colors.
The other way around is similar
So, we got here a polynomial reduction from graph-coloring problem, and thus finding an optimal solution to your problem is NP-Hard, and there is no known efficient solution to this problem, and most believe one does not exist.
Some alternatives are using heuristics such as Genetic Algorithms that does not provide optimal solution, or using time consuming brute force approaches (that are not feasible for large number of students).
The brute force will just generate all possible splits to k groups, and check if it is a feasible solution, at the end - the best solution found will be chosen.

Does a greedy approach work here?

Suppose there are N groups of people and M tables. We know the size of each group and the capacity of each table. How do we match the people to the tables such that no two persons of the same group sit at the same table?
Does a greedy approach work for this problem ? (The greedy approach works as follows: for each table try to "fill" it with people from different groups).
Assuming the groups and tables can be of unequal size, I don't think the greedy approach as described works (at least not without additional specifications). Suppose you have a table of 2 T1 and a table of 3 T2, and 3 groups {A1}, {B1,B2} and {C1,C2}. If I follow your algorithm, T1 will receive {A1,B1} and now you are left with T2 and {B2,C1,C2} which doesn't work. Yet there is a solution T1 {B1,C1}, T2 {A1,B2,C2}.
I suspect the following greedy approach works: starting with the largest group, take each group and allocate one person of that group per table, picking first tables with the most free seats.
Mathias:
I suspect the following greedy approach works: starting with the largest group, take each group and allocate one person of that group per table, picking first tables with the most free seats.
Indeed. And a small variation of tkleczek's argument proves it.
Suppose there is a solution. We have to prove that the algorithm finds a solution in this case.
This is vacuously true if the number of groups is 0.
For the induction step, we have to show that if there is any solution, there is one where one member of the largest group sits at each of the (size of largest group) largest tables.
Condition L: For all pairs (T1,T2) of tables, if T1 < T2 and a member of the largest group sits at T1, then another member of the largest group sits at T2.
Let S1 be a solution. If S1 fulfills L we're done. Otherwise there is a pair (T1,T2) of tables with T1 < T2 such that a member of the largest group sits at T1 but no member of the largest group sits at T2.
Since T2 > T1, there is a group which has a member sitting at T2, but none at T1 (or there is a free place at T2). So these two can swap seats (or the member of the largest group can move to the free place at T2) and we obtain a solution S2 with fewer pairs of tables violating L. Since there's only a finite number of tables, after finitely many steps we have found a solution Sk satisfying L.
Induction hypothesis: For all constellations of N groups and all numbers M of tables, if there is a solution, the algorithm will find a solution.
Now consider a constellation of (N+1) groups and M tables where a solution exists. By the above, there is also a solution where the members of the largest group are placed according to the algorithm. Place them so. This reduces the problem to a solvable constellation of N groups and M' tables, which is solved by the algorithm per the induction hypothesis.
The following greedy approach works:
Repeat the following steps until there is no seat left:
Pick the largest group and the largest table
Match one person from the chosen group to the chosen table
Reduce group size and table size by 1.
Proof:
We just have to prove that after performing one step we still can reach optimal solution.
Let's call any member of the largest group a cool guy.
Suppose that there is a different optimal solution in which no cool guy sits at the largest table. Let's pick any person sitting at the largest table in this solution and call it lame guy.
He must belong to the group of size no larger than the cool group. So there is another table at which sits a cool guy but no lame guy. We can than safely swap seats of the lame and cool guy which also results in an optimal solution.

Resources