This is an extension to our maximum cardinality bipartite matching problem we introduced earlier. Imagine the same situation, we are given a bipartite graph G = (V,E) in which the vertices can be separated into two disjoint sets such that there are no edges between vertices belonging in the same set. Instead of weightless edges like our previous problem - we are now faced with weighted edges. The problem now is to match the vertices such that the total weight of the matched edges is as small (or as large) as possible whilst still retaining the maximum number of matchings. The figure below depicts this situation:
Applications:
Like the weightless version, we can compute the most optimal pairings between two disjoint sets. For example, let X represent a set of workers and Y to be a set of jobs. There is an edge (u,v) in E of worker u in X to job v in Y of weight W, which gives the productivity of the worker u given job v. If we were tasked to assign the workers to the set of jobs such that each worker is given exactly 1 job and we were also told to maximise the total productivity of the assignments then the answer is simply the maximum weighted matching in the bipartite graph. We can also define the edge weights to mean the converse, for example, the weight for edge (u,v) could mean how difficult worker u would find job v – in which our task could be to minimise the total difficulty whilst ensuring all workers are assigning to exactly 1 job. This does not complicate things as we simply need to reverse our operations to derive the opposite result.
Algorithm:
We need a new approach to tackle this problem. Augmenting the paths isn't enough as it doesn't take into account the weightings on the edge. One idea is to arbitrarily assign the maximum matching without consideration of the weights and then try to successively improve (i.e. maximise) the cost by finding a negative cycle an augmenting the matchings along this cycle. To represent this we calculate a modified graph G’, with edge weights representing the cost difference we will gain by matching a vertex x with another vertex y. In other words, for any u and v, we put a weight W on the edge from u to v, where W is defined as the cost we will gain if we gave the current partner of v to u, i.e. W = cost(u, matching[v]) – cost(u, matching[u]). Now we simply look for a negative cycle in G’ and augment the partners along the cycle to produce a better matching. How does this work? If we can find a negative cycle from a vertex u to itself, it means we can alternate partners to reduce the total cost whilst also keeping the same number of matched partners.
Consider the following bipartite graph:
And the matchings below:
The matching on the left has a weight of 15; the optimal minimum matching has a weight of 8 (shown on the right). Algorithmically, let’s assign the vertices on the top-left and bottom-left (A and B respectively) and the vertices on the top-right and bottom-right (C and D respectively). We compute the modified G’ edge weights of (A,B) and (B,A) – the cost of switching the partners around (take note their matched counterparts – i.e. the right side, is calculated implicitly). Proceeding to do this we yield:
W(A,B) = cost(A, matching[B]) – cost(A, matching[A]) = 6 – 5 = 1
W(B,A) = cost(B, matching[A]) – cost(B, matching[B]) = 2 – 10 = -8
Therefore a possible cycle could be from A -> matching[B] -> B -> matching[A] -> A. This yields a cost of W(A,B) + W(B,A) = 1 + (-8) = -7. Since this is a negative cycle we can save a total cost of up to 7 (you can verify this by hand) if we augment along this cycle. From this, A becomes matched to matching[B] (i.e. D) and B becomes matched to matching[A] (i.e. C). The matching on the right hand side diagram highlights this minimal matching. To actually find a negative cycle, an easy way is to use Floyd-Warshall’s algorithm which is illustrated in our implementation below. Additional book-keeping is required to keep track of the parents so we can back-track and augment the negative cycle path.
To summarise, our algorithm is as follows:
- Start with an initial perfect matching M
- While there is a negative cycle C on G’, augment along the negative cycle C to produce a better matching M’
- Return the matching M
Other Notes:
This problem can also be solved using the Hungarian algorithm (a famous combinatorial optimisation algorithm) or using minimum-cost flows. In fact, our cycle cancelling implementation is a specific subset of the minimum cost flow algorithm. The difference between our cycle cancelling algorithm and the one used for the general minimum cost flow problem is that we need to run a maximum flow algorithm to derive an initial network (here we simply assign the vertices to each other as we are guaranteed a perfect matching it’s a complete bipartite graph).
There are many variations on maximum (or minimum) weighted bipartite matching. The first variation is called the assignment problem where we are given an equal number of vertices on each side and the graph itself is complete (i.e. all the vertices link to each other between the two disjoint sets). The second variation we are given an uneven number of vertices on each side but the graph itself is still complete, here we proceed to reduce the problem into the first variation by adding dummy vertices to make the sides even. Any edges which go to these vertices have a weight of 0 and hence do not have an effect on the final answer. Another variation is where there are an uneven number of vertices and the graph isn’t complete – hence a perfect matching may not be possible, this is where the general minimum cost flow algorithm comes in which we will go over later.
Applying the Algorithm:
We briefly demonstrate how the algorithm is used to solve two algorithmic problems.
ACM ICPC Pacific Northwest Regionals 2004 (GoingHome)
URL: http://acm.tju.edu.cn/toj/showp1636.html
The problem can be easily reduced to the assignment problem by constructing a bipartite graph of houses and people. The edge weights are simply the Manhattan distance between house i and person j. We then just run our algorithm over the graph and we obtain our answer!
#include <iostream> #include <string> #include <vector> #include <algorithm> using namespace std; int cost[111][111]; // cost of assigning house i to person j int curMatchings[111]; // house i matched to person curMatchings[i] int delta[111][111]; // the delta/cost graph int parent[111][111]; // used for backtracking the cycle int cycle[111]; // path of the cycle int totalLen; // length of the cycle bool augmentingCycle(int N) { memset(delta,0,sizeof(delta)); memset(parent,0,sizeof(parent)); memset(cycle,-1,sizeof(cycle)); // derive the delta graph for (int i = 0; i < N; i++) { for (int j = 0; j < N; j++) { delta[i][j] = cost[i][curMatchings[j]] - cost[i][curMatchings[i]]; parent[i][j] = j; } } for (int k = 0; k < N; k++) { for (int i = 0; i < N; i++) { for (int j = 0; j < N; j++) { if (delta[i][k] + delta[k][j] < delta[i][j]) { // minimum cost matching delta[i][j] = delta[i][k] + delta[k][j]; // book-keep the optimal path so far for i->j parent[i][j] = parent[i][k]; // detect for cycle (if vertex i and j are the same vertex) if (i == j) { totalLen = 0; // backtrack and construct the cycle path do { cycle[totalLen++] = i; i = parent[i][j]; } while (i != j); return true; } } } } } // did not find an augmenting path return false; } int cycleCancel(int N) { int res = 0; // pseudo max-flow for assignment problem for (int i = 0; i < N; i++) curMatchings[i] = i; while (augmentingCycle(N)) { // augment the negative cycle int r = curMatchings[cycle[0]]; for (int i = 0; i < totalLen - 1; i++) { curMatchings[cycle[i]] = curMatchings[cycle[i+1]]; } curMatchings[cycle[totalLen-1]] = r; } // compute the final cost for (int i = 0; i < N; i++) { res += cost[i][curMatchings[i]]; } return res; } int main() { int N, M; while (cin >> N >> M) { if (N == 0 && M == 0) break; char ch; vector<pair<int,int> > houses; vector<pair<int,int> > people; for (int i = 0; i < N; i++) { for (int j = 0; j < M; j++) { cin >> ch; if (ch == 'H') houses.push_back(make_pair(i,j)); else if (ch == 'm') people.push_back(make_pair(i,j)); } } // compute the weights associated with matching house i to person j for (int i = 0; i < houses.size(); i++) { for (int j = 0; j < people.size(); j++) { cost[i][j] = abs(houses[i].first - people[j].first) + abs(houses[i].second - people[j].second); } } cout << cycleCancel(houses.size()) << "\n"; } return 0; }
TC SRM387 D2 1000 (MarblesRegroupingHard)
URL: http://www.topcoder.com/stat?c=problem_statement&pm=8538
This is a re-visit of a previous problem we solved using DP. This is a slightly less obvious reduction than the previous problem; we need to compute the total number of colours which exists in the whole dataset. Then we can draw an edge from box i to colour j with a total weight of all the other j-coloured marbles from other boxes minus the ones in box i with colour j (as they don’t need to be moved). Take note that there may be a different number of colours to the number of boxes but we are always guaranteed that the number of colours is less than or equal to the number of boxes, hence a matching is always possible based on the pigeonhole principle. We can add dummy colours to the graph to reduce it to the assignment problem – which is what is done below:
class MarblesRegroupingHard { public: int minMoves(vector <string>); }; int totColours[15]; int B[51][15]; int cost[55][55]; // cost of assigning house i to person j int curMatchings[55]; // house i matched to person curMatchings[i] int delta[55][55]; // the delta/cost graph int parent[55][55]; // used for backtracking the cycle int cycle[55]; // path of the cycle int totalLen; // length of the cycle bool augmentingCycle(int N) { memset(delta,0,sizeof(delta)); memset(parent,0,sizeof(parent)); memset(cycle,-1,sizeof(cycle)); // derive the delta graph for (int i = 0; i < N; i++) { for (int j = 0; j < N; j++) { delta[i][j] = cost[i][curMatchings[j]] - cost[i][curMatchings[i]]; parent[i][j] = j; } } for (int k = 0; k < N; k++) { for (int i = 0; i < N; i++) { for (int j = 0; j < N; j++) { if (delta[i][k] + delta[k][j] < delta[i][j]) { // minimum cost matching delta[i][j] = delta[i][k] + delta[k][j]; // book-keep the optimal path so far for i->j parent[i][j] = parent[i][k]; // detect for cycle (if vertex i and j are the same vertex) if (i == j) { totalLen = 0; // backtrack and construct the cycle path do { cycle[totalLen++] = i; i = parent[i][j]; } while (i != j); return true; } } } } } // did not find an augmenting path return false; } int cycleCancel(int N) { int res = 0; // pseudo max-flow for assignment problem for (int i = 0; i < N; i++) curMatchings[i] = i; while (augmentingCycle(N)) { // augment the negative cycle int r = curMatchings[cycle[0]]; for (int i = 0; i < totalLen - 1; i++) { curMatchings[cycle[i]] = curMatchings[cycle[i+1]]; } curMatchings[cycle[totalLen-1]] = r; } // compute the final cost for (int i = 0; i < N; i++) { res += cost[i][curMatchings[i]]; } return res; } int MarblesRegroupingHard::minMoves(vector <string> boxes) { int res = 0; memset(totColours,0,sizeof(totColours)); memset(B,0,sizeof(B)); int numColours = 0; for (int i = 0; i < boxes.size(); i++) { istringstream iss(boxes[i]); int colours; int k = 0; while (iss >> colours) { totColours[k] += colours; B[i][k++] = colours; } numColours = k; } // make the weighted bipartite graph // use the cost matrix to fill in for (int i = 0; i < boxes.size(); i++) { for (int j = 0; j < boxes.size(); j++) { if (j >= numColours) cost[i][j] = 0; else cost[i][j] = totColours[j] - B[i][j]; } } return res = cycleCancel(boxes.size()); }
No comments:
Post a Comment