Greedy Algorithms

(Algorithms chapter 5)

Greedy strategies take a very natural approach to solve a problem. Given the choices that we have made so far, make the choice that provides the most benefit going forward. We can think of a greedy algorithm as taking the choice that looks locally optimal. However, generally our goal is to find a global optimum. When do these locally optimal moves produce a global optimum? We start by seeing some examples.

Def 1. (Matching) Given a graph G=(V, E), a subset of edges M \subset E is a matching if every two edges in E has disjoint endpoints.

Maximum Matching Problem

Given a graph G=(V, E), find a matching of maximum cardinality.

A greedy approach:  Pick any available edge that extends the current matching.

Does the above algorithm always output a maximum matching of the graph? No. Consider the example below which is a path of length three. If the green edge is chosen as the first edge by the algorithm, the algorithm stops there. The optimal solution though is a matching of size 2 (red edges).

Screen Shot 2018-09-18 at 4.05.44 PM

Shortest Path Problem

Given a graph G=(V, E) with weights on the edges,  and vertices (s,t), find a path between s, t with minimum total weight.

A greedy approach: Start with s and at each step choose the neighbour that gives the shortest path to t.

Does the above algorithm always find a shortest path? No. Consider the example below. At the first step, the algorithm chooses the neighbour c cause the distance of c to t is 6 versus the distance of a to t which is 8. Therefore the greedy algorithm outputs the path (s, c, d, t). However, the shortest path is (s, a, b, t).

Screen Shot 2018-09-18 at 4.20.04 PM

Minimum Spanning Tree (MST) Problem

The next example is a problem (computing the minimum spanning tree of a graph) where a greedy algorithm finds the optimal solution. Before getting into the problem, we need to introduce some definitions of a graph G=(V,E).

Def 2. (Connected Component) A connected component is a subset of vertices M \subset V such that there is a path between any two vertices in M.

Def 3. (Spanning Subgraph) A subgraph that includes every vertex in V.

Def 4. (Tree) A tree is a connected component of G with no cycles.

Claim 5: A minimal weight spanning connected component  F\subseteq E is a spanning tree. That is,

  1. F has no cycles
  2. F has |V|-1=n-1 edges; i.e. |F|=n-1.

Proof: (1) Suppose F had a cycle. Then, we can remove any edge on the cycle while maintaining connectivity between every pair of vertices. Therefore, there can be no cycles if F is minimal.

(2) We prove by induction that |F|=n-1. It’s clearly true for the base cases n=1,2. Now, for n>2, remove any edge in F. Then, F \backslash \{e\} has 2 components (otherwise F was not minimal), and each component has n_1, n_2<n vertices with n_1 + n_2 = n. By induction,

|F| = (n_1-1) + (n_2-1) + 1 = n-1.

Minimum Spanning Tree (MST) Problem. Given a graph G=(V, E) and weights on the edges, find a spanning tree of G with minimum total weight.

A greedy approach: One possible approach is to simply select the lightest weight edge. Locally, this seems like the best edge to select. However, there is a potential problem: what if the lightest weight edge creates a cycle? Well, then we simply ignore that edge and proceed to the next lightest edge. We have essentially described Kruskal’s Algorithm:

Kruskal’s Algorithm for solving MST:

KRUSKAL(V,E,w):

  • Sort edges in increasing order of weight; so have an ordering e_1, \ldots, e_m such that w(e_i) \le w(e_{i+1}).
  • Set T = \{\}
  • for i=1 to m
    • Add e_i to T if it doesn’t create a cycle.
  • Return T.

Screen Shot 2018-09-21 at 4.58.07 PM

Theorem 6 (Kruskal): Kruskal’s algorithm produces a minimum weight spanning tree of G=(V,E).

Before proving this theorem, we need to first introduce some definitions and lemmas.

Def 7. (Cut) A cut (S,\bar S) is the set of edges between a subset of vertices S \subseteq V and its complement \bar S = V \backslash S. That is,

(S,\bar S) = \{(u_i, v_j) \in E | u_i \in S, v_j \in \bar S\}.

Lemma 8. Fix any edge e whose weight is the minimum in some cut (S,\bar S). Then, there exists an MST T containing e.

Proof: Suppose e is not in an MST. Consider an MST T and add e to T. Then, e induces a cycle C (since T \cup \{e\} has two paths between endpoints of e).

Consider the cut (S,\bar S) for which e is a minimum weight edge. Then C must cross (S,\bar S) in some other edge f with weight(e) \le weight(f).

Set T' = T \cup \{e\} \backslash \{f\}. Note that weight(T') \le weight(T), which means T' is an MST (since it connects every pair of vertices and is minimum weight).

Lemma 9 (cut property). Let X be a subset of edges of an MST of a graph G. For any edge e such that e is a minimum-weight edge of a cut (S, \bar{S}) and X \cap (S, \bar{S}) = \phi, there is an MST T s.t. X \cup \{e \} \subseteq T.

Proof. Take some MST T that contains X.  Let (S, \bar{S}) be a cut in which e is a minimum cost edge and assume that e \notin T. Add e to T. This creates a cycle. At least one other edge of the cycle e' must be in (S, \bar S). Since e was the edge of the cut with the smallest weight, T \cup \{e\}\setminus \{e'\} has lower weight than T. This contradicts T being an MST.

Proof (of Kruskal Theorem): Using Lemmas 8 and 9, we can now prove that Kruskal’s algorithm always produces an MST. When the next edge

When the next edge e is added, we can identify a cut (S, \bar S) such that e is the first edge across this cut. Let X be the edges that has been added so far. According to Lemma 9, e belongs to some MST. We add e to X and repeat the process.

Running time of Kruskal’s algorithm: O(m \log n).

Sorting the edges once up front takes O(m \log n).

However, we also have to keep track of whether an added edge creates a cycle, i.e. when an edge connects two vertices in the same connected component. We can do this via a simple data structure.

For each vertex, introduce a component number, which is initially just the index of that vertex. When Kruskal’s algorithm adds an edge which connects two vertices with the same component number, we skip that edge because it creates a cycle. If the component numbers of the two vertices are different, then we relabel the vertices in the smaller component to the label of the larger component. Since we relabel the smaller component, the size of a vertex’s component at least doubles each time it is relabeled. Therefore a vertex’s label changes at most O( \log n) times. Thus the total cost of this data structure is O(n \log n).

Prim’s Algorithm:

Another algorithm to find the MST of a graph is Prim’s algorithm.

  • Start with any vertex, and mark it as visited.
  • Add the lightest weight edge from some visited vertex to an unvisited vertex; repeat until all vertices are visited.

As with Kruskal’s algorithm, its correctness follows from the cut property given above.

Running time of Prim’s algorithm: O(m \log n).

Matroids

Can we define a more general class of problems that could be solved by the above greedy algorithm?

Def 10. (Matroid) A matroid consists of a universe of elements U and a collection I of independent subsets of U with the following properties

  • A \subseteq B, B\in I \Rightarrow A \in I
  • |A|<|B|, A,B \in I \Rightarrow \exists e \in B s.t. A\cup \{e\} \in I

A Greedy approach. Sort elements of U in increasing order of weights. Add the next element in the list that preserves the independency.

Theorem 11. The above greedy algorithm finds the maximum weight independent set.

Proof. We prove by contradiction. Let x_1 \geq x_2 \geq X_3 \ldots be the elements in the output greedy algorithm in descending weight order and let y_1, y_2, \ldots be the elements in the maximum independent set in descending weight order. Since the total weight of the optimal is higher than the total weight of the greedy solution, consider k to be the first index that w(x_k) < w(y_k). Then |\{x_1, \ldots, x_{k-1}\}| < |y_1, \ldots, y_k| and both are independent. By the matroid property \exists y_j \in \{ y_1, \ldots , y_k \} such that \{ x_1, \ldots, x_k \} \cup \{y_j\} is an independent set. But w(y_j) \geq w(y_k) > w(x_k) and this contradicts the fact that the greedy algorithm chose x_k and not y_j.

Observation 12. Spanning trees form a matroid. Let’s define U = E and I as the subset of edges with no cycles (forests). It is simple to see that this satisfies the two matroid properties. Given this, Theorem 11 states that the greedy algorithm stated above finds the maximum spanning tree. This can be used to solve the MST problem by simply defining the weights M_{ij} = W - w_{ij}. Then finding an MST on a graph with weights w is equivalent to finding the maximum spanning tree on the same graph with weights M.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s