Notes for CSC 162, 1 Apr. 2010 ff Chapter 6 Fifth project due Tues. Apr. 13, noon. Final project will do interactive graphics and event handling. ======================================== Graphs terminology: node / vertex edge path directed v. undirected complete (strongly) connected Nodes may represent anything that has some notion of "connection". Can be physical things cities on a map, with roads (or airline flights) between them Can be virtual things web pages, with links (directed) Can be abstract things "is a", "has a" relationships Can be _states_ of a system Cf. jugs of water problem, or graph 4-coloring project Can have O(n^2) edges. Often have significantly less -- O(n log n) or even O(n) Planar graphs, e.g., have O(n) edges. ---------------------------------------- representations adjacency matrix good for dense graphs (only) adjacency lists more common << see graph.py >> Usually have labels on nodes; usually want to be able to find node with a given label. Book therefore suggests using a Python dictionary for the root index of the adjacency list structure. +-----+---+ | key | --+--> (ref) ---> (ref) ... (ref)-\ +-----+---+ | key | --+--> (ref) ---> (ref) ... (ref)-\ +-----+---+ | key | --+--> (ref) ---> (ref) ... (ref)-\ +-----+---+ | key | --+--> (ref) ---> (ref) ... (ref)-\ +-----+---+ | key | --+--> (ref) ---> (ref) ... (ref)-\ +-----+---+ | key | --+--> (ref) ---> (ref) ... (ref)-\ +-----+---+ ... +-----+---+ | key | -----> (ref) ---> (ref) ... (ref)-\ +-----+---+ ======================================== Searching graphs Cf. backtracking search -- e.g., the water jugs problem or the graph 4-coloring project. ---------------------------------------- Breadth-first search requires an explicit queue << see bfs.py >> Word ladder problem build the graph put word of length k in k bins, one for every choice of which letter to make a wild card create a node for every word; connect it to every word with which it shares a bin use BFS to find shortest path between A and B Similar to jugs of water problem where we want to find minimum number of moves. << see jugs.py >> ---------------------------------------- Depth-first search Easiest way to find reachable nodes -- to explore a graph via the edges. Can be written as a simple recursive routine. << see dfs.py>> (Book does it with an explicit stack.) Connected components of undirected graph (workshop). Knight's tour (book). Strong similarity to the 4-coloring problem: modest number of chess squares / map regions exponential number of system states (visited squares / colored regions) solved by depth-first search with heuristic explore square with the mimimal number of outgoing moves / region with maximal number of neighbors generalization of Knight's Tour is Hamiltonian cycle: NP-hard graph 4-coloring is also NP-hard << see knights_tour.py >> ---------------------------------------- Topological sort Assign post-order numbers during DFS; these constitute a topo. sort. << see topo_sort.py >> ---------------------------------------- Transitive closure (workshop) ======================================== Strongly-connected components example: WWW meta structure DFS G DFS G-transpose, but pick, whenever we have a choice, the node with the highest finish time in the original DFS. The trees of this second search are the strongly connected components. Why? If C1 and C2 are distinct strongly connected components, and there is an edge from C1 to C2, then the largest finish time in C1 has to be bigger than the largest finish time in C2. (Otherwise we would have gotten back into C1 without finishing C2, which would mean they're not distinct components.) So in the 2nd DFS, by always choosing nodes with highest finishing times, we're visiting the strongly connected components in a topological sort, but since we've reversed the edges, we're actually visiting in a _reverse_ topo sort of the (reversed) components, so we won't be able to get from one to another, and they'll be separate trees. << see sc_components.py >> ======================================== Shortest paths (in directed graph) Dijkstra's algorithm: single-source shortest path << see dijkstra_prim.py >> def dijkstra(G, start): PQ = minHeap() for v in G: # O(n) v.setDistance(sys.maxint) v.setPred(None) start.setDistance(0) for v in G: v.pqn = pqNode(v) PQ.insert(v.pqn) # O(n log n) while not PQ.isEmpty(): w = PQ.deleteMin().vertex for v in w.getAdj(): newDist = w.getDistance() + w.getCost(v) if newDist < v.getDistance(): v.setDistance(newDist) v.setPred(w) PQ.percolateUp(v.pqn.index) # O(m log n) All-pairs shortest path is trivially O(mn log n). If graph is sparse, that's optimal. If graph is dense, that's O(n**3 log n). Floyd's alg. is O(n**3), which is better -- but only if the graph is dense. ======================================== Minimum spanning tree (for undirected graph) Prim's alg. (covered in the book): O(m log n) Similar to Dijkstra's alg. but NOT THE SAME: At each step, instead of finding the node with the next-shortest path back to the starting node, we find the node with the shortest connection to the growing tree. While Dijkstra's alg. does produce a spanning tree, it isn't necessary minimal, and a MST does not necessarily give you the cheapest path from A to B for any given pair of nodes. << see dijkstra_prim.py >> def prim(G, start): PQ = minHeap() for v in G: # O(n) v.setDistance(sys.maxint) v.setPred(None) start.setDistance(0) for v in G: v.pqn = pqNode(v) PQ.insert(v.pqn) # O(n log n) while not PQ.isEmpty(): w = PQ.deleteMin().vertex for v in w.getAdj(): if v.pqn in PQ.contents and w.cost[v] < v.getDistance(): v.setDistance(w.cost[v]) v.setPred(w) PQ.percolateUp(v.pqn.index) # O(m log n) Kruskal's alg.: Also O(m log n), but arguably simpler (workshop)