### Archive

Archive for the ‘graph theory’ Category

## Basic implementation of Dijkstra’s algorithm

Dijkstra’s algorithm as presented in Algorithm 2.5, page 75 of the book Algorithmic Graph Theory is meant to be a general template. Lots of details have been left out, one in particular is how to implement line 6 of the algorithm. This one line of Dijkstra’s algorithm has been the subject of numerous research papers on how to efficiently implement a search technique for Dijkstra’s algorithm. A simple search technique is linear search, where you search some array or list from start to finish. A more efficient search technique for line 6 is a binary heap. To implement infinity as stated on line 1 of Algorithm 2.5, you would simply let a very large number represent infinity. This number should ideally be larger than any weight in the graph you are searching.

Below is a basic implementation of Dijkstra’s algorithm following the general template of Algorithm 2.5. This implementation is meant to be for searching in simple, unweighted, undirected, connected graphs. Because the graph to search is assumed to be unweighted, I simply let each edge have unit weight and represent infinity as the integer $10^9$. The implementation below should provide a basis on which to implement Dijkstra’s algorithm for, say, weighted graphs and other types of graphs. To use the implementation below, save it to a Python file, load the file into a Sage session using load(), and call the function dijkstra().

def dijkstra(G, s):
"""
Shortest paths in a graph using Dijkstra's algorithm.

INPUT:

- G -- a simple, unweighted, undirected, connected graph. Thus each edge
has unit weight.
- s -- a vertex in G from which to start the search.

OUTPUT:

A list D of distances such that D[v] is the distance of a shortest path
from s to v.  A dictionary P of vertex parents such that P[v] is the
parent of v.

EXAMPLES:

sage: G = graphs.PetersenGraph()
sage: dijkstra(G, 0)
([0, 1, 2, 2, 1, 1, 2, 2, 2, 2], {1: 0, 2: 1, 3: 4, 4: 0, 5: 0, 6: 1, 7: 5, 8: 5, 9: 4})

sage: G = Graph({0:{1:1, 3:1}, 1:{2:1, 3:1, 4:1}, 2:{4:1}})
sage: dijkstra(G, 0)
([0, 1, 2, 1, 2], {1: 0, 2: 1, 3: 0, 4: 1})
"""
n = G.order()  # how many vertices
m = G.size()   # how many edges
D = [1000000000 for _ in range(n)]  # largest weights; represent +infinity
D[s] = 0       # distance from vertex to itself is zero
P = {}         # a dictionary for fast look-up
Q = set(G.vertices())
while len(Q) > 0:
v = mindist(D, Q)
Q.remove(v)
if D[u] > D[v] + 1:     # each edge has unit weight, so add 1
D[u] = D[v] + 1
P.setdefault(u, v)  # the parent of u is v
return D, P

def mindist(D, Q):
"""
Choose a vertex in Q such that it has minimal distance.

INPUT:

- D -- a list of vertices with corresponding distances.  Each distance
D[v] corresponding to a vertex v means that v is that much further away
from a source vertex.
- Q -- all vertices to consider.

OUTPUT:

A vertex with minimum distance.
"""
v = None          # start the search here
low = 1000000000  # the running minimum distance; represent +infinity
for u in Q:
if D[u] < low:
v = u
low = D[v]
return v

Categories: graph theory, mathematics, Sage

## DaMN book now on Softpedia

My book-in-progress Algorithmic Graph Theory, co-authored with David Joyner and Nathann Cohen, is now listed on Softpedia. These days I rather refer to the book as the DaMN book. The name is taken from the first letter of the first name of each author.

## Typeset trees using TikZ/PGF

Some combinatorial trees typeset using TikZ/PGF. The Linux filesystem hierarchy:

\begin{tikzpicture}
[-,thick]
\footnotesize
\node {\texttt{/}} [edge from parent fork down]
child {node {\texttt{bin}}}
child {node {\texttt{etc}}}
child {node {\texttt{home}}
child {node {\texttt{anne}}}
child {node {\texttt{sam}}}
child {node {$\dots$}}
}
child {node {\texttt{lib}}}
child {node {\texttt{opt}}}
child {node {\texttt{proc}}}
child {node {\texttt{tmp}}}
child {node {\texttt{usr}}
child {node {\texttt{bin}}
child {node {\texttt{acyclic}}}
child {node {\texttt{diff}}}
child {node {\texttt{dot}}}
child {node {\texttt{gc}}}
child {node {\texttt{neato}}}
child {node {$\dots$}}
}
child {node {\texttt{include}}}
child {node {\texttt{local}}}
child {node {\texttt{share}}}
child {node {\texttt{src}}}
child {node {$\dots$}}
}
child {node {$\dots$}};
\end{tikzpicture}


Classification tree of organisms:

\begin{tikzpicture}
[sibling distance=6cm,-,thick]
\footnotesize
\node {organism}
child {node {plant}
[sibling distance=2cm]
child {node {tree}
child {node {deciduous}}
child {node {evergreen}}
}
child {node {flower}}
}
child {node {animal}
[sibling distance=2.5cm]
child {node {invertebrate}}
child {node {vetebrate}
[sibling distance=4.7cm]
child {node {bird}
[sibling distance=1.5cm]
child {node {finch}}
child {node {rosella}}
child {node {sparrow}}
}
child {node {mammal}
[sibling distance=1.5cm]
child {node {dolphin}}
child {node {human}}
child {node {whale}}
}
}
};
\end{tikzpicture}


The Bernoulli family tree of mathematicians:

\begin{tikzpicture}
[-,thick,%
every node/.style={shape=rectangle,inner sep=3pt,draw,thick}]
\footnotesize
\node {Nikolaus senior} [edge from parent fork down]
[sibling distance=4cm]
child {node {Jacob}}
child {node {Nicolaus}
child {node {Nicolaus I}}
}
child {node {Johann}
[sibling distance=2cm]
child {node {Nicolaus II}}
child {node {Daniel}}
child {node {Johann II}
child {node {Johann III}}
child {node {Daniel II}}
child {node {Jakob II}}
}
};
\end{tikzpicture}


An expression tree:

\begin{tikzpicture}
[-,thick]
\node {$+$}
[sibling distance=2.5cm]
child {node {$\times$}
[sibling distance=1cm]
child {node {$a$}}
child {node {$a$}}
}
child {node {$\times$}
[sibling distance=1cm]
child {node {$2$}}
child {node {$a$}}
child {node {$b$}}
}
child {node {$\times$}
[sibling distance=1cm]
child {node {$b$}}
child {node {$b$}}
};
\end{tikzpicture}