Archive

Archive for the ‘mathematics’ Category

Primality test in Haskell

14 March 2012 Leave a comment

Problem: Implement a primality test in Haskell. Below is the code. The first implementation of the function divides uses a homemade function for finding the remainder when an integer is divided by another integer.

-- The remainder when a is divided by b.                                        
remainder :: Integer -> Integer -> Integer
remainder a b | a < b = a
              | a == b = 0
              | otherwise = remainder (a - b) b

-- Whether d divides n.                                                         
divides :: Integer -> Integer -> Bool                                        
divides d n = remainder n d == 0

The second implementation of divides uses the built-in function rem to find the remainder upon division by an integer.

-- Whether d divides n.  A more efficient version that uses the built-in        
-- function rem.                                                                
divides :: Integer -> Integer -> Bool
divides d n = rem n d == 0

The full primality test follows:

-- Whether d divides n.  A more efficient version that uses the built-in        
-- function rem.                                                                
divides :: Integer -> Integer -> Bool
divides d n = rem n d == 0

-- The least divisor of n that is at least k.                                   
ldf :: Integer -> Integer -> Integer
ldf k n | divides k n = k
        | k^2 > n = n
        | otherwise = ldf (k + 1) n

-- The least divisor of n.                                                      
ld :: Integer -> Integer
ld n = ldf 2 n

-- Primality test.                                                              
prime :: Integer -> Bool
prime n | n < 1 = error "must be a positive integer"
        | n == 1 = False
        | otherwise = ld n == n

Basic implementation of Dijkstra’s algorithm

25 February 2012 Leave a comment

Dijkstra’s algorithm as presented in Algorithm 2.5, page 75 of the book Algorithmic Graph Theory is meant to be a general template. Lots of details have been left out, one in particular is how to implement line 6 of the algorithm. This one line of Dijkstra’s algorithm has been the subject of numerous research papers on how to efficiently implement a search technique for Dijkstra’s algorithm. A simple search technique is linear search, where you search some array or list from start to finish. A more efficient search technique for line 6 is a binary heap. To implement infinity as stated on line 1 of Algorithm 2.5, you would simply let a very large number represent infinity. This number should ideally be larger than any weight in the graph you are searching.

Below is a basic implementation of Dijkstra’s algorithm following the general template of Algorithm 2.5. This implementation is meant to be for searching in simple, unweighted, undirected, connected graphs. Because the graph to search is assumed to be unweighted, I simply let each edge have unit weight and represent infinity as the integer 10^9. The implementation below should provide a basis on which to implement Dijkstra’s algorithm for, say, weighted graphs and other types of graphs. To use the implementation below, save it to a Python file, load the file into a Sage session using load(), and call the function dijkstra().

def dijkstra(G, s):
    """
    Shortest paths in a graph using Dijkstra's algorithm.

    INPUT:

    - G -- a simple, unweighted, undirected, connected graph. Thus each edge
      has unit weight.
    - s -- a vertex in G from which to start the search.

    OUTPUT:

    A list D of distances such that D[v] is the distance of a shortest path
    from s to v.  A dictionary P of vertex parents such that P[v] is the
    parent of v.

    EXAMPLES:

    sage: G = graphs.PetersenGraph()
    sage: dijkstra(G, 0)
    ([0, 1, 2, 2, 1, 1, 2, 2, 2, 2], {1: 0, 2: 1, 3: 4, 4: 0, 5: 0, 6: 1, 7: 5, 8: 5, 9: 4})

    sage: G = Graph({0:{1:1, 3:1}, 1:{2:1, 3:1, 4:1}, 2:{4:1}})
    sage: dijkstra(G, 0)
    ([0, 1, 2, 1, 2], {1: 0, 2: 1, 3: 0, 4: 1})
    """
    n = G.order()  # how many vertices
    m = G.size()   # how many edges
    D = [1000000000 for _ in range(n)]  # largest weights; represent +infinity
    D[s] = 0       # distance from vertex to itself is zero
    P = {}         # a dictionary for fast look-up
    Q = set(G.vertices())
    while len(Q) > 0:
        v = mindist(D, Q)
        Q.remove(v)
        Adj = set(G.neighbors(v))
        for u in Adj.intersection(Q):
            if D[u] > D[v] + 1:     # each edge has unit weight, so add 1
                D[u] = D[v] + 1
                P.setdefault(u, v)  # the parent of u is v
    return D, P

def mindist(D, Q):
    """
    Choose a vertex in Q such that it has minimal distance.

    INPUT:

    - D -- a list of vertices with corresponding distances.  Each distance
      D[v] corresponding to a vertex v means that v is that much further away
      from a source vertex.
    - Q -- all vertices to consider.

    OUTPUT:

    A vertex with minimum distance.
    """
    v = None          # start the search here
    low = 1000000000  # the running minimum distance; represent +infinity
    for u in Q:
        if D[u] < low:
            v = u
            low = D[v]
    return v
Categories: graph theory, mathematics, Sage

Documenta Mathematica now mirrored at sagemath.org

26 June 2011 Leave a comment

I’m delighted to announce that the journal Documenta Mathematica is now mirrored on the Sage website at http://www.sagemath.org/documenta/. Documenta Mathematica is an open access mathematics journal. It is open to all fields of mathematics. Articles are refereed in the traditional anonymous peer review model, which is what any respectable journal does.