In a previous post I have explained how to extend arbitrary instances of `MonadPlus` with constraint solving capabilities. Now, I’ll focus on how to use this functionality to implement lazy non-deterministic computations on top of a `MonadPlus`.

You can fetch the latest developments based on the idea presented in this post from my git repositories constraint-monad.git and lazynd.git.

## Eager Non-Determinism

Haskell programmers usually use the `MonadPlus` type class to model non-deterministic computations. Let’s define an operation that inserts an element at an arbitrary position in a list using this style.

``````insert :: MonadPlus m => a -> [a] -> m [a]
insert x xs = return (x:xs)
`mplus` case xs of
[]     -> mzero
(y:ys) -> liftM (y:) (insert x ys)
``````

If we compute the head of inserting zero in the list of natural numbers, then we could expect to get two results — zero and one. With this implementation, however, the second result `1` is duplicated infinitely often.

``````> take 10 (insert 0 [1..] >>= return . head)
[0,1,1,1,1,1,1,1,1,1]
``````

The reason is that, in the last line of `insert`, the result of the recursive call is evaluated before returning the list extended by `y` — even if it is not demanded.

In a lazy monad — i.e., in a monad where `undefined >> x` equals `x` — the result `1` would not be duplicated. However, I haven’t seen a lazy monad for non-determinism yet and couldn’t implement one either.

## Call-Time Choice

The problem with expressing non-determinism like shown above is that arguments of functions are evaluated before applying the function. In a demand driven language we would expect arguments to be evaluated only if they are demanded be the computation. In order to avoid evaluating arguments prematurely, we can make arguments of functions monadic too. But then, argument variables are placeholders for possibly non-deterministic computations rather than values which changes the semantics of the program. For example, the call `double coin :: [Int]` yields `[0,1,1,2]` with the following program.

``````coin :: MonadPlus m => m Int
coin = return 0 `mplus` return 1

double :: MonadPlus m => m Int -> m Int
double n = liftM2 (+) n n
``````

Intuitively, a shared variable denotes the same value at every occurrence, but in the example above it denotes a computation that yields `0` or `1` non-deterministically. The results of the two arguments of `(+)` may differ although they are shared by the variable `n`.

In purely functional programs, demand driven evaluation produces the same results as eager evaluation if the latter terminates. This is a desirable property also for non-deterministic programs and the functional-logic programming community has coined this notion call-time choice: variables are placeholders for values and lazy non-deterministic computations should have the same results as corresponding eager ones.

Without extensions, `MonadPlus` allows us to model either call-time choice (by resorting to eager evaluation) or laziness (sacrificing call-time choice). We cannot obtain both.

## Constraining Alternatives of Choices

In a previous post, I have shown how to add constraints to any instance of `MonadPlus`. A preliminary implementation of lazy functional logic programming that uses choice constraints to ensure call-time choice is given in these files:

The file permsort.hs shows how to express lazy FLP using monadic computations with choice constraints. For example, you can evaluate the call

``````evalPrint id (permute (cons true (cons false nil)))
``````

to compute the solutions of permuting the list `[True,False]`.