A markov model is a representation internally of transition
probabilities. This can be used for many different applications. The easiest to explain are creating random word
generators (see the Markov.String
module) and random sentence generators. These are created by training the model
with a particular corpus of material of a particular genre. You can then create new
This is the main data structure for the markov models. This data structure internally uses a sparse representation for transition probabilities. This will be performant when you have many different possible states and they don't all connect to each other. If you have a lot of transition states and each state can transition to almost every other state, this implementation will not be as performant as it could be.
Elements are needed to define the beginning and end of a chain. We don't only need to know the transition probabilities from one element to another. We need to know the chance that a chain starts with an element, or the chance that the chain will end with an element.
empty : (Element a -> comparable) -> Markov comparable a
Create an empty markov chain with no elements in it. This is needed to add further elements into it to start to train the model. In order to store the transition properties, we need a way of serializing your data.
charComparable : Element Char -> Int
charComparable element =
case element of
Start ->
-2
Element c ->
Char.toCode c
End ->
-1
alphabet : Markov comparable a -> List a
Get the alphabet of the current markov model.
probability : Element a -> Element a -> Markov comparable a -> Basics.Float
Get the probability of a particular transition state.
transitionProbabilities : Element a -> Markov comparable a -> List ( Element a, Basics.Float )
For a particular element, get the probabilities for transitioning from the input element to all the other elements which are accessible from the input elements. This only returns elements with non-zero probabilities. All probabilities are normalized to be within the (0 -> 1] range.
add : Element a -> Element a -> Markov comparable a -> Markov comparable a
Add a transition into the markov graph. If the character is not an uppercase or lowercase character or a digit then the transition is not added.
train : List a -> Markov comparable a -> Markov comparable a
Add a sequence of transitions. This function adds the Start and End to the list so that you are able to train the data with starting and ending probabilities as well.
trainList : List (List a) -> Markov comparable a -> Markov comparable a
Train the markov model on multiple sequences at once.
{ maxLength : Basics.Int }
The parameters used for generateSequence
. This are setup in record format so that this function can be extended
later.
generateSequence : SequenceSettings -> Markov comparable a -> Random.Generator (List a)
Generate a sequence of elements using the probabilities within the markov transition graph. This sequence should
come out to be a random length and takes into account the Start -> a
and the a -> End
transition probabilities.
This ensures that the starting and ending weights are taken into account as well.
encode : (a -> String) -> Markov comparable a -> Json.Encode.Value
Encode a markov graph into a json object. In order to encode a markov graph we need to know how to encode your object into json data. Unfortunately because of the current implementation, the object needs to be converted into a string so that it can be used as the json "key" object for storing transition probabilities.
decode : (String -> a) -> (Element a -> comparable) -> Json.Decode.Decoder (Markov comparable a)
Decode a json object into a markov graph. In order to decode the model, you need to have the inverse of the encoding
function. This function needs to be able to convert the string object you created into your object. It also needs to
take the same function that you used to create the markov graph in the Markov.empty
function. This is what was used
to store that element into a transition graph.