Infinite Monkeys v1.61
miniature ends state explanations Series sick
starts side exchange
Source eventually Your revolves scribes
nowhere earns scattered
rid disk kingdom; tyranny You undoubtedly
you until likely
Same| mysterious struggle
launderers star rectangle emits Source eRoGK7; albeit
selecting guiding grasping getting go otter-view
otherwise effect thought throwing Gnoetry; lying ghosts
albeit temple emits sent there earns superstition
son News search help playing gertbot|
NOT tools spectacle everybody Yeailliam men
system markov Voyage epy Yeats Sidenote:
GRASS staples statement tat
l leaf-mould divided drifting grafting guided
This is an unsupervised generation done with Infinite Monkeys. Believe it or not, this was done completely with n-grams gathered from our newest release, the Gnoetry Daily chapbook. Normally, n-grams are biased in a linear fashion, ie: the word that comes after a certain word becomes one of its links. In this case, the bias was constructed out of some basic rules. Can you figure out the rule that wrote this poem?
The next word in the line’s sequence must begin with the same letter that ended the last word. For example: spam manners slap patterns.
This is one of two new ways to bias the n-grams data, the other being simple (naive) alliteration (which doesn’t take phonetic components into account).
Check it out:
*Please also note
Firstly, if you clear the bias first you will find that the “rule” applies basically without fail. However, in the case that a word has no links, another random word will be drawn.
Also, if you clear the bias, and generate a line out of the data, since their are no links, every word will be drawn at random.