Skip to content

nightmare, a sonnet

September 29, 2010

nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare
nightmare nightmare nightmare nightmare nightmare

18 Sept, bigram generation, Suicidal Tendencies first album lyrics, epogees.

I get it now! Flarf!

No seriously, this poem is actually an illustration of what happens when your bigram model has an (a, b) pair where a = b, where the set (a,*) has only one element, and where there is more than one (*,b). For example, the Suicidal Tendencies song “Subliminal” includes the chorus:

“Danger – Nightmare
Doomsday – Nightmare
Murder – Nightmare
Nightmare, Nightmare”

Note that there’s a dash in each of the first three lines. So basically, when you’re generating from your language model, any time you pick one of the words (danger, doomsday, murder), the next token will be “-” with some non-zero probability. After that, the next word will be “nightmare” with some non-zero probability Now for this session I’d set the tokenizer to remove commas ’cause I thought it was appropriate for the subject material. So it happens that in a bigram model of the Suicidal Tendencies’ first albums’s lyrics, the only successor of the token “nightmare” is “nightmare” itself, from the last line shown above. so once “nightmare” is selected, the only possible successor was “nightmare” itself, and so on ad infinitum.

Personally, the real reason it was interesting was the subjective experience of authoring it. Generating poetry is kind of like using a petri dish to grow micro-organisms. You choose your technique, you set your parameters, and you see what happens. Every once in a while I’d see “nightmare” occur here and there, and then inevitably “nightmare” would consume the whole population of words like an uncontrollable bacteria.

ok, cute, so what? well, the funny thing about these language models is that any model that contains a “cycle” defined as a series of seen bigram pairs (a, b) (b,c), … (*,a) will have an infinite set of possible poems. HOWEVER, by introducing poetic forms, we are constraining that number of possible poems to be generated! for any nontrivial language model there are still going to be approximately nine bajillion possible poems (i.e. too many to examine exhaustively) but we can still try partitioning the space of possible poems. for example, realizing that if you’re generating an infinite bigram chain tokenized without puncutation on the lyrics of the Suicidal Tendencies first album, the chain will tend towards a state where the only token being generated is: “nightmare.” QED!

p.s. I went to a poetry reading today! It was exactly like I’d always imagined an AA meeting would be!!! Jesus, if that’s the state of modern poetry then we’re better off just being kooks on blogs!

I don’t understand why you would want to have a poetry reading without music. People will listen to pretty much anything you say if you put it to a beat.

Advertisements
No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: