Skip to content

Sketch: ClassNPhone

February 14, 2010

poets so public, hung to write thine body rebel love
bright knowledge adonis minds fearing neglect
imprison time from lawful bell, falsely filching the physician
blood and wood quick darkly

mistress both untrue and melancholy
have virgin hands lacking neglect
love gentle in accidents outgoing badness
thy children your impregnable legions

1:42am, jan 6 2010
supervised generation, phonemically evaluated, sampling from a class-based bigram model of shakespeare’s sonnets

I totally dig Pablo Gervás, head of a team that does some pretty sophisticated poetry generation research. But I just came across a paper of his at epoetry 2007 with an unfortunate quote, given my current focus:

“Most people prefer a completely mundane sentence that makes sense to line of alliterative drivel.”

Oh man… and just when I’d given up on coherence to focus on phonemics…

Coherence, as I say, is a tricky thing. For the moment I wanted to focus on the sounds of poetry. Sure, you can have Gnoetry generate bigrams until they sound good, but I wanted a program that at least attempted to sound good automatically. But how do you evaluate ‘sounds good’? If you used human annotators you’d need reasonable agreement metrics, and I suspect you’d never get two people to agree reliably unless you restricted the ratings to something trivial. But if you use an authorial approach to evaluation, you can say that ‘sounds good’ is whatever the author says it is!

So anyway, I built an interface to allow the author to define the kind of phonemics wanted, and generate poetry accordingly:

I wanted to make it look like a soundboard, you know, like in recording studios. Well it does, a little, but the poetry sucks. Luckily I made the output completely editable. So you could cut out, for example, everything except “increase the contrary” and “evermore feeling”, re-generate, and manually edit again. This is the complete sonnets of Shakespeare, by the way. I used class-based n-grams instead of straight type-based n-grams like in Gnoetry. Originally I thought it would provide more variation, but I think it just removed whatever little coherence adjacency brings… Oh well, that’s why I consider this a ‘sketch’.

Anyway, it was inspired by the textbook example of n-gram poetry generation in Jurafsky and Martin (2000), as well as by Gnoetry (Trowridge, 2007) and (Manurung, 2003). I used the Stanford Part-Of-Speech tagger on the copy of Shakespeare’s sonnets, though any tagger (or even manual annotation) would do, and generated through sampling. Pressing the ‘generate – line’ button generates 10 lines, the best of which is picked by an evaluation function, which is a measure of a candidate line’s phonemic similarity to the phonemics desired. The CMU pronouncing dictionary (partially supplemented by manually-annotated words) enables the evaluation of the phonemes that make up the line. Increasing the value of Consonance or Assonance provides higher evaluation scores that contain multiple instances of ANY consonants or vowels, respectively. Increasing the value of an individual consonant or vowel phoneme likewise provides higher evaluation scores for those lines that contain those particular phonemes. Examining the Java console (which may be accessible through the browser, or through a toolbar widget) shows the candidate lines and their scores.

I made it a java applet so people could check it out without having to install a Python module or something but apparently wordpress.com doesn’t allow applets in blog posts. So until I score some web space, you can check it out by downloading it here (UPDATE here), renaming the extension to .zip, unzipping it, and double-clicking on the web page therein. The latest version is now online here!

Later!

Works Cited

Pablo Gervás, Raquel Hervás, Jason R Robinson, “Difficulties and Challenges in Automatic Poem Generation: Five Years of Research at UCM”, e-poetry 2007, Université Paris8.

Daniel Jurafsky and James Martin, Speech and Language Processing, 1st edition, 2000.

Hisar Maruli Manurung (2003), “An evolutionary algorithm approach to poetry generation”, PhD dissertation, University of Edinburgh

Jon Trowridge (2007?), “How Gnoetry 0.2 Works”, mid)rib Issue 2.

Advertisements
3 Comments leave one →
  1. Eric Elshtain permalink
    February 14, 2010 6:27 pm

    Welcome! Thanks for the fantastic work–and for adding to our collection of poetry machines.

    Mr. Gervas’ statement is a common one and begs the question–are those really the only alternatives? Sense versus absolute nonsense (“drivel”). He also says in the essay that “Poems usually tell a story.” Pace, Gervas, but really? Usually? That seems to me to be a very short-sighted, if not naive idea to assert about poetry.

    Machines like Gnoetry butt heads with the arbiters (I’m not including Gervas here) of a totalist view of how poems can mean.

  2. Matthew permalink
    February 16, 2010 10:37 am

    This is fantastic! I’m always excited to play with new word toys, and I’m already having fun with this one.

    My only question would be how one might add more source texts. I’m no programmer, so I have no idea if that’s a silly question or if it would be more than I could handle, but I’d love to see what the program would be like with more sources at its disposal.

    Again, though, great work.

    • February 18, 2010 7:10 pm

      > My only question would be how one might add more source texts.

      Not easily at the moment. Maybe someday. There are a couple other things I want to do first.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: