Skip to content

Accidental, as in what the lion says (Stein Poem)

June 29, 2015

If a lion could talk, we would not understand him.
– Ludwig Wittgenstein (tr. G.E.M. Anscombe)

She cannot be blue.  There is no blue in general.
Just try it.  So rosy and pink with yourself.
So let us think for now only of meaning.
Dear me the thing is incomprehensible!
Also maybe you are incompatible.
You should never go into long descriptions.
If you hear clearly what the lion says.
Why a lizard or a man may be plainly anxious.
And wild animals have nothing to say to you.
Your imagination full of cuts and trembling.
You are indistinguishable from it.
Just past and above it is by accident.
The definition must be accidental.
The edges of the same place cannot burn twice.

Composed with Gnoetry 0.2 and the following texts:
Gertrude Stein, The World Is Round
Woods Hutchinson, The Child’s Day
Gertrude Stein, Corrected Stanzas in Meditation
Bertrand Russell, Mysticism and Logic and Other Essays

More Free Grass (9 “Haiku”)

June 29, 2015

Only text.  Have we
not stood here like trees in the
eyes of publishers?

I myself do not
drive an engine of free speech
and variety.

It cannot fail.  Code
becomes law; code extends the
control of others.

O manhood, tangled
in the administration
of democracy.

The soul as softness,
possibilities.  Can we
get service at night?

There is no escape
for you and me, but we do
not need the future.

It’s crazy to me
to believe that property
is the dream of life.

I swear I never
had any limit.  Yet there
is no need for me.

More formalities…
if you do not buy it, you
are a miracle.

Composed in collaboration with Gnoetry 0.2 and these texts:
Lawrence Lessig, Free Culture
Walt Whitman, Leaves of Grass

Careless, as in a little lonesome Rose (Stein Poem)

June 18, 2015

After a long break, I’ve come back to the Stein Poems series, eventually to become the book same: a Stein wreader. This poem uses Stein’s book, The World Is Round, as its principal source text (weighted 50%). It is a delightful book written for children, although some find it too “difficult” for their reading level (I hate when people use this word to speak about poetry).

Anyways, enjoy.

Well well Rose is a cowboy, too.
With all the legs of little lonesome,
The stars were big and round with glasses.
Almost every school will send for a wagon,
Drawn by the time you see it.
The grass looked like measles on buckskin,
The stars were not always sold there but they still flew.
Draw a picture of its legs,
Janet said, it will stand there.
I am talking about a sick pony in school,
Or it might be called a receiver, you see.
Did you see it there was an O and animals.
Did she see it.  The hours spin away,
For he is careless in winter.
It makes Rose cry and thunder so.

Composed with Gnoetry 0.2 and the following source texts:
Woods Hutchinson, The Child’s Day
Gertrude Stein, The World Is Round
Howard R. Garis, The Curlytops at Uncle Franks Farm

Install the Gnoetry 0.2 on Xubuntu VirtualBox Appliance

June 12, 2015

In an effort to spread the good news of Gnoetry more easily to the masses of Mac and Windows users, I’ve taken on the project this year to prepare an alternative method of getting Gnoetry up and running with minimal obstacles. For Ubuntu or Debian users, things have been simplified now by using git to clone michel-slm’s bug-fix branch of the Gnoetry source files from GitHub, installing a few dependencies and building from the included Makefile (I’ll put together a short set of instructions for that process later this summer). For those not interested in installing Linux on their computers, there is now an alternative.

I have put together an Xubuntu virtual machine which has Gnoetry 0.2 already installed and ready to launch from an icon on the desktop. VirtualBox allows for virtual machines to be exported and shared as “appliances.” You can download the VirtualBox appliance file (2.3 GB) and the PDF installation guide from my main website, here:

Gnoetry 0.2 Virtual Appliance Installation Guide and User Manual

Also included is a user manual for Gnoetry 0.2. Aside from the simple installation of Oracle VirtualBox, an extension pack, and the appliance itself, the only tricky part is optimizing the virtual machine for you system. A complete guide through this process is included in the installation guide.

Test it out yourselves and let me know how it works.


February 4, 2015

Memoria was fabricated from two separate ‘biases’ or saved n-gram states. Pages 3 through 10 (which disclude the title page and preface) are generated from the KJV translation of the Bible. All poems thereafter are a mixture of Revelation, Daniel, Genesis, and The Best Women’s Erotica of 2001.

The Augmented Imagination Project

September 21, 2014

Hi all, and thanks for inviting me to join this blog.

I’ve been engaged in an ongoing computer poetry experiment I call ‘The Augmented Imagination Project’ or ‘AI Project’(!) for short. I see the project as a creative experiment open to numerous interpretations, and no more susceptible to a definitive summary than a poem. But there are some things I can say by way of introduction to the project:

In the AI project, algorithms randomly generate words, which are shown in a randomized fashion in a digital display. Alphabetical constraints can be put on the words to be generated, for example, the constraint that only words containing the letter ‘d’ are displayed. I am currently working on adding further constraints so as to limit e.g. number of syllables or rhyme-endings.

My interest in the project centers on the possibility of exploring the impact of computers on the creative process itself. Unlike most other computerised poetry generators, the AI project is not designed to produce finished poems, but to offer an interface for human-computer interaction whose promise is an enhancement or ‘augmentation’ of the poetic imagination, specifically with respect to its word-generating function.

As I argue in an essay on my website, it is possible to locate this project within a subtradition of computer poetry I call ‘computer-assisted poetry’ (as distinct from ‘computer-generated poetry’) that would include for some poetic experiments by Charles Hartman, and Ray Kurzweil’s Cybernetic Poet software.

I think of myself as having a pluralistic poetics encompassing both avant-garde and more traditional perspectives. Consequently, I have personally used the AI project to assist in writing poetry in a variety of styles, including styles that may not appear in the least computer-generated. (An example can be found here.) This neutrality between a variety of approaches to poetry and poetics is another thing that excites me about the project.

To see the program in action, and my essay ‘Seven ways of looking at the Augmented Imagination Project’ visit my website

I’ve also created a resource site for digital poetics. It’s still in its early stages, and I’d be grateful for any suggestions of links to add to it.

Thanks again. I welcome comments on the project, and will keep you posted on updates.


Infinite Monkeys 3.4.0

July 18, 2014

Hello All!

I have finally gotten around to finishing and updating IM3’s new scripting language (JSB).


JSB is a major upgrade for a couple of different reasons. Namely, the base variable type is based on JavaScript Object Notation (JSON) and that makes is easier to import or export information to or from JSB/IM3. Secondly, the language is just more powerful, and better constructed, and it because it has a more JavaScript-like syntax should be easier for people to pick up as opposed to SMUP which like Lisp except crappier.


Here is an example of JSB (with output)

var s = "";

s += JJ + " " + NNS + newline;
s += JJ + " " + NNS + newline;
s += "the " + NN + " of " + NNS;

print( s );
out( s );



limpid environics
brandname corticosteroids
the pollio of housings


news early July

July 3, 2014
  • killer article on interactive poetry generators: Reading the Drones at eliterature 2014. a strong focus on JanusNode and other generators, says: “Some of the most interesting contemporary poetry is written in collaboration with [Interactive Poetry Generators] and is yet to receive the critical attention it deserves – see, for example, the Gnoetry Daily group blog and Eric Elshtain’s Beard of Bees press.” word! (thanks for pointing it out, Nick!)
  • a Turing test for poetry:! of particular amusement is the fact that in their leaderboard, 4 of the top 5 poems in the “Most computer-like computer poems” (i.e. computer-generated poems that were correctly identified as computer-generated) were produced by jGnoetry. Of the “Most human-like computer poems”, 2 of the top 5 are by JanusNode. But see, that’s why I don’t like that whole “try to fool people into thinking a computer-generated poem is actually human-generated” angle. The whole point of jGnoetry is that you spend time during interactive generation to make it interesting… that way it’s yours, but it isn’t as computer-generated. With JanusNode, all the work is done beforehand to make it human-like, but it isn’t really yours… and since it was so carefully human-authored, is it really solely computer-generated? Still, the website is a neat idea, check it out!
  • Yet again, I missed the Computational Linguistics for Literature workshop. There seemed to be fewer talks this year, though, and nothing really on poetry or generation…
  • the code poetry slam isn’t completely about poetry generation, but it’s still kind of cool…
  • an interesting story about text generation with Markov chains from the writer of the infamous Scientific American “computer recreations” column that introduced the idea to north america
  • no more ads on Gnoetry Daily! when did they start adding ads anyway? screw you,!
  • the always-interesting poetry with mathematics blog had some interesting posts over the past year, including oulipo-esque symmetric squares and growing lines, as well as poetry with mathematics readings!
  • in the every-growing tradition of twitter bots… @pentametron!
  • just so you know… Gnoetry Daily Volume 2 is in the works!!1!!

June 29, 2014

Thoughts on NaNoGenMo pt. 2

June 17, 2014

Continuing my overview of National Novel Generation Month, in which novel-length texts were computationally generated…

4. jiko

jiko’s project “Gen Austen” (hah!) produced several novels derived from Jane Austen (or in one case, Austen-related fanfic).

This is not jiko

This is not jiko

One uses trigrams with some POSing, one uses some numerological approach, two of them use dialogue-replacement algorithms (replacing the dialogue of one novel with the dialogue of another) and one is passed through an anagram generator to produce, basically, a list of anagrams. jiko also provides a list of resources he’s worked on (or just finds useful?)

5. nickmontfort

Nick’s novel World Clock consist of a series of template-generated paragraphs which describe a time in a location, a person in a place, and an action performed by that person.

This is not nickmontfort

This is not nickmontfort

For an added bit of class, the script outputs TeX, for easy pdf-ing.

6. catseye

Next up is catseye, who developed a story engine that also powered dariusk’s entry.

This is not catseye

This is not catseye

catseye’s novel Dial S for Swallows discusses the interactions of two characters, an object, and a location. It reminds me a bit of the time I copy-pasted the output of a mud (a text “multi-user dungeon”, kids) repeatedly until it was novel-length. Skimming the source code, it looks like it is indeed a high-level simulator running several agents through a world representation. Another thing worth looking through. catseye also has several very interesting thoughts on the process worth further study.

7. elib

Looks like elib wrote a twitter scraper that collated all lines it found that began with “The way that…” Not bad. Works best as conceptual writing.

This is not elib

This is not elib

8. MichaelPaulukonis

Looks like MichaelPaulukonis did several NLP-based transformations on texts, including named entity recognition swaps between texts. It looks like something more is going on, but it’s not clear what; need to look through the final text and the source code a bit more.

This is not MichaelPaulukonis

This is not MichaelPaulukonis

9. ianrenton

ianrenton is apparently using some kind of spam-generation technique called Bayesian Poisoning (which is a technique that adds common non-spam words to spam in an attempt to have them classified as spam, thereby rendering the classifier unreliable.) It’s a great idea, since it’s likely to add to a text the kind of words you’d expect to see there (i.e. not spam-like.) ianrenton produced a text using fanfic from as a basis. I haven’t looked too closely through the source code, but it seems to work by collating sentences from different stories before performing any Bayesian Poisoning techniques.

This is not ianrenton

This is not ianrenton

10. lilinx

lilinx is pulling all sentences with the word hit/hurt from a French translation of Homer’s Iliad, and reversing their order. My French isn’t very good, but it’s a great idea. Script, notes, and output.

This is not lilinx

This is not lilinx

11. ikarth

Then there’s ikarth’s “Gutenberg shuffle” approach. Described as: “Take a collection of texts from Gutenberg, generate a list of all of the names you find. Swap the names around. Shuffle the sentences. Shuffle and redeal for every paragraph.” Here’s the code, which allegedly uses OpenNLP, and the resulting novel.

This is not ikarth

This is not ikarth

12. Beyamor

Beyamor’s novel “The Lives of Writers” is made by taking bits of writer’s biographies, and collating them together. Great idea, with a lot of potential. Looks like it’s written in Python.

This is not Beyamor

This is not Beyamor

13. Strand

Strand’s “simple solution” is

puts "Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo. " * 6250

ah ha ha ha ha…. more conceptualism. or is it flarf? or just good ol’ fashioned lulz? I like the sentence, but… I dunno…

This is not strand

This is not strand

14. jonkagstrom

jonkagstrom’s Abscission is deep. Apparently , the approach is to part-of-speech tag Kafka’s Metamorphosis, then modify each paragraph according to a series of transformations directed by WordNet. Pretty awesome on a number of levels. Here’s the output and the code.

This is not jonkagstrom

This is not jonkagstrom