Skip to content

Infinite Monkeys 3.4.0

July 18, 2014
by

Hello All!

I have finally gotten around to finishing and updating IM3’s new scripting language (JSB).

https://sourceforge.net/projects/infinitemonkeys616/

 

JSB is a major upgrade for a couple of different reasons. Namely, the base variable type is based on JavaScript Object Notation (JSON) and that makes is easier to import or export information to or from JSB/IM3. Secondly, the language is just more powerful, and better constructed, and it because it has a more JavaScript-like syntax should be easier for people to pick up as opposed to SMUP which like Lisp except crappier.

 

Here is an example of JSB (with output)

var s = "";

s += JJ + " " + NNS + newline;
s += JJ + " " + NNS + newline;
s += "the " + NN + " of " + NNS;

print( s );
out( s );

 

 

limpid environics
brandname corticosteroids
the pollio of housings

 

news early July

July 3, 2014
  • killer article on interactive poetry generators: Reading the Drones at eliterature 2014. a strong focus on JanusNode and other generators, says: “Some of the most interesting contemporary poetry is written in collaboration with [Interactive Poetry Generators] and is yet to receive the critical attention it deserves – see, for example, the Gnoetry Daily group blog and Eric Elshtain’s Beard of Bees press.” word! (thanks for pointing it out, Nick!)
  • a Turing test for poetry: botpoet.com! of particular amusement is the fact that in their leaderboard, 4 of the top 5 poems in the “Most computer-like computer poems” (i.e. computer-generated poems that were correctly identified as computer-generated) were produced by jGnoetry. Of the “Most human-like computer poems”, 2 of the top 5 are by JanusNode. But see, that’s why I don’t like that whole “try to fool people into thinking a computer-generated poem is actually human-generated” angle. The whole point of jGnoetry is that you spend time during interactive generation to make it interesting… that way it’s yours, but it isn’t as computer-generated. With JanusNode, all the work is done beforehand to make it human-like, but it isn’t really yours… and since it was so carefully human-authored, is it really solely computer-generated? Still, the website is a neat idea, check it out!
  • Yet again, I missed the Computational Linguistics for Literature workshop. There seemed to be fewer talks this year, though, and nothing really on poetry or generation…
  • the code poetry slam isn’t completely about poetry generation, but it’s still kind of cool…
  • an interesting story about text generation with Markov chains from the writer of the infamous Scientific American “computer recreations” column that introduced the idea to north america
  • no more ads on Gnoetry Daily! when did they start adding ads anyway? screw you, wordpress.com!
  • the always-interesting poetry with mathematics blog had some interesting posts over the past year, including oulipo-esque symmetric squares and growing lines, as well as poetry with mathematics readings!
  • in the every-growing tradition of twitter bots… @pentametron!
  • just so you know… Gnoetry Daily Volume 2 is in the works!!1!!

June 29, 2014
by

Thoughts on NaNoGenMo pt. 2

June 17, 2014

Continuing my overview of National Novel Generation Month, in which novel-length texts were computationally generated…

4. jiko

jiko’s project “Gen Austen” (hah!) produced several novels derived from Jane Austen (or in one case, Austen-related fanfic).

This is not jiko

This is not jiko

One uses trigrams with some POSing, one uses some numerological approach, two of them use dialogue-replacement algorithms (replacing the dialogue of one novel with the dialogue of another) and one is passed through an anagram generator to produce, basically, a list of anagrams. jiko also provides a list of resources he’s worked on (or just finds useful?)

5. nickmontfort

Nick’s novel World Clock consist of a series of template-generated paragraphs which describe a time in a location, a person in a place, and an action performed by that person.

This is not nickmontfort

This is not nickmontfort

For an added bit of class, the script outputs TeX, for easy pdf-ing.

6. catseye

Next up is catseye, who developed a story engine that also powered dariusk’s entry.

This is not catseye

This is not catseye

catseye’s novel Dial S for Swallows discusses the interactions of two characters, an object, and a location. It reminds me a bit of the time I copy-pasted the output of a mud (a text “multi-user dungeon”, kids) repeatedly until it was novel-length. Skimming the source code, it looks like it is indeed a high-level simulator running several agents through a world representation. Another thing worth looking through. catseye also has several very interesting thoughts on the process worth further study.

7. elib

Looks like elib wrote a twitter scraper that collated all lines it found that began with “The way that…” Not bad. Works best as conceptual writing.

This is not elib

This is not elib

8. MichaelPaulukonis

Looks like MichaelPaulukonis did several NLP-based transformations on texts, including named entity recognition swaps between texts. It looks like something more is going on, but it’s not clear what; need to look through the final text and the source code a bit more.

This is not MichaelPaulukonis

This is not MichaelPaulukonis

9. ianrenton

ianrenton is apparently using some kind of spam-generation technique called Bayesian Poisoning (which is a technique that adds common non-spam words to spam in an attempt to have them classified as spam, thereby rendering the classifier unreliable.) It’s a great idea, since it’s likely to add to a text the kind of words you’d expect to see there (i.e. not spam-like.) ianrenton produced a text using fanfic from fanfiction.net as a basis. I haven’t looked too closely through the source code, but it seems to work by collating sentences from different stories before performing any Bayesian Poisoning techniques.

This is not ianrenton

This is not ianrenton

10. lilinx

lilinx is pulling all sentences with the word hit/hurt from a French translation of Homer’s Iliad, and reversing their order. My French isn’t very good, but it’s a great idea. Script, notes, and output.

This is not lilinx

This is not lilinx

11. ikarth

Then there’s ikarth’s “Gutenberg shuffle” approach. Described as: “Take a collection of texts from Gutenberg, generate a list of all of the names you find. Swap the names around. Shuffle the sentences. Shuffle and redeal for every paragraph.” Here’s the code, which allegedly uses OpenNLP, and the resulting novel.

This is not ikarth

This is not ikarth

12. Beyamor

Beyamor’s novel “The Lives of Writers” is made by taking bits of writer’s biographies, and collating them together. Great idea, with a lot of potential. Looks like it’s written in Python.

This is not Beyamor

This is not Beyamor

13. Strand

Strand’s “simple solution” is

puts "Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo. " * 6250

ah ha ha ha ha…. more conceptualism. or is it flarf? or just good ol’ fashioned lulz? I like the sentence, but… I dunno…

This is not strand

This is not strand

14. jonkagstrom

jonkagstrom’s Abscission is deep. Apparently , the approach is to part-of-speech tag Kafka’s Metamorphosis, then modify each paragraph according to a series of transformations directed by WordNet. Pretty awesome on a number of levels. Here’s the output and the code.

This is not jonkagstrom

This is not jonkagstrom

News for Infinite Monkeys 3

June 15, 2014
by

I am currently working on a new scripting language for im3 which has a more BASIC-like syntax but also uses JSON objects as its base variable type. Variable declares, math operators, string operators, and basic math and string functions have been added as well as FOR and WHILE loops, array plucking and assignment, if/else procedures and I think I finally have the parser working properly. SMUP was clunky and difficult to work with so I decided to create a language which I felt was more enjoyable to play with and more powerful functionality wise. And importantly, I’m having a great deal of fun actually programming it.

I have also compiled my dictionary into a JSON object. It contains around 50k words, and has CMU’s Pronouncing dictionary appended to it. It contains part of speech information, keyword tagging, and stress information. That can be downloaded here:

https://sourceforge.net/projects/infinitemonkeys616/files/master.dictionary.json/download

im3 now makes it easier to add words en masse and any dictionary can now be exported as a JSON object, so I’m super excited about the project.

Also, today my EEG reader was delivered by NeuroSky and the developer’s kit is free, and written in object C, which I can work with. So there are plans in the works to start developing for the head set, which I think will be super fun.

J!

New Version of prosaic

June 9, 2014

prosaic has evolved from unmaintainable piles of Perl and then Javascript to the pristine world of symbolic evaluation with a rewrite in a relatively young lisp-like language called Hy. Along with a cleaner and saner codebase, prosaic has a short explanatory readme, its first documentation to ever exist.

One can check out the new release on Github. If you’re more interested in poetry, here’s a series created using the previous version of prosaic based on thirty one cyberpunk novels.

Hy compiles to Python, making prosaic importable and programmatically callable from either Hy or Python. It also supports a simple command line interface:

    hy __init__.hy load some_text0.txt some_mongo_db_name
    hy __init__.hy load some_text1.txt some_mongo_db_name
    hy __init__.hy load some_text2.txt some_mongo_db_name

    hy __init__.hy create templates/haiku.json some_mongo_db_name

The above example adds three text files (presumably containing prose or poetry in English) to a MongoDB database and then asks prosaic to generate a poem using the haiku template, which looks like this:

[{"syllables":5},
 {"syllables":7},
 {"syllables":5}]

Templates support the following keys:

  • syllables: number of desired syllables for the given line
  • keyword: line that matches string keyword
  • fuzzy: line that occurs near a line with string keyword
  • rhyme: letter indicating line’s place within a rhyme scheme.

A naive sonnet might look like this:

[{"rhyme": "A", "keyword":"vamplate"},
 {"rhyme": "B", "keyword":"stiletto"},
 {"rhyme": "A"},
 {"rhyme": "B", "fuzzy": "steed"},
 {"rhyme": "C", "fuzzy": "steed"},
 {"rhyme": "D"},
 {"rhyme": "C"},
 {"rhyme": "D", "keyword": "buckler"},
 {"rhyme": "E", "fuzzy": "buckler"},
 {"rhyme": "F"},
 {"rhyme": "E"},
 {"rhyme": "F"},
 {"rhyme": "G", "syllables": 10, "keyword": "giant"},
 {"rhyme": "G", "syllables": 10, "keyword": "dragon"}]

Keys can be mixed and matched ad nauseum for a given line.

The goal of this rewrite was to be able to run prosaic using threads to improve performance and to allow prosaic to be embedded within other, potentially web based, applications.

Ultimately, prosaic is still a hacked-up art project with a user base countable on one set of fingers. Don’t expect much in the way of support or usability, but do feel free to open issues on Github or send me an email if you have questions about it.

As always with open source software, patches are welcome.

 

my personal blog http://chiptheglasses.com has more poetry (non-cyberpunk cut-up and fully human poetry). i say inane things at @nate_smith.

Thoughts on NaNoGenMo pt. 1

June 6, 2014

Man, I’m still glum that I missed National Novel Generation Month. I mean, my state of mind was such that I couldn’ta done anything worthwhile, but, you know… Anyway, I guess I can live vicariously by looking at the different generation methods people used.

So the goal is “Spend the month of November writing code that generates a novel of 50k+ words.” and “The only rule is that you share at least one novel and also your source code at the end.” If I remember NaNoWriMo, it’s not really a competition, more like a challenge to encourage people to do something, and this is more or less the same. It’s a little hard to tell how this could be judged anyway, since the books aren’t really anything you’d read through closely.

1. dariusk

Looking through the completed issues page, it looks like the first person to finish was dariusk, who organized the thing.

This is not dariusk

This is not dariusk

The book is called “Teens Wander Around a House.” (4MB pdf) The “action” was generated by a text adventure generator written by someone called catseye. dariusk used text from dreambank.net, which is apparently somewhere that people post dreams. In this case he used a bunch of descriptions of dreams by teenage girls, with some dreams by a child molester halfway through. For dialogue, he grabbed text from twitter, mapping text that followed specific question templates (such as “why do you…”) to the first turn of a dialogue pair, and text with related templates (such as “because…”) to the second turn of a dialogue pair. There are some great ideas there, especially the text sources. It’s a little unclear exactly how they are put together. The text generator seems to drown out the other two sources, but you can kind of see their effect.

2. erkyrath

Next up is erkyrath’s offering “Redwreath and Goldstar Have Traveled to Deathsgate“, which is pure dialogue and seems to be based on some kind of in-joke related to a fantasy novel series.

this is not dariusk

This is not erkyrath

The generator looks like some kind of grammar that uses templates to recursively build a set of dialogue pairs. It definitely deserves closer study. Here are some more details.

3. juhana

Then there’s juhana’s shakestweetes, which is a set of plays with Shakespeare-like titles. Looks like the script grabs a bunch of tweets and collates them. Straightforward, but effective.

This is not juhana

This is not juhana

Gotta wait til later to look through some more…

Follow

Get every new post delivered to your Inbox.

Join 33 other followers