Skip to content

Haikuing the News

September 21, 2010

Tue September 21 16:58:23 2010

The physician who
lives in the soil made the corn
stalks puny.  As dirt.

16:59:25

Investors also
understand that land is a
sense of deference.

17:00:06

Pharos financial
group, grains, the author of the
biggest deals.  She said.

17:00:51

For some of them are
plowing their money into
parcels of silence.

17:01:36

If not, magistrates
are expected to increase
fivefold by cattle.

Text:
Google, Aggregated News, 21 Sept 2010

Advertisements
10 Comments leave one →
  1. eddeaddad permalink*
    September 26, 2010 1:10 pm

    More enlightening than the text it came from! The timestamps add to the effect.

    • Eric Elshtain permalink
      September 26, 2010 10:51 pm

      I’ve decided from now on to read my news only after it has been filtered through computational poetry engines…

  2. October 16, 2010 3:29 am

    I believe Gnoetry is bound to have tremendous implications for linguistics, psychology, sociology, … even religion.

    • eddeaddad permalink*
      October 16, 2010 10:54 pm

      > I believe Gnoetry is bound to have tremendous implications
      > for linguistics, psychology, sociology, … even religion.

      ooo, I like this guy! so don’t leave us hanging, tell us why / how !?

      • October 29, 2010 11:31 am

        I have read J. Trowbridge’s comments on how gnoetry works and the implications by Eric Elshtain (http://www.womenwriters.net/digitaleves/gnoetry.html). I must admit that I was really excited with the performance of Gnoetry 0.2. It makes a few syntactical mistakes (e.g. unexpected changes of subject within the same sentence) but does not make grammatical mistakes (excepting in complex/compound sentences where there might be a conflict of tenses: past and present tenses in the same sentence). But these are minor ‘glitches’ and I have concluded that they should not be frequent. Trowbridge says “The software does not contain any a priori knowledge about grammar, and the computer has no idea about parts of speech.”
        Indeed, I have run the Gnoetry 0.2 to create a dozen gnoems so far and am satisfied with its grammatical performance, though not so much with its ‘use’ of punctuation, which can be mediated by the user. The software’s grammatical reliability is the result of statistical analysis of source texts. The linguistic implication here is one which challenges Chomsky’s idea of Universal Grammar (innate faculty / nature). In other words, it might look awkward now to accept the idea that grammar is hard-wired in the human brain. It seems to me that if anything is ‘hard-wired’ in the brain it is not grammar but something like the statistical analysis of gnoetry (which is an unconscious process following years of practice). In this context I am more prepared to accept the Behaviorist’s interpretation of language acquisition, one which speaks of imitation and the effects of the environment (nurture). Much as I ‘hate’ to admit it, I agree with Eric [at least for now ;)] that “Any poem is the product of the re-arrangement of pre-extant human utterances; in short all poets are always borrowing from and differentiating from other human’s texts and other human’s speech, and Gnoetry is no different.”
        In view of the above, and at the risk of sounding contradictory, I believe that both nature and nurture are at work with respect not only to language but also to other human faculties. That which links the two apparently opposing mechanisms is evolution. Following years of practice a skill which is vital for adaptation/survival is ultimately ingrained in the genes and passes on to descendants.
        Gnoetry 0.2 produces meaningful strings of words on the basis of observed ‘patterns of word choices’ which are reproduced on the basis of certain criteria.
        Suppose JT and EE released Gnoetry 7.0 which now represents years of refinement of the software in terms of its statistical analysis capabilities. Suppose also that a first-time user ran it according to instructions to get a poem like E.A. Poe’s A Dream Within A Dream. Would you say that our user ‘hit the jackpot’? I doubt it, considering the performance of Gnoetry 0.2. You might say, ‘Well, that’s all hypothetical’ or that ‘Gnoetry 2.0 is a fat chance, let alone version 7.0’. You may be right, but Gnoetry opens up new horizons firstly by redefining the term ‘poet’, ‘author’ or ‘maker’. ‘Who is the poet?’ is now a legitimate question. It should not be difficult to imagine that discussions about the legitimacy of the title of ‘author’, ‘poet’, etc. with respect to anyone who claims the title might be extended to apply to the idea of God. For example, questions like ‘Did God create the universe or a machine?’ or ‘Did God create the universe out of nothing or did He use some pre-extant matter and the help of some machine?’ Such questions, however, may be taken to undermine established ideas about God (who is considered the ‘poet’ or ‘maker’ or ‘author’ of the universe, etc.). In which case we are going to attract the fire of deists – but I will not pursue it further.
        Another thing is that Gnoetry functions like an oracle: the fact that it de-contextualizes words gives readers the opportunity to look for meaning elsewhere (where perhaps there is none) or forces them to reconsider their perception of the world in search for alternative meanings (indeed a journey into the unknown).
        Closing this explanation I should say a big thank you to J. Trowbridge and Eric Elshtain for making Gnoetry possible. It is a big inspiration for me (like starship Enterprise). I’m looking forward to higher versions.

  3. eddeaddad permalink*
    October 30, 2010 12:29 am

    Mr. Trialonis (may I call you George?) I nominate the above for comment of the year. Let me see if I can engage with parts of it:

    First, let’s distinguish between 1) computer-generated poetry, 2) n-gram generation, and 3) interactive poetry generation.
    1) Computer-generated poetry has been going on for a while; Chris Funkhauser is probably the best cataloger of this, see the book “Prehistoric Digital Poetry” (which I haven’t tracked down myself yet!) and this timeline:
    http://web.njit.edu/~funkhous/2003/brasil/creativetime.html
    So: there are a variety of ways you can use computers to generate poetry.
    2) n-gram generation (building an n-gram language model is the “statistical analysis” that Gnoetry does) has been going on since at least 1972, see the MIT AI lab’s “HAKMEM” item described here:
    http://en.wikipedia.org/wiki/Dissociated_press
    What is n-gram generation? I tried to write an intuitive overview here:
    http://netpoetic.com/2010/06/computer-science-for-poets-n-gram-language-models/
    When you’re building a language model, n-grams are only one of many possible options. For example, you could build Part-Of-Speech templates to generate from:
    https://gnoetrydaily.wordpress.com/2010/08/23/break-bear-presenteth-pos-sonnet-line-templates/
    or build a statistical grammar using a tool such as the Stanford Parser:
    http://nlp.stanford.edu/software/lex-parser.shtml
    and generate from that.
    3) there are a variety of ways that humans can use interactive poetry generation tools to generate poetry. If you look at some of the most commonly used generators, described here:
    http://netpoetic.com/2010/10/interactive-poetry-generation-systems-an-illustrated-overview/
    you’ll see that I split them up into categories based on whether the computer is doing most of the work, or the human doing most of the work, or whether they are working together collaboratively in some constrained way.

    So: Gnoetry is part of the poetry-generation tradition described in (1), and used n-gram language models as described in (2). The innovation of Gnoetry in terms of (3) is in the way in which it coinstrains the human-computer interaction, allowing the human to improve upon the poem (within the constraints of the n-gram language model) that was randomly generated (from the constrains of the n-gram language model.)

    gotta go, more later!

  4. eddeaddad permalink*
    October 30, 2010 2:44 am

    Second, regarding the syntactical performance of Gnoetry: if you look at exactly what an n-gram language model is doing, you’ll see that it gets its coherence based on the authoring of the source text: the author guaranteed that adjacent words are syntactically coherent (unless it’s a Dada source text or something…) As you point out, this breaks down in longer sentences. However, this can be used for interesting effects; in eRoGK7’s “A Vibrator on the Inner History of Satisfaction” the segment that contains the title sentence is interesting (to me, at least) precisely because it parses in such an unusual way. And because a human is involved, they can tailor the output to their needs.

    I’m not sure exactly how to quantify the syntactic coherence of an n-gram model. I’m guessing you could model it mathematically, and it’s possible someone has already done it – it probably depends on the value of n, the source text, the grammar to which you’re comparing the output, and more. You could explore it empirically by generating a number of n-gram sentences and rating how grammatical you judge them to be, but that seems like a waste of time if it can be done mathematically. I have to admit that I’m not familiar enough with the literature and methodology of Markov models to do such an analysis myself, though.

  5. eddeaddad permalink*
    October 30, 2010 3:18 am

    As far as the nature of human grammatical processing… This isn’t really my area, but I spent a couple days reading up on neurolinguistics after a computational neuroscience class a couple years ago, and my understanding is that the human brain’s language capabilities were adapted from cognitive skills that evolved to solve non-linguistic problems. For example, pattern-matching, identifying sequences, generalization, etc. evolved to assist humans as hunter-gatherers. As humans identified the need to use language, they adapted these cognitive abilities to the task. So in this context, it seems likely that when human children learn language, they are learning how to adapt the cognitive abilities they have inherited from millenia of evolution. I have to admit I’m not terribly familiar with Chomsky and the Behaviorists, so I’m not sure how this approach compares to their ideas.

    So I guess I agree that when humans learn grammar, it’s largely a matter of training cognitive systems, which is analogous in some ways to building a language model for generation. But, my additional hypothesis is that societal- and symbol- grounding also play a large part to meaningful language use, which is why language learning is a heavily supervised event… but that’s another conversation…

    (btw, Christ Westbury, who developed JanusNode, is a neuroscientist, maybe we oughta ask him!)

  6. eddeaddad permalink*
    October 30, 2010 3:52 am

    Actually, back to societal grounding for a second… when you quote Elshtain as saying: “Any poem is the product of the re-arrangement of pre-extant human utterances; in short all poets are always borrowing from and differentiating from other human’s texts and other human’s speech…” I guess I don’t disagree with that, but I think the situation is a bit more complex. Researchers from various disciplines are starting to realize the importance of modeling how language-users coordinate their language use: Natural Language Processing researchers use the term “semantic alignment”, psycholinguists use the term “grounding,” there’s a related term in philosophy that I’m forgetting at the moment, and I think there’s a concept in linguistics too, but the term I like best is from AI: “societal grounding.” Basically, the question is: in what ways do you need to coordinate your language use with other language users, for that language to be inherently meaningful? Because this constrains the types of language you can use, the types of concepts you can model, and the ways you can refer to objects and events in the world.

    Anyway, that’s all I got to say for now, I think we agree on the fact that computer poetry highlights a lot of different interesting questions, and it’s a lot of fun to think them through. Later!

  7. October 31, 2010 1:18 pm

    eddeadad (easy to spell your name: you see my father is dead – I can remember that). Yes, I ought to thank you for the very interesting information you offer above in connection with my ‘layman’s’ earlier reply. Yes, please call me George; you will be doing me a favor.
    It seems to me that Gnoetry is perhaps the more recent descendant of the Dadaist’s tradition. I understand there are similar efforts to bring together mathematics and poetry. Computer generated poetry may be a serious step towards a major paradigm shift, mainly in the direction of language and meaning. This is a very interesting area of which I have very little knowledge.
    I intend to follow up the information posted here and the links you offer in the first part of your reply.
    I should thank you very much.
    George

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: