«Creativity Support for Computational Literature By Daniel C. Howe A dissertation submitted in partial fulfillment of the requirements for the degree ...»
Otherwise, whatever gets into the poetry is determined by the generative method & lies there waiting in the source texts. [Mac Low 1998] 4.4.9 Charles O. Hartman Like many others of his generation, poet and digital literary theorist Charles O.
Hartman was very much influenced by the work of Cage and Mac Low. Writing in Virtual Muse: Experiments in Computer Poetry , he devotes almost an entire chapter to the relationship of chance, randomness and digital literature. He begins the third chapter by reminding us that “one of the Greek oracles, the sibyl at Cumae, used to write the separate words of her prophecies on leaves and then fling them out of the mouth of her cave. It was up to the supplicants to gather the leaves and make what order they could.” He compares this with his early poetic experiment for the Sinclair ZX81, a BASIC program called RanLines that stored 20 lines in an internal array and then retrieved one randomly each time the user pressed a key [Hartman 1996].
One of the unique contributions of Virtual Muse, a required text in ‘Programming for Digital Art and Literature’ is the in-depth discussion of n-gram-based generation, which Hartman used extensively in his work Monologues of Soul and Body. The project was conceived during Hartman’s experiments with the code for Travesty, the n-gram based generator originally published by Kenner and O’Rourke in Byte magazine . He says, “Here is language creating itself out of nothing, out of mere statistical noise. As we raise n, we can watch sense evolve and meaning stagger up onto its own miraculous feet.” [Hartman 1996] Interestingly, at the time of Hartman’s experiments with Travesty, he was also working on a poem that took Alan Turing as a subject, specifically the famous ‘Turing test’ for machine intelligence. Hartman took the poem he had written and ran it through his version of TRAVESTY at eight different chain lengths: n=2 through n=9. The results proved to be evocative of his themes, as the computerized n-gram process appeared to build a sort of sense that wove through his input texts which addressed human and computerized sensemaking. But rather than simply use the n-gram outputs, he created a dialogue between his traditionally authored text and the TRAVESTY-generated text, the soul and the body of the title. As Hartman puts it, “In the computer output I saw the body constructing itself out of the material of soul, working step by step back to articulation and coherence. It's a very Idealist poem, and at the same time very Cartesian, and perhaps monstrous.” [Hartman 1996] Or, as Funkhouser  puts it, “As the poem progresses, and the ‘body’ text is less abstract, the author succeeds in creating parallel monologues in which one (‘body’) borrows from the other.” Again we find a multi-layered text emerging from the writers’ engagement with machine processes in Hartman’s monologues. There is Markov’s original formulation of the n-gram process; Shannon’s use (three decades later) of this abstract process for text generation; Bennett, Hayes, Kenner, and O'Rourke demonstrating (later still) that Shannon’s linguistic operations could create intriguing results with literary texts; Hartman's decision use this literary operation in his own work; and the innovations in implementation of Bennett, Hayes, Kenner and O'Rourke, and Hartman himself. Further, we have Hartman’s decisions concerning the input texts, not to mention the lines he wrote himself, and the compositional attention paid to how these were combined into the final piece. More recently, we see a range of practicing artists (John Cayley, for instance, who uses word-level bi-grams that he calls “collocations”) exerting influence on a new generation the series of student and artist projects using the word and sentence level n-gram facilities built into the RiTa toolkit (see the RiTa gallery for a number of examples of such work).
This quotation, from Tristan Tzara’s 1920 “Manifesto on feeble love and bitter love” [Motherwell 1981] is one of the most commonly reprinted texts from the Dada movement.
While often associated with nihilism and portrayed as an ‘anti-art’ movement, Dada’s most significant area of artistic innovation may have been in the creation of procedures like the one above [Wardrip-Fruin 2006]. In fact, as is the case with many examples of generative art, rather than to read individual works, an understanding of Dada involves, as Wardrip-Fruin argues, a reading of the processes employed, perhaps even independently of any examples of work at all.
An example of contemporary algorithmic implementations of Dada processes is Florian Cramer’s 1998 reimplementation of Tzara’s newspaper poem process as a web-based CGI script. The resulting web page performs a computationally-implemented version of Tzara's process (without a sack, a hand, or paper scraps of differing sizes) and allows the page's visitor to choose to use the text of a newspaper (from a pull down menu), the text of a particular web page (there is a form element for entering page addresses), or any body of text the visitor may write or paste in (there is a form element for entering text) [Wardrip-Fruin 2006].
4.4.11 The Oulipo Aphorismes by Marcel Bénabou , appeared in ”Syntexts” in 1977. This generator, written in the APL language by Kenneth Iverson , produces twenty-five aphorisms at a time in French, intended to reflect some version of profound insight. The program features a number of different slotted configurations such as, “X is in Y, not Z,” “A delivers B but C will deliver us from D,” “Q is the continuation of R by other means,” etc. A
sample activation of the program in Funkhouser generates the following outputs:
Beauty is the continuation of patience by other means.
Hatred of ignorance is no other than the love of the rhythm.
Science delivers evil, but what will deliver us from the present?
The programming reflects tendencies that have existed since the outset of text-generation, though the output, due to the aptitude and choices of the programmer, also reflects a more complex effort in programming than found in many works. Beyond formulating the equations, the author must select appropriate materials to fill the slots. In a case such as “9” above, the closely connected variables call for setting up a range of language that will juxtapose effectively; the same principle is true, but less direct, in equations with more variables. The phrases are clear, grammatical aphorisms made with poetic language… Bénabou’s construction uses a finite amount of programming code to write endless aphorisms Funkhouser 2007].
Bebabou was an original member of the important Oulipo group, and his precise formulation of the permutations to be performed are characteristic of their work. The Oulipo group (Ouvroir de Litterature Potentielle, or Workshop for Potential Literature) represents perhaps the most direct precursor to the work presented here. Although much of the group’s work did not directly involve computers, (this was later taken up exclusively by a splinter group called the Alamo), their focus on procedures, constraints, and transformation in the service of experimental literature directly informed much of the computational work to come.
Founded in 1960 by Raymond Queneau and Francois Le Lionnais, the group came eventually to include an international roster of well-known writers and mathematicians, including Jean Lescure, Marcel Benabou, Harry Matthews, and Italo Calvino, each of which are introduced briefly below. For a full treatment of the group and their continuing work, there are a number of full-length books devoted to the subject [Lescure 1986; Mott 1986; Mathews and Brotchi 1998].
Unlike traditional writers (note the word potentielle or ‘potential’ in the group’s title), Oulipans have been as concerned with abstract forms (often taking the shape of constraints on a text) as with instantiated instances of those forms. The Oulipan position is that all writing is constrained writing, but most writers work within constraints that are either a) traditional (e.g., the sonnet), or b) so ingrained as to be almost invisible (e.g., the 'realist' novel), or c) and perhaps worst, unknown to the writer (e.g., automatic writing of the type performed by the Surrealists) [Wardrip-Fruin 2006]. A primary aim of the Oulipo is to supplement or replace these potentially ‘hidden’ constraints with two types of new constraints: on one hand, the traditional constraints (e.g., poetic forms), once largely forgotten, which they hope to help revive; and on the other, constraints new to literature, many adopted from mathematics (and mathematical games) [Wardrip-Fruin 2006]. Georges Perec was an Oulipan who excelled at the use of both sorts of constraints; for example, writing a novel without the letter “e” (the “lipogram” is a constraint with a long tradition) or one structured by innovative application of ”the Graeco-Latin bi-square, the Knight's Tour, and a permuting schedule of obligations” [Mathews and Brotchie 1998].
Further, the ‘potentielle’ of the Oulipo highlights its relation to computational literature in general and more specifically to the work presented here. The group has insisted, since its beginning, on the distinction between 'created creations' (creations crees) and 'creations that create' (creations creantes), focusing their attention on the latter, as does RiTa.
Oulipan artists have been concerned not with literary works themselves, but with the
procedures and structures capable of producing them. As Wardrip-Fruin writes:
The Oulipo clearly has [procedures and structures] at the heart of it efforts.
But this has not always been understood. After all, literary groups, of which the Oulipo is certainly one, generally produce texts, rather than structures and processes for others to use in creating texts. And certainly members of the Oulipo have produced remarkable texts using Oulipan procedures, as the novels of Perec, Calvino, and Matthews attest. But these procedures have also been used for literary works by non-Oulipans. Just as Perec made masterful use of the lipogram, so Gilbert Sorrentino, Christopher Middleton, and others have made remarkable use of 'N + 7.' But we must not, when impressed by these examples of procedures in use, allow this to cloud our vision of Oulipan potential. As Oulipo scholar Mark Wolff puts it, 'Writing is a derivative activity: the Oulipo pursue what we might call speculative or theoretical literature and leave the application of the constraints to practitioners who may (or may not) find their procedures useful' .
In addition to their work on constraints for writing, there are also two other types of Oulipan proposals that Wardrip-Fruin notes. One type are formal procedures for the transformation of text via substitution, reduction, and permutation (either of one's own text or of found text95. The most famous Oulipan procedure of this sort, 'N + 7,' involves substituting all the nouns in text with the noun found seven dictionary entries later. Another is called the This technique is employed in a number of RiTa components, including the RiLiPo object which provides implementations of a range of Oulipian procedures like N+7, which uses the RiTa Lexicon as a “dictionary”.
“chimera”96, wherein one produces a new text from four source texts. From the first text the nouns, verbs, and adjectives are removed. These are then replaced with the nouns taken in order from the second text, the verbs from the third text, and the adjectives from the fourth text. Similarly, “definitional literature” replaces a text’s verbs, nouns, adjectives, and adverbs with their dictionary definitions, and can be applied recursively (replacing those words with their definitions, and so on). Interestingly, unlike Dada, Surrealism, or most other art movements of the 20th century, the Oulipo is one of the few that continues to practice to this day, nearly 50 years after its founding.
Raymond Queneau, one of the two Oulipan founders, contributed a number of important works to the Oulipan canon. He is also known for his characterization of the Oulipo as a group of “rats who build the labyrinth from which they plan to escape.” [Mathews and Brotchie 1998]. One of his earliest works, a quintessential recombinant text, is his Cent mille milliards de poems (CMMP), or One Hundred Thousand Billion Poems (1961), which consists of ten 14-line sonnets. Due to the unique construction of the text—each poem is set on a page cut into fourteen strips that can be turned individually—the reader can construct alternate poems by reading the first line of any of the original sonnets, followed by the second line of any other sonnet, followed by the third line of any another, and so on. The work is constructed so that any reading of this sort produces a sonnet that functions syntactically, semantically, metrically, and in its rhyme scheme. The process of creating unique poems according to this procedure exposes a vast number of possibilities to the reader.
When choosing which of the first line to read, there are ten possibilities. Next, one has ten choices for the second lines, giving one hundred (10 * 10) possibilities for each of the first The 'chimera' and several other related constraints are presented and described further in “the Oulipo Compendium” [Mathews and Brotchie, 1998].
two lines. After reading a second line, one chooses from the ten third lines, giving a thousand (100 * 10) total possibilities for the first three lines, and so on.