WWW.DISSERTATION.XLIBX.INFO
FREE ELECTRONIC LIBRARY - Dissertations, online materials
 
<< HOME
CONTACTS



Pages:     | 1 |   ...   | 13 | 14 || 16 | 17 |   ...   | 29 |

«Creativity Support for Computational Literature By Daniel C. Howe A dissertation submitted in partial fulfillment of the requirements for the degree ...»

-- [ Page 15 ] --

This presents an interesting tradeoff between the two paradigms. With statistical models, relatively little prior knowledge of the genre is required in the creation of a model, while a “deep” understanding of the structure of the texts in question is generally needed to construct a convincing grammar for it. Statistical models tend to work best with large amounts of textual input, as the statistical properties grow stronger (to a point) with a larger input set. Grammars, on the other hand can be represented compactly and require no external input set for operation, a fact that recommends them for web-based projects, at least prior to the addition of the RiTaServer module. Further, the two approaches differ in the fact that after creating a “successful” grammar, according to whatever definition one might use, one is often then in a position to generalize about the genre that is being examined. If the generated outputs are representative of the genre, then the grammar has generally captured some, if not all, of the relevant rules on which that genre operates. On the other hand, the success of a statistical model tends to tell us more about the particular statistical approach (and possibly the chosen input texts) than anything about the genre in question.

From a pedagogical perspective this is an important distinction. While grammars may require more knowledge and more effort to create, they tend to result in more post-process knowledge about the type of language under examination. Statistical models, on the other hand, require less overhead, tend to generate more “surprising” results, and teach students more about the “process” (statistical analysis) than the “product” text, which may or may not represent some existing genre. In the context of PDAL and RiTa, both approaches are important, and further, they shed light upon one another, especially when presented back-toback, as was generally the case in the class. Lastly, they served as a conceptual link into a deeper understanding of the history of artificial intelligence, illustrating the historical distinction between “neat” and “scruffy” approaches [Wardrip-Fruin 2006] that has long been a subject of debate in the field. Of course the ideal approach is often, depending on the project, a mix of the neat and the scruffy. This avenue is made easily available through the RiTa tools as when, for example, grammars make calls to statistical models to generate some text item that obeys both the grammatical specification and the statistical distributions.

While PDAL students were required to create a project employing some type of probabilistic language model, it was left to each student to decide what type of model to use.

The RiTa toolkit provides a range of objects that leverage probabilistic techniques, including n-gram-based generators (RiMarkov), Keyword-In-Context models (RiKWICker), and “maximum entropy” parsers and taggers (RiPosTagger, RiChunker and RiParser.) However, as n-grams featured prominently in one of the course texts, Charles Hartman’s Virtual Muse, they (with support via the RiMarkov object) were chosen by a majority of students for their projects. As was customary, the topic was presented with a range of reading, coding, writing and critiquing exercises. The primary readings for this section were Hartman’s Virtual Muse and Eric Elshtain’s writings on the Gnoetry engine [Elshtain 2006]. One of the unique contributions of the former text is its rigorous discussion of n -gram-based generation in a literary context, which Hartman used extensively in his work Monologues of Soul and Body.

Elshtain’s text presents an interesting set of extensions to the basic n-gram technique.

In addition to this reading, several artworks employing n-grams were presented and critiqued, including ‘Talking Cure’ [Castigilia et al. 2002], and several of the poetic texts generated using Elshtain’s Gnoetry70 engine. The introduction began with a very simple sketch (see Appendix: Examples) that generated new texts from a combined set of Wittgenstein and Kafka pieces, allowing users to interactively experiment with different n values and immediately see the affect on the output. In response to questions concerning the working of the program, a very basic introduction/review of probability was presented and For more information, visit the Gnoetry Engine at http://www.beardofbees.com/gnoetry.html.

students directed to create a similar “mash-up” of their own to be performed in the subsequent class. In the following class, students presented their programs to the class for critique. A discussion ensued about limitations of the approach and additional features of the RiTa tools were presented for those who had not discovered them on their own, either via the examples or documentation. These included weighting of inputs, constraints on repetition, custom tokenization, feature compression (case, synonyms, etc.), and the literary extension methods discussed in the technical section, (e.g., getCompletions(), getProbabilities(), and getProbabilityMap()), which allow for some degree of interactive control of the model during generation. In addition, several hybrid approaches were presented, including the use of RiMarkov on other features (provided by the RiAnalyzer) such as Part-of-Speech. Another approach was the combined use of a grammar for higher-level structures (e.g., section, paragraph or even sentence) with the use of the statistical models for lower-level tasks, such as word-selection, and semantic consistency. Several interesting (and publishable) projects resulted from this set of work, including a full-scale dramatic play generator complete with lighting and stage directions, etc. (see Appendix: Student Project Gallery).





Concurrently, students were given a coding assignment to build a letter-level concordance for an input text (presented alternatively as a unigram model, or a n-gram model with n=1). This assignment led naturally into a first lesson on data structures, as students quickly realized that to store the information required, some type of dictionary-like structure would be required. In this dictionary, given some “key” (often a single letter), one could obtain the number of times it appeared in the input, without scanning the input each time. The pros and cons of various approaches were discussed, from Lists, to Hashtables, to Arrays indexed on character code, in terms of both efficiency and storage space. By the end of the session, most students seemed comfortable with the assignment itself, and, more importantly, with evaluating (at least at a very basic level) the different data representation alternatives, a central topic in many introductory computer science courses. Further, this discussion was motivated directly by the materials and problems of the given context (creative text generation) rather that by an abstract problem designed to match the topic to be taught.

Lastly, a maximum entropy approach, as represented in the tagging, chunking, and parsing components of RiTa was presented. Although not all students possessed the required mathematical experience for full comprehension, it was a clear “next step” after the previous assignment, especially in the case of those students intending to continue on to further computer science courses. As was generally the case with RiTa tools, there were (at least) two levels of understanding which enabled use of the tools: a base level concerning what the various methods could do, and a deeper understanding of the inner workings of the components (always available for inspection). While the latter allowed users to take full advantage of the functionalities and extensions in the RiTa objects, it was not required for those wishing to make only simple use of the tools. One of the part-of-speech taggers included with RiTa used a maximum entropy approach71 and was presented as an example for this section. Not only was part-of-speech tagging an easily understood example, it led (as soon as substitutions were attempted) directly into chunking, where students wished to replace noun-phrases rather than simple nouns. This led into a discussion of parse-trees and strategies for parsing (bottom-up, top-down, chart-strategies, etc.) and additionally made clear the presence of recursive syntactic structures, e.g. noun-phrases containing other nounphrases, and the need for (in some cases) a full-fledged parser (RiParser) rather than a simple chunker (RiChunker). The presence of such structures initiated a discussion of recursion The other was a faster, but less accurate, transformation-based tagger following Brill [1992].

itself, and a few simple recursive algorithms were presented in combination with a more general presentation of the kind of problem for which a recursive solution is recommended, e.g., one containing sub-problems with a similar structure to the initial problem. Rather than the typical Fibonacci or Factorization examples, recursively structured English sentences were presented as examples.

3.6.1.3 Integrating Computational Thinking In the assignments above we can see how, in addition to the “artistic” elements required to make a compelling work of digital art, an impressive range of core computational ideas arise naturally as a result of the material at hand. Through just the two relatively simple examples presented above, grammars and n-grams, the student will have been introduced to an impressive number of key computer science concepts, many of which likely to be taught in an introductory CS sequence; from finite-state automata to context-free grammars and the language hierarchy; from elementary data structures to hashtables, to the construction of parse trees; from regular expressions to recursion. Rather than appearing to students as arbitrary additions to the “real” topic at hand, the relevance of these ideas is immediately apparent in a course that focuses on creative language-driven programming projects.

3.6.2 Final Student Projects in PDAL Typically, final projects involved both novel combinations of existing RiTa components and the creation of custom code to extend or augment existing functionality. In several cases, such extensions have been added to the core RiTa library, with authors receiving credit on the RiTa website. Because RiTa provides a core subset of the potentially daunting infrastructure generally required for language-based artworks, students are comparatively free to explore a variety of topics through both individual and collaborative projects and encouraged to focus on those aspects of their work they find most engaging.72 Further, the open and community-oriented nature of the programming environment provided students with a sense that their projects were meaningful contributions, both to other RiTa users and to the larger digital art community, as opposed to just “exercises”. Several students in the courses expressed interest in incorporating elements of their projects back into the library, while others exhibited and published their projects in well-respected galleries and journals for digital literature. Still others expressed interest in creating their own libraries to support creativity for specific domains. RiTa itself (the code for which was often discussed in class) provided a helpful example in these cases of what such a library might look like, with well thought-out interfaces, clean code structure, and thorough documentation.

In addition to source code and functioning programs, careful documentation of all aspects of students’ process was stressed, both as a method to evaluate their development in a critical/reflective manner, and to provide examples and resources for others in the RiTa/Processing community. Finally, in the last meeting of each semester, students presented their work in a live setting to a larger audience of practicing artists, researchers and educators.

Both the breadth and depth of these projects has been astonishing and is discussed further in Chapter 5 (Evaluation) as one measure of the tools ability to support a wide range of creative work. For those interested, several dozen of these projects are available in the project gallery located on the RiTa website at http://www.rednoise.org/rita/rita_gallery.htm.

See the Chapter 5: Evaluation, for a further discussion of this claim.

–  –  –

4.1 Introduction This chapter presents a summary of prior work that has influenced the theorization, design, implementation, and deployment of the RiTa toolkit. While the range of this work is broad, this is due to the fact that little, if any, existing research has targeted our specific goals With this in mind, we focus here on related research and practice whose goals overlap with at least one of the explicit goals of this project, as laid out in the introduction. While the brief discussion of prior work in the opening chapter present the current state of creativity support for the literary arts, this work in this section represents more direct influences on our

research, and falls into the following primary categories:

Programmatic Educational Environments • o for Natural-Language Processing (NLTK, SimpleNLG, etc.) o for Procedural Literacy and Interactive Art (Processing, Max/MSP, etc.) Computer Science and Literary Art (Strachey, Shannon, Weizenbaum, Bringsjord) • Computationally-augmented Literary Experiments: Tools and Practice • For reasons of economy, several areas of active research relating only tangentially to RiTa, specifically tools for interactive fiction (e.g. Inform or Curveship), games with narrative and/or conversational elements (e.g. Facade), non-programmatic support tools (e.g. scriptwriting aids like Dramatica, argumentative-writing aids like Euclid, and collaborative writing tools like EtherPad), are not addressed here, though pointers to resources on these topics have been included where applicable.

4.2 Programmatic Educational Environments Computer Science (CS) researchers have created a wide range of programmatic libraries that attempt to aggregate the range of tasks required to perform high-level natural language research. Recent years have also seen some interest in adapting this approach to the classroom, by providing tools that specifically address pedagogical issues that arise as new computer science students attempt to work with natural language. Similarly there has been impressive growth in both the number and quality of libraries and environments designed specifically for computational artists. As the RiTa toolkit bridges these two research areas, this section presents a review of important works in each that have informed our approach.



Pages:     | 1 |   ...   | 13 | 14 || 16 | 17 |   ...   | 29 |


Similar works:

«UCGE Reports Number 20395 Department of Geomatics Engineering GeoPubSubHub: A Geospatial Publish/Subscribe Architecture for the World-Wide Sensor Web (URL: http://www.geomatics.ucalgary.ca/graduatetheses) by Chih-Yuan Huang JANUARY, 2014 UNIVERSITY OF CALGARY GeoPubSubHub: A Geospatial Publish/Subscribe Architecture for the World-Wide Sensor Web by Chih-Yuan Huang A THESIS SUBMITTED TO THE FACULTY OF GRADUATE STUDIES IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF...»

«Digital Struggles: Fostering Student Interaction in Online Writing Courses A DISSERTATION SUBMITTED TO THE FACULTY OF UNIVERSITY OF MINNESOTA BY Andrew Virtue IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY Laura Gurak, Advisor July 2013 © Andrew Virtue 2013 Acknowledgements I would like to thank my advisor, Dr. Laura Gurak, for her patience, guidance, and encouragement throughout the dissertation process. I am so grateful for having the experience to work...»

«MANIPULATION OF COLD ATOMS USING AN OPTICAL ONE-WAY BARRIER by TAO LI A DISSERTATION Presented to the Department of Physics and the Graduate School of the University of Oregon in partial fulfillment of the requirements for the degree of Doctor of Philosophy September 2008 ii “Manipulation of Cold Atoms Using an Optical One-Way Barrier,” a dissertation prepared by Tao Li in partial fulfillment of the requirements for the Doctor of Philosophy degree in the Department of Physics. This...»

«Developing New Catalysts and Methods for Catalyst-transfer Polycondensations (CTP) by Zachary Jacob Bryan A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Chemistry) in the University of Michigan 2015 Doctoral Committee Associate Professor Anne J. McNeil (Chair) Associate Professor Kenichi Kuroda Professor Adam J. Matzger Professor Melanie S. Sanford Zachary Jacob Bryan 2015 Dedication To my mother, father, brother, and wife for...»

«2012-2013 Burnett International College Catalog 2013-2014 Cataog BURNETT INTERNATI ONAL Col lege 2013-2014 1 Volume III Table of Contents Welcome 6 Mission _ 7 Vision 7 School Philosophy 7 Legal Ownership _ 8 Board of Trustees 9 School Administration _ 9 Faculty _ 9 HOLIDAYS AND BREAKS12 Class Start & End Dates for Enrollment 12 Hours of Operation_ 13 GENERAL ADMISSIONS REQUIREMENTS14 International/Non-U. S. Schools_14 International Students 14 ACADEMIC POLICIES_ 15 Attendance _ 15 Clinical...»

«MARK EVANS Department of Curriculum, Teaching and Learning Comparative, International and Development Education Centre Ontario Institute for Studies in Education, University of Toronto 252 Bloor St. West, Toronto Canada, M5S 1V6 mark.evans@utoronto.ca EDUCATION 2004 Doctor of Philosophy, Department of Educational Studies University of York, England (Supervisor: Professor Ian Davies) 1977 Master of Arts, Department of Political Studies McMaster University, Ontario, Canada 1977 Bachelor of...»

«Extending ACID Semantics to the File System via ptrace A Dissertation Presented by Charles Philip Wright to The Graduate School in Partial fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science Stony Brook University Technical Report FSL-06-04 May 2006 Copyright by Charles Philip Wright 2006 Abstract of the Dissertation Extending ACID Semantics to the File System via ptrace by Charles Philip Wright Doctor of Philosophy in Computer Science Stony Brook...»

«PEER VICTIMIZATION, PHYSICAL ACTIVITY, AND SOCIAL-PSYCHOLOGICAL ADJUSTMENT IN OBESE YOUTH By CHARISSE WILLIAMS A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2007 1 © 2007 Charisse Williams 2 To my best friend, Jocelyn Lee 3 ACKNOWLEDGMENTS I would like to thank Jocelyn Lee for her assistance and constant support through this difficult process. I would to thank...»

«ABSTRACT QUANTUM SIMULATION OF Title of dissertation: INTERACTING SPIN MODELS WITH TRAPPED IONS Kazi Rajibul Islam, Doctor of Philosophy, 2012 Professor Christopher Monroe Dissertation directed by: Joint Quantum Institute, University of Maryland Department of Physics and National Institute of Standards and Technology The quantum simulation of complex many body systems holds promise for understanding the origin of emergent properties of strongly correlated systems, such as high-Tc...»

«Pests of Deciduous Trees and Shrubs –1– Introduction Managing common insects, weeds, plant diseases and certain animal pests found in the backyard can be a challenge. However, there are a number of ways to approach the problem. Information in this publication will help identify and manage pest problems. Pest management methods will vary among individuals according to their tolerance of the pest, the damage and a basic philosophy about handling pest problems. It may not be necessary to...»

«3 $74 6 FULL-SCALE LEACHATE-RECIRCULATING MSW LANDFILL BlOREACTOR ASSESSMENTS David A. Carson US. Environmental Protection Agency Risk Reduction Engineering Laboratory (ML-CHL) 26 W. Martin Luther King Drive Cincinnati, Ohio 45268-3001 USA INTRODUCTION The integrated waste management hierarchy philosophy continues to develop as a useful tool to solve solid waste issues in an environmentally respot isible manner. Recent statistics indicate that approximately two thirds of municipal solid waste...»

«ABSTRACT Disordered Ultracold Two-Dimensional Title of dissertation: Bose Gases Matthew Charles Beeler Doctor of Philosophy, 2011 Dr. Steven Rolston Dissertation directed by: Department of Physics Ultracold bose gas systems can perform quantum simulations of high temperature superconductors in certain parameter regimes. Specifically, 2D bose gases at low temperatures exhibit a superfluid to thermal gas phase transition analogous to the superconductor to insulator transition in certain...»





 
<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.