Class Notes: Meeting 4
Questions Emerging from Your Environmental Scans
- What formats are best for computational research? When do formats matter historically?
- How strict is Dublin Core? What conventions does the ontology follow?
- How to continue with research when what you need is not in libraries or special collections?
- How to treat "the whole work" through data modelling? When and how to divide texts into parts?
- When does web enthography become a study of classification and historical design habits?
- Where models and modelling are concerned, where to start? And how to find examples?
- How to acquire materials that are not digitized or not online?
- How to map or georeference a text, including print materials?
A Quick Review of Our Discussions Thus Far
A few things to keep in mind as we proceed into a discussion of data modelling, classification, and speculation:
- During our first meeting, we unpacked various ways that people can argue with computers (e.g., where the computer becomes an invisible medium, a hermeneutic agent, a resistant object, or a mechanism for networking scholarship).
- This discussion underscored the fact that computers (as techno-cultural objects) cannot be read homogeneously, as if they somehow have inherent functions and values incapable of change.
- In response to our discussion, Galloway would likely ask us not just how a computer works but what (in the first place) is mediation. Also, in the context of this seminar, a related question is how is the very practice of criticism is a mediated act (e.g., are we "digging" into the unconscious of texts, are we reading them superficially, are we immersing ourselves in them, are we diagramming them, and so on).
- More interested in the formal properties of media, Manovich gives us the following principles of new media: numerical representation, modularity, automation, variability, and transcoding. For our purposes, we might ask how, in the practice of using computation for literary/cultural purposes, we relate to and understand such principles. In short, how do they play a mediating role in the production and interpretation of criticism?
- Here, Chun will (in a gesture echoing Benjamin's critique of aestheticized politics) remind us that we should get over speed as a primary characteristic of new media and networks. Instead, memory is the fundamental characteristic we should stress, and it differs from storage in that it turns this into that (e.g., the archive requires reading, or the HTML file needs a compiler).
- During our next few meetings, we should keep both Chun and Galloway in mind: how do particular methods of literary and cultural criticism mediate our relationships to history, texts, and artefacts? How are such mediations enacted and communicated to audiences (e.g., should they simply be explained in writing, or should they be performed through criticism)? And how do computational methods and the new objects they examine endure? Such questions prompt us to start thinking about data modelling in the humanities.
Responses to "Standards"
A few remarks, by Bowker and Star, to consider:
- "we stand for the most part in formal ignorance of the social and moral order created by these invisible, potent entities"
- "No one, including Foucault, has systematically tackled the question of how these properties inform social and moral order via . . . new technological and electronic infrastructures. Few have looked at the creation and maintenance of complex classifications as a kind of work practice, with its attendant financial, skill and moral dimensions."
- "If we are to understand larger-scale classifications, we also need to understand how desktop classifications link up with those that are formal, standardized, and widespread."
- "we want to look at what goes into making things work like magic: making them fit together so that we can buy a radio built by someone we have never met in Japan, plug it into a wall in Champaign, Illinois and hear the world news from the BBC."
- "a classification system exhibits the following properties: 1. There are consistent, unique classificatory principles in operation. . . . 2. The categories are mutually exclusive. . . . 3. The system is complete."
- "The term [standard] as we use it in the book has several dimensions: 1. A 'standard' is any set of agreed-upon rules for the production of (textual or material) objects. 2. A standard spans more than one community of practice (or site of activity). It has temporal reach as well, in that it persists over time. 3. Standards are deployed in making things work together over distance and heterogeneous metrics. . . . 4. Legal bodies often enforce standards. . . . 5. There is no natural law that the best standard shall win. . . . 6. Standards have significant inertia, and can be very difficult and expensive to change."
- "Classifications may or may not become standardized."
- "we define boundary objects as those objects that both inhabit several communities of practice and satisfy the informational requirements of each of them. . . . They are thus both ambiguous and constant"
Responses to "Model Of," "Model For"
A few remarks, by McCarty, to consider:
- "A model of something is an exploratory device, a more or less 'poor substitute' for the real thing"
- "a model for something is a design, exemplary ideal, archetype or other guiding preconception. Thus we construct a model of an airplane in order to see how it works; we design a model for an airplane to guide its construction. A crucial point is that both kinds are imagined, the former out of a pre-existing reality, the latter into a world that doesn't yet exist, as a plan for its realization.
- "I argue that from the research perspective of the model, in the context of the humanities, failure to give us what we expect is by far the most important result, however unwelcome surprises may be to granting agencies."
- "The model can be exported to other texts, tried out on them in a new round of recursive modeling, with the aim of producing a more inclusive model, or better questions about personification from which a better model may be constructed. This is really the normal course of modeling in the sciences as well: the working model begins to converge on the theoretical model."
- "Models-for do not have to be such conscious things. They can be the serendipitous outcome of play or accident."
Responses to "Speculative Computing"
Some remarks, by Drucker and Nowviskie, to consider:
- "Two significant challenges . . . The first is to meet requirements that humanistic thought conform to the logical systematicity required by computational methods. The second is to overcome humanists' long-standing resistance (ranging from passively ignorant to actively hostile) to visual forms of knowledge production."
- "the speculative approach is premised on the idea that a work is constituted in an interpretation enacted by an interpreter. The computational processes that serve speculative inquiry must be dynamic and constitutive in their operation, not merely procedural and mechanistic."
- "Any 'order of things' is always an expression of human foibles and quirks, however naturalized it appears at a particular cultural moment."
- "Creating programs that have emergent properties, or that bootstrap their capabilities through feedback loops or other recursive structures, is one stream of research work. Creating digital environments that engage human capacities for subjective interpretation, interpellating the subjective into computational activity, is another."
- "The speculative approach, which interjects a long-repressed streak of subjective ambiguity, threatens the idea that digital representations present a perfect match of idea and form."
- "The speculative approach is not specific to digital practices --- nor are generative methods. Both, however, are premised very differently from that of formal, rational, empirical, or classical aesthetics."
- "Only as this work progressed did researchers realize that perceptual processing of visual information had to be accompanied by higher-order cognitive representations. Merely understanding 'perception' was inadequate. Cognitive schemata possessed of the capacity for emerging complexity must also be factored into the explanation of the way vision worked."
- "The goal of 'pataphysical and speculative computing is to keep digital humanities from falling into mere technical application of standard practices (either administered/info management or engineering/statistical calculations). To do so requires finding ways to implement imaginative operations."
- "Does the computer have the capacity to generate a provocative aesthetic artifact?"
- "A model is an interpretative expression of a particular dataset. More importantly, it is what the interpreter says it is at any given point in time. We find the flexibility inherent in this mode of operation akin to the intuitive and analytical work of the traditional humanities at its best."
- "Our goal is to place the hermeneut inside a visual and algorithmic system, where his or her very presence alters an otherwise mechanistic process at the quantum level. . . . What we haven't yet tried in a rigorous and systematic way is the injection of the subjective positioning any act of interpretation both requires and embodies into a computational, self-consciously visual environment."
Key Problems These Texts Are Engaging
What are some key problems that these texts appear to be engaging?
- What is the relationship of the model to the original source? What choices are being made?
- The model of is a poor substitute for the original, whereas the model for allows someone to actualize / make what does not yet exist. Should there be a balance between of and for in critical practice? Or should there be a bias?
- To what degree do models of involve interpretation? To what degree should they?
- How are models associated with efficiency, reduction, or constraints? To what effects?
- How are models standardized?
- How to apply computational methods to imaginative artefacts?
- The "it's just a computer" problem: is a computer capable of aesthetic imagination or aesthetic provocation?
- What tensions exist between speculative computing and expertise (or expectations of expertise)? Who is the audience for a speculative computing project, and why would speculative computing appeal to them?