DUST: Long-Term Thinking Activities

One of our consultants on DUST, our NSF-funded Alternate Reality Game, is Woodruff (“Woody”) Sullivan, an asrobiologist at the University of Washington. I first encountered Woody’s name back in c. 2010 when I happened upon the eerie 1991 Department of Energy report on how to warn civilizations 10,000 years in the future about the hazards of radioactive waste. One of eleven or so expert panelists on the project, Woody and his team produced a series of extraordinary architectural landscape designs that, in the intervening years, have circulated widely on the internet and achieved a kind of cult fame.

One of the things Woody has stressed in our discussions with him about DUST is how important it is for teens to learn how to tell biographical stories–or call them “process stories”–about physical things found in the natural and artifactual worlds, such as rocks, ravines, stars, buildings, and monuments. What can we infer about the genesis, transformation, and decay–about the temporality–of the things we encounter? What clues in their current states or configurations can help us conjecture  about their past or future lives?

Violet Cannon

Violet Cannon

Inspired by these conversations, we’ve introduced two long-term thinking activities into DUST: “Your House or School in 100 Years” and “Message to the Future: Across 10,000 Years.” A third activity–“Design Your Own ExoPlanet”–is coming soon. Although these are geared toward teens 13-15 years old, anyone registered for the game can submit entries (but you only have two more days left if you want your design considered for inclusion in Woody’s office!). We’ve used one of our in-game characters, Violet, to establish the fictional exigence for these activities: Violet is a precocious 15-year-old girl whose astronaut parents both collapse on board the International Space Station when the story opens. Feeling betrayed by the formal educational system that has left her and her friends unprepared for the epic challenges that now confront them, Violet decides to design an alternative curriculum for teens that will help them solve the mystery at the heart of the story: the worldwide collapse of adults who fall into a sleep-like state in the aftermath of an historical meteor shower that disperses dust into the earth’s atmosphere.

Screenshot of the 3D Panoramic model of Woody's Office

Screenshot of the 3D Panoramic model of Woody’s Office

We’ll be selecting two player submissions whose work will be hung on the wall in our 3D panoramic model of Woody Sullivan’s office. One of three such game environments created using the Unity Game Engine, the office environment allows players to rummage through Woody’s papers, access his computer files, and study other artifacts in the room for clues. We’ve already included one creative reinterpretation of “Spike Field” on the wall (see the original design here for comparison), based on one of the concept designs in the Department of Energy report (the reinterpretation was done by a talented undergrad student, Terrell Carter, in our Transmedia Fictions course).


Interpretive recreation of “Spike Field” by Terrell Carter

To submit your own entry, point your browser to fallingdust.com and register for the game.


Here’s Violet’s in-game post with links to the activities:



image credit: NASA http://1.usa.gov/1MrPqtK

 When I was a little girl, I used to love to sit at my Mom’s desk pretending to be a grownup, rifling through her papers and typing at her computer. For as long as I can remember, she kept a snow globe in the corner of the desk that I would shake every time I came in. It wasn’t just any old cheap plastic globe: it was Saturn suspended in there, the little white flakes swirling peacefully around its rings. My Mom used the globe as a paperweight to pin down a creased, slightly stained scrap of paper with a quote from John Updike, one of her favorite authors: “All around us, we are outlasted.” “Outlasted by what?” I asked “the moon and the stars?” “The moon and the stars,” she said, “but also the snow globe, the books on the shelves, the chair that I’m sitting on. Little things as well as big things, human-made as well as natural.”


Image credit: Swamplot http://bit.ly/1CBrvTd


I thought about that and tried to imagine the snow globe in 50 years. It would no longer sit on a desk, I decided, but be buried in a landfill. A trash compactor would have squashed it, causing the liquid to leak out, and the fake snowflakes would have seeped into the ground. I wondered if the liquid were poison. And if someone dug up the flattened globe after 100 years or 1000, what would it look like then?

moldy_sandwich It was hard to think about something I owned surviving that long, and changing as the decades and centuries rolled by. It still is. At school Mike Sanchez recently taunted me about the moldy sandwiches that I never eat accumulating in my locker. “Clean out your disgusting lunches!!!” he wrote in a note stuck on the outside of the locker. “Love, Your Books .” The three weeks it took for those PB&Js to go from edible to puke green were the longest amount of time I’d entertained in a while when it came to my own stuff.

alien_microbeAnd now here we are, more than three weeks into The Collapse. We still don’t know exactly what happened, or why, or how, but it seems increasingly clear the meteorite was carrying microscopic stowaways hailing from an exoplanet millions of lightyears away. @Deehob has a hunch they were put there by an alien civilization, and I think he may be right. I’ve been having nightmares about the microbes. I know it’s crazy, but in my dreams they were shooting little sleep-inducing darts into the unsuspecting adults, laughing maniacally. Maybe they were feeling punchy after having weathered uv radiation and the vacuum of space for thousands of years.

Then last night the dream changed. Gone were the tranquilizer guns; instead the microbes began gently sprinkling fairy dust into the eyes of the grownups. It danced and swirled in the air, like the flakes in the snow globe, and formed the same beautiful fractal patterns we saw before. And as I watched, I thought I glimpsed something intentional in the patterns—something older than the moon and the stars. Something primordial.

All around us, we are outlasted.

updike_outlasted Guys, if we’re going to solve this mystery and save the adults, we have to learn to think at ginormous time scales, even if our puny little brains rebel. So welcome to Vi’s Boot Camp 101: Long-Term Thinking. We’re going to ditch the lectures and do things a little differently. Let’s get started.

Exercise 1: Your House or School in 100 Years

Exercise 2: A Message to the Future: Across 10,000 Years

Exercise 3: Design Your Own ExoPlanet: Coming soon!


Bibliocircuitry and the Design of the Alien Everyday

I’ve been meaning to post a direct link to this article since it’s available open source (but hard to find via Google). Here’s the full reference + abstract:

Hancock, C., Hichar, C., Holl-Jensen, C., Kraus, K., Mozafari, C., and Skutlin, K. (2013). Bibliocircuitry and the design of the alien everyday. Textual Cultures.

This essay describes, models, and advocates for the role of reflective design in bibliography and textual studies. Popularized by Donald Norman, reflective design promotes critical inquiry over usability and exploratory prototyping over fully realized productions. We highlight four projects undertaken by the authors that embody reflective design, including three that explore the crossed codes of print and electronic books. A larger aim of the essay is to position bibliotextual scholarship and pedagogy as design-oriented practices that can be used to imagine the future as well as reconstruct the past.

New Project: Informal Learning and Transmedia Storytelling

I’m thrilled to announce a big new collaborative project on educational ARGs and transmedia storytelling funded by the National Science Foundation (a combined 1.2 million dollars for the first two years, with another $500,000 expected for Year 3). The project is a joint endeavor between Brigham Young University and the University of Maryland in partnership with NASA, the Smithsonian Institution, and the Computer History Museum, plus leading game designers, educators, scientists, and researchers. We’ll be designing, implementing, and conducting research on two large-scale games–“authentic fictions,” as Ken Eklund calls them–one focused on computational thinking, the other on deep-time sciences; the games will target youth aged 13-15, with a special emphasis on girls and other groups underrepresented in STEM. The project will iteratively design and test two distinct types of transmedia fictions (closed- and open-ended) to study their effects on learning.

One of the most exciting aspects of the project for me is the amazing team we’ve managed to assemble. Props to my collaborator and friend, Derek Hansen (BYU, leading this effort), who thought in really creative, ambitious ways right from the start about the kind of multi-institutional partnerships we’d need to put in place to make this happen. Our open-ended ARG–the one grounded in the deep-time sciences (think Astronomy and Astrobiology)–will be led by the brilliant Ken Eklund, who created World Without Oil, the game that launched the forecasting game genre in 2007. We’ll also be working closely with Jeff Sheets, Director of the Laycock Center for Creative Collaboration in the Arts at BYU. Jeff has previously overseen transmedia advertising campaigns for major companies–including Nike, Gatorade, Doritos, Taco Bell, and Ford Motor Company–with millions of viewers. He is also fluent in Spanish (as are several students who will be working on the project), allowing us to develop and distribute some of our game materials in both English and Spanish. And as mentioned above, NASA, The Computer History Museum, and the Smithsonian are all directly involved (one of our goals is to incorporate the Babbage Difference Engine into the game, a working replica of which is on display at the Computer History Museum in Mountain View California). At UMD, our core team consists of me, June Ahn, and Beth Bonsignore. June did his PhD in Education, and Beth is a polymath with a background in English, Computer Science, and Education, who also spent much of her career as a signal analyst for the US Navy. We’ll also be working with my colleague Allison Druin in the HCI lab–drawing on her expertise in participatory design–along with a group of teen co-designers. We’ll soon be hiring both graduate and undergraduate students as well.

Having one ARG devoted to the deep-time sciences is also grist for Hopeful Monsters, my book-in-progress on long-term thinking. Deep-time sciences are those sciences that deal with processes that occur over thousands or millions of years, such as the evolution of the galaxies or the continental drift of the earth. This type of thinking is future-oriented as much as past-oriented, as Lorraine Daston eloquently observes:

What all sciences of the archive have in common is not past- but rather future-consciousness: they imagine the archives that they have taken such pains to amass and conserve as a bequest to their successors, to the archaeologists, astronomers, geneticists, geologists, and demographers of the future. To create and curate an archive is to assume disciplinary continuity, sometimes across centuries or even millennia (as when astronomers in the year 1900 decided to bequeath a complete photographic record of the sky to the astronomers of the year 3000). There is always a utopian element in the sciences of the archives, a vision of a community that will endure – and cherish the collections so carefully laid up as provisions for future research.

Among other things, the deep-time focus should allow us the opportunity to create a number of experimental design fiction artifacts, and really think through methods and approaches for doing so. I’m also eager to gather data on what role they might potentially play in the learning process and how they function from a cognitive perspective within the context of transmedia fiction.

When preparing the grant proposal, I reached out to Dr. Woodruff Sullivan, an astronomer at the University of Washington, to see if he would be willing to serve as one of the consultants on the project. Sullivan, notably, participated in the Department of Energy’s 1991 task force to develop a marker system to warn future civilizations for up to 10,000 years about the hazards of radioactive waste stored near Carlsbad, New Mexico. Those of you who know me and my work know how obsessed I am with the extraordinary report that resulted from that project, and so the opportunity to work with Dr. Sullivan is just incredibly exciting.

I’ll wrap up by mentioning that there’s a preservation component to the project that will extend some of our prior research on increasing the longevity of ARGs and transmedia fiction. We’ll be working with my colleague Jimmy Lin who, along with one of his students, has created a prototype implementation of the Wayback Machine on a 16-node Hadoop cluster and will soon be developing new analytical tools for scholars to browse and research large web collections.

Much more on this project in the months ahead.


Alt-Research for Humanities PhDs

The following are remarks I delivered at the 2013 Modern Language Association Convention in Boston as part of a roundtable on joint programs in Languages, Literature, and Libraries. I’ve posted a slightly expanded version of the text.

In February 2012, Dan Cohen, Director of the Center for History and New Media at George Mason University, published a blog post reflecting on some of the cognitive biases at work in the everyday act of reading, as reported in Nobel-Prize winning psychologist Daniel Kahneman’s 2011 book, Thinking, Fast and Slow. Kahneman cites a series of experiments that demonstrate how seemingly incidental details like font style and paper quality affect our views on the trustworthiness of the information we encounter in books and other printed materials. After reading Cohen’s post, I immediately took to Twitter, wistfully commenting “I wish we humanists were DESIGNING some of those experiments rather than simply reporting on them.”

Although seemingly off the cuff, that thought had been brewing in a more generalized form for several years, an outgrowth of the time I’ve spent in an Information School, or iSchool, and more specifically as a humanist within an iSchool. One hallmark of these schools is that they are methodologically diverse and adventurous. Preternaturally so. With faculty whose backgrounds cut across the full spectrum of the traditional disciplines, they are breeding grounds for research incorporating qualitative and quantitative analysis, interviews, survey instruments, laboratory experiments, field experiments, eye tracking, participatory design, usability testing, and more. Research, in other words, that comes in a tremendous variety of flavors and combinations.

That emphasis on methodology is reflected in our PhD curriculum, which requires doctoral students to take a minimum of four—yes, four—methods and design courses, plus statistics, before they can advance to candidacy. Many elect to take more than that. Given this heavy load, students must venture outside the iSchool to get their full dose of methods. They thus spread across the campus, taking classes in Anthropology, Education, Public Health, American Studies, even History and English, whose methods courses we list in our handbook.

My Twitter comment, then, was a preliminary attempt to draw attention to what I perceive as a blind spot—or at least a missed opportunity—in the current conversation around graduate education in the Digital Humanities. That blind spot, I believe, is in part a consequence of conflating two distinct if inter-related things: technology education and methodological training, both of which we array under the heading of “methodology.” In practice, of course, the two are deeply intertwined. For heuristic purposes, however, I want to separate them, if only to make the larger point that while we’ve made some progress over the last 5-10 years on integrating technology into DH curricula (e.g., introductory programming or database design), we’ve made far less headway on expanding our methodological toolkit. It’s true that we routinely invoke the importance of methodological training in DH, but more often than not we’re using it as shorthand for “technology education.”

Why should humanists be interested in the methods I’ve flagged, most of which originate in the social sciences? One reason is that by venturing beyond methods already familiar to us, we enrich our field and open up new vistas of discovery.  The controlled laboratory experiments on the psychological effects of font style, to hearken back to Dan Cohen’s blog post, for example, are of obvious import and relevance to bibliographical and textual studies, two of our oldest branches of literary study. But these experiments shouldn’t be considered in isolation.

Take Rachel Donahue [on the panel] who is writing a dissertation on understanding, managing, and preserving the records of video game developers. As part of her research, Rachel has interviewed game developers; conducted a game documentation survey on the preservation practices of the videogame industry and the player community; and is now in the process of gathering oral histories from players and developers of the now defunct Glitch, a browser-based Massively Multiplayer Online Game. Documentation Strategy, Grounded Theory, and Action Research—methods she was exposed to in an iSchool—are all part of Rachel’s repertoire.

Similarly, Amanda Visconti, who holds a Master of Science degree in Information from the University of Michigan’s iSchool, is now a PhD student in English at the University of Maryland. Her work weaves together the different strands of bibliography/textual criticism, digital humanities, and information science. Amanda is currently in the early stages of a dissertation whose products will be theoretically and historically informed digital tools that expand existing online scholarly editions to allow more participatory and experimental modeling and representation of texts. Crucially, she is drawing on her iSchool background to bring usability testing into the making of her modules. Think about that: user testing deployed within the context of textual scholarship and bibliography; user testing of a digital scholarly edition!

Both Rachel and Amanda are engaged in projects that fall broadly within the purview of the Digital Humanities. What sets them apart from many peer projects conceived and developed within a humanities department is not their technical sophistication, which has become increasingly common, but rather their methodological range. Methodology—not (just) technology—is what makes their brand of DH distinctive.

My second reason for recommending these methods speaks pressingly to our current moment: one consequence of the failure to adequately distinguish between technology and methodology is that we often end up articulating a vision for the PhD that frames it not as an advanced research degree, but as a professional degree. This occurs principally within the context of efforts to revamp doctoral education to prepare students for positions outside the academy. The logic seems to be something like this: if we infuse enough programming, database design, social media savvy, and tool use into the curriculum, then our students can find rewarding jobs as programmers, publicists, technical writers, and so forth outside the ivory tower if they aren’t lucky enough to land a coveted research position within it. The PhD begins to look a lot like a terminal Master’s degree in a professional discipline. I think there is a third way, however, that prepares students for what I’d call not alt-academic [or alt-ac] positions but alt-research positions: while alt-ac gives us a framework for thinking about alternative academic jobs that may include research, its purview is mainly academia and its environs. Research, however, also happens outside the academy. Once again iSchools provide a useful point of reference: in the iSchool environment there is often a direct pipeline between industry and academia, between the research labs at Google, Yahoo, Adobe, Microsoft, Intel, and others , on the one hand, and iSchools on the other.  My colleague Allison Druin, for example, former director of the Human Computer Interaction Lab at the University of Maryland, has partnered with Google on research studies aimed at understanding how children search and retrieve information on the internet; and with Nickelodeon on studies exploring how to create better mobile interfaces for children’s media. In these cases, the partnership crosses the industry-academia divide, but there are scores of instances where iSchool faculty have had prior careers as PhDs in these corporate research labs or spent their sabbaticals at them or found ways to creatively blend them (Danah Boyd, whose work is familiar to many humanists, is a good example). It would be fair, I think, to characterize these shifts between academia and industry as lateral career moves, insofar as both types of positions are research positions that require real methodological chops. These are also the types of positions for which Digital Humanists *should* be able to compete. We’re not there yet, but if we begin to promote advanced research methods in our curricula, we could be soon.

One final note: I’m also prepared to talk about some of the pitfalls of this approach (and I can anticipate many of them). But I also think it’s time we in the humanities cultivated a more nuanced response to the knee-jerk corporate drone critique, and began to realize there’s opportunity to help shape industry through research collaborations, rather than simply criticize from the wings.

Web Archiving in my Information Access in the Humanities Course

[Update: one of my students, Sarah Weissman, has published a fabulous overview of her group’s web archiving project and the problems they encountered over on the Library of Congress’s Signals blog.]

This semester we’re doing large-scale web archiving in my iSchool Information Access in the Humanities course. To get a sense of what I mean by “large scale,” our data budget using the Internet Archive’s subscription-based Archive-It software is 1,024 GB and includes up to 12,000,000 documents (we likely won’t capture nearly that many documents, but part of the learning experience is getting a sense of how much information gets captured with just a handful of seed sites per group as starting points for the crawler). A shout-out to the fabulous Lori Donovan at the Internet Archive who has been working closely with our class. Here’s a copy of the assignment.

Web Archiving Assignment
Due: 28 November
Presentations: 5 December

In this project you’ll harvest and preserve a humanities web-based collection. Your group will scope the collection, troubleshoot media file format issues, create metadata, and deal with robot.txt files and copyright issues, as well as learn about the architecture of the web. Using specialized open-source software to harvest content, you’ll create a topically based collection of websites, which are then permanently hosted at the Internet Archive. The software–known as Heritrix–crawls and captures pages from the live web, which is then viewable through the Wayback Machine. The service also includes specialized search tools that allow for full-text and metadata searching.

A number of other graduate library or archival programs use Archive-It in the classroom, including University of Michigan’s iSchoolUNC- Chapel Hill’s SILS program, University of South Florida’s iSchool, Clayton State University’s Archival Studies program, and NYU’s Moving Image Archiving and Preservation Program, to name a few. Their collections are built around themes such as Alternative Energy Sources, Digital Tools for Human Rights Awareness, and the 2011 Wisconsin Union protests, among others. To get a sense of what others have done, you can search by collecting organization, collection, or specific URL.

Useful resources and links:

*Archive-It log-in page
*UMD’s Archive-It Collections
*Search page

*”Preservation is Cultural Literacy” (my Huffington Post article, which includes a discussion of web-archiving in the K-12 classroom)
*Archive-It help


1.) 4-5 double-spaced page document (Times New Roman 12″ font, with 1″ margins), plus images and appendices. In terms of genre, this document is a cross between a report and a reflective essay. You should address the following information:

*Description of and rationale for your web archive collection. What is the theme or topic of your collection, and how did you arrive at it?
*What are the 7-10 seeds that make up your collection?
*How did you scope your collection? Did you have to make any scoping adjustments along the way?
*What did you choose to capture for each site or seed: the entire site, one or more directories, or one or more subdomains? (Be sure to attend to the syntax of your seed URLs to make sure you’re capturing what you intend.)
*How did you make these decisions? Before making your final selections, please read the “appraisal and selection” section of Jinfang Niu’s “An Overview of Web Archiving” in D-Lib Magazine. Take note of the various approaches to appraisal he identifies: selection by domain (such as .gov or .edu),  topic or event, or media type and genre. Niu also distinguishes between value-based sampling and random or statistical sampling.
*What type of content was archived in the course of your crawls? Images? Video? form- and database-driven content? PDFs? Study your post-crawl reports to get a quantitative sense of the types and numbers of files that were captured.
*What major rendering problems did you encounter, and how did you troubleshoot them? What other technical issues did you run into (e.g., crawl traps, robots.txt files, etc.)?
*What are some of the major takeaways from this project? What did you learn, and what surprised you? Remember the Internet Archive’s motto: “The Web is a Mess.” How was the truth of this statement brought home to you?

2.) Include screenshots, charts, and appendices as needed.

3.) 5-7 Powerpoint slides that summarize your thematic collection and your experience using the Archive-It service. Include screenshots, statistics, and technical issues encountered along the way. You can derive your slides directly from your written report (i.e., there may be considerable redundancy or duplication between them).

You will submit a printed report to me, but you should also cross-post as much as possible to the class blog to share with your classmates. I’d also strongly encourage you to blog about your experiences as you begin to experiment with the Archive-It software and tools.

Other requirements:

*Seed sites: 7-10 total
*Production crawls: 4-5 (think carefully about the frequency of your crawls and when you want to schedule them; please note that our data budget only permits five production crawls per group)
*Unlimited test crawls
*Dublin Core collection-level metadata: Title, Subject, Description, Creator, Date
*Dublin Core seed-level metadata: at your discretion.

Tips and suggestions:

*Do a test crawl before your first production crawl (and subsequent test crawls as needed)
*Carefully study your post-crawl reports and learn from them
*Pay attention to the amount of data and documents you’re archiving by studying your post-crawl reports and collection home page
*Browse your archived content through the Wayback Machine and check for rendering issues
*Study the Quality Assurance (QA) post-crawl report and run patch crawls as needed
*Attend to Robots.txt issues
*Study how metadata elements have been used in other archive-it collections
*Remember that you are archiving for future generations as well as users in the present. How does the question of (future) audience shape your collection description? See my Huffington Post blog entry for observations on how teachers and students in the K-12 web-archiving program have approached this task.
*When choosing metadata subject terms, research and review existing controlled vocabularies, ontologies, and classification systems for relevance (e.g., the Getty Art and Architecture Thesaurus 

). See what creators of other collections have done.

Virtual Letterpress Printing

This semester I’m  teaching a revised and updated version of my Book 2.0: The History of the Book and the Future of Reading course. Last week we experimented with LetterMpress, touted as “a virtual letterpress on your iPad.” Available for both the iPad and Mac, the app, which is modeled after a 1964 Vandercook model, simulates the experience of setting type, locking it into the press bed with furniture and quoins, inking the print blocks, and pulling an impression. After playing around with both apps, I’ve developed a preference for the Mac version, which allows for finer-grained motor control over the positioning of type.

Here’s a sample of what my grad students created in class:


Hopeful Monsters

Very excited to announce that last week I signed a contract with the MIT Press for my book-in-progress, Hopeful Monsters. I’ll be saying a lot more about this in coming weeks and months, but for now thought I’d post a brief formal description of the project:

Hopeful Monsters: Computing, Counterfactuals, and the Long Now of Things is an examination of the role of conjectural methods, counterfactual reasoning, and speculative design in the humanistic disciplines. It is a contribution to the rapidly emerging literature on the “digital humanities” that takes seriously the idea that the future (as well as the past) is a viable domain for humanistic inquiry and—crucially—that it is also computationally (and materially) accessible. More than half a century ago, C. P. Snow asserted that scientists have “the future in their bones,” while humanists act as if “the future did not exist.”[1] Although few today would frame the differences in terms as stark as these, the idea that the disciplines are divided along predictive-historical lines persists.   Hopeful Monsters reexamines this duality by arguing that the number and diversity of humanistic genres and fields of study prefiguring or otherwise deeply engaged with the future have, over the course of the last decade, reached critical mass. These would include, among others, possible worlds theory; imaginary media; tangible futures and design fiction; culturomics; constructed languages; environmental and sustainability studies; digital curation and preservation; and massively multiplayer forecasting games, such as World Without Oil, Urgent Evoke, and Find the Future.  (The Long Now of my title is taken from the non-profit foundation of the same name, which seeks to furnish tools and methods for reckoning with “deep time,” time measured in intervals of not only decades, but also hundreds or even thousands of years.)[2]  Individual chapters develop the critical and theoretical tools necessary for developing practical heuristics for conjectural thinking by ranging in scope from digital preservation techniques and the culture of game modding and emulation to the design of weird computer architectures and counterfactual machines to the creation of “artifacts-from-the-future” for transmedia storytelling and Alternate Reality Games.  Simultaneously seeking to both broaden our conception of digital humanities (in particular by counterbalancing its current emphasis on “big data” with the DIY cultures of making, modding, and tinkering) and reorient the humanities toward a more hopeful, less crisis-ridden future, Hopeful Monsters is about the strange loops and hybrid products of what-if thinking in the service of art, design, preservation, and communication.


[1] Snow, The Two Cultures (Cambridge: Cambridge UP, 1959) 11.

Digital Preservation Assignment and Workshop

In celebration of National Preservation Week, I thought I’d post a couple of relevant assignments I’ve used with my undergrads, both of which could be easily adapted for younger students. The first is a writing assignment, the second a set of lab activities.



Formal requirements are slightly different for this paper topic: six double-spaced pages, plus bibliography of at least five sources. You don’t need to quote or summarize or overtly reference these sources in the body of your essay; they should inform your work indirectly rather than directly. I will be looking for evidence that you’ve absorbed and synthesized some of the core themes of ENGL467 and are able to extend them in novel yet credible ways.

Create a hypothetical reception history for Robert Pinsky’s Mindwheel, Roberta and Ken Williams’ Mystery House, OR Sean Stewart’s Cathy’s Key that spans 25 or 50 years in the future. Think of yourself as a biographer, only you’re writing the life of an artifact, not a person. This is where the rubber meets the road: where issues of preservation, intellectual property, technology, authorship, creativity, reproduction, scholarship, and geo-politics coalesce to determine the fate of your object. Will it turn viral to survive–or be locked down 200 feet below ground in a cold-storage vault? Will it remain inviolable–or ripped, mixed, and burned so repeatedly that it morphs into something bearing no physical resemblance to what the author(s) originally created? Will it be taught in classrooms, exhibited in museums, studied by scholars, and propagated across online communities? Or will it slowly rot and decay in the trashbin of history? Will it be irradiated by the heat of a nuclear holocaust–or will humanity’s better angels prevail? Migrated across media and platforms, or permanently fixed in a material substrate? Remembered or forgotten? Lost or found?

To do well on this assignment, you will need to give considerable thought to how our various course themes interrelate: how, for example, does intellectual property affect preservation? How do media and technology affect preservation? At a more basic level, what is preservation, anyway? If a community of individuals transmits an object over time but mutates it in the process (think William Gibson on little Johnny X), does this transformission (a term coined by textual scholar Randall McLeod, which splices the words “transmission” and “transformation”) constitute a legitimate form of preservation? What if only a fragment of the original survives—is that preservation? What if the artists have created a work whose different nodes are distributed across multiple media, as in Cathy’s Key? Do some parts of this multi-unit work stand a better chance of survival than others?

Narrative point of view: lots of possibilities here. You could write from the vantage point of a third-person omniscient narrator. You could also tell the story from the point of view of the object itself; a future researcher, curator, collector, fanboy, or amateur; or all or none of these.

Vivid detail: give me granular information!

Limited space: six pages isn’t a whole lot of space in which to tell the life story of an art object. You may have to choose one or two key moments to relate rather than attempt to give an exhaustive account of the object’s transmission and reception. This is not necessarily to dissuade you from offering a more comprehensive view–rather it’s intended to get you thinking about how best to structure your writing.


(collectively, these lab exercises should be spread out over 2-4 class sessions)

This set of exercises has three parts. In Part I, students are introduced to Mystery House Taken Over, a crossbreed between a remix project and a preservation project. Working with the reimplemented source code of Mystery House–a classic 1980s-era game of interactive fiction–students can study, alter, and recompile the code to create their own mash-ups. In the process, they learn a lot about the history of the game (the first work of IF to include graphics) and what it means to update and re-create a vintage video game for a contemporary platform. The commented code gives insight into the creative decisions made by the programmers, who elected, among other things, not to require players to enter commands in all caps, as was the case in the original, while at the same time choosing to reproduce a programming bug.

In Part II, students run another early work of IF, Mindwheel, in an emulator. After playing the game, they  obtain a hash value: a unique alphanumeric code that acts as a digital fingerprint for the file. To test its efficacy, they alter the game’s disk image in a “Hex editor,” and then obtain a new hash value, which demonstrates that the bits have changed in the interim. Digital archivists use hash functions to monitor the integrity of the digital objects in their care over time.

[The instructions for the hash values are adapted from a handout created by Matt Kirschenbaum and Naomi Nelson for their “Born Digital Materials: Theory and Practice” course at UVa’s Rare Book School.]

Part 3, which I still need to write up, uses a Kryoflux and a disk drive to rescue bits off of old 5.25 or 3.5 inch floppy disks.

MYSTERY HOUSE (NB: I haven’t checked these links in a while, so it’s possible some are broken)

Modding the game:

Tips and Suggestions for Game Play and Game Mods

  • Begin by playing the original version of Mystery House online. If you need help, consult one of the walkthroughs by clicking on a relevant link above. Spend 15 minutes or so familiarizing yourself with the game.
  • You should already have the MHTO Occupation Kit on your desktop.
  • Consult the “Read Me” file in the MHTO occupation kit folder to help you make game modifications, recompile the code, and Blorbify it (we’ll go over this in the workshop). If you are a PC user, open the “Read Me” file and the Inform source code file in WordPad instead of NotePad for better formatting/legibility.
  • If the source code or “Read Me” file is still difficult to read because of the formatting, then copy and paste the mhto.inf file into Microsoft Word, and then copy and paste the contents back into your original file and save. Make sure you replace the old content with the new.
  • Suggestions for mods: Change the description of the house (or any other description); change the text of notes as printed on the screen; change the descriptions of the Non Player Characters (NPCs); change the number of turns that elapse before the “It is getting dark” message appears (original number is 20).
  • Guidelines for changing images: locate Flickr photographs released under CC licenses by using the CC search engine. Once you’ve downloaded an interesting image, use your favorite image editing software (default choices would be the “Paint” program on Windows or “Preview” on Mac) to change the file format from jpeg, gif (or whatever) to .png file format. (As a last resort, use the online image converter–see link above.) Rename the image using the exact same file name as the original image you want to replace (e.g., “Attic1.png”). Make sure you put your new image in the same folder as the original image (either “Items” or “Views”). The name of the image file for the house (front view) is “front_yard.png”).
  • If you’re feeling particularly ambitious, use the “Paint” application installed on your machine to create new notes. (NB: these notes are image files. You can save them directly in the required .png file format and then store them in the appropriate MHTO occupation kit image folder. Make sure they are labeled in such a way so as to replace an original image note (see previous bullet point).
  • Before playing your modded game, make sure you’ve recompiled and reblorbified it following instructions in the “Read Me” file. Also make sure you’ve downloaded an interpreter, or reader, for the game (see “Glulx Interpreters” link above).

Experimenting with hash functions:

Windows/PC Users: start out by locating two kinds of utilities/programs and downloading them:

  • HexEditor: HexEdit or FSHED (there are also others)
  • Find and download a free MD5 utility for Windows/PCs
  • Then proceed to the rest of the instructions for Mac users and adapt them for your purposes

Mac Users:

  • Copy the “Creative_Futures_Lab folder from the USB drive directly onto your desktop (rather than in “Documents” or “Downloads” or any subdirectory).
  • Open the “Mystery_house_hexfiend” folder in the Creative_Futures_Lab folder and drag the “mystery_House2.DSK” out onto your desktop.
  • Open “Terminal” (the UNIX shell/command line interface) on your Mac.
  • Type in the following command (minus the quotation marks): “cd Desktop”
  • Now create an MD5 hash by typing in the following command (minus the quotation marks): “md5 Mystery_House2.DSK”
  • You should get back a long alphanumeric string that looks something like this: 2af9aeaab8d67d9d63114fabe11e5068. Copy this string into your text editor or Wordprocessor (e.g., MS Word)
  • Return to the “mystery_house_hexfiend folder and fire up the “Hexfiend” Hex reader by double clicking on the icon.
  • Open the “Mystery_House2.DSK” file you just hashed. Change a byte or two. Then save, and run your MD5 command again in Terminal (see above).  Copy and paste this new alphanumeric string directly beneath the previous one in your text editor or word processor. What do you notice?




Title: Mindwheel: An Electronic Novel
Creator: Robert Pinsky
Creator: Steve Hales
Creator: William Mataga
Contributor: Richard Sanford
ContributorS: Kazuko Foster, Richard Blair, Thom Hayward
Publisher: Synapse Software Corporation / Broderbund Software
Date: 1984
Type: Interactive Fiction / Adventure Game
Format: 5-1/4 floppy disk + hardbound book
Format: BTZ [Better Than Zork] parser/programming language
Description: Developed by Synapse software and distributed by Broderbund, Mindwheel was one of five interactive electronic novels published by the company as part of its text adventure series. Initially released for IBM and Apple, versions of the game were also adapted for the Atari and Commodore (the complete list of platforms includes the Apple II, Atari 8-bit, Atari ST, Commodore 64, and DOS)

Description: The user (or “player character”) adopts the persona of a “mind adventurer” who must travel telepathically into the past to retrieve the Wheel of Wisdom, a mysterious object upon which the fate of humanity rests. With the help of Dr. Virgil, the user travels through the minds of four deceased individuals: a rock star, a dictator, a poet, and a scientist. Along the way, she solves puzzles, answers riddles, and encounters the Cave Master, a prehistoric creature from whom she must obtain the wheel in order to avert disaster and save humankind from extinction. The book that accompanies the disk serves as both an instruction manual and fictional guide to the plot. It includes excerpts from an alleged textbook on “matrix immortality,” an interview with Dr. Virgil, an apocryphal note on the genesis of the novel, illustrations, photographs, poems, and blank pages for the user to jot down thoughts about the game.

Rights: Riverdeep, Inc.

MS-DOS version of game: http://www.igorlabs.com/etc/games/index.html

Walkthrough from GamesOver.com:

http://www.gamesover.com/walkthroughs/mindwhee.txt This walkthrough does not specify what system it is for.

Walkthrough from IFArchive.com: http://www.ifarchive.org/if-archive/solutions/Mindwheel.sol


  • Open the “Agrippa and Mindwheel” folder in the Creative_Futures_Lab folder
  • Drag and drop the “vMac.ROM” on top of the “Mini vMac” emulator
  • Once you get a flashing disk icon with a question mark in the emulator, drag and drop the System701_boot.dsk in the folder on top of it.
  • Now drag and drop the “Mindwheel.img” file onto the emulator window and then double click on it once it appears in the emulaor.
  • To start playing Mindwheel, you’ll be prompted for a password. To obtain the password, identify the relevant page number, line number, and word number the program asks for by opening the “nw0010023.pdf” file (which is a copy of the Mindwheel game book).
  • Enter the password and start to play (see below for information on Mindwheel and how to find online walkthroughs).


  •  Copy the “Creative_Futures_Lab folder from the USB drive directly onto your desktop (rather than in “Documents” or “Downloads” or any subdirectory).
  • Open the “Agrippa and Mindwheel” folder in the Creative_Futures_Lab folder and drag the “Mindwheel2.img” out onto your desktop.
  • Open “Terminal” (the UNIX shell/command line interface) on your Mac.
  • Type in the following command (minus the quotation marks): “cd Desktop”
  • Now create an MD5 hash by typing in the following command (minus the quotation marks): “md5 Mindwheel2.img”
  • You should get back a long alphanumeric string that looks something like this: 2af9aeaab8d67d9d63114fabe11e5068. Copy this string into your text editor or Wordprocessor (e.g., MS Word)
  • Return to the “Agrippa and Mindwheel” folder and fire up “Hexfiend” by double clicking on the icon.
  • In Hexfiend, open the “Mindwheel2.img” file you just hashed. Change a byte or two. Then save, and run your MD5 command again in Terminal (see above).  Copy and paste this new alphanumeric string directly beneath the previous one in your text editor or word processor. What do you notice?



“When Data Disappears”: My NYT Op-Ed

My op-ed for the New York Times on digital preservation was published today.  A shout-out to Clay Risen, editor extraordinaire at the New York Times, for approaching me about writing it and for all his help. I first met Clay a couple of years ago when he interviewed me for an article he was writing on game preservation.

For those wanting more information on how to salvage bits from old storage media, particularly magnetic media, see Jeanne Kramer Smyth’s excellent “Rescuing 5.25″ Floppy Disks from Oblivion” and Archive Team’s wiki page on “Rescuing Floppy Disks.”

For more on how archivists, scholars, and players are preserving vintage videogames, download the “Preserving Virtual Worlds” final report, a white paper to the Library of Congress’ National Digital Information Infrastructure for Preservation Program.  Co-authored with Jerome McDonough, et al.

Also forthcoming is ” ‘Do You Want to Save Your Progress?’ The Role of Professional and Player Communities in Preserving Virtual Worlds,” co-authored by me and Rachel Donahue. For a preview, you can listen to my recent talk at the New York Public Library. And if you’re interested in the legal angle on videogame preservation, check out my article in the Journal of Visual Culture.

Finally, you can read a description of our current work on videogame preservation here.

Machinima Issue of the Journal of Visual Culture

The Machinima Issue of The Journal of Visual Culture is now out.  To access the full text of articles, you need a subscription to the journal. Under the terms of the licensing agreement, however, authors can publish the first version of the article sent to the editors.  Since my early version is nearly identical to the published version (with the important exception of images), I’ve made it available here for download and circulation.

Here’s the abstract:

Kari Kraus, ” ‘A Counter-Friction to the Machine’: What Game Scholars, Librarians, and Archivists Can Learn from Machinima Makers about User Activism” (Journal of Visual Culture 2011 (10): 100-112).

The author examines the legal issues associated with machinima creation in relation to archival and preservation efforts. Specifically, she argues that what makes machinima as a cultural practice particularly interesting from a legal perspective is its ability to dramatize the tension between copyright law and contract law; public rights and private rights; and the right of reproduction versus the right of adaptation. She proposes that game scholars, librarians and archivists take a page from the play book of machinima creators when developing their own professional approaches to user activism and digital access and preservation.