Author Archives: Lee Skallerup Bessette

How Interactive or Collaborative is Day of DH?

I wasn’t the only one who created an archive of the #dayofdh tweets; my colleague Ernesto Priego did too. And rather than do the work twice, I’ll share the visualization that he set up for the tweets of the day (you can zoom out and see just how big the day was for twitter users). I tweeted the second-most during the day (not much of a surprise there, I guess), but what did surprise me is just how isolated we all are in this giant twitter bubble.

TopTweeters

According to my spreadsheet totals, there were 603 unique twitter handles who tweeted using the #dayofdh hashtag at least once (these numbers will change as the spreadsheet continually updates). That, to me, seems like a lot of people participating. Here is a breakdown.

dayofdhsummary

That’s a lot of links and a lot of RT. So, a lot of sharing. And, almost 5500 tweets, all told.

But what about interaction between the participants? DayofDH was perhaps, unsurprisingly, the most “conversationalist” during the day.

TopConvo

But most of us just broadcast ourselves during the day using the #dayofDH hashtag, while few of us used it to actually carry on conversations. That isn’t to say that conversations weren’t happening, but that we weren’t using the hashtag to carry them out.

I think that’s problematic, although there is no way to force participants to tag everything that they say during the Day of DH, I think that it shows a really narrow picture of what DH is, and that is collaborative. If in my first post on Sunday night I mentioned that I am trying to do DH alone, I neglected to mention that I am physically alone doing DH in my small, relatively isolated geographical area. But, I am a part of a larger collaborative community through my social media networks who support and help each other in our work.

My work yesterday in Voyant would not have been possible without the virtual assistance of Stefan Sinclair, but also the valuable introductory tutorial shared by Brian Croxell. I only know these tools exist largely through Twitter. When I’m stuck, I tweet out my issue and I almost always get an immediate response from someone on my network.

So I guess what I’m saying is that I’m a little disappointed by what the #dayofdh twitter archive reveals about what we did with our day (or, what we choose to highlight as important in our day).

So then I decided to run the raw tweets through Voyant. Here is the Word Cloud:

dayofdhwordcloud

Sorry this isn’t an embedded live tool; I couldn’t get it to work. Anyway, we apparently need more time. Not tremendously surprising there. If you want to play with the data yourself in Voyant, here is the link. I took out things like RT, http, and the hashtag itself to get the proper data. I exported my google spreadsheet, copied the text into a text file and then uploaded it to Voyant, in case you’re interested.

Final thoughts. I just spent an hour putting this post together, gathering the data, and making the various visualization techniques work. Why?

I did it because I believe in being a good DH (and social media) citizen. I can’t do much (yet), but what I can do (archive), I think is important to do and to share widely. When I live-tweet conferences or archive hashtags, I do it because I think it’s important that someone keeps track and publicly archives this material, making it available and usable.

I call it Twitter-Karma. I put out useful things in the hopes that useful things come back to me. Hasn’t let me down yet.

Happy #dayofdh, everyone.

Visualizing the Text

This is not Big Data. This isn’t even medium-sized data. This is two versions of the same novel, one from 1994 and the other from 2012. The book is Chronique de la derive douce, by Dany Laferriere. I spent a month digitizing and cleaning up the text, and today, I was finally able to run it through both Juxta and Voyant.

I was most interested by Juxta because it would highlight exactly what’s changed and what hasn’t in the text. What shocked me however was just how much the new version changed; I knew it had doubled in size, thus there was a significant portion added, but Laferriere went in and tinkered with the original text, something he hadn’t done in his other “new” versions of his work.

What did I learn from Juxta? First off, the text wasn’t as clean as I thought. As I was working on the document on different platforms (the mac at home and a PC at work), some of the commas and quotation marks were off. Plus, in the original version of the book, certain parts had the words broken up at the end of the lines, divided by a dash, while the new version didn’t. I made the executive decision that formatting was less interesting to me than the actual words themselves. So I went back and cleaned up the text some more.

What I ended up with was a great side-by-side comparison of the text. But first, how much had the text changed? A lot.

Juxta1

While it was processing the text, it said it was processing over 10000 changes. Great. That’s what a close-reading, textual scholar wants to hear.

Juxta2

You can click on the picture to see a larger version, but this is just how much the first page changed. The epigraph changed. The first verse changed. And I have to say that I probably wouldn’t have noticed these changes in the first verse (clearly, I noticed the epigraph) unless I was able to visualize it like this. The content of the verses are still pretty much the same, but subtly changed. It’s not like a wholesale addition or subtraction. Just…different.

Now we come to the heat map to see where the changes in the text have taken place. Conclusion? EVERYWHERE.

Juxta3

The last line of the book didn’t change. There isn’t anywhere else in the book that hasn’t changed somehow. This is going to take a lot longer than I thought. But what does Voyant have to say about my piece? Quite a bit actually.

First off, the number of words? Almost literally doubled.

VoyantSummary

I was really interested in the peeks and some of the words that appeared more frequently in one version versus the other.

VoyantTrends

Relative to the text, the words, “filles” and “fille” and “femme” (girls, girl, woman) and “chambre” (room) decrease in terms of their density from the first to the second version, but the word “temps” (time) increases. Here it is put another way:

VoyantFrequency

This is actually REALLY interesting and possibly significant. I’d have to look at when and where a little more closely in terms of how they map within the two texts, but this shows me that his relations (as well as his room) becomes less significant, while the concept of time becomes more important or significant. This is a suspicion that I had long had about the revision, and this just confirms it.

I’m pretty excited, and I am grateful for Stefan Sinclair for helping me with some of the pickier aspects of Voyant. I’m going to be doing more work in here to study the two texts, but certainly, this is a great place to start.

DH and “Digital” Pedagogy

I’m a big fan and big contributor to the online journal Hybrid Pedagogy. I’ve contributed a few things, but one of the issues that I am continually writing about or referring to is the digital “ethos” that things like hybrid pedagogy or even DH inspires. For me, it’s about building, creating, interacting, making, and collaborating.

Take for example my peer-driven learning class. I let the students decide for themselves how they want to interact with the pieces they’ve selected, as well as what they want to build for the other students to use/interact with/engage. Today, it just so happened that the majority of the presentations were videos, one which was a video essay, and another was a music video. The students reflected on their choices, what the point of the videos were, as well as how they went about making the video.

Other students in the class have made analogue games; one made a Magic: The Gathering-type card game, but with scientists. Another did a Risk-type game, but using the different geographic areas of the US (they surveyed students and found that a good majority at our school believed we were in for another civil war). Another group create a Jeopardy-like game to expose our ignorance about poverty in the United States and more globally. These, to me, all reflect a kind of DH ethos of making and remixing, even if it didn’t involve digital media.

Having said that, I am particularly inspired by this “new” book created in Scalar (which is in Beta-phase, so you should check it out; I think it’s awesome), Flows of Reading: Engaging with Texts. I’m thinking that because I want this class to be a collective and new experience that this “book” would be an excellent resource to get them started thinking differently about engaging and interacting with text. The example given is Moby Dick, but we could easily create our own “chapters” around the readings the students have selected.

I really like Scalar because it does allow for text to be interactive and media-rich. And, it lets you mash them together. Perhaps this book will become the bridge in my peer-driven class between staying completely analogue and going all-digital. Or at least, it will hopefully get them to think differently about the difference.

Live Archive of #DayofDH Tweets

I’ve started a live archive of #dayofdh tweets here. It will update every hour, and goes back about a week. If I had though of starting it last night, then we’d have a live archive of just the day, but I’m sure there’s a way to get around that problem once/if anyone wants to start visualizing and working with the raw data of the twitter archive.

I am forever indebted to Mark Sample for first introducing me to this method, but we should all express a whole ton of gratitude for the creator of the script necessary to create such an archive, Martin Hawksey. If you specifically want the spreadsheet for your own Twitter searching purposes, you can find that post here. There is also a pretty cool visualization tool he created for the spreadsheet (which I’ll be using later today or tomorrow morning to “see” what the day of dh was like on Twitter).

But I invite and encourage everyone to check out the work he does on his site, which is all open-access and open-source. He does A LOT of really cool data collection and visualization stuff (that goes over my head, honestly, but it’s still pretty neat).

I’ll be back later with some “numbers” regarding the Day of DH tweets. I don’t know if it’s implied or not, but feel free to do whatever you want with the archive I’ve created. Except delete it. Please don’t do that.

Why “Still Trying to do DH”?

This was cross-posted at my other blog at IHE. At least it will be sometime tonight.

To marks the second year that I’ve participated in the Day of DH, where those of us who are digital humanists (or aspiring to be) record what it is we do during the day. The first one was in 2009, organized by my alma matter, the University of Alberta. This year, the Matrix over at Michigan State University is hosting our blogs for the day. Last year, I spent the day digitizing a pile of inter-library loan book I am using for my work over with the Editing Modernism in Canada group (among other things that I don’t even remember doing – I’ve been freaking busy!). For this first post, I want to explain why I named my blog this year, Still Trying to do DH.

I’ve come a long way in a year. My panel on DH was accepted at MLA13 (which was one of the things I was working on for last year’s Day of DH), and I was included on another one. This year, I’ve also been accepted to speak at DH2013, which is a pretty big deal. I’ve also now attended DHSI and the inaugural DHWI, a THATCamp, and the Networked Humanities conference. But I’m still struggling with actually “doing DH.” I’ll give you an example of why that is.

I’m currently writing a traditional academic book on the Haitian-Canadian author Dany Laferrière.  I’m looking specifically at how he revises, rewrites, and adapt his work. One chapter is going to be focused on his book Chronique de la dérive douce (or A Drifting Year in English translation). When he revised and expanded in the original 1994 edition in 2012, it nearly doubled in size. I wanted a way to visualize the changes, so see if anything jumped out as being significant. I also wanted to more easily see how much he had changed and added. This meant digitizing the two books.

I’ve been working on this for a month. First, photocopy the books. Then cut them up, so that there are no page numbers or page headings. Then, scan them, then OCR them, then turn them into txt files, then clean up the text. Oh, the cleaning of the text. Adding to the challenge is that the books are in French, with all of the accents and other symbols. Why did this take a month? Because I teach four writing-intensive classes, am trying to still write the book, and have no place to go on campus that offers this kind of support. I could pay for it, but I would probably have had to clean up the text anyway.

I’ve written about it before, and I’m writing about it again: it is hard to actually do DH (at least the way some define it) if you are off the tenure-track at a smaller institution that does not (yet) value or support (and perhaps never will) digital humanities work (we teach students how to use proprietary tools, and “hacking” is no encouraged). I use the tools that are freely available to me because that’s all that I have the time or the expertise to do at the moment. It’s hard to do digital pedagogy when my students openly resist learning to use them.

On the actual Day of DH (as opposed to Sunday afternoon as I am writing this), I will be in class all day, largely watching my peer-driven learning students present their project. While most of the projects thus far have been analogue in nature, the ethos of building something has informed what we have done as a class in order to engage with the materials they chose to read.  I have been live-tweeting the presentations over the past two weeks using #peerdriven and will continue today. I’ll blog a little more about that during the day, and hopefully once my kids get to sleep, I’ll be able to do a little more work on my research, feeding my lovely txt files through both Juxta Commons and Voyant.

Until then, remember the isolated DH practitioner, trying to make things work on her own.