MAIS 623 Assignment #1

Digital Humanities: An Elegy for the Capital I

The ideas of this course, and the concepts behind Digital Humanities as a whole, unite our personal experience of computer and networking technology, along with “behind the scenes” aspects of technology, including technical decision-making and the philosophy of computing and the Internet, with our commitment to engaging in and perpetuating the life of the Humanities disciplines in an interdisciplinary (or post-disciplinary) era. This paper will look at my own loooong background in computers through the lens of material covered so far in MAIS 623 Digital Humanities, exploring the history of the Internet (how it got this way), the future of Digital Humanities (where it’s going), and reach some conclusions about whether this essay itself is an act of engagement in the process of Digital Humanities.

Personal Background

In a sense, the essay A Brief History of the Internet (Leiner et al., 2016) is the backdrop to my own digital life, beginning in the early 1960s (my own physical, biological life began in the last five days of 1969) and evolving through to the present day. It focuses more on events that were unfolding in the background, setting the stage for the networking most of us do today, than on the front-end computers that were in common use from the mid-1970s onwards. Nonetheless, it is the story of my life in the sense that by the time users like me were ready for technology, it was there, fully-formed, with much of the groundwork performed behind the scenes by folks like Leiner, Cerf, and their colleagues.

I started using computers when they first started finding their way into schools, in the late 1970s. I’ve been using PCs, for the most part, since they first came out in 1982. I was using BBSs – the precursor to today’s networking setups – in the late 1980s, and my introduction to the Internet, mainly through email, was sometime around 1991. I was working for tech companies back in the mid-1990s before the bubble burst.  

Though I’ve never been challenged to formulate my own philosophy of computing, it would have to be “the more open, the better.” In keeping with that, I have always tried to take the “least travelled” path if there was one in my travels with technology. Just a few examples…

  • Choosing Unix back when PCs couldn’t network;
  • Commodore Amigas back when PCs couldn’t do graphics;
  • Back to free Unix distros when Windows stopped doing the things I wanted as simply as I wanted to do them;
  • Using WordPerfect when Microsoft Word was starting to get big;
  • Using PaintShop Pro when Photoshop was becoming a Thing;
  • Picking Mosaic when Internet Explorer threatened to monopolize the world;
  • Searching with Altavista when Yahoo seemed like it would turn the Web into a hick town. 

Far from sheer bad luck – always picking the loser! – these choices represented my deliberately picking the loser in the hope that with my help, things wouldn’t tip – that the PC wouldn’t be the only computer we had; that Windows and Microsoft Word wouldn’t be the only choice; that Photoshop wouldn’t become an all-powerful juggernaut; that IE couldn’t rule our lives.

In Jaron Lanier’s (2011) terms, I was working in my own small way to preserve openness and diversity and prevent what he calls “lock-in,” the process by which “thought structures are solidified into effectively permanent reality” (p. 9).

Without really giving it much thought, perhaps, I also assumed – contrary to Lanier’s assertion, and Arment’s, as we will see later – that the best way to avoid lock-in was to embrace the culture of shareware and freeware. That was not an easy thing back before the Internet; in those days, you had to get your software on a floppy disk.  I say without giving it much thought, but in honesty, those of us who thought about such things in those heady early days probably believed (and many still believe) in striving together towards common goals and the faceless open protocols on which the Internet was built.  It’s hard for those who weren’t there to understand: openness was simply so refreshing after years of proprietary everything. From needing to hunt down the exact version of a program that would work on my system to websites that would work on any platform was a tremendously liberating leap. So I love the idea of striving together, communism-style, towards a noble common goal.

And despite Lanier’s nay-saying, we have seen incredible gains because of that early cooperative spirit. I cannot tell you how easy life with computers is in some ways, at least, not without using the word “nowadays” and starting to feel like I’m going on 80 years old. But it’s true! There are so many file formats and machines that can talk to one another perfectly well.  At one point, if you had a file on a disk it would literally and physically not operate in another type of computer, and that was that.  You were stuck unless you owned some kind of pricey in-between kind of converting machine.

At some point, life started to get easier. (As Leiner et al. (2016) chronicle, behind the scenes, this was due to the hard work of committees dealing with standards for interoperability). Files just started to work and all the computers started talking to each other, faster and faster and faster, and… it was pure magic.

We were all under the spell of that magic for a while, I think.

Massively Networked

Slowly but surely, we all became massively networked, beyond our wildest dreams. I was earlier to do this than most. I sent my first email back in 1990. I was at Mount Royal College, and I found a friend from Toronto. I looked him up, on the Internet, which you could do in those days, because there simply weren’t so many people. So I found him and sent him an email, and that was that. We were in business.

Later, there were web browsers, and even a few websites, and as Lanier says, they were almost all weird. Soon enough, anybody could have a website, and anybody could blog or share photos and all the things you could do now (except streaming video!), but you needed some tech skills, and no two sites were alike. Awesome.

And just about everything was free. Not just free, but truly free. Ad-free, garbage-free. If you found a site you didn’t like or a person whose views you hated, you just stayed away.

But the thing Lanier (2011) doesn’t mention in his elegy for the strangeness of the early Internet – “Individual web pages as they first appeared in the early 1990s had the flavor of personhood” (p. 48) –.is that it quickly became exhausting. After a while, there was just too much. Too much of what?

  • Too much work to post a single blog post.
  • Too much trouble to rearrange pages on your site so the most recent entry was up on top (blogging sites do this automatically, of course)
  • Too much trouble to find sites worth visiting (pre-Google).
  • Too much trouble to get online – for a while you had to have 3 pieces of software and a modem and a phone line and they all had to talk to each other while you stayed off the phone line.
  • Too much work to keep links from your own site current

Certainly, it was all amazing, and tons of fun, but it desperately needed simplification if everybody was going to use it. This simplification is the stuff Lanier speaks out against – the transition from yucky-looking personal web pages, through somewhat-personal MySpace and onto slick-looking but personless Facebook.

We welcomed this transition, mostly. Perhaps that was short-sighted.

Even though we were busy playing around online, we were also living in the real world, and understanding what good design looked like. So everybody kind of cheered, I think, when good design came to the Internet. When Google arrived and was so minimalist and clean that you had to love it. When Facebook made all the choices easy. As Arment (2011) explains in his video, which provides many useful insights into the process of software production, very few companies consulted users before creating their software, which is probably for the best. Then, they were able to roll out products which dazzled us with their design and simplicity.

So the transition to Facebook was a no-brainer, for most people. But some of us remained naïve for a while. I think we didn’t recognize it at first – the greed behind these slick new sites. The (shudder) profit motive. The sheer chutzpah! A friend of mine complained a couple of weeks ago that Facebook kept re-sorting her newsfeed into the order it wanted her to view items. Oh, you can choose to display items chronologically, but the next time you come back to Facebook, they’ll be ranked back the way Facebook thinks they should be. And this is because, to Facebook, we are users, but – as Arment (2011) explains – we are not customers. Customers are the people paying to buy ads. The product here is us, our eyeballs, our clicks. And as Arment says, this inevitably corrupts the motivation to create great software.

But I’m getting ahead of myself. Because I really believed at first, and I think a lot of us did, that the Internet was going to be like some kind of big hippie campground. Like Burning Man, where there would be no money; no problems, no trash – you pack it in, you pack it out, and leave no trace behind. No wonder we thought that. Computers kept getting bigger and bigger, and faster and faster, and cheaper and cheaper, and people just figured that at some point, it would be incredible. Just breath-takingly lovely. And free.

That was what Facebook promised, and then (surprise!) they sold us out by actually turning a profit. It was like setting up a Starbucks at Burning Man, except all your friends were there, you’d already bought your ticket, the coffee tasted great, so what could you do?

You stayed. We all stayed on Facebook.

To his credit, Jaron Lanier doesn’t seem to have a Facebook or any other social-media presence. And his personal website still looks butt-ugly. Like, circa 1999 butt-ugly.

(don’t believe me?  here’s a screenshot!)

image

Fig 1. Pretty Darn Ugly.

Reverse Alchemy

I have so far been unable to come up with parallels for this period of growth and bursting out of ourselves that the Internet created in the 1990s and early 2000s. Has this ever happened before in human history? When they invented the printing press, nobody thought they’d just be churning out Bibles forever (not with so much erotica waiting to be published!). Printing presses had to turn a profit from Day One or the company running them would be shut down – this was the real world, after all, not some kind of Dot-com bubble.

Everybody tends to make overblown claims about the Internet, so I want to avoid making overblown claims.  But I find myself believing this much is true: the Internet has seen a type of reverse alchemy over the last twenty years. We have turned gold into dung; the shining city on the hill has become the decaying inner city, and every Internet user is trapped in his or her own ghetto of ordinariness. An ordinariness that, thanks to the miracle of modern technology, can now follow us around everywhere we go.

The words have lost their wonder, and we can see this in the way that all the overblown terms we created no longer warrant their capital letters. Internet, you don’t deserve a capital I. Information Superhighway, what ever were you, anyway? Nobody anyone quite knew. “Net”? Sounds like something you’d get caught in if you weren’t careful. Surfing? A passing fad; here today, gone tomorrow.

You can’t step in the same internet twice.

Since I have children in their 20s, I know the sad truth – Facebook is already gone. They only use it because that’s where their parents and grandparents are. They’re on reddit, they’re on snapchat, no more capital letters for their apps because they are all about glorifying the ephemeral; here today, gone tomorrow.

The internet that is everywhere – the Internet of Things; the internet that is everywhere and the internet that is nowhere. The Internet of my past has vanished without a trace.

Engaging with Digital Humanities

Now, the broader question I must address is that while I pursue my studies at the Master’s level and intensify my relationship with the Humanities, what role is Digital Humanities going to play in this endeavour over the next five years?

There is no doubt that technology will be involved, but to what extent does that turn my pursuits into Digital Humanities? Do we have to harness the full potential of the medium (media) for that term to apply, or is it Digital Humanities if I use a tool to convert one of the course readings to OCR text, and then a second tool to convert that to audio format, so that I can export the MP3 to my phone and listen to the course “readings” while playing Solitaire in the background? (A most pleasurable way to study.)

Do our endeavours count as Digital Humanities – and again, we’re back with the capital letters, even though I’m not at all certain they are earned in this case either – if they are only moderately digital? Or only moderately humane? Let’s take this essay as an example, since it is part of my coursework towards three credits in Digital Humanities. The fact that I am posting this essay in blog format carries very little significance, here in the year 2017, where although blogs are “public” there is very little chance that it will be discovered and shared at all.

I’m not at all sure that the term has been adequately addressed in any of our sources. I’m not at all sure that it can be, given the limitations of our understanding.

Still, the short answer to this question is… yes. Not a resounding yes, a tentative yes. If, as Burdick (2012) asserts, “Digital Humanities is born of the encounter between traditional humanities and computational methods” (p. 3), then this essay, written in a fairly traditional humanities fashion of conducting research, creating a loose thesis, and then typing the whole lot up, has “encountered computational methods” simply by virtue of being typed up in Microsoft Word and posted to a blog. It is text-based, which she suggests represents the core of the humanities. And perhaps, as she also suggests, it takes advantage of “new modes of knowledge formation enabled by networked, digital environments” (p. 7).

In light of this, Digital Humanities feels like a bit of a letdown, frankly. I hope there’s more to it than this.

Hayles (2012), for one, is more optimistic. Where I see most of us lazily submitting rather traditional papers in an online fashion, to be distributed faster than ever but in a format that is mostly reminiscent of 19th- and 20th-century humanities, she believes that some will expand their horizons into a second wave, a veritable flood of “multimedia practices that explore the fusion of text-based humanities with film, sound, animation, graphics, and other multimodal practices across real, mixed, and virtual reality platforms” (p. 43). She sees this new wave as supporting and sustaining the traditional humanities into a new era.

I hope she’s right. I really hope the rest of us will start doing something creative and won’t just keep humping our ideas onto the back of the most convenient free platform which is provided for us to do so (thanks, WordPress!). Adding a few hyperlinks here and there isn’t going to cut it. And jazzing things up old-school, Lanier-style isn’t going to cut it either, because few of us have the knowledge and technical skills to sit down and build a web page – nor the desire to maintain one.

Lanier (2011) writes, when he’s finished rambling about cephalopods and starts talking about solutions, that “200 percent of responsibility needs to be taken for things to improve” (p. 202). Judging from the kludge of his website and the misery of his life without social media, not to mention his activism and authorship, he’s shouldering more than his share. But that still leaves a lot of work to be done. And I’m just not that confident that we’re up to the task of preserving his vision of a unique digital presence, or of Hayle’s vision of claiming and engaging with the information age “in serious, sustained, and systemic ways” (p. 61). I really think we’ll go on schlumping lazily through our digital lives unless something – though it’s hard to imagine what – ever comes along to shock us out of the dream and back into sobriety once again.

References

Arment, M. (2011). Contrary to popular belief, Webstock.. https://vimeo.com/21779251

Burdick, A., Drucker, J. & Lunenfeld, P. (2012.) “Humanities to digital humanities” in Digital_Humanities. Cambridge, US: The MIT Press.

Hayles, Katherine. (2012). “How we think: Transforming power and digital technologies.” In David Berry (ed.) Understanding Digital Humanities. Thousand Oaks, CA: SAGE. Pp. 42-66.

Lanier, J. (2011). You are not a gadget a manifesto. New York: Knopf.

Leiner, B., Cerf, V., Clark, D., Kahn, R., Kleinrock, L., & Lynch, D. et al. (2016). A Brief History of the Internet – Internet Timeline | Internet Society. Internetsociety.org.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s