Poiesis and Techne

3rm4y

There is a desire to make sense of the whatness of technology. What are we doing with it in the humanities? What should we be doing with technology in the humanities? Is what we are doing with technology dangerous? If it is dangerous, should we proceed with caution, or throw our caution to the wind and proceed with reckless abandon and do statistical analysis, generate word lists and concordances, digitally image and represent, shut up in silos, and certify authorship without let or hindrance? What are we doing and why?

It is interesting to turn to Martin Heidegger with regard to the nature of the essence of technology. Despite the fact that reading Heidegger can make anyone feel like a first rate moron, persistence will pay off. After all, you can feel like a super-mega-hella-nerd when you are able to throw around seemingly incomprehensible phrases like, “Enframing means the gathering together of that setting-upon which sets upon man, i.e., challenges him forth, to reveal the real, in the mode of ordering, as standing-reserve. Enframing means that way of revealing which holds sway in the essence of modern technology and which is itself nothing technological.” What? Just. . .What?

Let’s get down to this. Heidegger presented a few lectures on the nature of technology. Writing after the destruction of the firebombing of Dresden, the dropping of the atomic bombs on Hiroshima and Nagasaki, and the attempted cultural annihilation of Jews (and gypsies, and homosexuals, and communists, and labor union organizers, and political dissidents, and religious people of all stripes who opposed the Third Reich’s state-based religion of Nazism), Heidegger had a reason to question the essence of technology. An unapologetic Nazi sympathizer, though he did say that it was the “die größte Dummheit seines Lebens” (the biggest stupidity of his life), Heidegger saw technology put to violent uses–chemical pesticides and atomic energy used to kill humans. He also saw developments that would be more peaceable–radar, originally developed by the RAF to detect the Luftwaffe, used to track weather, and Alan Turing’s “bombe”, the machine at Bletchly Park that cracked the Enigma Code, eventually made into computers, such as the one that you’re reading this blog on. Heidegger’s entire philosophy rests on his annoyance that people don’t question enough. They don’t look into the quidditas of the world and of their perceptions. They’re hampered by philosophy. What is the essence of technology?

The essence of technology, Heidegger asserts, is to allow people to order nature, for people to uncover the already unhidden. After all, nuclear fission and fusion happen, they happen without human beings making them happen. Human beings can start a nuclear chain reaction, something that happens, thanks to physics, without human beings. All physicists have to do is put the correct things in order, order their experiments, and then, bam! The Gadget. The problem is, that human beings shouldn’t be the “standing-reserve” of technology. We shouldn’t be used by technology. Technology shouldn’t be an uncontrolled nuclear reaction. We should be able to make something of this revelation.

We need to question our techē, these activities and skills of craftsmen, to bring that technē into poiēsis. Certainly, this is difficult if we are talking about atom-splitting, but the research library at Los Alamos National Labs seems to point at this a way to approach this difficulty. Using the supercomputers available through LANL and Stanford University, researchers are able to write programs or use programs written by others, to reveal what is already present in texts that have been digitized. Research libraries like LANL’s give a place for researchers to develop programs and write code to untangle the unhidden. The problem then becomes, what do we do after we have ordered this text. Are we going to order it, as the technology allows us to do, and say that’s the end of our technē? or are we going to do something new with it?

What is already present is the text, the author, the sentiments, the plot, the words, the affect, the themes, the book, the print, the paper. What needs to be revealed is what the human being must reveal, but, as Heidegger observed, it’s unconcealed, it just needs to be detected. Here’s a good example. Karina van Dalen-Oksam presented a paper at the 2013 Digital Humanities conference about the epistolary novels of Elisabeth Wolff and Agatha Denken. She wanted to know if one author wrote some of the letters and adopted a voice for those characters, and if the other author wrote the other characters’ letters and created their voices. Basically, her research looked into authorial attribution and the formation of character voices. Using a stylometric R-script program developed by Eder and Rybicki, she inquired into technē and poiēsis. What she found out was not what she was expecting. She expected to find that one author wrote a set of characters and that the other author composed the other set of character’s letters, and voila! Instead she found that Wolff and Denken shared the labor of crafting letters for a number of different characters, except for one. And that character was Abraham Blankaart, whose letters have a distinctive style and voice, pointing to authorship by either Wolff or Denken, but not both.

Certainly, Elisabeth Wolff and Agatha Denken know who wrote which letters, but they’re long since dead, and what is unconcealed, their authorship, is hidden only because they took their secrets to the grave with them. Technology has allowed Van Dalen-Oksam to reveal the poiēsis of the novel, and she orders it, but she also reveals something in that ordering, and creates scholarship out of the ordering.

Technē indeed.

WolffandDenke

From van Dalen-Oksam, Karina. “Epistolary voices: The case of Elisabeth Wolff and Agatha Deken.” Digital Humanities 2013.

Forcing the Question

first

Willard McCarty, the Obi-Wan Kenobi of the humanities computing world, pushed at some real questions and problems about what it is that digital humanists do, and asserted that there is a profound mismatch between the algorithms created by coders and programmers and the data “normal” to the humanities–normal data like books, characters, historical events and personages, plots and themes, eras and epochs, documents and relia. Words. Where do these algorithms come from? From a symbolic language? From symbolic mathematics? Where are the data points stored? How do we liberate them from the digital silos where they are hoarded? What is the reality of the data, and what realities do the data point back to? What are some major ontological and ethical issues raised by DH?

Here is a start:

Located in northern New Mexico, Los Alamos National Laboratory (LANL) is a multi-mission national security science Laboratory, responsible for monitoring the safety and reliability of the U.S. nuclear stockpile, defense against all things nuclear, energy security, and scientific discovery. It is the senior DOE National Laboratory and the most productive scientifically. Its Research Library (SRO-RL) is a leading DOE digital library, providing state-of-the-art information technology tools to our research community and developing innovative web technologies to further information availability, accessibility, and digital preservation. We are seeking an individual with the vision, leadership, creativity, and entrepreneurial skills to direct and inspire its world-renowned Research Library team.
(Job posting is from the Digital Library Federation (DLF-Announce) email list)

I am from New Mexico. Los Alamos National Labs is the site of the Manhattan Project, and it was at Alamogordo’s Trinity Site that they blowed it up good. What does weapons research have to do with “information availability, accessibility, and digital preservation?”

New Mexico’s national labs face significant questions about funding for the coming year, newly appointed Energy Secretary Ernest Moniz told reporters during a visit to Sandia National Laboratories on Tuesday.
“We’re all suffering under a lot of uncertainty on the budgets,” Moniz said at the end of a day of visits to Sandia and Los Alamos labs.
Among the biggest unsettled questions are a proposed budget increase for the labs’ work refurbishing the nation’s B61 nuclear bombs and funding for upgrading the buildings at Los Alamos used to do the research and manufacturing work with plutonium, a radioactive metal used in nuclear bombs.

(Albuquerque Journal, 4 September 2013).

When McCarty observed about the digital humanities that “the moral seems clear enough: that computing belongs in the humanities because it accords with their final project. . . not to solve problems, but to make them worse” (1224), it seems that this sticky wicket is exactly the sort of thing that is making it worse:  that digital humanities is a philosophical and ethical pursuit that begs the same questions as any other ethical pursuit. While the traditional humanities approach seems to be one of book, notebook, pencil (or maybe book, laptop, keyboard), the digital humanities require more. The laptop uploads programs and coding to a server, that runs the code, applies data sets, and generates results. Some of the supercomputers that allow for these computations are housed in research laboratories like LANL that receive funding for the manufacture and proliferation of weapons of mass destruction that could soon send us to the post-technological, post-apocalyptic, post-nuclear world of the monks in A Canticle for Leibowitz.

Let’s trouble our own digital humanities water: the ethical problems are ones raised by our participation in a system that has its origins in DARPA, the Department of Defense, and weapons funding. The ontological questions that come into existence then are these: if the resources to do our work come from less than peaceable places, is the scholarship that we generate disrupting the reality of the weapons research labs? What is the nature of our scholarship’s existence, if its beginnings and continued existence owe less to the money generated by NCAA sports, and more to plutonium pits?

Digital humanities and humanities computing are more than coding, more than interesting plot and affectation charts, more than bar and line graphs, more than codices and concordances, more than GIS and more than imaging, more than online libraries with dirty OCR and inexpertly applied TEI. Digital humanities demand a theoretical approach and a metatheory to allow us to make our problems worse and more complex, not solve them.