Three Barriers to the Development of Digital Tools in and for the Humanities.

Edward Vanhoutte

This short position paper draws the attention to three barriers to the development of digital tools in and for the humanities which have the lack of a well defined set of fundamental methodologies for humanities computing as their basis. The paper calls for a better definition of the field, a reconsidering of the teaching curriculum, and an improvement in communicating about what it is that humanities computing does. Only the extrapolation of a clear self-awareness of the field can contribute to the funding and development of digital tools in and for the humanities.

In the concluding chapter of his book Humanities Computing, Willard McCarty lists 'Re-tooling' as tenth point in his preliminary agenda for the field (McCarty, 2005, p. 217-224). According to McCarty, two requirements are implied in any call for better tools: 'first, that we know what computing can do for us in our asking and answering of significant questions; second, that the benefits can be demonstrated.' Willard's first requirement I want to extend towards the definition of the field. Indeed the question what humanities computing is has only been answered partially, either through more or less chronological surveys of humanities computing, which are very useful exercises in stock-taking, but can never provide the field with a definition;[1] or through scarce definitions of the field such as the one written by, again, Willard McCarty in 1996.[2] His definition mentions the following activities of humanities computing: 1. the application of computing tools to arts and humanities data; 2. their use in the creation of these data; 3. the development of theoretical models of the field; 4. the study of the sociology of knowledge; 5. teaching; 6. research; and 7. service. What is missing, in my opinion, is the very development of the tools.

In the beginning years of humanities computing, scholars who were involved with the activities sketched by McCarty's definition either programmed their own tools and even constructed their own computers for that purpose, like Andrew D. Booth did;[3] or called upon the programming capabilities of computer scientists. Roberto Busa, for instance, relied entirely on the concordance programs developed by IBM for his work on the works of Thomas Aquinas. The discussion of the late 1980's and early 1990's about the curriculum for computing in the humanities unveiled a clear dichotomy between, what Nancy Ide has called, the 'Holistic View' and the 'Expert Users View' (Ide, 1987). Supporters of the former, including Ide herself, envisioned such course to 'provide a broad survey of the field and identify and expound the theoretical principles that inform its methodologies, in order to provide a foundation for further learning and work.' (Ide, 1987, p. 210) According to adepts of the latter view, such courses should 'familiarize students with existing tools and provide sufficient skills to enable them to automate phases of fundamentally traditional humanities research.' (Ide, 1987, p. 211) The focus on the overlap between computer science and humanities disciplines in the former view has its main interest in the underlying formal structures and methods and contrasts as such with the latter view's interest in the computer as a tool for humanities research. This instrumental approach requires the teaching of skills, programming languages, and applications which are of use to experts and maintains the autonomy of computer science as the discipline which provides the means and the humanities as the discipline which instructs the goals.[4]

In the practice of teaching computer courses for the humanities this dichotomy centred around the question whether or not programming should be taught to humanities students, to what extent, and for what purpose. In the debate on the disciplinary status of humanities computing, the focus is on the shift of emphasis from the computer as a tool to the methodological fundamentals of the use of the computer in the humanities. The contumacy of the adepts of the Holistic view who argue for academic autonomy of humanities computing is directed towards the conservative conviction that humanities computing is divisible into the traditional humanistic disciplines[5] on the one hand and computer science on the other. I call this conservative because it is intended to maintain the traditional academic organisation of disciplines that are identified and formed by the subject they study and not by the methodologies they employ. This reductive and deconstructive perception of humanities computing as an instrumental adjuvant service to the traditional humanities disciplines on which it thrives is still upheld by major academic administration, funding, and assessment structures, and by some members of the humanities computing community who advocate the 'Expert Users View'. An interesting middle position between the two extremes described here was defended by Mike Fraser who argued in the round table session on New Directions in Humanities Computing that concluded the ALLC/ACH2002 conference in Tübingen that if humanities computing does exist, it has more to do about service to the humanities in sharing methods and tools.[6] By addressing humanities computing as such, Fraser acknowledged the field as existing, but put it in a servile relationship to the traditional subject based humanities disciplines. In McCarty's initial description of humanities computing (1996), service is but one of the three mentioned manifestations in the institute, and one of the seven general activities. But in my view, also that description is incomplete, for it lacks any notion of development of computing tools for the humanities, which McCarty (1998) located in computer science. Certainly, there are humanities computing scholars who develop their own software tools and who do not consider themselves computer scientists. Especially the families of XML and XSL languages provide powerful technology to create such tools without the need for programming. Taking this into consideration, I propose here to rewrite McCarty's first sentence as follows: 'Humanities computing is an academic field concerned with the development and/or the application of computing tools to humanities + arts data or their use in the creation of these data.'[7] Likewise, Fraser's description could be adapted to my vision by rephrasing it as 'if humanities computing does exist, it has more to do about service to the humanities in teaching and sharing methods and tools it develops and or uses.'

If this addition to the definition of humanities computing would be accepted, it would mean a rapprochement of the 'Holistic View' towards the 'Expert Users View', and a reaffirmation of the common ground humanities computing shares with computer science. Consequently curricula could be redesigned to meet this definition and maybe develop into the direction of what Manfred Thaller likes to call 'Humanistic Computer Science' (Thaller, 2006). Offering these curricula could prevent the lack of enthusiasm with graduating programming talent, a problem signalled by John Unsworth (2003), and having this programming talent at our disposal, we could think more about what humanists want to do with computers, rather than what computers can do for the humanities (again a point made by John Unsworth). In his provisional list of functional primitives of scholarship Unsworth mentions discovering, annotating, comparing, referring, sampling, illustrating, and representing. Unsworth's list runs more or less parallel with the taxonomy of computational methods developed by the UK Arts and Humanities Data Service (AHDS).[8] These two lists can provide a further insight into a fruitful merger of computing and humanities disciplines.

A third barrier to the development of tools I want to raise and that is the consequence of the lack of a definition of humanities computing, and the inadequate curriculum, is the problem computing humanists have with communicating about their work both internally in the community and externally to members of the society. McCarty (2005) treats these problems as points four[9] (p. 209-210) and eight[10] (p. 213-215) respectively in his agenda, so I can suffice here with re-emphasizing a twofold need in connection with our communication to the outside world, namely the need for a popularized explication of every activity and project we undertake in humanities computing, and the need to grasp every occasion to explain and justify what we do. The explicit need to communicate in simplified and non-jargon terms will increase the receptive potential of our activities and it will eventually open up the minds of others (McCarty, 2005, p. 215).

The development of digital tools for the arts and the humanities includes the formalisation of one's methodology into computational algorithms. The description of humanities computing as a superset of disciplines which are founded in the traditional humanities disciplines does not help towards a common methodology that may as well not exist. However, a set of methodological commons can be formulated, both among the activities inside humanities computing, and between humanities computing and computer science. This could then be taken as the defining framework for the field, which could inspire curriculum design and a better communication between the field and its social partners. The incorporation of tools design and development in humanities computing is an essential step away from a culture of dependency and will further constructive dialogues involving humanities computing as a valid entity.



© 2006 Edward Vanhoutte

This position paper was read at the AHRC (Arts and Humanities Research Council) Methods Network Workgroup on digital tools for the arts and humanities, Thursday June 15th, 2006, King's College London.

XHTML auteur: Edward Vanhoutte
Last revision: 20/06/2006

Valid XHTML 1.0!