banner

Putting Time back in Manuscripts: Textual Study and Text Encoding, with Examples from Modern Manuscripts.

Edward Vanhoutte

edward.vanhoutte@kantl.be


It's interesting to observe how many theorists of electronic scholarly editing have advocated the transition from universional editions in print to universal electronic editions which can in theory hold all versions of a work (Dahlström 2000), but have passed over the practicalities underlying the production of such an edition. The envisioned model of the versioning edition, representing multiple texts (Reiman 1987) in facsimile as well as in machine-readable form, in concordances, stemmata, lists of variants, etc. has already been applied to editions of older literature and medieval texts (e.g. Robinson 1996, Solopova 2000), and more interesting work in several fields is currently under way (e.g. Parker 2000). At the core of the theories of such computer editions and computer assisted editions (e.g. Gants 1994, McGann 1996, and Shillingsburg 1996) is the requirement for a platform independent and non-proprietary markup language which can deal with the linguistic and the bibliographic text of a work and which can guarantee maximal accessibility, longevity and intellectual integrity (Sperberg-McQueen, 1994 & 1996: 41) in the encoding of texts and textual variation. The encoding schemes proposed by the TEI Guidelines for Electronic Text Encoding and Interchange (Sperberg-McQueen & Burnard 1994) have generally been accepted as the most promising solution to this. The transcription of primary source material enables automatic collation, stemma (re)construction, the creation of (cumulative) indexes and concordances etc. by the computer.

Although the TEI subsets for the transcription of primary source material "have not proved entirely satisfactorily" for a number of problems (Driscoll 2000), they do provide an extremely rich set of mechanisms for the encoding of medieval manuscripts and documents with a fairly "neat", static, and stable appearance such as print editions. The real problems arise when dealing with modern manuscript material.

Whereas medieval manuscripts form part of the transmission history of a text, evidence of which is given by several successive witnesses, and which show the working (copying) process of a scribe and the transmission/distribution of a work/text, modern manuscripts are manuscripts "qui font partie d'une genèse textuelle attestée par plusieurs témoins successifs et qui manifestent le travail d'écriture d'un auteur" (Grésillon 1994: manuscripts which form part of the genesis of a text, evidence of which is given by several successive witnesses, and which show the writing process of an author). The french school of Critique Génétique primarily deals with modern manuscripts and their primary aim is to study the avant-texte, not so much as the basis to set out editorial principles for textual representation, but as a means to understand the genesis of the literary work or as Daniel Ferrer puts it: "it does not aim to reconstitute the optimal text of a work; rather, it aims to reconstitute the writing process which resulted in the work, based on surviving traces, which are primarily author's draft manuscripts" (Ferrer 1995, 143).

The application of hypertext technology and the possibility to display digital facsimiles in establishing electronic dossiers génétiques, let the editor regroup a series of documents which are akin to each other on the basis of resemblance or difference in multiple ways, but the experiments with proprietary software systems (Hypercard, Toolbook, Macromedia, PDF, etc.) are too much oriented towards display, and often don't comply with the rule of "no digitization without transcription" (Robinson 1997).

Further, the TEI solutions for the transcription of primary source material do not cater for modern manuscripts because the current (P4) and previous versions of the TEI have never addressed the encoding of the time factor in text. Since a writing process by definition takes place in time, four central complications may arise in connection with modern manuscripts and should thus be catered for in en encoding scheme for the transcription of modern primary source material. The complications are the following:

  1. Its beginning and end may be hard to determine and its internal composition difficult to define (document structure vs. unit of writing): authors frequently interrupt writing, leave sentences unfinished and so on.
  2. Manuscripts frequently contain items such as scriptorial pauzes which have immense importance in the analysis of the genesis of a text.
  3. Even non-verbal elements such as sketches, drawings, or doodles may be regarded as forming a component of the writing process for some analytical purposes.
  4. Below the level of the chronological act of writing, manuscripts may be segmented into units defined by thematic, syntactic, stylistic, etc. phenomena; no clear agreement exists, however, even as to the appropriate names for such segments.

These four complications are exactly the ones the TEI Guidelines cite when trying to define the complexity of speech, emphasizing that "Unlike a written text, a speech event takes place in time." (Sperberg-McQueen and Burnard 2001, 254). This may suggest that the markup solutions employed in the transcription of speech could prove useful for the transcription of modern manuscripts, in particular the chapter in the TEI Guidelines on Linking, Segmentation, and Alignment (esp. 14.5. Synchronization).

Building on this assumption, this paper will address the relationship between theory of texts and the design of electronic markup, and will be illustrated with examples from four projects I am currently involved in at the Centrum voor Teksteditie en Bronnenstudie (Centre for Scholarly Editing and Document Studies): The transcription of the Finnegan's Wake notebooks by James Joyce, the electronic edition of the diaries of Daniel Robberechts, a genetic edition of a manuscript by Willem Elsschot, and the transcription of all of the extant witnesses of a novel by Stijn Streuvels. This paper will revisit Michael Sperberg-McQueen's "Text in the Electronic Age: Textual Study and Text Encoding, with Examples from Medieval Texts." (Sperberg-McQueen, 1991) and will define a research agenda for the new TEI Working group on the transcription of modern manuscripts.

Literature



© Edward Vanhoutte, 12 June 2002.
This is the abstract for a paper to be presented on ALLC/ACH 02. Tübingen: University of Tübingen, 25 July 2001.


XHTML auteur: Edward Vanhoutte
Last revision: 12/06/2002

Valid XHTML 1.0!