Paperback(1st Edition)

$30.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

Provocative yet sober, Digital Critical Editions examines how transitioning from print to a digital milieu deeply affects how scholars deal with the work of editing critical texts. On one hand, forces like changing technology and evolving reader expectations lead to the development of specific editorial products, while on the other hand, they threaten traditional forms of knowledge and methods of textual scholarship.

Using the experiences of philologists, text critics, text encoders, scientific editors, and media analysts, Digital Critical Editions ranges from philology in ancient Alexandria to the vision of user-supported online critical editing, from peer-directed texts distributed to a few to community-edited products shaped by the many. The authors discuss the production and accessibility of documents, the emergence of tools used in scholarly work, new editing regimes, and how the readers' expectations evolve as they navigate digital texts. The goal: exploring questions such as, What kind of text is produced? Why is it produced in this particular way?

Digital Critical Editions provides digital editors, researchers, readers, and technological actors with insights for addressing disruptions that arise from the clash of traditional and digital cultures, while also offering a practical roadmap for processing traditional texts and collections with today's state-of-the-art editing and research techniques thus addressing readers' new emerging reading habits.



Product Details

ISBN-13: 9780252082566
Publisher: University of Illinois Press
Publication date: 02/10/2017
Series: Topics in the Digital Humanities
Edition description: 1st Edition
Pages: 368
Product dimensions: 6.00(w) x 9.10(h) x 1.00(d)

About the Author

Daniel Apollon is an associate professor and head of the Digital Culture Research Group at the University of Bergen. Claire Bélisle is a researcher at the National Scientific Research Center at the University of Lyon. Philippe Régnier is director of research at the National Scientific Research Center at the University of Lyon.

Read an Excerpt

Digital Critical Editions


By Daniel Apollon, Claire Bélisle, Philippe Régnier

UNIVERSITY OF ILLINOIS PRESS

Copyright © 2014 Board of Trustees of the University of Illinois
All rights reserved.
ISBN: 978-0-252-03840-2



CHAPTER 1

The Digital Turn in Textual Scholarship

Historical and Typological Perspectives

ODD EINAR HAUGEN AND DANIEL APOLLON


Three Perspectives

This chapter is written under the assumption that the history of textual scholarship from its very beginnings to the digital age can be understood from three perspectives. These perspectives are not the perspectives of the historian who tries to grasp the development of textual scholarship, but rather the perspectives held by the practitioners of the art and science of editing texts, for scholars who edit, comment, and analyze texts written by other people. This chapter assumes that editors may choose to look backward, outward, or inward.

First, looking backward means to search for the origin of the text and to trace its development through time. When dealing with classical and medieval works, the editor has to track the process of copying, starting with the original text and then moving on from one copy to the next. When dealing with post-Gutenberg texts, the editor needs to trace the development from the first drafts made by the author until the end product, usually a printed edition. Second, looking outward means to view the text as a product situated in a sociohistorical context. This implies that its contents, its use, and its organic relationship to other texts and sociocultural realities is of greater interest than its origin and material transmission. Third, looking inward implies viewing the text as an individual expression of its own right, as a self-contained document, to be read and understood on its intrinsic merits. One of the characteristics of texts and literature, highlighted by the New Criticism in literary studies in the 1950s, is the existence of multiple layers of meaning and of a wealth of interpretations.

The approach defended in this chapter reflects the belief that the self-contained nature of texts, as advocated by this New Criticism (looking inward), and the awareness of the organic relationship of texts to their world (looking outward) can benefit from a historical approach to the text and its transmission (looking backward). The purpose of this chapter is to show how these three perspectives may shape digital text scholarship.

The tripartite view outlined above is, of course, a simplification, and it is not intended to be a scheme in a Hegelian sense for the actual history and development of textual scholarship. It includes, rather, three aspects that have been differently weighted in the practice of textual scholars over the years, to the extent that these perspectives can be seen as competing but not mutually exclusive points of orientation. Any scholar who primarily looks toward the origin of a text and tries to chart its development will understand and acknowledge the fact that each stage of the text also has a contemporaneous setting and interpretation and will indeed exploit such contextual knowledge. Any scholar focusing on the setting of a text, on its Sitz im Leben (German: literally, its "setting in life"), or its uses is well aware that the text also has a material history and a physical aspect.

This chapter takes a broad look at the history of textual scholarship with these perspectives in mind. While the history of textual scholarship is commonly traced back to the birth of Western philology in the Hellenistic age, shaped and cultivated in the Library of Alexandria, the starting point here is the methodological foundation of textual scholarship in the early nineteenth century. This does not mean that the long history of textual scholarship in antiquity, in the Middle Ages, or indeed in early modern times should be disregarded as "pre-scientific," but that major approaches to textual scholarship, in print and online, can be exemplified and discussed with reference to scholars of the nineteenth and twentieth centuries.


Looking Backward: The Formalist Approach

The conception of modern textual criticism is commonly thought of as belonging to Karl Lachmann (1793–1851) and his generation of editors in the first half of the nineteenth century. Lachmann's work covered all three major fields of editorial philology—classical philology, Bible philology, and medieval philology—and thus has become a point of reference in all of these fields. Lachmann expressed clearly the basic tenets of a scientific textual criticism very early in his career in a critical review published in 1817 of Friedrich von der Hagen's edition of Der Nibelungen Lied (1816). Lachmann claimed that the editor should search for the original version of the text, or, if that was unattainable, for as close an approximation to the original as possible: "On the basis of a sufficient number of good manuscripts, we should and we must build a text which reflects all of these, a text which either would be the original text or a text which would come very close to the original." This position found support in the contemporary historical source criticism, or Quellenkritik. In fact, textual criticism and historical criticism, following Lachmann's reasoning, should be seen as two aspects of the same approach. Only when younger and less authoritative witnesses had been removed through a strict analysis would the editor (or historian) be able to understand the text in its true context.

Later in the nineteenth century this program was enthusiastically adopted by the German Romanist Gustav Grober (1844–1911) and the French Romanist Gaston Paris (1839–1903). The first contribution in this field was Grober's analysis of the manuscripts of the story of the Saracen knight Fierabras (1869), but the most important and consequential work proved to be Paris's edition of the Alexis legend (1872), published together with the French scholar Léopold Pannier (1842–1875). This edition contained a thorough analysis, a recension, of all manuscripts of the Alexis legend as well as a complete text based on the results of this analysis. The introduction gave a clear and concise discussion of the principles for the recension of manuscripts, from a theoretical point of view as well as from a practical one. This edition became a paradigm for French editorial philology for more than half a century.

The position of Karl Lachmann and Gaston Paris is one of strictness and formalism. The recension of manuscripts is, in the memorable words of Paris, an almost mechanical operation, "une opération pour ainsi dire mathématique" (Paris and Pannier 1872, 13). Recension also implements a highly reconstructive approach, since it strives to trace the text to its origins. After the original author had finished his work, textual deterioration was likely to set in. The task of the textual critic was to remove as many corruptions as possible from the text in order to restore it to its former glory. This perspective is certainly not new; it was already an integral part of the Homeric scholarship of the Alexandrian age (as shown by Honigman 2003 and Niehoff 2007). The practice of indicating corruptions by a dagger sign, the obelus, was probably introduced by Zenodotus of Ephesus, the first librarian (ca. 325–ca. 234 BC), and Echtheitskritik (German: "criticism of authenticity") has been part of textual scholarship ever since. The novelty of the Lachmannian approach resides in its systematic exploitation of corruptions, or errors, as a means of establishing the filiation or derivation of a text. Looking for errors became not only a part of the examination of the text but also the very foundation of its recension and its genealogical analysis. Gaston Paris emphasized the basic fact that copyists very seldom make the same mistakes at the same places (1872, 10). From this observation it follows with logical necessity that the filiation of a text can be established on the basis of the errors in the manuscripts.

When Joseph Bédier (1864–1938) edited the medieval text Le Lai de l'Ombre in 1890, it was, as he himself observed, under the sign of Lachmann ("sous le signe de Lachmann"), or, he might have said, under the auspices of Gaston Paris, the ever-present editor of the journal Romania. The introduction to the edition concluded with a stemma showing the relationship of the manuscripts and continued with a text constituted on the basis of this stemma. The story might have ended here, and Bédier might have moved on to other fields of study and left the Le Lai de l'Ombre edition as a youthful exercise. However, he felt uneasy with his recension and returned to the text over the years. After revising his edition in 1913, he published in 1928 an article in Romania in which he discussed the recension of Le Lai de l'Ombre and the methodological conclusions to be drawn from it. In this article Bédier draws no less than eleven different stemmata for the text, all equally valid and possible as explanations of the transmission of the manuscripts. Rather than moving from uncertainty to certainty, as the 1890 edition with its single stemma would seem to indicate, Bédier had indeed moved from certainty to uncertainty. Above all, he suspected any stemma construction of being heavily biased. Leafing through the volumes of Romania, he noticed that the overwhelming majority of stemmata published in this journal had only two major branches. That was indeed a strange forest, a silva portentosa, and rather than reflecting a historical fact, Bédier suspected that it reflected a weakness in the method and its usage. So after almost four decades of struggling with the manuscripts of Le Lai de l'Ombre and a growing uneasiness with the Lachmannian method, Bédier concluded that the method had to be suspect. There was simply too much bifidity in the recensions—that is, too many stemmata in which there were only two main branches. Bédier provocatively claimed that the best answer still was that of the old humanists: to choose the best manuscript in the tradition, the codex optimus, and rely on that, save for obvious corruptions (Bédier 1928, 356). Editions of this type are usually referred to as best-manuscript editions, and in spite of the seemingly unscientific selection procedure, these are still recognized as a major type of edition (see, e.g., Foulet and Speer 1979, 38, and elsewhere in this book).

In spite of—or possibly as a consequence of—Bédier's fundamental criticism and doubt about the genealogical method, the study of manuscript traditions has continued to be the object of logical or mathematical approaches. Along this line can be placed people like Henri Quentin (1872–1935), Walter Wilson Greg (1875–1959), and Dom Jacques Froger. Quentin was very critical of the use of common errors, "fautes communes," which were so central in the Lachmannian tradition. He would accept only variants and nothing but variants (1926, 37): "I do not recognize errors or common mistakes, neither good nor bad readings, only the variant forms of a text, on which I by a method of the strictest statistics first delimit the families, then the classes of manuscripts within each of these, and finally the families within these classes" In spite of this criticism, Quentin should be regarded as a formalist and thus a textual critic who basically shares Paris's conviction of the regularity and analyzability of manuscript evidence. In fact, Quentin claimed that he had been able to formulate an iron rule, "une règle de fer" (1926, 37), that would remove all subjectivism from manuscript recension.

Already in 1963, dealing with the rich inventory of New Testament manuscripts, Ernest Colwell and Ernest Tune advocated to carry out a statistical comparison of each manuscript with all other available manuscripts witnessing a given text. While it would be difficult to disagree with this program, the development of the necessary tools took some time. However, when the Centre National de la Recherche Scientifique (CNRS) summed up the status of modern textual criticism in the conference report "La pratique des ordinateurs dans la critique des textes" (Irigoin and Zarri 1979), there was a large array of methods available. However, considering the rapid spread of software tools and the increase of computing power, progress in this area seems to have been slower than anticipated. Indeed, the emergence of powerful methods of multivariate data analysis contributed to strengthen the awareness of practitioners that universal and irrefutable methods allowing the editor to infer filiation from incomplete textual sources may not exist. Since 1979 the list of multivariate techniques has been extended with biplot methods, such as correspondence analysis (see Apollon 1985), and in more recent decades graph-oriented methods, such as phylogenetic analysis, a methodological pillar of evolutionary genetics, have emerged as a new contender. Peter Robinson and colleagues have argued that phylogenetic analysis can come a long way toward solving the chronological problem inherent in any recension (Barbrook, Blake, Howe, and Robinson 1998). Recently, a Finnish research group tested a number of quantitative techniques using an artificial data set and concluded that phylogenetic analysis is indeed a strong contender, but not the only serviceable technique nor possibly the best one in the field (Roos and Heikkilä 2009). Unfortunately, we do not have any complete manuscript filiations from antiquity or the Middle Ages; we have only fragments of unknown proportions. The "true" filiation of a text can thus never be ascertained, but only approximated. These recent methodological advances have the seemingly paradoxical side effect of confirming the position and role of editor at the center of the editing process.

While the basic tenets of the genealogical method have not been questioned since the time of Lachmann, and the method, ideally, should still be regarded as valid, it is not always practicable. A manuscript filiation can indeed be modeled by anlyzing the distribution of errors over time, but only if the recension is uncontaminated—that is, each copy has been made from a single exemplar. If there have been multiple sources along the line, the method breaks down quickly. Against contamination no remedy has been found, Paul Maas concluded—"Gegen die Kontamination ist kein Kraut gewachsen" (1960, 30). The depressing fact is that so few traditions are uncontaminated. In classical Latin literature, Karl Stackmann, for example, believes that only the works of the most important authors were transmitted without contamination (1979, 252). In vernacular medieval literature, texts were often copied quite freely, blurring the distinction between the copyist and the redactor. In short, the genealogical method remains a valid method, but only for a minority of manuscript traditions. There is no such thing as a "genetic tracer" any textual equivalent of mitochondrial DNA, that may help the editor and readers to irrefutably identify an original text.

What a mathematical analysis can offer is a way of mapping the distribution of variants of a text. While this has been envisaged for a long time (see, for example, the bold statement by Henri Quentin quoted above), it is only in the last decades that this has been made possible in practice. Mathematical models do have their limitations, however. They can undoubtedly help the critic to establish a focal text, in the sense that they can identify clusters of variants and thus the likely centers in the transmission of the text. It is less certain that they can help in establishing an archetypical text—that is, a text in which the diachronic axis has been revealed, and thus showing which variants belong to an earlier stage of the text and which belong to a later stage.


(Continues...)

Excerpted from Digital Critical Editions by Daniel Apollon, Claire Bélisle, Philippe Régnier. Copyright © 2014 Board of Trustees of the University of Illinois. Excerpted by permission of UNIVERSITY OF ILLINOIS PRESS.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Preface vii

Acknowledgments ix

Introduction: As Texts Become Digital Daniel Apollon Claire Bélisle Philippe Régnier 1

Part I History, Challenges, and Emerging Contexts

1 The Digital Turn in Textual Scholarship: Historical and Typological Perspectives Odd Einar Haugen Daniel Apollon 35

2 Ongoing Challenges for Digital Critical Editions Philippe Régnier 58

3 The Digital Fate of the Critical Apparatus Daniel Apollon Claire Bélisle 81

4 What Digital Remediation Does to Critical Editions and Reading Practices Terje Hillesund Claire Bélisle 114

Part II Text Technologies

5 Markup Technology and Textual Scholarship Claus Huitfeldt 157

6 Digital Critical Editing: Separating Encoding from Presentation Alois Pichler Tone Merete Bruvik 179

Part III New Practices, New Contents, New Policies

7 The Making of an Edition: Three Crucial Dimensions Odd Einar Haugen 203

8 From Books to Collections: Critical Editions of Heterogeneous Documents Sarah Mombert 246

9 Toward a New Political Economy of Critical Editions Philippe Régnter 266

Bibliography, Online Sources, and Software Tools 297

Contributors 331

Index 335

From the B&N Reads Blog

Customer Reviews