Showing posts with label Ulrich Schmid. Show all posts
Showing posts with label Ulrich Schmid. Show all posts

Friday, February 08, 2013

Online Bible Tool with Eusebian Canon Data

4

Troy Griffitts at the Institute for New Testament Textual Research (INTF) in Münster, who has been involved in the CrossWire Bible Society for many years in order to provide free, open-source resources such as the free Bible software project SWORD project, has now done us another great service. In collaboration with Ulrich Schmid of the INTF, he has integrated the Eusebian Canon data into The Bible Tool. Here is a general description:
Welcome to The Bible Tool— a free, evolving open source tool for exploring the Bible and related texts online. Created by CrossWire Bible Society, the Society of Biblical Literature and the American Bible Society as the first in a number of coming Bible engagement tools using an XML standard called OSIS, we provide power searching capabilities and cutting edge tools to help you engage the Bible at a deeper leve
Just open the Bible Tool and start clicking on the Eusebian canon numbers in the left margin of the text and see what happens.


Monday, October 17, 2011

The Apocalypse Project (Wuppertal)

2
I am pleased to announce the latest major inquiry into the text of the Greek New Testament. At the beginning of this month, a team based at the Kirchliche Hochschule Wuppertal-Bethel has started work on an Editio Critica Major of the book of Revelation in partnership with the INTF-directed Editio Critica Maior series. The Deutsche Forschungsgemeinschaft (German Research Fund) has funded the initiative. Martin Karrer is the primary investigator. Ulrich Schmid is playing a leading role in deploying the latest relevant technological innovations. The project will progress in three phases with a completed edition hopefully after approximately ten years.
At least two from our blogroll will be active in the project. For the next two and a half years, I will be editing the Sahidic text of the Apocalypse. In a year, Martin Heide will begin creating an edition of the Syriac. I am fortunate to be able to conduct my research in Münster, which is a world center for Coptology as well as New Testament textual criticism. My colleagues at the INTF have repeatedly surpassed my expectations with their kindness and Gastfreundschaft! ...not to mention patience for my rudimentary German.
In coming months, I will say a bit more about the project. I am excited that Alin Suciu has discover a new fragment of the Sahidic Apocalypse which he has also identified as deriving from the same codex as other already-known leaves.

Wednesday, August 06, 2008

Münster Colloquium on the Textual History of the Greek New Testament Day 1-2

1
Unfortunately, I am now under time pressure, since I have a train to catch, and it will take me over 20 hours to get home, so I will not be able to give extensive reviews of the other papers at this point. [PMH: I shall add some comments in brackets]

The next session treated “Causes and forms of variation” and two papers were presented by David Trobisch and Ulrich Schmid. In Trobish’s paper, “What is there in a picture? Analyzing scribal practices of structuring the text,” he mainly pointed out that the documentation of the evidence has to come first, not the authoritative interpretation.

[He also argued that manuscripts are produced not only by scribes, but by a combination of author, scribe, editor, publisher and reader/corrector. Variants can originate at any of these levels - but they ought to be carefully distinguished, especially in the construction of stemma. He argued that a critical edition ought to facilitate the reconstruction of ‘the text of the first edition’ (other methods would be required for determining the ‘authorial’ text). He urged that transparency, of method, display and evaluation of material was very important so that errors would not be hidden, but would be identifiable. He began to ask the question: what if there is more than one textual archetype, e.g. the DFG text in Paul alongside the 01 ABC text - how should a critical edition handle this? But unfortunately he ran out of time and we never heard the answer.]

He posed a number of critical questions and heartily welcomed the new initiative of the Virtual Manuscript Room under development at the INTF in Münster (in fact, I think Trobisch was involved in the idea of the VMR). Read more on the INTF homepage.

Then it was time for Ulrich Schmid, ”Conceptualising ‘scribal’ performances.” Schimd started with describing the default assumption: was we physically find in a MS is the work of a scribe. Everything can be used to describe the scribe. Everything is a kind of scribal performance. The variants produced are the most obvious traces of scribal performance. However, not everyone who left prints in the MS were acting as scribe. There are other roles. Schmid pointed to the need for criteria for distinguishing scribal and non-scribal activity. One important criteria is to distinguish between different scripts, book-hand (readability, regular letter forms, few abbreviations) vs. documentary hand (speed of writing, effective use of space, varying letter forms and ligatures, more abbreviations). Marginal comments and readers’ notes would probably be written by a more casual informal hand. Then Schmid presented some compelling examples of readers’ notes. This phenomenon would also provide some explanation for the introduction of certain variants into the textual tradition, which may have been interpreted as theological interpretations and creations of the scribes (e.g., acting as “orthodox corruptors” - my remark).

[So Ulrich also outlined the process of literary production/reproduction in antiquity:
1. the authorial stage
2. the editorial stage (which places authored material into the public domain)
3. the manufacturing stage (the primary scribal stage)
4. the users stage]

[Some of this material is also covered in Ulrich’s paper at the 2007 Birmingham Colloquium, for our report, see here]

The next two papers were presented by Michael Holmes: ”Working with an open textual tradition: Challenges in theory and practice.” Holmes described the nature of an closed textual tradition (without contamination due to cross-colonization) and an open tradition. The Greek New Testament textual tradition is an open tradition. Then Gerd Mink read his paper, ”What does coherence tell us about contamination and coincidental emergence of variants.” This was a description of the CBGM method. Here I may refer the readers to a detailed account available on-line here. There is also a discussion about the concepts of “initial text” as opposed to archetype and the original text.

In the final session there were two more papers treating the criteria used in textual criticism; Eldon J. Epp, ”Traditional ‘canons’ of New Testament textual criticism: Their value, validity, and viability (or the lack thereof)” and J. K. Elliott: ”What should be in the apparatus criticus to a Greek New Testament?” [Epp ran through a survey of the development of lists of criteria and then distributed a sheet with his proposed summary of the key criteria. In his paper he did propose that in the final analysis, exegesis would be the final arbiter, citing a study of Rom 11.31 which dealt with Paul’s logic throughout the passage as a vital contribution to the old textual problem of the NUN there.] I will only cite Elliott’s opening sentences, “We text-critics are greedy people. We want all evidence in an apparatus criticus. That is the ideal.” [The selectivity of the NA edition could be misleading, e.g. you can’t reconstruct the reading of a particular manuscript through a passage; some types of variants, e.g. spelling, are excluded, and rigorous eclecticism needs more variants — since they tend to think that the genuine reading could be preserved in any manuscript.]

I am sorry that I have no time to summarize all the papers from the first day (perhaps some of my co-bloggers could help out?). Let me just say that the conference was of major interest and importance, and very well organized. One of the most important objectives was to present the CBGM method, developed by Gerd Mink. This was the main event of day two, and I think that this presentation was successful, although many questions of course remain. Unfortunately, there was little time or opportunity for me to take notes during day two, because all my braincells were very busy trying to understanding more about the method.