公 法 评 论 | 惟愿公平如大水滚滚,使公义如江河滔滔 et revelabitur quasi aqua iudicium et iustitia quasi torrens fortis |
Hermeneutics: An Introduction to the Interpretation of Tradition
Copyright (c) 1996 by Nick Szabo
Permission to redistribute without alteration hereby granted.
Contents
Introduction
Part and Whole
Application
Deconstruction
Hermeneutics of the Law
Gadamer and The Value of Tradition
Computational Hermeneutics
Evolutionary Hermeneutics
Introduction
Hermeneutics[5] derives from the Greek hermeneutika, "message analysis", or "things for interpreting": the interpretation of tradition, the messages we receive from the past. Hermeneutics is usually applied to areas where tradition is considered important in people's lives -- religious texts, legal precedents, and so on. It is usually applied to text, but Derrida has pioneered its application to other media.
Part and Whole
The part/whole debate in hermeneutics is much like the stock-market prediction problem (a message from the present!) -- how many weeks of prices should you look at, how many different prices, etc. to get any idea of what the market is in order to predict it? Looking at one price tells you nothing, while looking at all the information that might be relevant is infeasible. The part-whole debate has never been fully resolved, but a reasonable compromise has been struck -- look at the part and whole alternatively, in a kind of iterative and cumulative process of learning. Shown as a flow chart this gives us a circular graph, the famous "hermeneutical circle".
Application
Application is where the "rubber meets the road": where the tradition demonstrates its value, or lack thereof, when applied to contemporary life. An application is the "end use" of a traditional text, such the judge applying the law to a case, or a preacher writing a sermon based on a verse from Scripture. An application is also a new interpretation, a new construction of the tradition. For example a judge, in the process of resolving a novel case sets a precedent, and the preacher, in the process of applying a religious doctrine to a novel contemporary moral problem, thereby change the very doctrine they apply.
Deconstruction
The popular epithet "deconstruction" comes from hermeneutics. "Dekonstruction", as originated by Heidegger, did not, contrary to its current popular usage, mean "destructive criticism". The term was popularized by Derrida, but in a context where it was accompanied by destructive criticism. Heidegger was very interested in reading philosophy in the original Greek, and noticed that translators tended to add their own interpretations as they translate. These interpretations accumulate as "constructions", and a doctrine, whether translated or reinterpreted in some other manner (for example, a law reinterpreted by a judge), accumulates these constructions over time, becoming a new doctrine. Heidegger, desiring to unearth the original Greek thinkers, set about to remove such constructions.
Deconstruction in its "postmodern" construction is usually applied to ferret out a bias one wants to remove, and has tended to get mixed up in the literature alongside criticism of those biases. So guess what, deconstruction has acquired an new interpretation, a new construction, "destructive criticism". But deconstruction in its original sense is not a criticism at all, it is simply a theory about how traditions evolve, namely via the accumulation of constructions, along with a methodology for ferreting out constructions that have for some other reason been deemed to be undesired.
Of course, the above analysis is itself a deconstruction of the term "deconstruction".
To continue our reflexive deconstruction, and thereby learn some more about its method and use, Heidegger was in turn inspired by earlier hermeneutics, in particular the Reformation Biblical translators like Luther who, in our postmodern parlance, were trying to deconstruct the Catholic Church's interpretations to get back to the supposedly inspired original text. Removing Roman doctrines such as tithes, indulgences, and spiritual loyalty to Rome had economically and politically beneficial effects to un-Romanized Europe[6] so there was quite a motivation for this seemingly obscure task. Of course, modern scholars have deconstructed further and found that there was no "original" text but an evolution of texts from the Essenes, the Dead Sea Scrolls, St. Paul, then (finally) the Gospels.
Hermeneutics of the Law
Natural law theorists are trying to do a Heideggerean deconstruction when they try to find the original meaning and intent of the documents deemed to express natural law, such as codifications of English common law, the U.S. Bill of Rights, etc. For example, the question "would the Founding Fathers have intended the 1st Amendment to cover cyberspace?" is a paradigmatic hermeneutical question. Before one can answer it unbiasedly, one must discover in onself, and put off to the side, post-Founder constructions (such as "scarcity", "public interest", and "fairness") that gave rise to FCC control over TV and radio usage and content (including severe restrictions of person-to-person "amateur" radio), and so on. Once one has dissected via this hermeneutical/historical analysis, these constructions from the original, one can examine the value of both the original intent and the constructions. One can, if one is inclined to favor the Founders over early 20th century socialists, proceed to criticize the socialist constructions, but this is a separate task of normal criticism (which the Derrideans falsely confuse with "deconstruction"). In this case even a Derridean critique favors, not a left wing view, but a libertarian/Ur-traditionalist view of cyberspace, as most of the post-Founder constructions that gave rise to radio censorship can be shown to be economically silly ("scarcity"), inapplicable (broadcast model), and so on. This creates a clearing for the original intent to shine through and reignite itself in cyberspace.
Gadamer and The Value of Tradition
The purpose of Heidegger's deconstruction was to recover prior or primordial forms of a living tradition, and the issues that led to its formation. Heidegger's main concern was the dialogue that led to the Aristolean static concept of "is". Heidegger enhanced static "is" with a rich, dynamic concept of being that allows us to, for example, talk about Darwin's algorithm in philosophical circles without being accused of "tautology".
There have been several superficial (aka "radical") crypto-Marxist philosophies constructed using a gloss of Heideggerian language, such as Sartre, Derrida, et. al., but I take these only as a cautionary tale to balance Heidegger's influence on Gadamer, who is probably the most eloquent and thoughtful defender of tradition in our time.
Gadamer[5] saw the value of his teacher Heidegger's dynamic analysis, and put it in the service of studying living traditions, that is to say traditions with useful applications, such as the law . Gadamer discussed the classical as a broad normative concept denoting that which is the basis of a liberal eduction. He discussed his historical process of Behwahrung, cumulative preservation, that, through constantly improving itself, allows something true to come into being. In the terms of evolutionary hermeneutics, it is used and propagated because of its useful application, and its useful application constitutes its truth. Gadamer also discusses value in terms of the duration of a work's power to speak directly. This involves two dimensions: time and accessibility.
Questions of original intent vs.later construction are often more difficult task than many natural law theorists are willing to admit, and more important than anti-traditionalists are willing to admit. The success of Derrida-following "deconstructionists" in defaming tradition stems in large part from traditionalist's technological inferiority, their lack of skill in using the hermeneutical tools of Heidegger and Gadamer. The good news is that anti-traditionalist hermeneutics is self-destructive; seeing no value in tradition it will accumulate none of its own (that it will admit to). It will thus remain superficial. Their drive to work out their post-Marxist frustrations by intellectually vandalizing tradition will soon be spent, and they will move on to something else.
Gadamer provides an in depth treatment of how to go about doing a historical/hermeneutical analysis, along with why it is important. His rich point of view has many resonances with the thoroughly rational field of algorithmic information theory, a resonance I am now happily studying. One especially strong concept from this area is the sophistication, an objective measure of the computational replacement cost of a structure. Replacement cost, combined with a long history of applicability, gives us a very strong indication of a tradition's value. One practical argument that springs from this is the costliness of error: a high cost of error from straying from a tradition, for example a high cost of slavery on personal lives, justifies the propagation and use of a live tradition (in this case, a law against slavery).
I don't think Gadamer's analysis is complete without being synthesized with Hayek; nor is it fully convincing in rational terms, nor accessible to the those who demand objective or scientific reasoning, unless put in the formal terms of algorithmic information theory. Hayek's ideas of unobservable subjective and observable intersubjective value can be seen as reifying the idea of application as hermeneutical truth. The question of when intersubjective truth is preferable to objective (standard scientific) truth may be be mathematically formalizable via algorithmic information theory.
Computational Hermeneutics
I suggest that the "hermeneutical circle" of part and whole can be formalized along the following lines -- the more bits of pattern, the more information we have; but it is infeasible to learn from the whole. So we need algorithms that scan larger parts for the easy regularities, and smaller parts for the difficult regularities, then we need to compare, abstract and synthesize what we have learned about the parts, and so on. This is a whole field full of algorithms to discover, algorithms that approximate in polynomial time the uncomputable solution to the problem of learning from the whole. Put most generally, the problem of learning the whole is formalized as a matter of finding all regularities in the whole, which is equivalent to universal compression, which is equivalent to finding the Kolmogorov complexity of the whole. This formal method of analyzing messages, is, not surprisingly, derived from the general mathematics of messages, namely algorithmic information theory (AIT).
This formal model will apply most directly where the situation is formalizable; for example to induction from messages from the environment, scientific data. Formal models from AIT such as distance, logical depth, etc. can also be usefully applied informally. Indeed, "distance" has long been a used in hermeneutics, again showing the strong similarities between these disciplines heretofore seen as about as distant as one could imagine -- AIT at the forefront of modern computer science, and hermeneutics a seeming throwback to Reformation theology.
Evolutionary Hermeneutics
Spencer, Popper, and Hayek pioneered the study of culture developing via a process analogous to Darwinian natural selection; this study is called evolutionary epistimology [1]. Dawkins [2] introduced the notion of a "meme" as a "selfish" unit of cultural information, suggesting that at least some cultural evolution operates according to the neo-Darwinian algorithm of variation and selection between units of information. In biology this process gives rise to genes that produce organisms, as if [4] for the purpose of their own propagation. In the view of memetics, many cultural traditions can likewise be analyzed in terms of how they function in their own propagation. Heidegger[3] pioneered modern hermeneutics[5] as the study of constructions. This article suggests a synthesis of these traditions into an evolutionary hermeneutics.
Dawkins' memetic theory adds the interesting idea of the "selfishness" of particular parts. With genes this is easy -- genes close to each other in sequence are more likely to cooperate, because they are less likely to be broken up. The only directly competitive genes are alleles, the alternative encodings for each gene. With memes, things are a lot messier. But memes also tend to clump into cooperative groups, and by comparing cooperative groupings we can sometimes can recognize things that look like "sites" for "alleles", eg the points of doctrine and practice in which Lutherans and Catholics diverge.
Analyzing the deconstruction methodology of hermeneutics in terms of evolutionary epistimology is enlightening. We see that constructions are vaguely like "mutations", but far more sophisticated -- the constructions are introduced by people attempting to solve a problem, usually either of translation or application. An application is the "end use" of a traditional text, such the judge applying the law to a case, or a preacher writing a sermon based on a verse from Scripture. In construction the judge, in the process of resolving a novel case sets a precedent, and the preacher, in the process of applying a religious doctrine to a novel cotemporary moral problem, thereby change the very doctrine they apply.
Thus, the Darwinian process of selection between traditions is accompanied by a Lamarckian process of accumulation and distortion of tradition in the process of solving specific problems. We might expect some constructions to advance a political ideology, or to be biased by the sexist or racist psychology of the translator or applicator, as some of Derrida's followers would have it. However, these kinds of constructions can be subsumed under two additional constructions suggested by the evolutionary methodology: synthesis and biomotivation.
Synthetic construction consists of one or more of:
* the development of a new element that synthesizes traditions
* the synthesis of texts or parts of texts into new texts
* new texts which incorporate such syntheses
I shouldly quickly note that I mean "synthesis" the broad sense of a consistent combination of subsets of two or more traditions. I hardly suggest a simplistic dialectal evolution of the kind put forth by Hegel.
Biomotivated constructions derive primarily from biological considerations: epigenetic motivations as studied by behavioral ecology[2, 8] or environmental contingencies of the period, such as plague, drought, etc. These may be harder to discover, given the controversies over the nature/nurture problem: for example, a construction to a text dealing with homosexuality may involved the epigenetic sexual orientation of the constructor, in interplay with one or more of the cultural modes of construction.
Selection then operates on these constructions when differential propagation occurs -- more books of one sort than another are published and read, more scientific papers cite this paper than that one, and so on. In other words, we get unsophisticated mutations, such as distortion during translation, or sophisticated Lamarckian constructions, for the purposes of solving a novel problem not directly address by the tradition, or for the purpose of synthesizing co-believed traditions, and/or due to biomotivations, followed by differential propagation, or Darwinian selection. During these stages of traditional development, we predict that traditions will be selected to behave, i.e. motivate further propagations and constructions, as if designed [4] to propagate themselves at the expense of other traditions that perform similar applications or satisfy similar human psychological needs. However, the constructions of translation, and biomotivation, and the particularly sophisticated constructions produced by application, will also often play a large role in determining the function of a tradition.
This view of tradition unifies evolutionary epistimology and hermeneutics, and suggests a broader range of analysis for the study of religion, law, and other institutions based on tradition.
Refs:
[0], Szabo, Objective Versus Intersubjective Truths
[1] Radzinsky & Bartley ed., _Evolutionary Epistimology_
[2] Dawkins, _The Selfish Gene_
[3] Heidegger, _Early Greeks Thinkers_
[4] Of course, something need not be consciously designed for a function
to act as if it had been designed for a function. See [2]
and Dennett, _The Intentional Stance_
[5] Gadamer, _Truth and Method_ is a Heidegger-inspired survey of
hermeneutics
[6] This resonates with a curious fact of political geography: the
line between nations that settled on being either Catholic
or Protestant almost neatly follows the Rhine-Danube frontier
between the Romans and the Germans which persisted for 400
years after Julius Caesar, but had been interrupted by Christianizing
Germans 1,000 years prior to the Reformation.
[7] "Genetic operators" refer to the kinds of editing operations
performed in genetic algorithms, a practical computer implementation
of Darwinian evolution. Here the "operator set" refers to the
set of possible constructions or "editing operations" on a
traditional text: translation, application, and now synthesis
and biomotivation.
[8] Wilson, E.O. _Sociobiology_
[9] Li & Vitanyi, _An Introduction to Kolmogorov Complexity and Its Applications_