![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Sometimes I forget I'm using philosophical methodologies uncommon in the US/UK, and invariably I get brought up short by bewildered (even sometimes annoyed) responses when I start analyzing a text/work. Guess that means I'm overdue for providing, hrm, not a disclaimer... so much as an explanation of my process and why it takes shape the way it does.
I rely most heavily on a style of argument called the dialectical process. It begins by setting forth a thesis, which is distilled to the point that one can then posit the antithesis. Then, by studying the tension between the two, one can arrive at a synthesis. In practical application, it ends up reading like a series of swings back and forth, from the thesis/positive to its antithesis/negative, until slowly each is modified in light of the other to reach a synthesis. That final step is not necessarily a conclusion, per se, so much as an assessment of the tension that exists between the two original points.
Yes, it's very similar to the Socratic method. No, it's not really that common in argument-styles in the US/UK. Yes, it originated from philosophy, but it has been applied as a type of literary criticism; it's a malleable format. Yes, it had a huge impact on me, being introduced to me at a point in my life when I was still pretty close to a blank slate when it came to any signficant analytical skills. No, I don't really use it formally, these days, but I can reread even off-the-cuff posts and see the pattern remains a strong one.
In some sense, this swinging back-and-forth is a kind of triangulation, especially when it goes hand-in-hand with deconstruction. Which, I should also explain, is not the same as common usage -- to de-construct -- because it's not simply a matter of taking apart the whole to end with the numerable parts.
Common critical analysis usually does reduce to parts, such as, "boy fish meets girl fish, boy fish loses girl fish, girl fish dies going over hydraulic dam." Reductionism then determines the parts consist of boy fish, girl fish, and hydraulic dam, and by reductionism's lights, thereby conclude the 'truth' of the whole is equal to two fish and a big honking hydraulic dam.
That's not deconstructionism, though. Deconstructionism is the step beyond reductionism: it may say, for the story to work (for parts to come together as whole), we are required to assume, or take for granted, that boy fish is X, or that girl fish is Y, or that all hydraulic dams are Z. That leads to asking: what if we instead posited that boy fish is Y? What does this shift in understanding do to the parts, and in turn to the whole?
Deconstructive philosophy, or deconstructive literary criticism, is concerned with questions such as, "what assumptions are we making? what do we have to take for granted in order to accept this argument's conclusion?" It seeks the premise, the core concept or thing that underlies the myriad parts. From there, it explores the buried cultural or philosophical or literary assumptions, to question the validity of the core/premise and possibly even the conclusions within the overall whole. Where reductionism sees the parts as equal to the whole, deconstruction is more hermeneutical; its primary questions are informed by interplay between parts and whole, as it explores and overturns hidden assumptions in the discrete parts.
I can't really explain this succinctly without jargon, honestly, so bear with me.
So, hermeneutics. It's a kind of cousin to the dialectic process; it's mostly used as a method of religious criticism/analysis. (Not always, though; my favorite philosopher, Martin Heidegger, was a big advocate for broadening hermeneutical applications, one reason I take that approach myself, I 'spect.) Here's my best shot at hermeneutics in thirty words or less: it's a way of understanding the whole via an understanding of the tension between the parts and the whole.
It's also a swing back-and-forth maneuver, only this time between the whole and its parts rather than a thing and its opposite. The key is that hermeneutics' perspective is that neither of these, the whole or its parts, can be understood in a vacuum, nor can either be understood if separated from the other. The other important detail about hermeneutics is that the textual meaning is entirely contextual, that is, grasped only via grasping the text's philosophical or cultural or literary grounding. Where deconstructionism seeks to identify the underlying assumptions in the parts, hermeneutics seeks to interpret/analyze the parts in light of a greater external context, be that cultural, philosophical, literary, etc.
So, to sum up: reductionism breaks a whole into its parts and considers these equal to the whole, interchangeable as 'bunch of parts' or as 'one big whole'. Deconstructionism explores the premise inherent in the parts as means to question -- possibly even overturn -- stale or outgrown concepts that may be preventing newer/better understandings of the whole. And finally, hermeneutics looks for contextual grounding that may clarify the parts as well as explicate the tension between the parts and the whole.
In some ways, I'm not entirely comfortable with hermeneutics; when taken to an extreme, it starts to imply author intentionality, a philosophy with which I disagree with strongly. But shy of that, it is a valuable and informative method of seeking potential meanings (though rarely the meaning), by placing the thing, story, philosophy, etc, within a fuller comprehension of its original context.
One might even say that hermeneutics is what's being practiced in classrooms across the US, every time students read Huckleberry Finn: unless and until the students understand the historic and cultural framework that Twain was writing in, and against, a lot of the characterizations, motivations, even dialogue, are either going to be missed or misread so utterly as to render the text meaningless (or worse, meaningful only in a strongly negative sense). So the teacher is applying a hermeneutical approach, shifting between the text as-a-whole and the historical and cultural clues in the text-as-parts that may in turn inform the whole.
If you read my posts on a semi-regular basis, you'll probably see this strong aversion to author intentionality come up again and again. The gist goes like this: if we could just talk to the author, then we'd understand what the text means. But author intentionality is a fallacy; when the author is the sole purveyor of a text's meaning, then anyone else's interpretations are, at best, only educated guesses. (And even that much only so long as the interpretation works from a position of trying to get into the author's head and guess at the intentionality). The position outright renders all criticisms as false on the face, with the sole exception of those provided by the author/creator.
I think that attitude isn't just hogwash, it's also a compete cop-out. It presumes that the value or meaning of a work exists solely within the authorial framework and that the reader brings nothing to the table, and can only take away what the author gave. Nonsense, total nonsense. A work, upon dispersal into the wild, becomes its own thing, independent of the author's intentions. It doesn't matter what Twain intended, at this point, if we're trying to understand variations in the interpretations. The best (and perhaps most pragmatic) approach now, I say, is to deconstruct the tension between the text and the reader, to find the basic assumptions that led to this interpretation -- and question those instead. Leave the author out of it, because the interaction is not between author and reader, but between text and reader. If you ask me, arguing otherwise is both misleading and unproductive.
There are good authors and bad, but I disagree that an author/creator always carries full blame for any fuzzy language or imagery, because I do believe all language is inherently fuzzy. Normally this isn't a problem. When I say "apple," I figure this means roughly the same to you as to me, give or take a few details, and that's good enough. But when you're going down to the molecular level to analyze the parts, getting it "approximately right" isn't good enough.
Plenty of sciences realize this; it's why we have jargon in every field. Come on! When I'm purchasing lumber, to say "a two-by-four" may suffice in casual conversation, but in application I must specify "select structural southern white pine, ten-foot, two-by-four." Each additional adjective is necessary to elucidate a precise meaning. I don't want my roof to fall in just as I finish the last skim coat of drywall, so of course I don't want the lumberyard misunderstanding me and selling me No. 2 Non-Dense southern white pine instead.
(Because the roof falling in? Bad. Very bad.)
Fuzzy language is very much a part of poetry, of literature, even of the mysteries in religions; it's fuzzy language that allows us to unconsciously reference cultural values without specifying them (which would alienate those without access). It can let in those who understand without locking out those who don't, because it leaves open the interplay between reader perspective and the authorial values hiding in the work. It's only when one analyzes, deconstructs, that language must -- per any science -- solidify into jargon, which is really just words with very precise definitions that don't always neatly match the looser common usage. And that jargon, admittedly, does lock people out, if they don't know the terms or understand the references.
That's just fine, if you ask me; philosophical discourse isn't meant to be -- and doesn't have to have -- a broad appeal, anymore than medical studies are written to be published in the latest AP bulletin. The audience is different. But it's also why sometimes I get frustrated that I can't just throw out the jargon and have readers understand what I mean via a shared, specific definition; this audience is not trained to have those ears. (That's a huge part of why my posts will go on and on, as I try to compensate for the schism between the terminology in my head versus the fuzzier analogies I tend to use because I don't care for writing a massive glossary with each post.)
The fallacy of author intentionality leads me to modernism, as that fallacy is a risk of the modernist perspective, which rests on a scientific or objective take on things. Modernism posits that there can be a Truth to a thing -- and I mean that capital letter very intentionally -- as an absolute Truth, in and of itself, independent of external cultural or philosophical contexts. You can probably see how modernist approaches are in total contrast to hermeneutical methodologies, because the latter posits that the tension between the parts and whole is clarified and/or grounded in a contextual understanding. In other words, modernism would say, "to understand girl fishes and hydraulic dams, we only need to look at this girl fish, and this hydraulic dam."
Postmodernism is subjective, relativistic; it would say, "other girl fishes have survived hydraulic dams," or even "other boy fishes don't lose girl fishes in the first place." It's true that at the outer extreme of postmodernism, the philosophy can become almost parody, self-reference so far gone as to become oppressively self-conscious to the point of pretentiousness. Television shows sometimes have modernist streaks in their storytelling; one example is the now-hackneyed "flashback episode" when the now-time consists of characters discussing, and then mutually "remembering", a sequence from an earlier episode. That's referential but only towards the self (internal), thus modernism; postmodernism caps that by having characters imply or even state outright that they're aware they're characters (aka breaking the fourth wall). Both refer but where modernism points inward, postmodernism turns outward.
f the postmodern streak is taken to its extreme, and the characters go so far as to analyze their reality via reference (either internal/self or external/breaking the fourth wall), then we get into metanarrative, "a story about a story". A critical analysis may create a metanarrative without getting this label; the postmodernist aspect appears when this metanarrative is created or revealed within the text.
That's where the pretension comes in, I think: it's like someone writing their autobiography and going into depth on what their life means in an overall scope of the world-in-general. It requires you follow along as they attempt to be both object and subject, and I just don't think that always works. (I'm not saying I'm formally Cartesian, far from it, but that I still agree one cannot truly create a metanarrative of one's role from within the role, because we just can't get out of where-we-are to see anything as-it-is.)
Modernism seeks order; it marshals rationality out of chaos. So for literary or philosophical criticism, (it seems to me) limiting a work's criticism to only those parts within the work -- devoid of context -- is a way of limiting the potential fragmentation that can occur when you drag in context after context after context.
On reflection, this dismissal of (or disinterest in) contextual fragmentation is a sign that someone is thinking with a modernism cap. That perspective is most often to blame (in my experience) for seeing textual deconstructionism as "over-analyzing" or "referencing external (and thus by definition unrelated) things". Which is not to say I'm arguing that doing so is better somehow, and neither am I criticizing in turn those folks who prefer the modernist mindset; I'm only pointing out that the two approaches are fundamentally opposite. If someone prefers the other, they're just not going to enjoy being stuck over on this side of the divide.
In modernist thought, a work can speak for itself (whether or not author-input is required); the drive to tease out every blooming possible cultural reference would be seen as a lot of noise and fury signifying nothing. I get that, I really do. But I'm not a modernist, and while it can be overwhelming to come at literary critique with a postmodernist "everything and the kitchen sink" attack, I find it far more satisfying on a personal level than limiting myself to only what's present in the work, meta-free. Postmodernism leans hard towards the absurd, and I mean that in an existential sense; it seeks the fragmentation and the chaos, and in a hermeneutical framework will often pile on more and more and more obscure cultural or philosophical contextual references.
As a last note -- because this is a perspective ingrained pretty deep in me -- existentialism is basically the position that existence precedes essence. From a religious standpoint, this does not deny a godhead nor does it posit automatic atheism, but it does question the use/value of a soul; religious existentialism takes the attitude that any 'spirit' element is shaped by the body's experiences, to the point that one might even say there was no 'soul' prior to the empirical existence.
On a more secular level, existentialism defines who-we-are as a product of what-we-have-experienced. It's really that simple. The essence of who-I-am is informed crucially, almost completely, not by some pre-determined fate but by my experiences growing up in Georgia, living in Rhode Island, owning a bookstore, and so on. Had I been exposed to different experiences, the essence of who-I-am would be so radically altered as to be a completely different person; in contrast, an essentialist believes that we are fundamentally ourselves regardless of overlaid experiences.
The connection between existentialism and postmodernism comes into play through the concept of the absurd, which isn't related to the funny except in the blackest (and if you're reading Sartre, a slightly morbid) kind of random humor. In existentialism, because our experience is what defines/shapes us, then all experience is positive (that is, positive in that it has an impact on our essence) and the lack of predetermined fate means that all possible experiences are open to us. If we choose to run off and join the circus, this is no more a valid or invalid option than if we shave our heads and join the merchant marines. Both are experiences, and experience as such is a neutral thing. The absurdity comes into play when one attempts to place a (usually moral) value on this experience over that experience.
The jargon-word is 'absurd,' but one could easily say, "that's nonsensical!" just like we might if someone announced that the next game move could be four steps to the left or four to the right, but that going to the right is a Bad Thing and thus any results of such a move would therefore also be Very Bad. When there is no assignable or predetermined validity to this option over that option, then to arbitrarily assign value to experience X over experience Y is a ridiculous, even unwarranted, justification.
Postmodernism's use of the concept parallels this: to designate that these sets of external references can be used to understand a text, while those sources cannot, is absurd in the same sense. All references are potentially valid and/or usable given our lack of empirical knowledge; we cannot know that an experience will lead to death on a hydraulic dam, though we may be able to guess. Existentialism's, and postmodernism's, application of the absurd does not deny the use of guessing to determine one's path -- "this looks like a better choice, given what I know of this kind of situation based on similar things in my past" -- but do consider absurd the idea that there is ever absolute certainty.
In pure existentialism, there is no certainty, none at all. I think sometimes that's why people confuse it with Nihilism, which is a bleaker philosophy that says "if all options are equal, then it doesn't matter what I do." Or more melodramatically, "if all options are equal, I'm screwed no matter what, so there's just no point in trying."
Existentialism doesn't play that card. The lack of value judgments on the potential is no reason to avoid experience completely. Existentialism merely says that to place value judgments -- good or nihilist-bad -- is absurd. "Nothing is true; everything is permitted" is not a source of fear per Nihilism, but a source of unmitigated freedom.
Ah, after all that, I could get into the issues of teleology, epistomology, and my fascination with social constructionism... but I won't. I'll just sum up: my style of deconstructive synthesis ultimately focuses on a post-modernist ludic ontology of strong social constructionism*. I may not use these methodologies absolutely formally, but that's the gist of my analytical process.
In ordinary language I might therefore conclude: if you think a work should stand on its own without cultural metanarrative, if you think the 'truth' of a work does not require external comparison, if you dislike the chaos of exploring a story's small details... you might, in the future, just skip my posts when I get into analysis. It's okay, really. You can take the modernist road, I'll take the postmodernist road, and I'll be in Scotland before you.
* See how much easier it is to be concise when using jargon? That took me this entire post to say lingo-free.
I rely most heavily on a style of argument called the dialectical process. It begins by setting forth a thesis, which is distilled to the point that one can then posit the antithesis. Then, by studying the tension between the two, one can arrive at a synthesis. In practical application, it ends up reading like a series of swings back and forth, from the thesis/positive to its antithesis/negative, until slowly each is modified in light of the other to reach a synthesis. That final step is not necessarily a conclusion, per se, so much as an assessment of the tension that exists between the two original points.
Yes, it's very similar to the Socratic method. No, it's not really that common in argument-styles in the US/UK. Yes, it originated from philosophy, but it has been applied as a type of literary criticism; it's a malleable format. Yes, it had a huge impact on me, being introduced to me at a point in my life when I was still pretty close to a blank slate when it came to any signficant analytical skills. No, I don't really use it formally, these days, but I can reread even off-the-cuff posts and see the pattern remains a strong one.
In some sense, this swinging back-and-forth is a kind of triangulation, especially when it goes hand-in-hand with deconstruction. Which, I should also explain, is not the same as common usage -- to de-construct -- because it's not simply a matter of taking apart the whole to end with the numerable parts.
Common critical analysis usually does reduce to parts, such as, "boy fish meets girl fish, boy fish loses girl fish, girl fish dies going over hydraulic dam." Reductionism then determines the parts consist of boy fish, girl fish, and hydraulic dam, and by reductionism's lights, thereby conclude the 'truth' of the whole is equal to two fish and a big honking hydraulic dam.
That's not deconstructionism, though. Deconstructionism is the step beyond reductionism: it may say, for the story to work (for parts to come together as whole), we are required to assume, or take for granted, that boy fish is X, or that girl fish is Y, or that all hydraulic dams are Z. That leads to asking: what if we instead posited that boy fish is Y? What does this shift in understanding do to the parts, and in turn to the whole?
Deconstructive philosophy, or deconstructive literary criticism, is concerned with questions such as, "what assumptions are we making? what do we have to take for granted in order to accept this argument's conclusion?" It seeks the premise, the core concept or thing that underlies the myriad parts. From there, it explores the buried cultural or philosophical or literary assumptions, to question the validity of the core/premise and possibly even the conclusions within the overall whole. Where reductionism sees the parts as equal to the whole, deconstruction is more hermeneutical; its primary questions are informed by interplay between parts and whole, as it explores and overturns hidden assumptions in the discrete parts.
I can't really explain this succinctly without jargon, honestly, so bear with me.
So, hermeneutics. It's a kind of cousin to the dialectic process; it's mostly used as a method of religious criticism/analysis. (Not always, though; my favorite philosopher, Martin Heidegger, was a big advocate for broadening hermeneutical applications, one reason I take that approach myself, I 'spect.) Here's my best shot at hermeneutics in thirty words or less: it's a way of understanding the whole via an understanding of the tension between the parts and the whole.
It's also a swing back-and-forth maneuver, only this time between the whole and its parts rather than a thing and its opposite. The key is that hermeneutics' perspective is that neither of these, the whole or its parts, can be understood in a vacuum, nor can either be understood if separated from the other. The other important detail about hermeneutics is that the textual meaning is entirely contextual, that is, grasped only via grasping the text's philosophical or cultural or literary grounding. Where deconstructionism seeks to identify the underlying assumptions in the parts, hermeneutics seeks to interpret/analyze the parts in light of a greater external context, be that cultural, philosophical, literary, etc.
So, to sum up: reductionism breaks a whole into its parts and considers these equal to the whole, interchangeable as 'bunch of parts' or as 'one big whole'. Deconstructionism explores the premise inherent in the parts as means to question -- possibly even overturn -- stale or outgrown concepts that may be preventing newer/better understandings of the whole. And finally, hermeneutics looks for contextual grounding that may clarify the parts as well as explicate the tension between the parts and the whole.
In some ways, I'm not entirely comfortable with hermeneutics; when taken to an extreme, it starts to imply author intentionality, a philosophy with which I disagree with strongly. But shy of that, it is a valuable and informative method of seeking potential meanings (though rarely the meaning), by placing the thing, story, philosophy, etc, within a fuller comprehension of its original context.
One might even say that hermeneutics is what's being practiced in classrooms across the US, every time students read Huckleberry Finn: unless and until the students understand the historic and cultural framework that Twain was writing in, and against, a lot of the characterizations, motivations, even dialogue, are either going to be missed or misread so utterly as to render the text meaningless (or worse, meaningful only in a strongly negative sense). So the teacher is applying a hermeneutical approach, shifting between the text as-a-whole and the historical and cultural clues in the text-as-parts that may in turn inform the whole.
If you read my posts on a semi-regular basis, you'll probably see this strong aversion to author intentionality come up again and again. The gist goes like this: if we could just talk to the author, then we'd understand what the text means. But author intentionality is a fallacy; when the author is the sole purveyor of a text's meaning, then anyone else's interpretations are, at best, only educated guesses. (And even that much only so long as the interpretation works from a position of trying to get into the author's head and guess at the intentionality). The position outright renders all criticisms as false on the face, with the sole exception of those provided by the author/creator.
I think that attitude isn't just hogwash, it's also a compete cop-out. It presumes that the value or meaning of a work exists solely within the authorial framework and that the reader brings nothing to the table, and can only take away what the author gave. Nonsense, total nonsense. A work, upon dispersal into the wild, becomes its own thing, independent of the author's intentions. It doesn't matter what Twain intended, at this point, if we're trying to understand variations in the interpretations. The best (and perhaps most pragmatic) approach now, I say, is to deconstruct the tension between the text and the reader, to find the basic assumptions that led to this interpretation -- and question those instead. Leave the author out of it, because the interaction is not between author and reader, but between text and reader. If you ask me, arguing otherwise is both misleading and unproductive.
There are good authors and bad, but I disagree that an author/creator always carries full blame for any fuzzy language or imagery, because I do believe all language is inherently fuzzy. Normally this isn't a problem. When I say "apple," I figure this means roughly the same to you as to me, give or take a few details, and that's good enough. But when you're going down to the molecular level to analyze the parts, getting it "approximately right" isn't good enough.
Plenty of sciences realize this; it's why we have jargon in every field. Come on! When I'm purchasing lumber, to say "a two-by-four" may suffice in casual conversation, but in application I must specify "select structural southern white pine, ten-foot, two-by-four." Each additional adjective is necessary to elucidate a precise meaning. I don't want my roof to fall in just as I finish the last skim coat of drywall, so of course I don't want the lumberyard misunderstanding me and selling me No. 2 Non-Dense southern white pine instead.
(Because the roof falling in? Bad. Very bad.)
Fuzzy language is very much a part of poetry, of literature, even of the mysteries in religions; it's fuzzy language that allows us to unconsciously reference cultural values without specifying them (which would alienate those without access). It can let in those who understand without locking out those who don't, because it leaves open the interplay between reader perspective and the authorial values hiding in the work. It's only when one analyzes, deconstructs, that language must -- per any science -- solidify into jargon, which is really just words with very precise definitions that don't always neatly match the looser common usage. And that jargon, admittedly, does lock people out, if they don't know the terms or understand the references.
That's just fine, if you ask me; philosophical discourse isn't meant to be -- and doesn't have to have -- a broad appeal, anymore than medical studies are written to be published in the latest AP bulletin. The audience is different. But it's also why sometimes I get frustrated that I can't just throw out the jargon and have readers understand what I mean via a shared, specific definition; this audience is not trained to have those ears. (That's a huge part of why my posts will go on and on, as I try to compensate for the schism between the terminology in my head versus the fuzzier analogies I tend to use because I don't care for writing a massive glossary with each post.)
The fallacy of author intentionality leads me to modernism, as that fallacy is a risk of the modernist perspective, which rests on a scientific or objective take on things. Modernism posits that there can be a Truth to a thing -- and I mean that capital letter very intentionally -- as an absolute Truth, in and of itself, independent of external cultural or philosophical contexts. You can probably see how modernist approaches are in total contrast to hermeneutical methodologies, because the latter posits that the tension between the parts and whole is clarified and/or grounded in a contextual understanding. In other words, modernism would say, "to understand girl fishes and hydraulic dams, we only need to look at this girl fish, and this hydraulic dam."
Postmodernism is subjective, relativistic; it would say, "other girl fishes have survived hydraulic dams," or even "other boy fishes don't lose girl fishes in the first place." It's true that at the outer extreme of postmodernism, the philosophy can become almost parody, self-reference so far gone as to become oppressively self-conscious to the point of pretentiousness. Television shows sometimes have modernist streaks in their storytelling; one example is the now-hackneyed "flashback episode" when the now-time consists of characters discussing, and then mutually "remembering", a sequence from an earlier episode. That's referential but only towards the self (internal), thus modernism; postmodernism caps that by having characters imply or even state outright that they're aware they're characters (aka breaking the fourth wall). Both refer but where modernism points inward, postmodernism turns outward.
f the postmodern streak is taken to its extreme, and the characters go so far as to analyze their reality via reference (either internal/self or external/breaking the fourth wall), then we get into metanarrative, "a story about a story". A critical analysis may create a metanarrative without getting this label; the postmodernist aspect appears when this metanarrative is created or revealed within the text.
That's where the pretension comes in, I think: it's like someone writing their autobiography and going into depth on what their life means in an overall scope of the world-in-general. It requires you follow along as they attempt to be both object and subject, and I just don't think that always works. (I'm not saying I'm formally Cartesian, far from it, but that I still agree one cannot truly create a metanarrative of one's role from within the role, because we just can't get out of where-we-are to see anything as-it-is.)
Modernism seeks order; it marshals rationality out of chaos. So for literary or philosophical criticism, (it seems to me) limiting a work's criticism to only those parts within the work -- devoid of context -- is a way of limiting the potential fragmentation that can occur when you drag in context after context after context.
On reflection, this dismissal of (or disinterest in) contextual fragmentation is a sign that someone is thinking with a modernism cap. That perspective is most often to blame (in my experience) for seeing textual deconstructionism as "over-analyzing" or "referencing external (and thus by definition unrelated) things". Which is not to say I'm arguing that doing so is better somehow, and neither am I criticizing in turn those folks who prefer the modernist mindset; I'm only pointing out that the two approaches are fundamentally opposite. If someone prefers the other, they're just not going to enjoy being stuck over on this side of the divide.
In modernist thought, a work can speak for itself (whether or not author-input is required); the drive to tease out every blooming possible cultural reference would be seen as a lot of noise and fury signifying nothing. I get that, I really do. But I'm not a modernist, and while it can be overwhelming to come at literary critique with a postmodernist "everything and the kitchen sink" attack, I find it far more satisfying on a personal level than limiting myself to only what's present in the work, meta-free. Postmodernism leans hard towards the absurd, and I mean that in an existential sense; it seeks the fragmentation and the chaos, and in a hermeneutical framework will often pile on more and more and more obscure cultural or philosophical contextual references.
As a last note -- because this is a perspective ingrained pretty deep in me -- existentialism is basically the position that existence precedes essence. From a religious standpoint, this does not deny a godhead nor does it posit automatic atheism, but it does question the use/value of a soul; religious existentialism takes the attitude that any 'spirit' element is shaped by the body's experiences, to the point that one might even say there was no 'soul' prior to the empirical existence.
On a more secular level, existentialism defines who-we-are as a product of what-we-have-experienced. It's really that simple. The essence of who-I-am is informed crucially, almost completely, not by some pre-determined fate but by my experiences growing up in Georgia, living in Rhode Island, owning a bookstore, and so on. Had I been exposed to different experiences, the essence of who-I-am would be so radically altered as to be a completely different person; in contrast, an essentialist believes that we are fundamentally ourselves regardless of overlaid experiences.
The connection between existentialism and postmodernism comes into play through the concept of the absurd, which isn't related to the funny except in the blackest (and if you're reading Sartre, a slightly morbid) kind of random humor. In existentialism, because our experience is what defines/shapes us, then all experience is positive (that is, positive in that it has an impact on our essence) and the lack of predetermined fate means that all possible experiences are open to us. If we choose to run off and join the circus, this is no more a valid or invalid option than if we shave our heads and join the merchant marines. Both are experiences, and experience as such is a neutral thing. The absurdity comes into play when one attempts to place a (usually moral) value on this experience over that experience.
The jargon-word is 'absurd,' but one could easily say, "that's nonsensical!" just like we might if someone announced that the next game move could be four steps to the left or four to the right, but that going to the right is a Bad Thing and thus any results of such a move would therefore also be Very Bad. When there is no assignable or predetermined validity to this option over that option, then to arbitrarily assign value to experience X over experience Y is a ridiculous, even unwarranted, justification.
Postmodernism's use of the concept parallels this: to designate that these sets of external references can be used to understand a text, while those sources cannot, is absurd in the same sense. All references are potentially valid and/or usable given our lack of empirical knowledge; we cannot know that an experience will lead to death on a hydraulic dam, though we may be able to guess. Existentialism's, and postmodernism's, application of the absurd does not deny the use of guessing to determine one's path -- "this looks like a better choice, given what I know of this kind of situation based on similar things in my past" -- but do consider absurd the idea that there is ever absolute certainty.
In pure existentialism, there is no certainty, none at all. I think sometimes that's why people confuse it with Nihilism, which is a bleaker philosophy that says "if all options are equal, then it doesn't matter what I do." Or more melodramatically, "if all options are equal, I'm screwed no matter what, so there's just no point in trying."
Existentialism doesn't play that card. The lack of value judgments on the potential is no reason to avoid experience completely. Existentialism merely says that to place value judgments -- good or nihilist-bad -- is absurd. "Nothing is true; everything is permitted" is not a source of fear per Nihilism, but a source of unmitigated freedom.
Ah, after all that, I could get into the issues of teleology, epistomology, and my fascination with social constructionism... but I won't. I'll just sum up: my style of deconstructive synthesis ultimately focuses on a post-modernist ludic ontology of strong social constructionism*. I may not use these methodologies absolutely formally, but that's the gist of my analytical process.
In ordinary language I might therefore conclude: if you think a work should stand on its own without cultural metanarrative, if you think the 'truth' of a work does not require external comparison, if you dislike the chaos of exploring a story's small details... you might, in the future, just skip my posts when I get into analysis. It's okay, really. You can take the modernist road, I'll take the postmodernist road, and I'll be in Scotland before you.
* See how much easier it is to be concise when using jargon? That took me this entire post to say lingo-free.
no subject
Date: 3 Dec 2008 11:43 pm (UTC)I found this a constant source of frustration in grad school, because to disdain entire schools of philosophy was the opposite of open-minded, which I had assumed philosophy to be at its heart, but it, alas, is not.
no subject
Date: 3 Dec 2008 11:53 pm (UTC)(I refrained from calling it 'continental' because I wasn't sure whether that had an obvious meaning to US folks.)
I actually never took a literary criticism class in college, hrm, except for one class on women playwrights. Everything I've ever applied to literary criticism has come from a background of studying theology, and then later, philosophy.
Also, icon lurve!
no subject
Date: 4 Dec 2008 12:25 am (UTC); )
no subject
Date: 4 Dec 2008 12:35 am (UTC)Sorry, I think my brain is burnt at this point... that was a lot to try and make clear for non-philosophy majors. WAH. I am not doing another post like this one for a long, long, time. I'm frazzled now.
...although come to think of it, the philosophy professors I had were also Continental. One was French-trained (read Sartre in the original and all I could say was, "man, sorry to hear that") and the other was from Denmark. Neither of them ever gave me the least grief about the fact that by that point I was solidly in a dialectical framework for every essay and paper I wrote, but I'd never connected their own training with their acceptance of my argument style. Probably because, now that I think about it, none of my theology or philosophy professors were US-educated. Hunh!
no subject
Date: 4 Dec 2008 02:20 am (UTC)no subject
Date: 4 Dec 2008 12:08 am (UTC)Honestly? I'm cool with medical genetics, but this kind of thing makes my head spin.
no subject
Date: 4 Dec 2008 12:12 am (UTC)It's okay. Medical genetics makes my head spin. (But in a very good way, honestly.)
I like it when my head spins. Fun stuff happens.
no subject
Date: 4 Dec 2008 12:30 am (UTC)Having said that, I have a confession to make: I actually prefer some real ales warm. The good real ales have extremely complex flavours, and the subtleties of the flavours are much more apparent and available when the beer is room temperature. Cold is definitely better on a hot day, but I also drink the stuff for the taste.
Oh, yeah, and sometimes the good headspinny effects, too.
no subject
Date: 4 Dec 2008 12:38 am (UTC)Yes, that was my UK experience: leave the UK on a morning that my sister and I are in thick tights, plaid skirts, and fluffy jackets and just right against the 50F or so morning temp... and land in Montgomery Alabama eight hours later and it's 99F with 90% humidity. The entire family spent the next month laying around in unending misery. Sigh.
no subject
Date: 4 Dec 2008 01:00 am (UTC)Hrm, well, yes -- ALL the natives, in fact. "British" refers to Great Britain: English, Scottish, Welsh, and a number of outlying islands like the Isles of Scilly and the Hebrides. Sorry, just a niggle....did you mean that there must be a lot of English in Aberdeen?
Meh, I don't know. Where were you drinking, and when? I am extremely familiar with a large number of Aberdeen pubs, and I can't think of any which regularly serve warm beer -- although occasionally you get something just freshly on tap, which hasn't been fully chilled.
And, 70 degrees here? My god, you had a warm day! Extraordinarily warm! We generally only get a few of those per summer! Wow!
no subject
Date: 4 Dec 2008 01:12 am (UTC)I doubt it's a pub per se, but one of those places every university has. We stayed on the university proper, since my father was presenting his doctoral dissertation. Then we headed to Edinburgh after that, and he presented it for another symposium, there.
Yes, we were told REPEATEDLY that it was "the warmest summer on record EVAH!!!eleventy-one!!111!!".
Meanwhile, we were all in sweaters and tights and wool caps and my mother was buying handknit sweaters like crazy to layer on top of what we'd brought. The locals are running around HALF-NAKED at the shore (or what passes for one) and even GETTING IN THE WATER while my sister and I stood there looking terrified. We were honestly convinced these people were INSANE.
Maybe I should mention that most of my childhood, I spent time nearly every summer on the Gulf of Mexico, down in Mississippi. When the air's 90-something and the water's 80-something, the idea of getting half-naked on a volcanic-rock black shore and going anywhere near water that's probably about 30F is just about my idea of sheer insanity.
Though we did have fun chasing golf balls.
no subject
Date: 4 Dec 2008 01:28 am (UTC)Oh, you're kidding. University of Aberdeen? When, when????? "Warmest summer ever" -- '95?
Are you sure it wasn't a "pub proper"? Most of the University of Aberdeen drinking is done either at the Machar*, on High Street, or the Bobbin over on King Street. If there is an illicit drinking den on campus, then by damn I want to know about it, so I can check it out....
-----------------------
*The Machar is not an illicit drinking den. It is a perfecly licit drinking den. ~If a bit on the small side.
Edit to add: To be fair, I did spend my first year in Aberdeen wrapped in multiple layers of jumpers and huddled next to the radiator.
I did grow up in Colorado, and my mother held that it was too cold to go swimming until it hit 80F. When I tell my friends here that, they just about piss themselves laughing.
...Have you ever heard the Billy Connolly routine on swimming in the North Sea?
no subject
Date: 4 Dec 2008 02:13 am (UTC)Talk about timing, while I was trying to reply, my dad called. He was a little startled as to why I'd be asking, but here's the answer as best he can recall: it was a faculty lounge or pub, on campus. The beer was not "warm" (though to my parents' taste buds it was), but "just not ice cold, or really all that terribly cold, but not warm." My mother's favorite after the trip was some kind of beer with a shot of Rose's lime in it -- no, not sure, didn't catch entirely what my father was saying. The lemonade drink may have been shandy? About 1% alcohol due to some beer being in there, he said.
As for temps, as soon as any of us opened our mouths, people could hear the Southern accents, so we got a lot of "how're the colonies" along with questions about just how hot it really was over there. The time I went into complete shock at seeing ICICLES under a CAR BELLY in MIDDAY was a huge amusement to a number of old men in a pub, I recall.
Oh, and it was JULY. Yeah. Icicles. Freaking ICICLES!
no subject
Date: 5 Dec 2008 01:51 pm (UTC)Right, I wasn't in Aberdeen, then.
Oh well, I guess that means I don't have to feel bad about missing you!
Re: icicles in July -- we don't get that so much, now. What with all the climate change, we just get cold rain.
Lots of cold rain. Lots and lots of cold rain....
no subject
Date: 4 Dec 2008 03:42 am (UTC)no subject
Date: 4 Dec 2008 04:23 am (UTC)Heidegger's big flip was to put ontology (being) before epistemology (learning): that is, that we must exist before we can interact/learn. Further, to truly explore phenomenology (study of consciousness and conscious experience) we must first be in the world, to be conscious of it. Being and Time intimidates people, but I think they're confusing it with Being and Nothingness by Sartre, which needn't confuse people but should still be all buried ten feet deep in a great big hole and left to freaking ROT because Sartre was like the absolute emo-boy ADD posterchild, I swear, the man could NOT stay on topic for longer than thirty seconds AND he's constantly bummed about the results. He's the emo existentialist.
But anyway. Reading Heidegger is probably better than trying to "learn" dialectic, or hermeneutics, because both are best learned by following along as someone else does it. Heidegger is one of the best, and more importantly, he's very clear and concise in his language (and his translations are almost all quite well-done, too). He'll define a thing, with a concrete analogy, and then speak of it not as an abstract but using that analogy to really communicate.
Being and Time is full of examples like this one: to define 'obtrusive', which is the quality of a thing that intrudes on our awareness only once it is absent, he gives the example of the broken lightbulb. Every time you walk in the room, you flip the switch and the bulb doesn't come on -- so the broken light is obtrusive. It stands out by absence (of its light) and makes you realize it exists. Furthermore, the obtrusive element leads to additional awareness that when you walk a room, you flip that switch even though! you know already the light is broken.
I could totally visualize it. I had no problems at all with the guy. I have no idea why people say he's hard, or stuff. Sigh.
I can't recall whether Heidegger had any shorter works that might be helpful for you, but you might want to look up his lectures. He had a bunch, and I've read some of them. He had to have been an amazing lecturer, because he has a very quiet wit, very subtle, and he's very systematic, steady, and thorough in making sure what he intends to communicate, actually is. Plus, his lectures are shorter and wouldn't require you dedicate half your life to reading Being-and-Time, whew.
Btw, unless you're in Europe, it's doubtful you'd learn any of this in a Philosophy 101 class. I don't know if they'd even mention dialectic, given Masq's comments (above). A theology class might raise the topic, maybe. Literary criticism might, from what Masq says.
Alternately (since Masq is a PhD in this stuff), just reply to her post & ask her! If anyone would know the really good intro "here's what it is and how to use it" so you have tools for the rest of your major, she probably would be your best bet. My background is strongly theological, so outside of Heidegger, anyone else I'd suggest is going to drown you in, well, godstuff.
no subject
Date: 4 Dec 2008 02:05 pm (UTC)I still do not get why no one, at any point while I have been raving about classics, has ever mentioned "HEY YOU MIGHT WANT TO LEARN SOME STUFF ABOUT METHODS OF INTERPRETATION, I DON'T KNOW, JUST MAYBE IT MIGHT BE USEFUL."
no subject
Date: 4 Dec 2008 07:04 pm (UTC)The impression I got of Plato was that he has a big emphasis on the eternal/immortal, hrm, the unchanging-ness of things. When you head in that direction, then sure, it would make sense that under those would be a Truth-with-a-capital-T. I was already too deep into the existentialist mindset to have much truck with a strong essentialist, I suppose.
no subject
Date: 5 Dec 2008 12:31 am (UTC)Gaaaaah I feel like my BA isn't worth the paper it's printed on now. And that I have been wastefully angry at scholars for writing so densely - I thought they wrote like that to be jerks! BUT NO, IT IS AN ENTIRE LANGUAGE, and it is time I learned it properly. What with trying to get into grad schools and all. (Pfffft I can't believe I could tell you when participles are circumstantial or go with a verb, but not the difference between nihilism and existentialism. Faaaaaaaaail.)
no subject
Date: 4 Dec 2008 04:07 am (UTC)I took several literature classes and if I recall correctly none of them wanted us to look past the text itself if at all possible. And none of them involved philosophies of any type on what texts were. Of course, they were all undergrad low level classes - maybe it gets more fun in the upper levels.
no subject
Date: 4 Dec 2008 04:30 am (UTC)(Later, back in the room: "hey, what does this number mean on the far left? The one that says 4-3-7?" YES SENIOR LEVEL OMFG GOING TO DIE OMG OMG OMG. Amazingly, I got an A in the class.)*
So I liked the professor and I just kept taking his classes, and I ended up defaulting to theology/religion degree thanks to that. Eh, well. Beat getting an english degree...
But! As time passes, I realize more and more how unique that course of study may have been. Most "religion" degrees have such a broader spread, that you don't really get into intense theology (unless you're at a seminary in which case, well, yeah, not going to be studying radical theology, I'd guess). And the few philosophy classes I took were advanced logic, existentialism, feminist existentialism, the scary stuff. The simpler stuff bored me, after having the crap scared out of me by a senior-level class my first semester in college. Whoops.
Hrm, now that I think of it, the two literature classes I had to take for grad-requirements didn't really bring up external information, either. Didn't stop me from bringing it up myself, but I don't recall whether the teachers got irked. (I'm pretty sure I probably didn't care, either.) In some ways my brain has always leaned towards synthesis anyway, which may've been why I took to this style so naturally.
* I should add, and head
He kinda blinked, opened his mouth, and she slid right past him with the addition of, "in thirty words or less."
He blinked again, exhaled, and began to explain. My notes are still in the library and on the very first page it says:
EXISTENTIA..??? is phil. about exist WHO IS THIS WOMAN ANYWAY DIE DIE DIE
Heh. Something like that. But we've been friends ever since. I figure another day or two and you'll see her reply crowing about the fact that SHE WAS THE ONE who started this whole mental fiasco.
no subject
Date: 4 Dec 2008 08:03 am (UTC)no subject
Date: 4 Dec 2008 09:59 am (UTC)Nihilism & existentialism seem to get conflated all the time, and I always end up stating yet again for the record just how they differ. When I say I'm predominantly existentialist, I've had a number of folks ask if I've read Nietzsche, even, and the answer is NO, NEVER. Fffft!
I found Heidegger incredibly accessible, to a great degree because he's a very organized thinker and writer. I've met folks who've not liked him -- most philosophers are ramblers, and I guess you get used to dealing with that, as a student -- and I was even told that the class in existentialism was considered the hardest class in the entirety of my college's offerings.
Dunno; I found Heidegger the easy part. It was Sartre that drove me up the wall. Hate, hate, hate, hate. And not entirely because Sartre is emo all over the place and bleak and depressed (and giving existentialism that bad name with nihilism not because Sartre was a nihilist so much as just constantly bleak) -- but because Sartre is one of the most disorganized writers/thinkers I've ever read. Pages upon pages and while he's using dialectic -- looping around and coming back at stuff -- he's very rambly and meandering about it. It is possible to be quite efficient with dialectic arguments; Sartre just meanders.
And he NEVER SHUTS UP about his friend who skis. I wanted to reach through the book and strangle the man while yelling GET A NEW ANALOGY LIEK NAOW PLS.
Heh.
no subject
Date: 4 Dec 2008 06:13 pm (UTC)Hello, my name is Inigo Montoya?
Okay, I so fail to suppress, but I'm a nihilistic Philistine, even if I do lean more towards postmodernist most days; you saw that coming a mile away. And I don't think there has ever been a more appropriate post for my default icon than this one. *snickers*
no subject
Date: 4 Dec 2008 06:21 pm (UTC)But given some rather cranky responses to some posts, I figured it was time to bring out the big guns. I've never cared for baffling with bullshit, but that's not at all the same as the joys of bewildering with REALLY BIG FREAKING WORDS.
no subject
Date: 4 Dec 2008 06:33 pm (UTC)Really, you've been getting cranky responses? o_O If people aren't looking for in-depth, detailed analyses, wtf are they reading Kaigou posts for?
no subject
Date: 4 Dec 2008 06:58 pm (UTC)I don't think the newcomers had much warning, so this is the obligatory "what you are in for, noob" post. Get it out of the way and over with, because I've got better things to do than educate the masses on complex theo-philosophical argument forms.
no subject
Date: 5 Dec 2008 01:02 am (UTC)Though, I could add on thoughts of author responsibility, within the context you are defining, with addendum's of genre vs literary , writerly fiction vs readerly fiction, but I won't, because we pretty much agree there already. So think about what you would think about that, and then, thats what I would have said, but prolly with less words, since I have kids, and don't get the time to write out big ones. *snerk*
Carry on....
no subject
Date: 5 Dec 2008 06:25 am (UTC)Author responsibility? Hmm. I dunno. I do think that once the work is released, it's separated from the author and any authorial input is no better nor worse than any other in terms of authority. I had a long talk with CP about this tonight, and I'd say the only exception to that rule (for me) is when you're analyzing the contents of an interview, the author's direct words as a person -- then, sure, pull in more from other quotes from the author.
But in a complete essay or story or episode or artwork or song or other media? My focus has never been on author and work, but on the dialogue between work and viewer/reader, which leaves author out of the equation. So I'm not sure what you mean by authorial responsibility, since it doesn't really show up on my radar when it comes to analysis.
Genre vs. literary... writerly/readerly? I think I know what you mean, but again, that's something I may ponder as a writer but not as an analyst, really. With an analytical cap on, I really don't distinguish between genre/literary at all -- but then, some of our current "greatest fiction" was once dimestore pulp fiction, so who's to say that genre cannot be literary, or vice versa, in a hundred years?
I guess I'd have to say that my thoughts are, "uh, I dunno," and I'm guessing those wouldn't be your thoughts at all! Guess you'll have to kennel the rugrats and take an afternoon to POST BIG WORDS. Woot.
no subject
Date: 5 Dec 2008 11:00 am (UTC)Agreed on the pulp beginnings of what are now classics. Thats why I snort at those who dismiss genre writing. They don't seem to know literary history very well, so their opinion on that subject, isn't worth listening to.
Oddly, I blogged about this, 2 days ago. My sister had this to say (some won't totally make sense, since she refers to my post there, but you'll get the basics)
"Barthes discussed the difference between readerly fiction and writerly fiction. In the first type, the reader doesn't have to work toward understanding any deeper ideas in the text, because the basic premises of the excepted ideology remain unchallenged, and so it meets semiotic expectations. The second relates directly to the idea of "watching craft," as you say, in that the reader is asked to interpret cultural symbols in new ways... the writer does this by using craft to challenge expectations, which impels the reader to be more attentive to the content. This challenging of expectations can be found in every literary form (and by that I mean genre), so therefore it seems prudent as a reader to ask not if something is genre or not, but rather does it confound some aspect of the constraints put upon it while still creating a readable work."
I have more to say on this...but kids are bellowing for bath and bed.
no subject
Date: 5 Dec 2008 05:31 pm (UTC)*rolls around in the quote in glee*
no subject
Date: 5 Dec 2008 11:11 pm (UTC)Anyway, what I was going to sorta dig into, is the idea that, something that one can use, to look at works of genre and see how that piece is one that transcends the usual, is by means of part of your own process.
A work of what seems, readerly fiction, by knowing the cultural/symbolic underlayings of what the author was adding or living, suddenly, the work has much more meaning and is actually, writerly fiction.
One can read say, a Neil Gaiman book, or even a Jasper Fford book and enjoy it. If one reads the books and knows what myths, novels, persons, symbolism the author is culling from, then, the work takes on much deeper, more nuanced meanings. Those readers, are getting more in touch with, imho, author intent (by that, I am taking to the idea that if they hadn't intended people to look deeper, they wouldn't have bothered to add those things in to begin with). P.D. James, John le Carre, are two more genre writers that are so lush, so nuanced in what they craft, that a more savvy reader, is going to be looking at a much deeper work, than someone who is not paying attention.
Which rolls back around, to Author Responsibility. The meat of the work and how they set out to connect the winding paths of their story, to lead to reader to the end place they hope for.
no subject
Date: 5 Dec 2008 11:21 pm (UTC)I was in fifth grade & had just finished the book when I read that interview with Tolkien and I thought, who CARES what YOU say? You just WROTE it, but you don't control MY head, in MY head, that was an avenging angel come to save teh day if I've ever seen one.
Granted, my interpretations are a bit more layered now, but the fact is that I've also chatted with folks about stuff I've written and been startled when they point out such-and-such a metaphor or how this and that hangs together to echo each other, or a particular image that shows up a few times in a subtle background fashion... and I go, hunh? What are you talking about? You're on crack. Are you on crack? I didn't write that! (Heh.)
Thing is, some works do require the full hermeneutical treatment, where you look at them in a broader cultural sense. And there are other works that don't, not because they're not rich enough, but because they come from an author brain that put stronger limitations on external stuff, adn instead went with what the author him/herself twisted around wholecloth. Although to some degree, on a thematic level, no one writes without some basic cultural context...
I think 'transcend' is the exact right verb, however. A classic piece is one that can embrace the here-and-now of its context, but also transcend that to incorporate the universal as well. Without the first, no one'll connect to it on a visceral level, I think, and without the second, no one will still be reading it in a hundred years.
Erm, maybe. But I think so, at least based on what I've seen.
no subject
Date: 6 Dec 2008 12:54 am (UTC)That essay, if it is indeed she I am recalling, is a few decades old. I wonder if she still feels the same?