Tuesday, December 21, 2010

“Why would anyone do that?”

I happened to see a student I’d lost sometime this semester after my last class, and as we were chatting, she asked why I’m a medievalist. My glib answer is always “Because it’s hard.” More honestly, it’s because medievalists are expected to know a lot, and we get to dabble in history, philosophy, language, literature, religion, art, and so on, regardless of our specialty; those focusing on literature, it seems, have more opportunity to go into these other fields, and that's one of the (many) reasons I've chosen it. I also noted that, as an English scholar, I get to work with poems like this one. When I pointed out the poem’s double entendre, she asked, quite earnestly, “Why would anyone do that?” This got me thinking. I’m always amused by the poem – it’s a bawdy side of the Middle Ages that often gets hidden behind the shining armor, chivalric aspirations, and religious dogma that represents the period in pop culture. Even the naysayers, those who see the period as a nasty, brutish, dirty place overlook gems like this because there can be no light in the Dark Ages, the middle between the greatness of Rome and its return via the Humanists or the breaking of Roman Catholic power by the Protestants. This poem is, for me, humor, it’s subtlety, it’s allegory (if you want to be grotesque). Eve Salisbury’s reading* is one that calls attention to the “continuing anxieties of husbands over issues of women's domestic power and the consequences of henpecking that ... suggest that concern over masculinity and reputation for prowess among men remains an issue as firmly entrenched as marriage itself” (p. 23). She later notes “the lyric parodies love songs usually directed toward a lady” (pp. 271–72), but leaves the reading at that. Another editor, Thomas G. Duncan, collects it in a section of “Miscellaneous Lyrics,” but refuses to engage it any further than to note a resonance with Chaucer’s description of Chauntecleer in his Nun’s Priest’s Tale and to obliquely recognize “the sexual innuendo,” which is compared to a nursery rhyme, perhaps to diffuse its power.** But the question remains: why would anyone do that? Why would anyone write such a poem? I think my current answer is simply this: manuscripts were precious things, and the texts contained therein were likely read and re-read, studied and reviewed in free time. Although this poem might seem fairly obvious today, and no doubt the innuendo was obvious then, it still takes a little thought to get beyond the obvious, surface meaning (describing the rooster), and the aptness of the description’s alternate meaning is sure to delight the teenager in all of its readers, while the more cerebral reader might appreciate the challenge of reading some more ennobling meaning into it. Regardless, it’s a short poem that appears in a single manuscript,*** and would likely be encountered as a brief diversion between more intellectual pieces. While we can only theorize on its original purpose, it's a fun little poem, and that's worth something in itself.

*Eve Salisbury, ed., The Trials and Joys of Marriage (Kalamazoo, MI: Medieval Institute Publications, 2002).

**Duncan, ed., Medieval English Lyrics, 1200–1400 (Harmondsworth: Penguin, 1995), where the poem appears on pp. 168–69 and the notes on pp. 245–46. The nursery rhyme to which Duncan refers is “Goosey, Goosey, Gander.”

***The manuscript is London, British Library, MS Sloane 2593, fol. 10v. “Cock” is preceded by “I syng a of a myden,” and is followed by “Omnes gentes plaudite” on the page.

Saturday, November 6, 2010

Too Damn Many

[Welcome back! Sorry for the extended silence — I’ve been catching up on some long-delayed work, after a summer filled with travel. I have a few topics to write about, but let’s get started with a post on vampires.]

Reflecting on horror and monsters over Halloween, I was struck by the glut of vampires we have today. And it seems that my greatest complaint is that there are just too many vampires. It seems to me that most monsters are best as single, threatening entities; while zombies work best as a horde and werewolves might run in a pack, ghosts and vampires, mummies and golems, the hydra and the Frankenstein creature work best as individuals. So the idea that there’s a subculture of vampires, one existing just outside of our own, strikes me as... well, awkward and unnecessary. The great works of horror literature avoid that. Richard Matheson’s I Am Legend works because there are ONLY vampires, but they really don’t have a society. The uniqueness argument I put forth here is turned around –— it is not the monster, but the human protagonist who is alone. Bram Stoker’s Dracula, despite its length, has five vampires total: Dracula himself, three unnamed “brides” in Transylvania (who never leave the castle), and poor Lucy, who’s dispatched fairly quickly. This paucity highlights the threat Dracula himself poses. But now we have “good” vampires fighting against their crueler, animalistic fellows — and often some hierarchy in place to govern them all. Seriously? I’m not sure if I should blame Anne Rice (Interview with the Vampire and its sequels) for this subculture, or what. But it’s really getting out of hand. Let me have rogue monsters, unique beings who are horrifying because they break the (taxonomic) rules, not because they are in a category unrecognized by humans (or, perhaps, “prey” would be a better description...).

This is not to say all recent vampire fiction is bad — I really do enjoy True Blood (though I’ve not yet read any of the original Charlene Harris novels), even if I’ve been avoiding the Twilight saga as much as possible. (Though my curiosity was piqued, it was also fairly well satisfied by “watching” Twilight. And I’ve had a number of trusted friends warn me not to read the books.) But neither True Blood nor Twilight are really horror stories — they’re romances with supernatural elements.

Noël Carroll has written extensively on how horror works; that is, how rational people can be honestly scared by a fiction. Rather than detailing his arguments here, the bottom line is that the monster threatens our cognitive categories (vampires are neither alive nor dead), and they present some actual, physical threat. Having a whole society of monsters forces recognition of a new category, and with that recognition comes acceptance. And if the monster is accepted into our world, it is no longer cognitively threatening; the troll in "The Three Billy Goats Gruff" is just part of their world. And it’s not frightening for its troll-ness, but its threat to eat the goats. Lt. Worf from Star Trek: The Next Generation is an alien, and a fairly frightening one at that, but he’s a welcome member of the crew, and his potential monstrosity is dissipated by his having a place in the categories the story has established.

The problem with these subcultures, for me, is that they dissipate the threat of the unique monster. I’m also not enough of a conspiracy theorist to accept the premise of a subculture existing just beyond our own, and that makes it hard for me to enjoy these works. I’m not willing to accept that idea. Although I thoroughly enjoyed the Harry Potter series, I often wonder how the two worlds (magic and mundane) can be kept so utterly separate. J. K. Rowling has hinted that there’s a branch of the Ministry of Magic that simply charms Muggles and spins the news with false reports (think MIB using magic, rather than alien technology), but it’s always been a little problem for me. It’s never really in the forefront, the way that True Blood highlights the outing of vampires. Rowling’s wizards exist in a different world; Twilight’s vampires and shape-changers are solely in ours, though normal people are oblivious to their ancient hierarchies and long-standing hostilities. And there are too many supernatural beings for me to suspend my disbelief on that count.

Sunday, July 11, 2010

Why not superheroes?

I read comics. I have for over 25 years — or much longer, if I count reading my dad’s comics. But I’ve been buying my own since ’85 or so... with a hiatus during high school and college (undergrad), as there weren’t any comic stores nearby. A few years ago, I thought I’d like to teach a course on comics, so I really got going then, expanding my own collection (which came here from my parents’ place once we bought a house) and paying attention to criticism, trying to expand my knowledge from personal anecdote to proper scholarship.

It struck me, though, that there seems to be two tiers of scholarship about comics. There’s one level of comics that’s really serious: scholars talk about Eisner’s Contract with God, Spiegelman’s Maus, Pekar’s American Splendour, Satrapi’s Persepolis, etc. Heavy, generally independent, “literary” comics. Then there’s a second level, almost second-rate, where superheroes are discussed. And that’s not terribly often. I’m not entirely sure why superheroes are dismissed so blithely, but I’m going to float some ideas. Let me know what you think in the comments.

It seems to me that superheroes get little good press with scholars, it seems. There really don’t seem to be many people writing about them. Two of the most respected creators in comics writing, Alan Moore and Frank Miller, are best-known outside of comics fans for their works responding to superheroes and their mythology.* Scholars who want to talk about superheroes generally discuss the work of these two, or they take a more historical approach and discuss the rise and periodization of comics, focusing on the major figures of comics (Superman, Batman, Spider-Man, the X-Men, etc.). Some will combine the two, for instance, talking about Dennis O'Neil's 1970's Green Lantern/Green Arrow comics, and the social issues addressed therein. Any deviation from some of these well-known, well-established, and well-loved figures prompts some sort of defensive or apologist stance from the author, as though there’s some agreement that other comics are unworthy of scholarly attention. Fans often take an apologist stance, too, when they present papers featuring superheroes.

This is the wrong attitude, I say. Sure, it’s a reflection of literary studies to want to discuss the whole story, from beginning to end, talking about character development and the intricacies of plot in some implicit (and sometimes explicit) worship of the genius of the author. And this just doesn’t work with comics. I don’t think anyone’s going to give Stan Lee (Fantastic Four, Spider-Man, the X-Men) an award for his prose. And, let’s face it, it’s tough to talk about character development when a character has no definite end, and origins can be tampered with. There are no conclusions to the stories, either: Superman and Batman have been in continuous publication for over 70 years, and many of Marvel’s big names have been around nearly 50; during that time, they all have appeared in multiple comics, either as a headliner or as a guest-star. And those comics have had multiple authors during their decades-long runs, and different stories sometimes bring the characters to different places in the same month... it can be a headache keeping track of one popular character. Add to that the current glut of cross-title “event stories,” for which the reader is expected to be familiar with every major character from decades of monthly publications, the characters’ relationships to the reader’s favorite, the villains and supporting cast of the focus titles, and you can see that traditional scholarship, which is putatively focused on the finite productions of literary masters, simply cannot be applied to reading comics. Literary studies are almost always focused on single works or collections by a single author. Those few that do deal with larger areas (e.g., Arthurian studies) necessarily limit themselves, usually to a certain subset (e.g., Victorian Arthuriana; Sir Gawain; French tales). Comics are much more of a hydra — never-ending tales about immortal characters.

So what can be done? Do we need a new approach to comics scholarship? Is it possible to discuss individual stories in an ongoing series? Has the concept of a shared universe made things too difficult for scholars? Can comics scholars write for themselves? That is, can we expect people to be familiar with the stories, or do we need to recap and carefully discuss the characters and events before we can address any specific critical issues that have drawn our attention? I don’t know. I do think a new way of talking about the big names is not a bad idea. But there really aren’t any good models of which I am aware. And scholars certainly don’t seem to be generating a lot of new material on comics. Or if we are, it’s too disparate (and, perhaps, too desperate) to command a lot of attention and generate a specific direction for comic book studies.

Perhaps what we need is a new focus. We should look at different materials, we should look at different comics, not just focus on the aforementioned indies and major titles. The big names (Batman, Superman, Spider-man, etc.) are stable – they’re not the titles with the really fun stuff going on, and they really haven’t been since their first decade or so. I don’t want to suggest there’s no new story, but let’s face it – changes for these characters are either glacially slow (how long was Spider-Man in high school? In college?) or they’re overturned within a few years (consider the deaths of Superman, Captain America, and Robin – all have been undone). it’s easy to be dismissive of change, because those changes will be written out of continuity, or they’ll be undone through some deus ex machina plot to return the character to a more nostalgic presentation.

On the other hand, there are many second-tier figures, shorter-lived comics, that are worth checking out. In the 1970s, the (extremely) restrictive Comics Code was weakened, and the comics companies allowed a lot of play for their creators.** Some characters were created primarily for legal reasons (e.g.., Spider-Woman and She-Hulk were created to simultaneously create and protect the franchise), but some titles have really strange things happening in them.*** They’re not the big name titles, they’re not the summer blockbusters that get people to read comics in the first place. They’re the second- tier (or third-tier) magazines that the publishers can let slide. They’re the titles that can be weird because they don’t have to be marketable to a broad audience who won’t get the in-jokes, or who won’t appreciate the bizarre things that are happening and can only happen in this medium.

I would challenge scholars to start looking at these figures, at these characters, who can play a great deal, who can show development because their characters are not enshrined as the main draws in the publisher’s universe. Readers know what to expect from Superman, from Batman, from Spider-Man, from the Fantastic Four, and any deviance from those expectations has to be short-lived or the readership will disappear. But characters like Howard the Duck, Swamp Thing, She-Hulk, Moon Knight, and teams like the Defenders have a lot more freedom: since they’re not as popular, they’re not as widespread, and their fans are comics fans, willing to allow their creators more room to explore the possibilities of the medium. They’re not locked into characterizations, and this allows them the ability to be more experimental and, therefore, more exciting to scholars. Now scholars just have to find these treasures and start discussing them.

Just some thoughts — please do let me know if you agree, disagree, or simply want to weigh in! There’s lots more to say, but I need to stop somewhere, and this is it...

* For those who don’t follow comics, Moore’s Watchmen and Miller’s Dark Knight Returns are still very well-regarded, as is Miller’s run on Daredevil. Interestingly, both have used their fame to launch non-superhero projects.

** You can read the original code at this website; Amy Kiste Nyberg also wrote a very readable history of the code (Seal of Approval: The History of the Comics Code [Jackson: University Press of Mississippi, 1998]), well worth checking out for more, including the full text of all three versions of the code.

***Admittedly, this reflects the creative team – how far were the authors and artists willing to go, and how far did their editors let them go. But considering the lassiez-faire attitude coupled with the huge workloads the editors carried, there was a lot more experimentation than would be expected.

Wednesday, April 28, 2010

Super-size My Language!

I’ve noticed a tendency to be verbose of late, and it’s actually a fairly disturbing trend. Its most recent form, at least the one that’s been on my radar most of late, is the propensity to improperly modify words in an effort to sound more intense. For example, I recently heard “extremely critical” and often hear “more unique” (or “most unique”).* And just today I heard someone on NPR say “very ubiquitous.” Now, when I hear things like this, all I can think of is The Princess Bride: “I do not think it means what you think it means.” (Except, unlike Inigo, I know the word’s being used incorrectly. Or, at least, modified unnecessarily.) I really think a lot of people need to look up what words mean, so they can use them properly.

Another, related linguistic gripe I have is the often unnecessary repetition because people don’t know what acronyms mean: PIN number, please RSVP, ATM machine, etc. It really should stop. Take a moment to expand them: "personal identification number number," "please respond if you please" (ok, so that one takes a little translation, too...). Then again, it can be a source of endless amusement. Think about it — when people ask “Where is the ATM machine?” they’re really looking for a machine that dispenses ATMs.

Now, this is not in student papers — it’s mostly in conversation, either in the media (TV, radio) or in person, usually conversations I overhear, rather than ones I participate in. I have read many papers where the students should have their thesaurus taken away, and where they are verbose for the sake of meeting a page requirement. I know many of the reasons for this effort, even if I try to urge honesty in their paper’s voice — writing much like they talk, or at least with a similar vocabulary. But that’s a topic for another post.

Of course, I am familiar with the argument that this is just a facet of the evolution of our language, and that I should just relax and accept it. But you know what? I see it as a devolution of the ability of people to understand their own language. And when we no longer understand the words we share, when we can no longer agree on meaning, that’s a problem, because the language itself is then an impediment to communication, not a tool to facilitate it. Now, we’re far from the end of the utility of English, and I don’t want to sound like Chicken Little. The sky is not falling, and we have a long way to go before we’re not communicating through a shared language. But if “unique” no longer means “one of a kind,” when the unique item no longer has status as a distinctly singular object, then we need a new word to replace “unique.” So... what is it? I’ve heard nothing. Just that “this” is more unique than “that” — which is, according to my dictionary, impossible.

If you don’t know what these ”big words” and acronyms mean, stop using them — to those who do know them, you sound like an idiot. And those that don’t are not worth amazing with such simple (and misguided) linguistic feats.

* I would like to point out that while one cannot logically increase something’s uniqueness, it can be decreased: an object can be “almost unique,” but not “more unique.” I have no problem with that.

Wednesday, April 7, 2010

What Do We Do?

At a recent party, I was asked “Why read Shakespeare? Kids [meaning high schoolers] don’t like it. They aren’t going to try. It doesn’t matter to them.” My response? Foolishly, I tried to play the canon card — I said “Ask the people you meet tomorrow if they know or have read Shakespeare. I bet all of them will know his name, and 95% will have read something he wrote.” “But,” was the reply, “People today don’t enjoy reading his work. They just have to in school.”

Ugh!

Upon later reflection, it seems to me that English departments have done a horrible job of selling themselves. And academia, both college and el-high, has moved on. What are English departments traditionally supposed to do? Study the language. Teach people how to read, understand, and interpret words, both written and spoken — and then respond to the ideas they present. And what do we do? Cultural studies. Film studies. “Fun” stuff, not “rigorous” stuff. Why? Because we need to keep up enrollment. (And, yes, we enjoy teaching it... but then, we also enjoy teaching Shakespeare and other stuff that's not "enjoyable," by my friend's implicit standards.) It seems we've lost sight of our initial mission, and we've not really defined a new one for ourselves.

But we'll never disappear — we’re the workhorses of colleges, walking all freshmen through composition for the benefit of every other program. We’re also the broom closet of the college, as a colleague argues. Our comp classes are the places where students have to not only learn how to produce research papers, which is fair, but they also learn how to use the library, how to use word processing programs (usually Microsoft Word), basic grammar and punctuation, and often some introduction to public speaking. Oh, and since we have to write about something, add in some content.

There's still the idea that we’re not teaching “marketable skills” when we get to do what we want (i.e., read texts) — that is the province of other programs: science, business, etc.

Is this because we learn to read as children, and the skill we use is considered mastered before college? Not perfected, perhaps, but good enough by 6th grade to get through bestsellers? (Yeah, I’m thinking of Dan Brown, Stephanie Meyer, and J. K. Rowling, however much I enjoyed reading Harry Potter.) Is anything that demands a higher reading ability just not worth the effort? Or is it not profitable? That is, my ability to read Shakespeare (or Chaucer) is really just showing off, and won’t get me a better job because it is, somehow, not valued by society. Or by the sectors that pay well.

Yeah, I’m frustrated. But so it goes. I wouldn’t choose another discipline. I am happy with my choices, with my direction. I do like teaching English, I like teaching literature, I even enjoy teaching composition. I just don’t enjoy being the freak that didn’t pursue a career that promises big paychecks.

I’m too smart for that.

Friday, March 19, 2010

Don’t tell me!

I have noticed a strange trend lately in my students’ papers. I’ not sure if I’m just more sensitive to it, or if it’s a change in the writing that’s coming in, but here it is: students like to use “you.” Now this doesn’t sound like a serious issue, and it certainly sounds silly when just sated that way. But here’s my issue: composition teachers have long asked (demanded?) that we be shown, not told, the evidence and conclusions. Focus on the ideas and details that support them. We’ve also tried to kill “I” in essays. When I ask them to use “I,”especially to indicate their experiences, my students often look at me askance, and say “Oh, I just can’t say ‘I’ in a formal paper! It’s just not right!” I always reassure them that yes, in the most formal academic discourse, “I” is usually shunned (as is the five-paragraph essay, but that’s another rant), but we’re writing more low-key, personal pieces at the beginning of the term, and their experiences are important — and can be “owned” with the pronoun.

So what does all this mean for “you”? I think it’s a common substitute. That is, students think “I have to show my logic, so I’ll tell my readers what to see.” And this quickly becomes “I’ll force audience agreement by proscribing reactions — what to feel, how to think.” It’s really difficult sometimes, especially when my students’ arguments are limited — or they don’t take into any account multiple opinions, different experiences, or simply different conclusions based on the same information. Too often, my students will say something like “You will remember from Romeo and Juliet...” Um, hate to rain on your parade, but I’ve never read that play. (Nor have I seen a whole production, be it on stage or screen.... about the closest I’ve come is West Side Story — unless sitting through Twilight counts, too.) Or better yet, “In 15 years, when you get married [I am already married], you will receive a box from your mother with your diaries in it, with a note that says ‘Remember these when you’re expecting.’” [I already have a child — and my students know both facts from casual conversation.] I pointed out to the student who set up this hypothetical situation that it’s at odds with my experience, and as such, it indicates that I’m not the target audience. I asked her to revise, removing “you,” as to allow other people to read and follow the point she’s making, rather than excluding readers because she expects her audience to be exactly like her.*

So I ask my fellow instructors — have you noticed an encroachment of “you” in papers? What do you do about it? I’ve given over some time in my writing course to explain why this is bad, as it disrupts the relationship between the author, who is supposed to show, not dictate, the ideas, and this reader, who wants to be led, not commanded, who wants to discover ideas, not have to manufacture knowledge to meet the demands of the author. It's only a couple of minutes, but I wonder if I should be more proactive, and introduce it before the first paper, rather than after the first draft, in response to my students. What do you think?

*Incidentally, she did not revise this page between drafts, and the demanding scenario remained.

Thursday, March 4, 2010

Knowing Enough

At the beginning of this semester, I covered a humanities class for a colleague who was out on medical leave. One of the icebreakers she assigned was a list of questions, and the purpose was to find people who know the answers. Since the class was fairly small, and the icebreaker is supposed to get students to meet many people, I told the students that I’d be happy to sign for one answer.

The students started circulating, and I answered a few different questions. One student walked up, with my colleague’s list in hand, and asked if I could answer any questions she hadn’t filled yet. Looking at her list, I saw that I could answer any of the questions, but I wanted to be a little cagier, perhaps to make a point about assumptions, as I had in my writing course. So I said “Yes, I can answer some. What question do you think I can answer?"

“I think you know the name of Alexander the Great’s tutor” was her reply.

Noting some conviction in her voice, I said “Yes, I do — why did you think so?”

Her reply? “Because you’re the teacher. You just know these things.”

I was rather humbled by her faith in teachers — and at the same time, I was glad I didn’t let her down. It got me wondering, though, just how much we’re expected to know, especially in the humanities. I chose my discipline, and my period, in part because they are hard — there’s nothing that can’t help illuminate readings, so everything is fair for academic consideration. But I also wonder when we’re (theoretically) ready to teach. When do we know enough to step in front of a classroom? I don’t think there’s a good answer, or a right answer, in terms of quantity of knowledge. I think the best answer is individual — you know enough when you feel comfortable, when you feel competent, to teach that subject, be it composition, introduction to literature, or something more specific. I look forward to hearing your thoughts!

Sunday, February 7, 2010

Studying Literature: Piling Pebbles to Reach the Heights

As I was talking to some students and pondering my choice to study literature, a powerful metaphor struck me. Studying literature is hard because we pile pebbles to reach the heights of greatness. Think of it this way: science is very goal-oriented, and it has an end. It is like a ladder. (So I’m mixing metaphors and similes — at least I recognize the difference!) You start with basic information and move up the chain of difficulty to a specific goal — the point at which you have to create your own next step by doing research (experiments) that expand the realm of science. And there’s a pretty defined hierarchy of knowledge: biology starts with basic cell structure, chemistry with atoms, etc. Then you move on to more complex systems (organs, molecules), then groups of those systems (circulatory system, molecular interactions), and so forth.

Literature’s not like that. Sure, we have easy literature and hard literature, but there’s no set order, no specific books everyone has to master (or simply read) before moving on. There is no hierarchy of knowledge — instead, in literature, we reach the heights of mastery by building a mountain of piled pebbles. Each work adds something else to our mass of knowledge, but none is more important or unimportant. Reading the complete works of Shakespeare is impressive, but reading the complete works of Stephen King is no less impressive — it’s just less influential on Literature. Reading the complete works of Chaucer, or Hemingway is, perhaps, more impressive, but none of these milestones will result in some profound wisdom in themselves, they won’t identify the reader as a master.

This is not to equate all reading; as I said earlier, there’s a spectrum of difficulty. Some difficulty is in the language (Middle English in general), some in the style (Henry James); some authors are simple in style, but profound in ideas (Hemingway). These are the authors and works that get trotted out as classics, whose inclusion in the canon are often taken as obvious. Some works are hugely popular, but have little literary merit (Brown’s The DaVinci Code; Meyer’s Twilight series). But they lack can be instructive in lacking those qualities — that is, these poor books can help define quality and genius.

I think the final idea I want to toss out is one of skill. We learn how to read very early on; some take to it quickly, others struggle to make sense of texts throughout their lives. And I wonder if this is one of the problems literary studies has. Dr. Seuss’s books are some of our earliest reading experiences, and they’re seen as simple. But they are often poetry, and although they’re short works, and presented in fairly simple ways, I think they’re pretty complex books to read, in that the reader needs to construct a narrative from very short, often declarative sentences.. Even Shel Silverstein’s poetry is fairly advanced wordplay. Is it E. E. Cummings? No, but what is (other than the poet himself)? Silverstein can certainly be more advanced than Shakespeare or Poe in terms of wordplay and sounds.

In any case, I start to wander. Do think about the idea that literature students have no hierarchy to climb, but only a collection of disparate stories from which to build their experiences, and that process of reading (and if we’re lucky enough, re-reading) only enriches our experiences and ability to form a coherent argument about literature. Unlike science, the cutting edge is unique to each of us, as we read new materials and add those works to our pile of stones in the quest for the heavens.

Let me know what you think!

Saturday, January 30, 2010

First Impressions

Welcome to my blog. A (very little) bit about me: I'm an adjunct professor at a local community college, while I finish writing my dissertation at the big university. My field of expertise is the Middle Ages, but I have a lot of other interests and ideas beyond that huge area.

I quite often have complaints, and that intention has created the name of my blog — a reference to humoric medical theory. Perhaps I'll post more on that later, but until then, that's all you get.

For the next post, I'll have a few words on how studying literature is like a pile of pebbles, unless there's some other, more pressing issue in my life!

Thanks for your interest, and happy reading!

-KR