Sunday, March 31, 2013

This would be a good title, but I said “would.”

I recently heard an announcer identify a player by saying “You know who it is? That would be Slim Summers.”* It really touched a nerve. I know grammar isn’t really a popular topic, and most people think we learn enough of it just by using English every day, but this really highlights a problem with common use: we are misusing solid constructions and impoverishing our language in the process. “Would” is subjunctive, a little-used verbal mood. It is the mood of potential, not actuality. (The subjunctive drives Latin students nuts because of this.) Its most common use is to present something that is contrary to fact. And it’s a great way of indicating possibility. “If you came to class, you would know what’s going on.” “I would help if I could, but I’m swamped.” “He might be a father, but he sure ain’t a dad”** “If I had a million dollars, I’d buy you a green dress.”*** All of these are good uses of the subjunctive: these are either false or questionable situations, and the grammar reflects that possibility, either false (he might be the biological father, but he certainly isn’t an involved parent) or unlikely (my having a million dollars). The important factor to remember with the subjunctive is that there’s an inherent “but” modifying it. Sometimes it’s stated outright (“but I’m swamped”), sometimes not. It’s always there, though, and that is the real value of the subjunctive. In saying “That would be Slim Summers,” though, the announcer has cast the identity into doubt. It would be him except what? It’s his twin? It’s his stunt double? It’s someone else entirely? I heckle the announcer because that’s the most recent (and common) use I’ve heard, but I’ve noticed the encroachment of the subjunctive elsewhere. When I was hired by one employer, the human resources people running the orientation said “The papers would be on the table after the presentation.” They would, but what? But you forgot to make the copies? You forgot to bring them? The table disappeared? Or if what? If we want them? Will they disappear from the table if we don’t want them? Rather than saying “they are on the table,” or “they will be on the table” (in keeping with “after the presentation”), the subjunctive establishes a contrary-to-fact statement, and I wonder what the reality is. The subjunctive fell out of common use in English long ago. Why, then, is it creeping into use again? Are we unsure of our authority? Are people mis-using this construction to sound more complex, more educated, more intelligent? Because, like so many things, those that know the truth, those that can read (or listen) carefully, will hear this misappropriation, this incorrect usage, and will think less, not more, of the speaker. Or, rather than communicating well, it will spread confusion, as the careful audience will be distracted by the possibility, as I was at orientation. But ultimately, the writers who specifically choose the subjunctive for its original purpose will be misread and misunderstood, and the depth of expression allowed by the subjunctive will be lost, and that is a shame. *As I wasn’t really paying attention, I forget the player’s actual name. Extra credit to the first person to identify “Slim Summers,” though! **Crash Test Dummies, “Androgynous” ***But not a real green dress; that’s cruel. (Thanks to Barenaked Ladies for this quote!)

Wednesday, February 13, 2013

An open letter to the Boy Scouts of America

Last week, there was some discussion about reviewing your policy barring homosexuals from joining the Boy Scouts in any capacity, as either a troop member or a leader. Although for me it’s really a non-issue, I recognize there’s a lot of concern, and, having been a Scout myself, having earned my Eagle rank, and having thought a fair amount about homosexuality, I want to weigh in.

As I understand the debate, people are uncomfortable having gay members of the Boy Scouts at any level. There’s concern that gay leaders will be predatory or simply bad role models, leading the boys to become gay themselves; there’s also concern that gay scouts will make events uncomfortable for their peers. Scouts should be a safe place for fun, exploration, and personal growth. Another argument is that the Boy Scouts promote good morals; part of the oath is to be “morally straight.” The problem with these arguments is that they’re based on faulty logic and premises.

First, let’s look at the issue of gay leaders. If you believe all gay men are pedophiles, then you’re unfairly generalizing. If all gay men are pedophiles, then so, too, are all priests – and, since every bishop, archbishop, cardinal, and pope are also priests, then there’s no getting away from pedophiles in the Catholic Church. Using that same logic, the one where the frightening example stands for all with the same characteristic, then all Baptists hate homosexuals, just like the Westboro Baptist Church, all veterans have PTSD, and all bankers are greedy, money-oriented crooks. My experience is that most Baptists are welcoming people, most veterans are adjusting just fine, and the bankers I know are genuinely decent people who happen to like helping people manage their money. I also know a number of gay men who are kind, generous guys who are no more attracted to children than anyone else. So excluding them from leadership roles because they might be dangerous is just foolish stereotyping.

The fear that the leader might find some time to be alone with the boys is a reasonable one, but again, it’s not likely to happen if there’s any common sense in the troop. A single adult leading a group of boys is never a good idea. It’s not safe. I can’t think of a single event I ever attended where there weren’t at least two adults, and that includes patrol meetings and a few years in Cub Scouts. Most longer trips (day hikes and overnight camping trips) included three adult leaders. If one adult were to be injured, or had to leave with a scout because of illness or injury, there were always others around. And the trips we took included scouts of many ages, and the older boys often took on leadership and mentoring roles, keeping an eye on the younger boys. So there’s very little opportunity for a leader to be alone with a single boy if the troop emphasizes safety and community.

“But,” you might say, “These men are mentors. They’re role models.” Well, yes. Yes, scouts look up to the leaders. But not all of them; each of the boys in my troop had his favorites, and we admired different things about different leaders. I don’t think anyone ever changed his (or her) sexual orientation because someone they admire is gay. However, many people have come out of the closet because someone else did, and I think that’s great. It’s worth showing boys that they don’t have to be ashamed of themselves, be it sexual orientation, religious belief, hobbies, or anything else that’s seen as outside the norm. If we want to encourage an open society, a diverse culture where self-expression is a positive thing, then we should have different kinds of people in prominent positions. If there’s a gay assistant scoutmaster, then boys who are themselves gay will not feel ashamed by their sexual orientation. If that man’s not around to lead by example, then the boys who need that positive role model will be ashamed, and they’ll keep a terrible secret. Further, if boys never meet a gay man, or a Muslim, or any of a number of other minorities, then they’ll never learn that these are just people, too — they’ll never know that these differences aren’t something to fear, but just a difference. The same can be said of gay scouts: if they’re a foreign other, never encountered, then they’ll be more frightening because of that alien nature. But if the boys see one of their own is gay, and that boy is also good at capture the flag and fire-building, if he’s a good friend and person, then there will be little to fear. The frightening unknown will become familiar, and no longer threatening.

The trickiest argument to refute is the morality argument, because Boy Scouts have always been interested in producing good, upstanding, moral citizens. But here’s my question: whose morality are the Boy Scouts promoting? Many troops are sponsored by Christian churches. Does this mean that the Scouts are a group for just Christians, or are Jews, Muslims, and Buddhists welcome? How about atheists? Agnostics? There is an argument that the Scouts, as a private organization, can be exclusive, and I’m OK with setting up requirements for membership. But the requirements when I was a scout were simply age and gender: you had to be an 11-year-old boy to join, and you had to transition to adult leadership at 18. That was it. Excluding anyone because of fear or misunderstanding, because of stereotyping or hate, is not something that the Boy Scouts I knew and loved would ever accept, never mind promote. This is not a means of advocating no moral stance, but I do think it's important to take cues from society, and American society is becoming far more tolerant of many different groups. Boy Scouts should similarly allow for, if not embrace, that expansion.

There have been many changes since Baden-Powell founded the Boy Scouts in 1910. World War I saw Boy Scouts on the front lines, literally scouting during the war. Mid-century troops were uncomfortable with racial and religious integration. I think now the big change will be accepting different sexual orientations. And that development reflects changing social attitudes: as society becomes more accepting, so, too, should their organizations.

So, Boy Scouts of America, please accept change. Gay scouts and leaders are no more threatening than any other group, and their presence can be a positive influence on scouts and, by extension, society. Don’t exclude anyone from your group based on faulty logic, inappropriate beliefs, and antiquated notions, or you, too, will find your influence and importance shrinking into irrelevance, and scouting is far too important a factor for far too many boys to have that positive influence disappear.

Wednesday, August 22, 2012

He’s not a governor anymore...

There is a tendency, perhaps born of courtesy, to refer to Mitt Romney as “Governor Romney.” That’s simply incorrect. He is no longer a governor, and continuing to refer to him by that title is misleading. It’s a practice that should end. Romney was a one-term governor of Massachusetts. He left at the end of his term in January 2007. His successor is the current governor. Moreover, there are 4 other living former Massachusetts governors. Is it right to call five people “Governor” for their service to the people of Massachusetts? Further, there are forty-nine other sitting governors, and many of their predecessors are still living. I don’t think it’s a stretch to estimate 200 living former governors in the US. Should they all retain the title? I mention numbers not because I am a fan of statistics, but to illustrate how many people could be called "governor" after their time in office. It’s just not that special, unlike “president.” There have only been 44 presidents in the history of the country. There have been 50 governors each year for the last 53 years. There’s also the issue of follow-up: after being governor, there’s plenty of opportunity for another job. After being president? Not much can follow that position. It’s a well-known, final job. Looking at recent presidents, Reagan and both Bushes retired; Carter and Clinton have embarked on charitable projects. Members of their administrations have gone on to various pursuits, but the presidents are pretty much done having a role in the government (or even the private sector) when their terms end. Referring to this elite group as “president” is a deserved, certainly earned honorific that is not likely to confuse people. Governors, though, can move on — and up — to new positions. And given the high volume of people with that title, it’s easy to not know who is a sitting governor, and who’s just holding on to the title. Further, when is the title given up? Do we continue to talk about Governor Ventura (Minnesota, 1999–2003), Governor Schwartzenegger (California, 2001–11), or Governor Spitzer (New York, 2007–08, resigned)? How about Mayor Jerry Springer (Cincinnati, 1977–78)? Does Romney merit the honorary title “Governor” because he’s still in politics? Why is that? Regardless, he’s far from the only one who can claim the title of “governor,” and he’s been out of office long enough to make two runs for president. He’s had a handful of other positions since his time in office, and he’s certainly not focused on Massachusetts politics in that time. What's the expiration date for the title? I would expect it to be upon leaving office, but I guess I'm wrong. Why do we still call him Gov. Romney? Does it seem right to you?

Friday, March 2, 2012

Rules of Usage: Teaching the Hows and Whys

I’ve been thinking lately that language teachers are enamored of rules, but they don’t seem to really help the students I’ve had. And, to a certain extent, the student I was.

On NPR a couple weeks ago, I heard a show talking about learning a language, and the host suggested that it’s harder to learn language as we get older. The guest, a neuroscientist, countered by saying once we learn our first language, others are fairly easy to learn, but that yes, it is harder to learn new languages as we age. I wondered, though, how much of that difficulty is a result of age, and how much is method. That is, the older we are the more invested we are in the rules of grammar; as infants, we are immersed in our native tongue, and as young learners, we tend to work on fun stuff — easy reading to build confidence, and lots of opportunity for error. However, my experience in high school, where I studied both French and German, as well as plenty of language courses in college, is that we were given rules, random sentences in terribly contrived situations, and lists of vocabulary. That is, I was presented with formulae and rules, words and phrases, but no good context, no community to provide models and correction — outside the classroom, of course.

I wonder, then, if that’s the problem. There’s an emphasis on rules and memorization, not usage and trial-and-error. And it seems to me that’s how we try to teach punctuation, too. There’s all sorts of punctuation, and it is far too often misused. Or appropriate punctuation (semicolons and dashes come to mind) is not used at all, because people don’t want to misuse them and be wrong.

Teaching punctuation, and reading about it for my composition classes, I keep running into the same thing: here’s a list of rules to learn, with some contrived or out-of-context sentences to illustrate the rule. But that’s not how punctuation evolved, or even what I have come to think is the best way to teach its use.

The rules of language are often written after the fact; just like definitions, they are fundamentally descriptive of usage. They do not tell people what to do from the start, but reflect the use people have already agreed upon. It’s easier to teach that way, perhaps — end a sentence with a period or question mark. Use commas to separate items in a series. But why teach the rules? Why not teach the use?

I would propose a new focus for teaching grammar: make students think about what they say and how they say it. Approach punctuation as a way to control the pacing and tone, to give the words some aural component.

When I work on editing medieval texts, ones that have either no punctuation or a set of punctuation that makes no sense to my modern, English mind, I think about an oral presentation. How would this text be heard? And I try to replicate my sense of the pacing and tone by inserting punctuation. It is interpretive, but isn’t that what reading any text is, to some degree? See how punctuation makes this line come alive, how it changes the fundamental — though implied — meaning, the connotation, of the text:

To be or not to be, that is the question.
To be... or not to be... that is the question.
To be! Or not to be! That is the question!
To be or no to be? That is the question?
To be (or not to be), that is the question.

There’s no difference in the words — just the punctuation. And each line’s punctuation controls how we read Hamlet’s words. I was thinking about just that: delivery. How the punctuation controls the meaning, or at least guides it. I didn’t think about the rules of punctuation, just the way I wanted the line to sound when you read it. And I chose my punctuation to guide you.

While I’m sorry I cannot test my theory properly, as I am not teaching this spring, I have often described the comma as a slight pause and a period as a full stop in presenting ideas. And last fall I did try to introduce the idea that punctuation breathes life (that is, pacing and tone) into otherwise inert words on a page or screen. Students really started to get it when I talked about the unknown tone of email and texts, but I didn’t follow up nearly well enough at the time. But I have wondered if it would be better to explain punctuation as a means to guide a reader, rather than as a bunch of random rules and guidelines.

As to language, I have nothing. I know that I always liked reading the actual authors when I studied foreign languages; friends and relatives who are trying to pick up languages now often label items around the house in order to learn the language, but it’s a form of immersion. Not so much rules and lists of vocabulary. The Rosetta Stone system, as I recall, uses a similar premise of immersion, and travelers always pick up more fluency, more comfort, by traveling, by immersing themselves in the language. I again have no conclusions regarding learning a language, but I do think teaching the rules of grammar and usage can only go so far as a slid pedagogical tool for making students more comfortable, accurate writers.

Friday, February 3, 2012

Returns

Dear readers,

It’s been nearly a full year since my last post. Lots has happened — most notably, I’ve finished writing and have successfully defended my dissertation. I have had a ton of tech problems along the way, including losing my desktop for almost two months, along with all the blog ideas and notes I had on them.

But new year, new computer, new ideas. I don’t plan to set any records, but I do plan to post a bit more. I have some comments on punctuation, some ideas about education, and some observations on horror I’d like to present here. And I’ll be happy to take suggestions, as well, if any of you want to ask for my thoughts.

Thanks for reading! I expect to have a new post up soon.

Thursday, February 17, 2011

Continuity and Change: Comments on Genre Films

A friend of mine recently posted on Facebook that she was excited about an upcoming movie, just to find out that it was by the director of one of the Twilight films. The ensuing discussion pointed out that some directors are uneven (specifically Joel Schumacher, who is still disliked for his Batman films, but praised for The Lost Boys), and that Hollywood cannot (or, perhaps, does not) do anything original. While these are a good point, I think there are other, equally valid directions to consider.

First, there’s simple economics. When Hollywood does try something original, especially in a genre film, it's often ignored or panned. Think of it this way: if Twilight is an original spin on vampires and audiences complain, why should filmmakers try for originality with the next vampire film? Why not return to the tried and true to get butts in seats? [Aside: I have never read Twilight, but I don't like what I've heard. However, that's another post altogether.] Conversely, there are folks who are devotees of vampires, and they stayed away from Twilight for a host of reasons, many because they feel vampires should be frightening, not romantic.

From another angle, there's genre theory. That is, people expect certain stock situations and characters in their genre entertainment, and they enjoy seeing the modifications from that form. When the modifications work, it's a hit; when they fail, it's seen as "just another genre film" (and probably not a good one, since it’s gone too far from expectations. Think about a long-running series: is James Bond original? Friday the 13th? Star Trek? Not really. But damn, I'd say the latest Star Trek was a successful film not because it was original, but because it played with audience expectations -- of the series, of the characters, and of the genre.

Now, will "Red Riding Hood" be good? Can't say. Haven't seen a trailer for it. But I don't think it's fair to cry for originality and then decry any film for not being true to genre expectations.

Thursday, January 27, 2011

Philosophy of Teaching

A position has opened up at the college where I work, and I've applied for it. Part of the application is a philosophy of teaching (or teaching statement). I've written 4 or 5 over the last decade, as I've had to submit a philosophy every time I apply for a job, but I've lost every one I have written. Here's my latest — hopefully I can hold on to this one! I'd love to hear feedback from you, my readers; I'm certain to revise this as I either apply for other positions or as I move through the ranks where I am.

Teaching Statement

I teach because I enjoy expanding students’ minds by introducing them to new ways of thinking. I also see teaching English as a means to improve quality of life. Those students who can read not only the content, but the implications of a piece of literature, those who can express their ideas clearly and effectively, are more capable of using those skills in a variety of ways. English is the best discipline to teach the skills of communication and critical thinking, though it’s a double-edged sword: people who think critically about things are more likely to be frustrated by the status quo, though they are also more likely to try to improve it. Teaching for an English program, especially composition, allows me to focus on helping students identify problems and read critically, then offer solutions. It is this opportunity to inspire students with my passion for literature and the humanities in general that I value most in teaching.

As a discipline, English offers me a challenge that I greatly enjoy. I like the variety of students that come through composition, often the only course required of all students. I have studied the Middle Ages because it combined my interests in a variety of areas, especially literature, history, and philosophy; I gravitated to literature because it seemed the most flexible in terms of making arguments. English critics read a text and find some point to make, connecting the text to modern readers’ lives. Composition adds another dimension of complexity, as the students are not unified in any quantifiable way; composition classes are often chosen based on time and location, rather than instructor or content. This allows for incredibly diverse groups, making the teaching experience more challenging and rewarding.

Despite that challenge of diversity, I find a number of ways to reach my students. Responding to some poor experiences of my own, I understand the importance of structure and preparedness. I offer a carefully crafted syllabus in the first class, and the course information sheet outlines my policies. The semester proceeds according to the syllabus, to allow the students a chance to anticipate and prepare for class. I, too, am prepared for each class, having read the assignments and prepared whatever homework I ask my students to; that is, I do everything I ask of my students. One of my greatest learning experiences as a teacher happened in my first year teaching. I did not prepare the reading I had assigned, instead relying on prior knowledge of the tale. My students asked me insightful questions for which I was utterly unprepared, and I left the classroom full of shame. I resolved to never let that happen again. My primary goal in a class is to be a coordinator. I generally open with a brief lecture, asking my students to outline the main points of the days’ topic or reading, and then filling out their list with whatever other pertinent information I wish to include. I then ask my students to respond, opening the floor for discussion, though I always have a few discussion questions of my own, should the students not be sure where to take the discussion. I rely on my students’ input for direction, to allow them to get what they need from the class. In such a way, I find my students to be more engaged and interested in my courses, since they feel they have some real sense of ownership. The students have a clear voice in the classroom, and in connecting the content with their own lives and interests, they are more likely to retain the deeper lessons regarding expression that are, in my opinion, at the core of an English degree.

One concrete method I’ve used in my entire teaching career is to prepare written assignments with a scenario that grounds the assignment in real life. In a course on comic books, I asked my students to defend the use of a comic book in their discipline; I had one student write a great paper on using the Flash, whose main power is super speed, to discuss issues in a physics class. I also try to engage my students’ interests, especially in composition, by challenging them to connect their majors to the class, especially as material for their research papers. After reading Mary Shelley’s Frankenstein, I have had students write papers on absentee parents, social pressures to fit in, and one enterprising political science student who compared Yugoslavia to the creature, both beings made of disparate parts and bound for a tragic end. I like to make literature live by challenging my students not to interpret its meaning in some general, grandiose way, but in a way that holds meaning for them.

As a student, I always did best when I knew my professors; this is often not feasible in a large lecture-style science class, but was often fostered in smaller, discussion-based literature and philosophy courses. Accordingly, I make an effort to learn all of my students’ names as soon as possible, and I am very generous with my time. Some of my most exciting moments have been when my students speak to me long after the course has ended, and they let me know how much they enjoyed my class or how much they learned in my classroom. I don’t expect my students to retain everything I offer to them in a semester, but I do hope that the content of my classes will inspire my students to approach life and learning in a new way. I firmly believe studying English is about processes, not facts. I don’t think anyone has been enriched by knowing that Geoffrey Chaucer died around the year 1400. However, I do know many people who have been surprised and amused by many of his tales, such as the group of senior citizens who agreed with the statement that women desire sovereignty, as Chaucer’s Wife of Bath discusses in her tale. Teaching English classes, be they centered on some specific theme or the often-dreaded composition, allows me the opportunity to not only challenge my students to think more broadly, but also to interrogate their own assumptions about what matters, and how to present their ideas. That is what I try to teach in my classroom: processes to better read and relay information.