Yesterday and today, the American president, Donald Trump, suggested that weapons-trained teachers should carry concealed firearms to keep the peace in schools. His proposal comes after yet another school shooting in the United States, one of a regular series of mass murders that is, or should be, a shameful embarrassment for politicians and the gun lobby in that country.
Today, Trump insisted in a tweet that “ATTACKS WOULD END” if teachers were armed, but, as I implied in a recent essay in Film-Philosophy, this is a case of insisting on a hypothesis when other countries have a proven alternative that dramatically reduces gun violence. It’s simple: far fewer available guns, especially automatic weapons of war sold to the general public. Few convenient weapons of carnage, few mass murders. Amanda Holpuch’s story in The Guardian today includes various other reasons why Trump’s plan is far-fetched, including the unlikelihood that a teacher would shoot accurately under pressure (not to mention with a pistol against a machine gun or a rifle easily modified to shoot automatically). Partly because ridding the country of most of its publicly available automatic weapons is inconceivable to its president and to many others, we hear “solutions” such as arming teachers, but what does this solution imply about Trump’s vision of education? Although I think it presentist and ageist to disbelieve an idea simply because it is old or shared by someone old, in this case Trump's vision should be dismissed partly because it is out of date. In my previous post, I quoted Marshall McLuhan, who wrote about the classroom as “an obsolete detention home, a feudal dungeon.” How true, when you consider Trump’s proposal, which is in effect to reinforce the idea that schools are jails presided over by armed guards. I mean, teachers. When I suggested—again, in my previous post— that we should envision the classroom as if it were the International Space Station, I tried to aim high, to the stars. Trump is aiming low, dungeon-level low. I would add that his model of education, with its hard-to-crack security, appears to be the one that Paulo Friere described as the banking model, in which teachers simply “transfer” knowledge to students, like a bank transfer. Trump himself said today, “I want my schools protected just like I want my banks protected.” In this model, teachers have all the power, including knowledge, and they dispense it for clients who have paid their tuition, so that a diploma or degree is a commodity rather than a qualification. This model is capitalist in one of the worst senses of capitalism, the so-called neo-liberal capitalism in which even intangible "things" are monetized. Contrary to this model, many contemporary teachers and professors believe that students need to be more in control of their educations and learn better when they are posed problems that they have to try to solve on their own, and with guidance as necessary. This alternative model puts significant authority in the hands of the students. Relatedly, some Indigenous models concentrate on shared dialogue and storytelling, and lessons are narrativized hints that have to be interpreted. Again, the students have more power over their educations, and they are therefore more likely to take responsibility for what, how, and when they learn. To arm a teacher is to enforce the teacher’s power and authority, but it is also to suggest that the manner of teaching should be authoritarian, not merely authoritative. This notion is a serious problem when it comes from supposedly democratic government. Democracy emerges partly out of education, which is, in some traditions, the opportunity to learn citizenship—not to be merely indoctrinated into patriotism, but to choose reasonably from varieties of government one that would represent you. Trump’s suggestion demonstrates to me that his vision of democracy is corrupted by authoritarianism. I feel a responsibility towards my students, but I don't want so much authority over their lives that I am responsible for their lives, too. They need to learn to care for themselves and for others as much as I do, and if looking out for each other was more a part of American culture, perhaps there would be less hatred. Canadians, with our democracy, need to remember this lesson too. The classroom should be more like democracy and less like tyranny. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “The Classroom as Prison Cell with Armed Guards.” Publicly Interested, 22 Feb. 2018, www.publiclyinterested.weebly.com.
0 Comments
Not many people think about Marshall McLuhan much now, but I do. He faded from being the best-known professor in North America to something like an obscure prog band still loved by only a few, such as Can—and if you’ve never heard Can's “Vitamin C,” you really should, not only for Damo Suzuki’s poetically critical lyrics but also for the jazzy-mathy Jaki Liebezeit on drums. That’s how I think of McLuhan. Like Can today, he’s “out there.”
Phrases like “out there” are metaphors, and McLuhan loved them. He was an English prof, after all. I’ve been thinking about how his use of metaphors for media might teach us about social media and their metaphors today, e.g., their flaming, their tweets, and their trolls. McLuhan seemed to be calling for new metaphors of media when he was in his heyday in the 1960s. In Understanding Media (1966) and The Medium Is the Massage (1967), McLuhan argues that the media are “extensions” of the person’s capacities, such as hearing and sight. A tweet extends our shrillest sounds. A troll lurks and angrily surprises you. Yes, you can tell that I am not a fan of social media! But a tweet can be manipulated to reject its own metaphoricity and produce better content, as Jeet Heer has done by popularizing "the Twitter essay," which also shows that the essay remains an essential form of knowledge creation and reflective communication. Metaphor is inherently reflective, because it always prompts us to wonder, well, how IS a tweet like birdsong? McLuhan implies that all media should be understood through metaphors, using a metaphorical statement to claim that a medium is either a “message” or a “massage.” (A metaphor simply says, A=B, or “this” is “that,” always a statement of shared identity.) Whenever we engage in knowledge translation, e.g., by saying that the International Space Station is about the size of a football field, we are using analogy or metaphor. Scientists do it all the time to help people relate to difficult numbers, concepts, and processes. So, McLuhan tried to find metaphors applicable to education. Fifty years ago in a book called McLuhan: Hot & Cool (1967), he suggested that education move out of the classroom: “The METROPOLIS today is a classroom; the ads are its teachers. The classroom is an obsolete detention home, a feudal dungeon.... We must invent a NEW METAPHOR” (116, his emphasis). In the same year, in the film This Is Marshall McLuhan, he said, “In the nineteenth century, the knowledge inside the school room was higher than the knowledge outside. Today it is reversed. The child knows that in going to school he is in a sense interrupting his education.” If only the classroom could be like the International Space Station! Much like prison for young people in North America, the classroom is punitive: a “dungeon” meant for “detention.” As a result of this belief, McLuhan thought that teachers would do better not to teach content (and, yes, the Internet can supply it just as well, in some cases) and to teach method instead—not what to think, but how. It’s an appealing idea, and we certainly do have major problems with education as a system and what it is teaching. Today, CBC News reported that someone filling in for a professor at the University of Guelph allegedly publicly embarrassed or traumatized a student and his aide for their behaviours in their large class of 600 students. The story itself isn’t perfectly germane to this entry on my blog, but a comment from a reader is. In the comments section, someone identified as Walter Wilkins alludes to Marshall McLuhan by remarking, “The student/teacher ratio is one of the explicit features of what’s being taught and learned; the medium isn’t only the message, it’s a problem.” He doesn’t elaborate, but the “explicit feature” that he seems to suggest is that students, when there are so many of them, are just a number, and so professors might treat students insensitively or inhumanely. In McLuhan’s terms, as a medium, a large class may centre a lot of attention on the professor’s power, and often the large classroom or lecture theatre is designed like an amphitheatre, focusing concentrically on the speaker at the front and centre of the room. Having taught a course in a lecture theatre, I know the feeling of power, but I also know that it can feel like you have been thrown to the tigers for the amusement of a crowd that has power in numbers. During the Maple Spring in Québec, in 2012, my lecture theatre at McGill University was occupied by a group of protesters from various universities, demonstrating how easy it is to disrupt a classroom. Arguably, the biggest disruption to the classroom today is the Internet in all its forms, but especially social media. In looking for metaphors of the Internet, I was led to Star Trek’s George Takei, who made this analogy: “Social media is like ancient Egypt: writing things on walls and worshiping cats.” The joke about the cats (which is funny 'cause it’s true, to quote The Simpsons) cues an ironic reading of the rest of the quotation: Social media as a singular entity is not all that old, it’s not all that civilized, it’s as much like graffiti as other forms of writing, and, yes, it’s where we idolize beautiful animals, including humans, or just show off all the gross shit they’re involved in. Takei was smart. In barely more than a dozen words, he offered a little lesson that expands even as it entertains. McLuhan would have liked it. In contrast, the tendencies of social media to elicit instant responses and to limit the length of responses (at least in the case of the tweet) are inherently anti-intellectual. Drawing from yet another source from the 1960s, Daniel Rigney learns from Richard Hofstadter’s Anti-Intellectualism in American Life (1963) the most dangerous kind of anti-intellectualism: unreflective instrumentalism, “the dominant ideology of advanced industrial societies and doubly dangerous because its technocratic assumptions are virtually invisible to the unreflective eye. The efficient pursuit of unexamined ends is now arguably the dominant form of anti-intellectualism” (447). Point. Click. Like. It’s quick and responsive, but we need more than that. We need the classroom of our minds to be “out there” a little farther, closer to the critical distance of the International Space Station. As a classroom the size of a football field, it's big, but there aren't a lot of people up there, so they aren't only numbers. And they have lots of time to think. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “The Classroom as International Space Station.” Publicly Interested, 17 Jan. 2018, www.publiclyinterested.weebly.com. Dear Premier Dwight Ball and Minister Eddie Joyce,
Partly because of China's plan to stop buying recyclables from countries such as Canada by the end of 2017, there is a new and urgent need to stop wasting so much plastic and start banning single-use plastic bags. I am writing to you today to support Municipalities NL, which is calling for a ban on plastic bags because mayors around the province do not believe that a ban is possible without your help. Plastic bags, especially for groceries and other shopping, are harming us, other species, and our environments. If we can't recycle them, we have to ban them. We have proven that we can't recycle them effectively; 91% of plastic is never recycled. In some places, such as most of Canada, we try to recycle bags by collecting them and often by shipping them to China, but we leave a carbon footprint from the transportation and the energy needed to remake the plastics. This is one of the reasons why we haven't started a recycling program for plastic bags in Newfoundland and Labrador. And so we have to ban them. Even when we try to divert the bags to a nearby landfill, we fail miserably. Images have been circulating of the "Plastic Bag Forest" near the scenic East Coast Trail and Robin Hood Bay—the trees acting as a filter to catch airborne plastic bags. Whales have been found dead with many plastic bags in their stomachs—in one case, 30 bags. For many species, like up to 90% of sea birds and presumably including people, ingesting plastic has become inevitable; this summer, a new plastic-ridden ocean zone as big as Mexico was discovered in the Pacific. You read that correctly: as big as Mexico. There are several other massive zones of floating plastic in the world's oceans. There is no other explanation except that humans are laying waste to land and sea. We behave so abhorrently for a lot of reasons, but I refuse to believe that it's simple ignorance or a lack of conscience; I think we do it because it's traditional to a capitalist society to accept the idea of surplus value and thus, maybe illogically, of waste; and, more important, it's convenient. If you do propose a ban, many people will object on this reason alone. When the ban came into effect in California, I saw a man on the news who said that no one had the right to make his shopping more difficult. If we can't convince him to change his behaviour as a consumer, we need to change the behaviour of suppliers, such as grocery stores. We can all learn that it's easy to carry reusable bags and use them for most of their shopping. Meanwhile, I am so tired of our inaction on plastic. (Bagged! In a previous open letter, I wrote to major airlines to find out why they don't recycle plastic cups on flights into Toronto, Canada's busiest airport.) Yet we have alternatives. I fold up a small recycled-plastic bag and put it in my knapsack for those times when I'm not planning on going to the grocery store but do anyway. We can leave fabric bags in our vehicles and bring them into stores with us. Now, an Australian initiative called Boomerang Bags has come to St. John's (and all over Australia, the United States, and elsewhere), and they leave recycled cotton bags to be borrowed and returned at many different stores, such as Food for Thought downtown. I would love us to be leaders rather than followers of Australia and innovative cities like Montreal, but there is no shame in gaining confidence from someone else's good idea. With the Green Party starting to find support in the Maritimes, and with several newly elected progressives on the City Council of St. John's, Newfoundland and Labrador might someday soon have more politicians who are listening to the many citizens who believe that we are failing future generations—not only people but other animals: whales, sea birds, polar bears, sea turtles, and even a lobster caught last month in New Brunswick with a Pepsi logo nearly fused into its claw. Don't we care? We need to act. Please write a new law that will ban plastic bags here too. Many will gratefully support you. Sincerely, Joel Deshaye PS. While we're at it, we should create local industries for recycling what we can't ban, such as glass—an easily reusable and recyclable material. Why can't we do that here? Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “Bagged: It's a Big Job, but Someone Needs to Ban Plastic Bags.” Publicly Interested, 3 Dec. 2017, www.publiclyinterested.weebly.com. On Tuesday (coincidentally the title of one of my favourite songs by the Men Without Hats on their gorgeous synth-pop album Pop Goes the World), I was invited to The Ship pub to give a talk for the Department of Philosophy’s Public Lecture Series. The 40-minute talk was on the philosophical concept of moral luck as seen in Clint Eastwood’s Dirty Harry films. This entry of the blog is a highly compressed version of the talk, which was in itself a shortened version of an essay published earlier this year in the open-access, peer-reviewed journal Film-Philosophy.
At its simplest, moral luck is a factor when we judge people “responsible for events that are not entirely within their control” (Gregory par. 25). The philosopher Bernard Williams coined the term “moral luck” at Cambridge University in the very late 1970s, and he thought it would seem like an oxymoron or a contradiction at the time (251), but he and others have since shown that—yes—luck does matter morally. One of the classic examples is of two drivers: one passes a stop sign without stopping and nothing happens, and another also passes a stop sign without stopping—but hits and kills a pedestrian walking through the crosswalk (Nagel 25). Although the collision is bad luck, we want to judge the driver who killed the pedestrian, not the luck, because luck has no moral agency by itself, right? But luck seems to have made the difference. We also want to judge the driver who killed the pedestrian as worse than the driver who didn’t. Williams and Thomas Nagel were writing about moral luck around the same time, but neither mentions Dirty Harry—but the first Dirty Harry film in 1971 happens to use one of the examples that they would use later in the 1970s and early 1980s. The main character, Harry Callahan, explains that his wife died when struck by a drunk driver. He rationalizes her death with these words: “There was no reason for it, really.” From this, I assume that the death of his wife helped to create the Callahan we know by adapting him to the unpredictability of others. He is highly tolerant of luck. I think that he believes that the morality of the luck depends on others, which is why he is often casually willing to allow other men to decide whether to escalate violence. I’m going to try to explain moral luck through a few movies in the series that many of you will have seen: the five Dirty Harry movies, starring Clint Eastwood. Because most of us probably haven’t seen Dirty Harry in a while, I’m going to remind everyone about how it goes as we start thinking about moral luck and how we know who we are, and which invites a trio of big words: epistemology, which is the study of how we know something; ontology, the study of being, of who and what we are; and existentialism, which is a belief in being defined by our free will and responsibility. I won’t dwell on these concepts; my purpose right now is to show that the first Dirty Harry film is unexpectedly ambiguous and full of subtle hints about philosophical concepts of who we are, and how we know what we are. This ambiguity can be interpreted not as a hidden ideological message but as respect for the intelligence of the viewer. Maybe at other times I’d be less generous, but I think that Dirty Harry has, in a sense, a both conservative and liberal respect for our own free will, as in classical liberalism, our ability to think and interpret for ourselves. Unlike so much of today’s media, the Dirty Harry films seem like they’re in dialogue with a variety of political views. In Dirty Harry, Callahan wants to bring criminals to justice without interference from what he perceives as an overly liberal police department and government. He seems conservative, today, but the film itself, with his name on it, seems liberal in preferring the attitudes and actions of African American criminals over those of white criminals. I’ll return to this deliberate contrast at the end of this entry, but first let me describe the most iconic scene. At the start of the film, Eastwood’s character defeats a series of bank robbers of African descent. The first man to shoot at Callahan (mildly hurting his leg) and to be shot by Callahan is about to retrieve his shotgun when he is targeted again, at close range. Eastwood then delivers the famous lines that I mentioned earlier: “I know what you’re thinking. Did he fire six shots or only five? Well, to tell you the truth in all this excitement I kinda lost track myself. But being this is a .44 Magnum, the most powerful handgun in the world and would blow your head clean off, you’ve got to ask yourself one question: Do I feel lucky? Well, do you, punk?” Satisfied that all is under control, Callahan begins to walk away, but the robber calls after him, “Hey! I got to know.” Notice that both the robber and Callahan say “I know” or “I got to know,” signalling epistemology or how we know. Callahan returns, aims at the man, and pulls the trigger—but the gun has no more bullets. By seeming to involve chance in the moral work of stopping a criminal, Callahan invokes moral luck, and the question of who is responsible. Whether Callahan is bluffing by saying that he “lost track” of the shots he fired is a related question. My published essay does not mention the fact that Callahan repeats this speech with a different outcome at the end of the film, but I’ll return to this repetition at the end here too. At this moment in the story, his potential bluffing can be interpreted as surprisingly epistemological and ontological, about knowing and being. It’s involved in Callahan’s moral ambiguity. The scene of the robbery offers a rather dizzying array of potential meanings, and it requires some close attention before we hear more about how Eastwood involves luck in representations of heroism. It would appear that Callahan is ready to murder the subdued man because of an implied question: “I got to know” how bad you really are, or if your gun is still loaded. But his desire “to know” has more to it than that. To know is not to be deceived. It’s shorthand for knowing the truth, and so the final pulling of the trigger is even surprisingly existential. The robber might be asking his question to know a truth about himself, in addition to the more obvious possibility of wanting to dare the policeman to kill him. If he’s asking about himself, it’s about whether he is a good and virtuous man despite the robbery. If Callahan is bluffing and knows that the gun is empty, his pulling of the trigger is, first, a sign of his merciless sense of humour. Second, it’s a judgment. Yes, we can imagine Callahan thinking, you backed down, so you’re good enough not to die right now. Callahan also implies that, unlike the robber, he knows himself to be good and would not fire a loaded gun at a defenceless man. He refers to “the truth” here in a moment that is wryly confessional, especially when he says, “I kinda lost track myself,” but he could be lying about having lost track. Callahan’s attitude and his personality do suggest that he is bluffing: he exudes self-control, or at least confidence—but then there are so many times in the Dirty Harry films when his behaviour is so reckless that he could not possibly know in advance the results of all his actions. When he asks the robber if he feels lucky, for example, Callahan knows at least that he has already won, even if he does not know the extent of the damage that he might cause in winning. What if Callahan is not bluffing? If Callahan really did lose track of the number of shots he fired, then he's playing a version of Russian roulette that does not risk the life of the person holding the gun. Notably, he does this only when asked, “I got to know.” Impulsive and irresponsible, he projects some of the responsibility for his action onto the robber, as if the robber’s guilt or innocence could be decided not by the robber’s violence, because that was already settled, but by his taunting or his curiosity, “I got to know.” The modified Russian roulette in Dirty Harry implies that the action of killing is the responsibility not of the policeman but of the other man, or of luck. This theme suggests that in the Western and cop movies the hero acts according to the morality or immorality of others, and that his own character is not intrinsically moral or immoral, because he applies his ethics to a limit and then he refuses to assume further responsibility. In other words, he might be saying, hey, I’m not really responsible for killing a man who dared me to pull the trigger. The potential for self-deception implied here might call to mind Jean-Paul Sartre’s concept of bad faith, which refers to self-deception or inauthenticity. If you want more on that, please read the published version of this entry. In brief, as I see it, the moral character of people with bad faith is related to their existential dilemmas of agency—and Callahan is hardly in an existential dilemma. You don’t look at him and think, here is a man who is searching his soul, wondering how to act. If he is deceiving anyone, it is another person, not himself—except that he might be mistaken about the number of bullets in his gun. Much like the gun, moral luck—to me—is more political than some of the debaters admit. Their points seem to assume, without ever saying it, that moral luck aligns with a liberal or leftist view: that criminals, like anyone, are often the result of circumstances beyond their control that may be described as bad luck. So, when the robber in Dirty Harry seems to want to know if he’s virtuous in spite of being in a robbery, the question is political: liberals tend to see past a person’s crime toward the conditions that led to it, such as poverty, whereas conservatives tend to focus on the deed itself, and then judge accordingly. These generalizations are up for debate, of course, and we could also debate whether Callahan wants to be involved in them—but I think he does. My view here is that moral luck is not all that liberal as a concept, because it enables Dirty Harry to coerce the bad guys into a mimicry of free will and responsibility, and this coercion is not a liberal style of rehabilitating criminals. Brian Rosebury at the University of Central Lancashire, who comes out of literary studies into philosophy as I do, is more worried than I am that moral luck seems to align with a liberal view. Rosebury’s concern is that “we do not choose our acts either, just because we do not choose what causes them” (508); similarly, maybe we can’t judge anyone, ever, because everyone is created “by biological luck and developed by cultural luck” (292). If this alleged moral relativism is truly a problem, and if it is politically liberal in orientation, as in Rosebury’s allusions to social constructedness, why would a figure as conservative as Callahan invite luck to determine his moral judgment or morality? One answer comes indirectly from professor Claudia Card of the University of Wisconsin, who joins the debate in 1995 with her book called Unnatural Lottery: Character and Moral Luck. Her book openly acknowledges the political relevance of moral luck. Rather than put a Sartrean emphasis on free will, Card puts will into the context of political, social, and economic limitations—such as repressive sexual laws, sexism and racism, and poverty—that people must work against to be responsible. Card focuses on one of Nagel’s four related kinds of luck, one I haven’t mentioned yet, that has been called “circumstantial luck.” Her first sentence, in fact, is that “[m]uch of the luck with which this book is occupied attaches to politically disadvantageous starting points or early positionings in life” (Card ix). Partly because she is not a relativist, Rosebury’s review of her book is positive. Card explains her own not-relativist-but-liberal position when she says she does not want “to let us off the hook morally by showing that fate determines who we become. I am no fatalist [says Card]. [She says,] I find luck influential but not ordinarily determining. It narrows and expands our possibilities, often through the agency of others over whom we have no control and often through the medium of social institutions” (x). For Card, and seemingly for Rosebury, luck can be accepted as an influence but not as the determiner of someone’s morality. So, through Card I might answer my own question. Perhaps Callahan invites luck to determine his morality to suggest (perhaps especially to liberals) that mitigating circumstances are not as important as they might seem and can still be strictly controlled: if there is a bullet left in his gun or not, he has demonstrated how effective a strong and punishing response to crime can be. That’s usually a conservative view. Now, as this entry approaches a conclusion, I want to shift our attention to the men defeated by Dirty Harry, men I’m going to call, ironically, the lucky punks. There’s a pattern in how Eastwood’s characters from the late 1960s through the end of the Dirty Harry series speak with the lucky punks. They are almost always African American men who evoke American racial politics from the era of civil rights to Reaganomics. Let me remind you that, in Coogan’s Bluff, Eastwood’s character was ready to stab a man and, when he’s asked if he would have done it, he says, “I don’t know. That was up to him,” which is the prototype of Callahan’s “Do I feel lucky?” It is also the origin of a third statement, when Callahan says, “Go ahead. Make my day,” in Sudden Impact, from 1983, the fourth of the five Dirty Harry movies. In all three scenes, Callahan’s foe is a black man; each one commits a crime, but each one backs down, luckily for him and for Callahan. If you're not convinced by my argument about the first shootout, above, think about the pattern of these three scenes. I would go so far as to say that they're a stereotypical and wishful commentary on American race relations during the time of the black power movement. This movement was meant to address civil and socio-economic inequalities, such as systematic or systemic racism and its impoverishing effect on Americans of African descent. Coogan and Callahan project responsibility onto what they might assume is blind luck (a synonym for chance that, like the free market, is not supposed to be prejudiced), whereas the pattern of skin colour suggests that it is definitely not blind. I’m fascinated to see that Callahan is represented as poor or at least cheap throughout his first story—cheap pants, hot dogs for lunch and supper—and maybe his lack of money gives him sympathy for the black men who rob the bank. Still, Eastwood’s characters seem to be telling black men (and I’m aghast at the message), “Quit stealing—and be responsible to yourselves and to us.” Upholding the generally anti-governmental position of these films, Callahan and Coogan would probably not be willing to supply the coin to pay the cost of fairer government and justice. Here I have to admit that Callahan uses his “Do I feel lucky” speech twice in the film, once with a black man who backs down and once with a white man who chooses to try to get his gun, and Callahan shoots him. The black criminal is a bank robber, and he is spared. The white criminal is a serial killer, kidnapper, rapist, and extortionist, a much worse criminal, and he is killed at the climax of the film because Callahan does have one more bullet in his gun the second time. We realize then that Callahan’s “Do I feel lucky” speech is a script, possibly one he has used more than once before. If he has used it more than once before, then he probably was bluffing and was in control when he stopped the bank robbers. Maybe it wasn’t moral luck, and in fact Claudia Card argues that “[t]aking responsibility [...] is likely to involve consciously developing an integrity that does not develop spontaneously” (24). I wonder, then, if making others responsible is usually going to be scripted and not “spontaneous.” Ultimately, however, I can only interpret what the film offers me, and there are only two “Do I feel lucky speeches,” and the real script is the screenplay that the writers gave to Clint Eastwood—the actor not the character—and these writers probably realize that there’s an aesthetic balance in having only two “Do I feel lucky” speeches, and there’s dramatic irony because the serial killer doesn’t know that Callahan is basically comparing him with the bank robber. I doubt that Callahan’s just repeating the same script in every showdown, going throughout the city, asking, “Do I feel lucky? Do I? Do you? How about you? Scale of one to ten…” More important, the political commentary seems to be that, on the one hand, that whiteness is associated with the worst crimes (quite a left-leaning admission in the North American context, these days); and, on the other hand, that the white criminal is not subject to luck and cannot be forced to take responsibility, but the black criminal is and can. For Eastwood’s characters, black men must be pressured to conform to expectations of non-violence and obedience. But, unlike the white criminal in the first movie, at least the black criminals have respect for their own lives and are willing to stop violence—and I want to take this detail as the film’s respect for African Americans, even though I can’t entirely. While the filmmakers represent black men with consistent symbolism related to luck throughout the Dirty Harry movies, not all of these men are stereotyped as criminals, and I have one final example to show that moral luck is connected especially to black men in these films. There are many white criminals in these films, and many of them are also stereotyped as symptoms of liberalism, as with the murderer and his girlfriend in Coogan’s Bluff. Sudden Impact plays on our expectations of seeing threatening black men in Dirty Harry movies, but then it introduces Horace, played by Albert Popwell, as an ally to Callahan. It’s interesting to see the sequels respond, or seem to respond, to political critiques of the earlier films, because this kind of listening suggests a style of conservatism that is still open to thoughtful debate. In Magnum Force, the second of the five, which came out in 1973, Callahan’s partner Early Smith, played by Felton Perry, is also a black man. One of their conversations suggests that Smith is aware that Callahan takes risks with people of his colour. In a scene where the two policemen are following suspects by car and beyond their jurisdiction, Callahan says that he wants to confirm a hunch and decides to antagonize the suspects—but Smith doesn’t want to be caught in the middle of a gunfight. He says, no, “I don’t want to be winning bets for anybody.” His reference to “bets” implies that he would agree with my argument that his partner uses others while depending on luck to seek justice. Callahan would disagree; he persists and says, “I’ve never been wrong yet, have I?” But later, after Callahan warns him to take care of himself, the corrupt policemen assassinate Smith because of his partnership with Callahan. Magnum Force suggests that Callahan’s hunches are never wrong, contrary to my argument about his partial uncertainty in the first Dirty Harry film, but he is unquestionably sometimes wrong: Smith dies because Callahan takes risks and cannot take responsibility for everyone; other people, including good people who are on his side, are forced to take responsibility for his actions. Because he cannot save everyone, he is not God, and if he is not God, his claims to certainty must sometimes be in error. He might be on a lucky streak, at least as far as his own survival goes. Although I’ve entertained other points of view, I’ve argued that Callahan was being honest about his partial uncertainty in the first “Do I feel lucky” showdown—though he’s probably confident that he’d win regardless. This discrepancy is ethical and political. He and his prototype Coogan have a common mission, not only to get their men regardless of the law but also to offer a final choice to the enemy, who may be punished if he continues to be violent. Their ethical shortcoming is that their respect for the African American men who confront them is limited to these men’s potential to be coerced into responsibility. In another way, however, the black robber is the most interesting character in the first Dirty Harry film, because he is the one whose unpredictability—and his potential to make a decision—is the truest unpredictability, and the truest potential. The robber is the one character who might want to know himself better. Callahan might want to know others but seems entirely confident in who he is, perhaps too confident. In contrast, we expect that the serial killer is going to try to kill again. He’s predictable. Because of this expectation, moral luck is more a factor in Callahan’s and the robber’s decisions. They are the interesting characters, and moral luck can be a plot device that creates suspense through the unpredictability of these characters. Works Cited
Here in sunny San Diego for the PCA-ACA 2017 conference, I reserved an evening to go see Lolita Chakrabarti’s play Red Velvet (2012; here and now directed by Stafford Arima) at the Old Globe Theatre in Balboa Park, and I discovered a fascinating study of dramatic irony as a parallel of the insidiousness of racialization. What I mean is that the play is about how race fools us.
Dramatic irony is when the audience knows something that a character doesn’t know. In this case, the character is the historical figure Ira Aldridge, an African American actor. He was the first black man to play Othello on the London stage. In Red Velvet, he’s trying to promote the movement toward naturalistic acting but is himself the over-actor incarnate. Albert Jones plays Aldridge to the contrary of the fictional and perhaps historical Aldridge’s preference for “domestic” or naturalistic styles of acting. Ben Brantley in The New York Times reports that Adrian Lester played the role similarly in 2014, so that the actor “exudes the scary, outsize presence of the barnstorming stardom of another time.” Aldridge’s controversial performance in London in 1833 coincided with the final major legal milestone in ending slavery in Britain and its colonies. Until then, white actors played black characters in blackface—and so, in an ironic twist (like that of Patrick Stewart playing Othello in an otherwise all-black cast), Aldridge plays King Lear in whiteface at the conclusion of the play, speaking these colour-sensitive lines from Act V of King Lear: . . . . . . . . . . . . . . . . . . . . . . . . . . They flattered me like a dog; and told me I had white hairs in my beard ere the black ones were there. To say “ay” and “no” to every thing that I said! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Go to, they are not men o’ their words: they told me I was every thing; ’tis a lie, I am not ague-proof. An ague is an illness, especially a fever, and so Lear is calling attention to various possibilities, including that he is confused—by the lies, by the “madness” (III,iv) that he worries is upon him. A mad character is probably always an instance of dramatic irony, at least in those moments when the character is not aware of the madness. In Red Velvet, I think the madness is the idea of race itself—but I’ll come back to that. Aldridge is also calling attention to the weaknesses of his body in lines that Chakrabarti seems to be repurposing. When Lear compares “white” and “black” hairs, he means age and how it is symbolized—here, that white hair is a symbol of wisdom, I think. When Chakrabarti’s Aldridge’s Lear says these lines, however, he signifies that race, like traits such as wisdom (which Lear did not consistently have), is not essential to anyone. Race is partly a bodily performance, especially as Red Velvet dramatizes Aldridge, and partly an attribution that can be manipulated for reasons good and bad. (Coincidentally, the San Diego Museum of Man, just steps away from the Old Globe in Balboa Park, is presently curating an exhibit called “Race: Are We So Different?”) The crisis of Red Velvet is that Aldridge’s critics, the writers who review his play in the newspapers, echo stereotypes of black men as (often sexually) aggressive and thus a threat to white virginity and whiteness-as-property, as in the theme of inheritance suggested by the play’s ailing white father and his son. (For more on the latter, see Cheryl I. Harris’s “Whiteness as Property” essay from the Harvard Law Review.) Aldridge has already seemed to prove his critics right in advance by rehearsing and performing the strangulation of Desdemona too “realistically,” which means according to the commonly held racial stereotype and the reality presumed by the critics. He then attempts to strangle his French manager, an ally and friend, when the Frenchman finally concedes to public pressure to remove Aldridge from the role. Unlike most of his colleagues, Aldridge is presented as an over-actor whether on stage or behind the scenes in the dressing room, and in the program Jason Sherwood, the set designer, comments that the superimposition of Aldridge’s private life (backstage) and public life (centre stage) is crucial to his character as imagined by Stafford Arima. Indeed, Aldridge is almost entirely “public”: projecting from the top of his voice, preoccupied with gesture, vying always for position and attention. One implication of Jones’s performance is that one’s persona invades one’s private life, a commonplace that informs much of my work on celebrity. As I’ve recently written in the context of racialization in The Journal of Commonwealth Literature, it is also that one’s public face can turn an “about face” on the self, allowing social norms to define a person. So, when the stage’s rotating proscenium (yes, a prop that expensive) sends us back to the present near the end of Aldridge’s life, the play ends with his Lear’s exhortation against the “lie” of the public’s and the court’s (and his family’s) support for him, juxtaposed against the flashback to his manager’s withdrawal of support following the racist reviews of his Othello. The play thereby emphasizes the struggles in the historical Aldridge’s remarkably successful career, set against the backdrop of Britain’s very mixed, ambivalent movement toward abolition from the late 1700s to 1833 when, finally—after about a generation—Britain stopped trading in slaves. If a viewer wonders why Aldridge is presented with something less than total sympathy, it’s because the play appears to be made to dramatize the insidious effect of socialization on one’s private life. We know something that the fictional Aldridge does not know: that he is unwittingly the exaggerated product of the racism of his critics, while he believes he is being authentic. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. "The Dramatic Irony of Race and Red Velvet." Publicly Interested, 16 April 2017. In the news again today, Senators are arguing about a controversial bill to change the national anthem, but the politicians and others who say that there is a grammatical problem with the proposed revision are wrong.
I don’t have any objections to the proposed revision. It would be different if the government were trying to revise one of the objectional poems by Irving Layton. The anthem is official and meant to be sung together to encourage citizens to feel that they are part of an imagined community, so inclusive lyrics are a good idea. The proposed revision, in fact, is still too martial and religious for my taste, but that's a topic for a different post. This post is about grammar and controversy, that old pair. Here is the start of the anthem—objectionable to someone in every line, I know, but culminating in the one under debate today: O Canada! Our home and native land! True patriot love in all thy sons command. I can explain the first two lines, but that’s probably not what you want right now.* You want the debate about the third one: “True patriot love in all thy sons command.” To be gender-neutral, Bill C-201 proposes this revision: “True patriot love in all of us command.” According to Senator Michael MacDonald in the CBC News story linked above and below, various people, supposedly including English and linguistics professors, have agreed with MacDonald, who protests that the revision is grammatically incorrect. It’s not. It’s a complete sentence, albeit in archaic syntax: an example of the re-ordering of words that is sometimes necessary to position rhymes at the end of lines, as with “land” and “command.” Adjusting the syntax but keeping all the same words, the line is still a grammatically correct sentence of the imperative type: “Command true patriot love in all of us.” The song begins by addressing Canada as if the country were a person (in a technique called apostrophe: "O Canada!"), and that person carries over to the third line as someone who could command someone else. It’s correct (if not politically) in the official lyrics, too: “Command true patriot love in all thy sons.” When Senator Michael MacDonald says, “The proper and only acceptable pronoun substitution for the phrase ‘All thy sons command’ is ‘All of our command,’” he is neglecting another “proper” reading: that Canada is the commander. His interpretation is fine, more or less, and we can discern it by adjusting the syntax again: “All thy sons command true patriot love.” In this case, we can interpret the line to mean that our boys are growing into authority by telling others, or inspiring others, to care for them nationalistically. Another interpretation is that we are in control of our own love. Also in this case, however, the preposition “in” mysteriously disappears. In my view, MacDonald has to explain the use of that preposition (or why it can just vanish) before he claims that his reading is the “only” one. * The song begins with what’s called an apostrophe—not the punctuation mark, but an address to someone or something. It’s part of a tradition of addressing sublime things, sometimes including the nation, especially if the nation is ruled by a king and the King is close to God, who (you might say) commands sublimity. It’s sometimes called the “apostrophic O,” as in Percy Bysshe Shelley’s “Ode to the West Wind." Here at home, the addressee is Canada, and the first line of the song is an incomplete sentence only if we neglect other types of sentences. The second line is the same: technically a fragment (no subject or verb) but excusable because of its exhortative, exclamatory role. If someone bumps into you at the grocery store and he’s the one to yell “Hey!”, you can’t say it’s not a grammatically correct response. How to cite this blog in MLA format: Deshaye, Joel. “The Grammar of the National Anthem in Canada." Publicly Interested, 4 April 2017, www.publiclyinterested.weebly.com. “Let’s have some decorum,” President Richard Pryor says in a White House press conference just before he jumps into the crowd to attack a journalist for asking a racist rhetorical question about his mother. In this 1977 sketch from the short-lived Richard Pryor Show, Pryor could well have been commenting on recent news about the relationship between the president and the media in the time of Donald Trump.* In the sketch, Pryor imagines himself as the 40th president of the United States—a position that went in fact to Trump’s touchstone, Ronald Reagan, whose so-called Reaganomics started a trend in exacerbating the American racial-economic inequalities that Pryor cited so often in his comedy routines.
When President Pryor channels generic political spin and defends the neutron bomb as “a neo-pacifist weapon,” I still hear Trump, though Trump would never use the Grecian prefix. Trump is less audible (almost an impossibility today) when Pryor’s critique of race emerges. Responding to a question about funding for the space program, Pryor says, “I feel it’s time that black people went to space. White people have been going to space for years, and spacing out on us as you might say. And I feel with the projects that we have in mind we’re going to send explorer ships to other galaxies, and no longer will they have the same type of music, Beethoven, Brahms, Tchaikovsky. Now they’ll have the Miles Davis, Charlie Parker....” If only Pryor were still alive to comment on Trump’s thinly superficial (and faint) praise for the long-dead nineteenth-century abolitionist Fredrick Douglass: “Frederick Douglass is an example of somebody who’s done an amazing job and is getting recognized more and more, I notice.” I bring up Richard Pryor and Donald Trump because we watched the 1985 version of Brewster’s Millions last weekend, or perhaps the previous weekend (a blur in busy times), continuing a series of viewings focused on Reaganomic movies, and I’m compelled by the resonance that this movie has with the current politics of the United States. Can anyone be elected in the United States—regardless of sex, gender, race, class, age? Americans are not alone among people willing to elect the seemingly unsuitable and unqualified, but the election of Trump is nonetheless remarkable. Brewster’s Millions asks a more specific question: Would Americans elect a black millionaire who is otherwise unqualified for public office? Brewster’s Millions is only one of many adaptations of a turn of the twentieth century novel by George Barr McCutcheon, which became a play and a series of films before Walter Hill adapted it and found Pryor and John Candy to play the leads. At least in this version, the story involves a black small-time baseball player, Montgomery Brewster (Pryor), whose elderly white relative (surprise!) dies and bequeaths him $300 million—but only if he can spend $30 million in 30 days without accumulating assets, giving more than 5% to charity, or destroying things that are “inherently valuable” such as works of art. (This unlikely plot recalls Steve Martin’s 1979 movie, The Jerk, when Martin plays a white man who thought he was black, realizes he’s white, becomes a millionaire who squanders his money, and then re-integrates with his adoptive—but now rich—black family.) The lesson is supposed to teach Brewster to hate spending and become frugal. It’s ironic, of course: the premise that conspicuous consumption might lead away from excess to moderation. Brewster sets to work hiring people—valuing the labour of typically under-paid people, with the exception of a few ritzy interior designers, lawyers, and money managers—but also has two inspired moments of how to spend money without gaining anything material. First, he buys the most expensive collector’s stamp in the world, then uses it as postage on a postcard. Second, and this one is special, he decides that the best way to waste other people’s money is to run for office. His campaign for mayor is really a campaign against the establishment, so his slogan is “None of the Above,” a far cry from “Make America Great Again.” But in other ways his campaign is a lot like Trump’s was. Spin off your stardom from tabloids / Reality TV to municipal / federal politics. Buy votes shamelessly. Be the third way (as ironic as it is to say that in Canada where the third way is to the left). Have little respect for office and be honest about it, or seem to. Announcing his candidacy for mayor, Brewster says, “What I’m saying is, only an idiot would vote for me!” His follow-up, what he calls “the bottom line,” is that “I’m here to buy your votes.” Later, at a big rally full of supporters, he declares that he is there “to see to it that neither of my opponents, nor me, win the election! I want to ask the question: Who’s buying the booze? ... And who’s trying to buy your vote? And who are you going to vote for?” The rallying cry is “None of the Above!” But the crowd really means him, and he later drops out of the race to prevent his actual winning. So Americans would elect a black millionaire! At least as mayor. And if he's funny enough. A few of my friends now have said they plan to weather the Trumpnado by sitting back to be entertained while waiting for him to lose an election. But that's exactly what Trump wants us to do. He hates being criticized, but he loves to entertain. Pryor’s critique of star politicians and their fans is that the masses don’t really care about the message as long as they are entertained, e.g., with “the booze.” It’s a classic—and class-based, Marxist—view of the public, one to which I will return in a moment when I ask whether the film itself undermine’s Pryor’s critique. First, consider that, because Brewster is entertaining, his public ignores his message of not supporting the establishment. Not supporting the establishment was perhaps the key premise of Trump’s campaign against the much better qualified Hillary Clinton. Pryor's satire here reveals that rich people like Trump are the establishment, just as much as political lineages such as the Clintons, the Bushes, and the Kennedys are. The people who vote for Brewster or Trump are the “idiot[s],” Pryor claims. Could any idiot be elected in the United States? I reserve judgment on whether Trump would qualify; my point is that any millionaire could be elected. Brewster’s Millions shows us a world in which Americans vote for the money, possibly without realizing how it is the driving force of the corrupted politics that they want to oppose. At the conclusion of Brewster’s Millions, Brewster does claim to be sick of spending money, but he does everything he can to get the $300 million—raising my question about the coherence of this movie’s satire. It ends with Brewster a millionaire without rules on how to spend his millions. In that sense, it promotes unregulated capitalism of the type that Trump supports. It does not promote the legitimacy of black men and women as entrepreneurs or in politics. Further, this unregulated capitalism does have one apparent rule: that white men govern the black men’s money. The 1985 version of Brewster’s Millions can be seen as a pedantic, racially condescending film, because the white man has to train the black man in how to handle money. Worse, the film shows only the training, a frantic montage of conspicuous consumption akin to later hip hop videos, afore the bling became satirical too. Brewster’s claim to be sick of spending money is such a passing gesture, such an ambiguity. Is the mereness of the gesture a sign that Brewster has not learned the white man’s lesson, perhaps deliberately? If he had truly learned the lesson, would he have been so desperate in the final minutes to get the $300 million? It’s a double bind. Either he plays by the rules of a white capitalist economy, or he remains an unemployed baseball player who has humiliated himself as entertainment before the masses. But maybe this is what Prior intended: to show, not only in the film but in its structural relationship with the economy of the culture industry, that black men in the United States are still not taken seriously, even when they are making the most serious of jokes. * I’ve decided not to call it “the era of Donald Trump,” preferring to allude instead to the title of a Gabriel García Márquez novel. How to cite this blog in MLA format: Deshaye, Joel. “Presidents Pryor, Trump, and None of the Above." Publicly Interested, 19 February 2017, www.publiclyinterested.weebly.com. This morning, after yesterday’s American presidential election of the businessman Donald Trump, I went looking for perspective. I wanted to help myself understand more fully why many Americans voted for him. I found a somewhat unexpected explanation through the mathematician and philosopher David Schweickart. In the title of an essay, he claims that “Yes Virginia, There Is an Alternative” to the global capitalism represented by rich elites such as Trump. Coincidentally, my very first post on this blog was an open letter to Justin Trudeau, one that alluded to the child’s letter to Santa Claus that received the famous response from the Republican outlet the New York Sun, “Yes, Virginia… [there is a Santa Claus].” I don't believe in Santa Claus, and I don't believe in Trump, and I don't like Schweickart's newly minted socialism, which—the day after the election—feels just too close to one of Trump's very few ideas, even though it's not. And so today’s post returns to the rosy nostalgia of the Sun’s letter in the context of Trump’s blatant mischaracterization of Hillary Clinton as the rich elite and himself as the outsider to the system.
Trump himself said, and I paraphrase, that America needs not a politician but a businessman—as if there was never a politician who was a businessman first. Many voters echoed this rationale for electing Trump: that government is corrupt and that the United States needs a leader who “isn’t owned by anybody,” and someone who will fire his underlings and thereby increase accountability. But this idealized “boss not politician” identity reveals an disheartening confusion of economy and government: the mistaken idea that capitalism is somehow more democratic than elected government. (This confusion is partly what led to the popularization of the term "neoliberalism" to describe ubiquitous capitalism, i.e., capitalism that is now inseparable from democratic governments, following I think from Margaret Thatcher’s claim that capitalism has no alternative.) Even if it were true that capitalism allows any new competitor into the market and hence provides renewal of its leadership, it would not be true that capitalism is accountable to anyone. (Exceptions are few and far between, especially among transnational capitalists. I don't have a problem with most small businesses, though they be capitalist.) If you disagree with the beliefs and actions of the chief executive officer of the biggest business in the country, you cannot vote that person out. If you think that businesses are somehow better at managing their finances than governments are with theirs, look at the huge number of businesses, including some of Trump’s, that have bankrupted themselves, with negative repercussions on investment and employment. Americans are not entirely irrational to appreciate corporations and mistrust a government that is associated with police brutality; illegal, immoral, and costly wars; and surveillance, torture, and murder. The president is ultimately responsible for these problems, but the police, the military, and the spy agencies are not exactly “government.” I’d like us to remember the term “civil servant” when we think of government. The connotation of civility shouldn’t be forgotten, and servitude, though not a word that describes most workers in government, can at least connote a devotion to a cause. If we, anywhere, are serious about upholding democracy, good government has to be a cause, and we need to consider whether the fat cats are in government as much as in big business. Few of us today are devoted to our corporate employers, because corporations demonstrate little fidelity to employees and often benefit from precarious (yes, sometimes unpaid) employment. Schweickart addresses this comparison in his essay, remarking that among the top 25 incomes in the United States in 2009 was that of a hedge fund manager: $900 million. To tax his income so that it would be equal to that of the president of the country, his tax rate would have to be between 99.95% and 99.99% (Schweickart 174), depending on equalizing before or after the president pays his taxes. (It’s always his. The United States just missed its first opportunity to elect a woman and to realize, at least for another moment, equality of opportunity.) But Schweickart’s essay is weirdly neoliberal in that it accepts, completely, that capitalism should be a part of government. Or that democracy should be a part of capitalism, which is probably the more accurate way of describing Schweickart's suggestions. In his aforementioned essay and his book After Capitalism, Schweickart conceptualizes a form of corporate government called “economic democracy,” which he calls “our new socialism” (183). The innovation, Schweickart claims, would be to replace labour and capital markets (183) with capitalism by the people and for the people (i.e., profit sharing or “worker self-management of firms”) and “social control of investment” (184). As a result, his economic democracy “is also far more compatible with ecological sanity than is capitalism… Capitalist firms tend to maximize total profits. Democratic firms tend to maximize profit-per-worker” (187) and therefore would not expand unsustainably. I like most aspects of these ideas, but not the conflation of government and economy implied in "economic democracy," and anyway these ideas will not be realized at a transformative scale without the regulatory insistence of government, notwithstanding the successes of the Mondragon Corporation, a cooperative. I used to work both for Canadian Tire and the Royal Bank of Canada, both of which engaged in limited profit sharing, but they were hardly democratic institutions willing to change according to the results of a vote. Trump would never do it. When political allies vote for, or work toward, a politician who wants less government and more leadership by corporate fiat, they are forgetting how democratic government serves and protects them with a much higher priority than how corporations do. This ignorance or selective memory has various historical dimensions that can best be explained through Trump’s slogan, “Make America Great Again.” This imperative assertion is an order, in fact, that both reifies his authority and delegates accountability—a big problem with corporate governance. It suggests that now, the end of the Obama administration, is a time when American is not great. Greatness is the past—perhaps the so-called Golden Age of capitalism in the two or three decades after the Second World War. (Trump might well prefer a revolutionary era.) Trump’s echo of Ronald Reagan’s slogan ("Let's Make America Great Again") suggests that he can remember only as far back as the late 1970s and into the 1980s, around when a potentially sustainable capitalism (Schweickart 177-178; Featherstone and Miles 126) veered off the cliffs of insanity. Trump’s remarkably short memory is a sign that we live in a time that Mark Featherstone and Malcolm Miles describe as “a permanent present” (125) on the pretense (not theirs) that no alternative to capitalism means no change and thus no future. It is also evidence of Trump’s nostalgic desire, as Svetlana Boym might describe it, “to obliterate history and turn it into a private or collective mythology” (xv). Voters buy Trump’s economic rationale because it encourages them to romanticize the past rather than believe, as Hillary Clinton asserted, that America’s best days are ahead of it (maybe four years ahead). And, in this case, it’s easy to forget. It requires no work at all. The New York Sun advised Virginia not to think so much about questionable characters like Santa Claus, and its message—though seemingly winsome—is far too close to the anti-intellectual message of Trump and his most manipulative and manipulated followers. The editors in 1897 encouraged young Virginia, eight years old, to concentrate on “faith, poetry, love, romance” rather than wonder about the truth and even begin, in her innocent way, to do some research. How sad that she put her faith in the Sun! How ironic that Trump pointed fingers so often at the liberal bias of the media when this historical example is so aptly contrary. How hilarious to imagine Trump expressing a thought or feeling even remotely poetic. We in (North) America cannot trust “the” government when “the” means Trump and his corporate agenda, one premised at least in the popular imagination on the end of the separation of government and economy. And I am simply heartbroken that so many Americans could trust someone so unwilling to allow his deals to be scrutinized for their legality. And someone so evidently racist, in his plans to ban Muslims and build a wall against Mexico.* And sexist, in his admitted sexual harassment and his repeated misogynistic slurs against one of the most accomplished diplomats in the world. * See the It's All Narrative blog for a convincing explanation of the relationship between economics and racism in Trump's electoral victory. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “Trump’s Appalling Economic Democracy.” Publicly Interested, 9 November 2016, www.publiclyinterested.weebly.com. Recently, Jordan Peterson, a professor at the University of Toronto, helped to cause a minor scandal when he refused to use gender-neutral or accommodating pronouns with students who self-identify as other than “he” or “she.” The university remonstrated him—and then Rex Murphy came to his defence a week ago in The National Post. Yesterday, the professor had a major news outlet, The Toronto Sun, to publish his own essay. That Peterson is gaining publicity for a right-wing perspective should be obvious from the stated dislike of Marxism in his essay and his nigh inexplicable claim that people who want to change pronoun usage have “an intense resentment of anyone who has become successful for any reason whatsoever.” As a more-or-less leftist liberal with only a little nostalgia for the bygone conservatism of the Red Tories, I want to use my own admittedly (and helpfully) jumbled politics, and my position as a professor of English, to ask a simple question. How can we set aside the us-and-them politics of this debate?
Before I go too far, I want to say that if a student ever came to me and said, “I prefer the pronoun 'per'" or any other pronoun, I would use it, or, if I couldn’t remember it among all the options, I’d use the person’s name. Having some control over the words people use to define you is meaningful to your sense of identity and belonging. Here is one of my favourite poets, the insistently or at least consistently lower-case bill bissett, offering a similar opinion: . . . . . . . . . . can b myself he she thinks thn thats the feer that th punishment will cum fr sure if he she cant leev her call her him n start packing Here bissett is also radically objecting to the authority of standard English, while offering the he/she option that many people today would change to “they.” Who would have thought that bissett’s writing would ever be old-fashioned in the eyes of other radicals? But rather than do any research right now to answer this question, I also want to say that I note as “incorrect” the grammar of most students who use “they” when referring to singular nouns and names. When a student’s writing is already excellent, I try not to count “they” as a technical error. Most students, however, are not using "they" for political reasons. Rather, they don’t know which parts of the sentence benefit from agreement with each other. They need a lot of reminders about how parts of sentences fit together to generate and express coherent, consistent thoughts. Asking for agreement in writing is usually not as political as many students and critics think. It's obviously political in the case of Peterson, however, with various parties attempting to convince or cow each other. In my opinion, confrontational assertiveness is no help, and a third way out of the double bind is needed. I can respect someone’s stated preference for a set of pronouns, but, if the word “they” comes from standard English and is plural in standard English, I’d also like people to respect my preference. It’s a part of my sense of identity and belonging as someone who loves language and has fostered that love against various stigmas that persistently degrade art and the humanities. Rather than err with “they,” I’d rather see writers use neologisms such as “per,” “pers,” and “perself,” which Marge Piercy coined in her 1979 novel Woman at the Edge of Time. (I like these ones because they remind us of the English word “person,” so they’re not only affirmative but also easy to remember and say.) To butt heads on “they” as plural or singular is to perform a script produced by a binary opposition whose politics is equally binary and thus potentially antagonistic. (“Politics is” can be correct when “politics” is used as a synonym for other singular nouns such as, in this case, “ideology.”). The third way is the neologism, which should be less contestable, in theory but not in Murphy’s or Peterson’s case. Murphy’s conservatism reacts partly against the perception of these pronouns as “a set of freshly made up words,” or, in other words, what he calls “neologisms.” Notably, according to the Oxford English Dictionary, the word "neologism" itself dates to 1772, which is closer to “new” than “old” in the history of the English language. If Murphy reflected on this relativity, he would soon realize that the English language is constantly changing to reflect new realities, partly by gaining new words. I would remind Mr. Murphy of George Orwell’s coinage of “doublethink,” which I suspect Murphy himself has been glad to have in his verbal toolbox. I love Murphy’s subjunctive and his vocabulary of “imprimatur” and, perhaps ironically, “obscurantists”—but, Mr. Murphy, to use “midwife” as a verb would surely have bothered some English professor somewhere. Maybe even me. Yes, I have been—am—a prescriptivist much of the time. In trying to improve a student’s writing, we’re trying to improve the student’s thinking. Many of us need to improve our thinking by learning how to think beyond binaries, or black and white. This lesson comes partly out of the debate over pronouns, and many of the advocates of gender-neutral pronouns identify as “non-binary.” But, still, knowing how words agree with each other is really helpful: it helps writers to be aware of how sentences work and how their readers might experience their sentences. There’s nothing wrong with this purpose. So I was stung when I first saw how the website Motivated Grammar attacks professors like me for prescriptivism. I’m amazed at how someone could write against prescriptivism and sound like such a bully! Check it out: The only problem with this view [of grammatical rules as helpful] is that all you’ve managed to learn about English is how to get your brain to release some satisfying endorphins every time you blindly regurgitate some authority figure’s unjustified assertion. You’re not helping; you’re just getting someone to pretend to agree with you long enough to shut you up. Or worse, you’re scaring people into submission to a point where they feel compelled to preface their speech with apologies for any unknown violence their words are committing against the presumed propriety of the language. (par. 4) Notably, Peterson believes that his university and his provincial government are trying to do just that: “[scare] people into submission.” He worries that the government will dramatically expand hate speech laws to punish people who misuse pronouns which, I agree, would be scary. I know that a pronoun can be used hatefully, but there are all kinds of other words that are much worse; "hate" is a very serious word. What if you could be punished if someone overheard you misidentifying a genderfluid person who identified as “she” when you knew her, and who later flowed into “he”? Gender is too complex to regulate with such imagined laws, and one would hope that the tone of the discourse surrounding it could be less brutish. Laws can be too rigid, and other forms of power can be more flexible. I like the power of contextualization, of putting things in perspective. Motivated Grammar states that many well-respected writers throughout history have used the singular “they.” If great writers break the rules, why can’t we all? Using a claim to authority (the great writers) to deconstruct a claim to authority (grammar) is fine, but it can be interpreted as just another power play, one power against another. Recently, I heard Alan Doyle of Great Big Sea hosting his program on CBC Radio, and he said of a song he had just played, “I love it—loves it!” He corrected himself into using the grammatically incorrect but culturally appreciated error of subject-verb agreement in Newfoundland. This example of self-policing demonstrates to me that the “grammar police” and the related discipline are not only functions of a dominant language or culture. (Read DA Miller’s The Novel and the Police or Michel Foucault’s Discipline and Punish for more on police and self-policing.) Dialects and subcultures have their own gatekeepers, often cultural figures such as Doyle or Murphy. I like their respective styles of writing, but let me give my own example of a great writer. Not too long ago, I was reading Tim Ingold’s wonderful book Being Alive, specifically its chapter on landscape and weather. The blurb from Stuart McLean (not the Stuart Mclean of The Vinyl Café) on the back cover claims that his prose “is exactingly lucid and charged with poetic eloquence.” Indeed, he is a writer who can use the subjunctive perfectly: “Are pebbles, then ‘objects on the earth’? [James] Gibson would say so, and so would we, were each of us to stop to pick one up and, having examined it, to replace it where it lay” (131). But I found this sentence: “For formerly blind persons whose sight has been restored by a surgical operation, and doubtless for the newborn opening their eyes for the first time, the delirium [of seeing the world appear to be formed in the moment] can be overwhelming” (128). Here, a writer many would call “great” switches from the plural “persons” to the singular “newborn” for no apparent reason, thereafter linking “newborn” with “their” when “newborns” would agree better. Why not write “newborns”? (It’s so easy to fix these minor errors, so why not?) Did Ingold intend to refer back past “newborn” to “blind persons”? Not likely. (That’s a sentence fragment, of course, and I’ve started some sentences with conjunctions, too.) But what harm is done by agreement? And why doesn’t this usage cast doubt on the writer? The short answer is that we trust Ingold’s writing because of who he is (however questionable such authority might be) and, more important for my argument, because most of his writing really is above reproach. Readers in the academy, however, are trained (perhaps a distortion of our education) to be critical of everything, including each other. One of the recent peer reviews of one of my essays returned the feedback that my writing is too “conversational”; I had used a single contraction in 6,500 words. (The essay has since been published.) My former supervisor, in contrast, reacted to my attempt to minimize metaphor (read my book if you wonder why) by telling me my writing had become almost unbearably “stark.” Professors tend to approach everyone’s writing with a critical eye. Students, especially, are usually in the early phases of establishing credibility as thinkers and writers. If my professors over the years hadn’t noted the myriad ways in which my essays were difficult to understand, I might have improved simply by reading a lot more, but I might have needed twenty years instead of—I won’t say how many. In the end, I wish Peterson would relent and eschew his overly conservative ways, but I also wish that the more ardent prescriptivists and political correctors would calm down a little so that we can talk about writing and gender without polarizing our debates. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “The Confessions of a Sisyphean Prescriptivist and bill bissett Fan." Publicly Interested, 4 November 2016, www.publiclyinterested.weebly.com. We often talk about how privacy is “shrinking.” Consider these pieces in The New York Times (on tiny office spaces), The Harvard Business Review (on shareable data such as body metrics), and Slate (on the secrets of corporate "people") as examples. We use this metaphor of space, one that can shrink or grow, to conceptualize privacy, but we rarely talk about “growing” it.
How do you grow privacy? “How do you grow a prairie town?” Robert Kroetsch once asked in a poem. His simplest answer was that “the gopher was the model,” because it could pop up and just as soon vanish. And if privacy is necessarily spatial, like a town, then, yes, I suppose it can come and go quite easily—or you come and go, and it stays wherever it is, sometimes where you might not find it again. If you’re one of the many teenagers who finally get their own room, you might lose it as soon as your parents have another baby. How do you shrink a private space? Easy: grow more people. And because space is finite and we can’t “grow” the space, not exactly (perhaps with the exception of a few built islands), you need to arrange for fewer people or for people who can’t claim it—thus war, colonialism, slavery, and real-estate bubbles or unaffordable housing. To oversimplify. But is privacy necessarily spatial? Two recent essays in The Walrus have been prompting me to think about this. One, by my friend Naben Ruthnum, is about thrillers and detective fiction and how these genres “reassure us that secrets are still possible,” even in the age of social media “when we can discover the unedited, intimate contents of millions of lives online” (70). The other, by Jonathan Kay, claims: “While pop culture continues to push the narrative that privacy is disappearing, the reality is very much the opposite: privacy protection has become a huge element of both engineering design and corporate branding in the technology industry” (26). According to Kay, our privacy is much better protected than we think, because multinational corporations such as Facebook and Microsoft are convinced that their businesses will grow faster if they have robust security protocols and privacy policies that let us believe we’re in good hands. For Kay, in the real world our secrets are safe, and only in the world of fiction do we really have to worry about private detectives, spies, and cat burglars rummaging through our underwear. But in both pieces, privacy is not so much a space as a feeling of security (this being the sense of privacy articulated after slavery in Dionne Brand’s answer to One Hundred Years of Solitude, At the Full and Change of the Moon) or a right to secrecy. While I was reading and re-reading The Walrus, I also happened to be reading the wonderfully bizarre At Swim-Two-Birds, a 1939 novel by Irish author Flann O’Brien that raises some of these questions about privacy. It’s one of the tallest of tales—a whopper you might say—in which an undergraduate writer composes a novel that involves Irish legends mingling into a cowboys-and-Indians narrative that crosses the path of a devil and a fairy. Said writer often escapes from his bullying uncle into his imagination, and his writing—as escapism—is really for him an escape into privacy. This is the opening sentence: “Having placed in my mouth sufficient bread for three minutes’ chewing, I withdrew my powers of sensual perception and retired into the privacy of my mind, my eyes and face assuming a vacant and preoccupied expression.” This line is followed by many other similar “retirements.” I’m fascinated by how physical and temporal it is; he’s chewing, and it’s for “three minutes.” It’s physical, but it’s also beyond “sensual perception,” as if it were meditation, as if he were a yogi. His mind might be a conceptual space (as it is in Phyllis Webb’s metaphor of the “glass castle” or Simonides of Ceos’s “memory palace” and his "method of loci"), but it is also out of space and time. In theory, then, your privacy can be as big as you can imagine it. Escapism is a management of the intrusions of the social world, the social world that is supposedly the real world in contrast with the world of fiction, illusion, or fantasy—whichever you prefer in this case. I don’t believe in this illusion vs. reality dichotomy. Our “real world” is absolutely full of illusion, fantasy, falsehood, deception, and error, and these make the world go round. Sometimes the only assurance is when you escape it into the mind, as when Descartes says, “I think, therefore I am.” Escapism is actually quite important, maybe more so than ever. It helps us minimize the social world, and it enables us to be a little more conscious and in control of the blend of fantasies in our lives—those of others (e.g., entertainment corporations, political parties, the “echo chambers” of social media) and our own. The social media networks offer privacy only so they can monetize your secrets for themselves. It’s your privacy but their property. Escapism can be a way out of this capitalism—if it’s not through more private property, or publishing, or buying video games or Game of Thrones seasons or any of a million other entertainments, activities, acquisitions, and options in general. Ruthnum’s essay suggests that fiction alleviates real-world anxieties (such as homophobia surrounding the trial of Oscar Wilde, alleviated by horror stories of his time) (70). It doesn’t only create an anxiety for the reader’s enjoyment of suspense, and then relieve it by resolving the tensions of the plot. It doesn’t only pose a fictional problem and offer the fictional solution. Ruthnum’s most compelling observation is that many thrillers today are in fact “near-techless thrillers” (69). They are set before the Internet, or people don’t have their smartphones, or their equipment is broken. The “tech” is basically a spoiler; it stops a tense plot from developing. What if that’s the problem with our real world? The inverse of Ruthnum’s observation is that, in our tech-full lives—despite true threats such as cyberbullying—we are usually contending with our own banality. Although plenty of escapism is banal (e.g., most television, even today in its “golden age”), the thrillers that Ruthnum reads are not. The writer’s imagination in At Swim-Two-Birds is not. They are fictional solutions to real problems. A banal world is a small world, whether real or illusory, social or private. Growing our privacy might be simple: shrink the banality—the sheer boredom, the predictable behaviours, the conformism of body and mind. Set aside the phones and their clocks. Be unplugged and alone more often, but not by shrinking the world of real people. Don't covet your neighbour's house. Sometimes I feel that there is nothing more banal than a mortgage. Now if I could only stop binging on Game of Thrones... Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “How Do You Grow Privacy?” Publicly Interested, 17 August 2016, www.publiclyinterested.weebly.com. |
AuthorJoel Deshaye is a professor of Canadian literature with an interest in publics, publicity, celebrity, mass media, and popular culture. Categories
All
Archives
November 2024
|