On Tuesday (coincidentally the title of one of my favourite songs by the Men Without Hats on their gorgeous synth-pop album Pop Goes the World), I was invited to The Ship pub to give a talk for the Department of Philosophy’s Public Lecture Series. The 40-minute talk was on the philosophical concept of moral luck as seen in Clint Eastwood’s Dirty Harry films. This entry of the blog is a highly compressed version of the talk, which was in itself a shortened version of an essay published earlier this year in the open-access, peer-reviewed journal Film-Philosophy.
At its simplest, moral luck is a factor when we judge people “responsible for events that are not entirely within their control” (Gregory par. 25). The philosopher Bernard Williams coined the term “moral luck” at Cambridge University in the very late 1970s, and he thought it would seem like an oxymoron or a contradiction at the time (251), but he and others have since shown that—yes—luck does matter morally. One of the classic examples is of two drivers: one passes a stop sign without stopping and nothing happens, and another also passes a stop sign without stopping—but hits and kills a pedestrian walking through the crosswalk (Nagel 25). Although the collision is bad luck, we want to judge the driver who killed the pedestrian, not the luck, because luck has no moral agency by itself, right? But luck seems to have made the difference. We also want to judge the driver who killed the pedestrian as worse than the driver who didn’t. Williams and Thomas Nagel were writing about moral luck around the same time, but neither mentions Dirty Harry—but the first Dirty Harry film in 1971 happens to use one of the examples that they would use later in the 1970s and early 1980s. The main character, Harry Callahan, explains that his wife died when struck by a drunk driver. He rationalizes her death with these words: “There was no reason for it, really.” From this, I assume that the death of his wife helped to create the Callahan we know by adapting him to the unpredictability of others. He is highly tolerant of luck. I think that he believes that the morality of the luck depends on others, which is why he is often casually willing to allow other men to decide whether to escalate violence. I’m going to try to explain moral luck through a few movies in the series that many of you will have seen: the five Dirty Harry movies, starring Clint Eastwood. Because most of us probably haven’t seen Dirty Harry in a while, I’m going to remind everyone about how it goes as we start thinking about moral luck and how we know who we are, and which invites a trio of big words: epistemology, which is the study of how we know something; ontology, the study of being, of who and what we are; and existentialism, which is a belief in being defined by our free will and responsibility. I won’t dwell on these concepts; my purpose right now is to show that the first Dirty Harry film is unexpectedly ambiguous and full of subtle hints about philosophical concepts of who we are, and how we know what we are. This ambiguity can be interpreted not as a hidden ideological message but as respect for the intelligence of the viewer. Maybe at other times I’d be less generous, but I think that Dirty Harry has, in a sense, a both conservative and liberal respect for our own free will, as in classical liberalism, our ability to think and interpret for ourselves. Unlike so much of today’s media, the Dirty Harry films seem like they’re in dialogue with a variety of political views. In Dirty Harry, Callahan wants to bring criminals to justice without interference from what he perceives as an overly liberal police department and government. He seems conservative, today, but the film itself, with his name on it, seems liberal in preferring the attitudes and actions of African American criminals over those of white criminals. I’ll return to this deliberate contrast at the end of this entry, but first let me describe the most iconic scene. At the start of the film, Eastwood’s character defeats a series of bank robbers of African descent. The first man to shoot at Callahan (mildly hurting his leg) and to be shot by Callahan is about to retrieve his shotgun when he is targeted again, at close range. Eastwood then delivers the famous lines that I mentioned earlier: “I know what you’re thinking. Did he fire six shots or only five? Well, to tell you the truth in all this excitement I kinda lost track myself. But being this is a .44 Magnum, the most powerful handgun in the world and would blow your head clean off, you’ve got to ask yourself one question: Do I feel lucky? Well, do you, punk?” Satisfied that all is under control, Callahan begins to walk away, but the robber calls after him, “Hey! I got to know.” Notice that both the robber and Callahan say “I know” or “I got to know,” signalling epistemology or how we know. Callahan returns, aims at the man, and pulls the trigger—but the gun has no more bullets. By seeming to involve chance in the moral work of stopping a criminal, Callahan invokes moral luck, and the question of who is responsible. Whether Callahan is bluffing by saying that he “lost track” of the shots he fired is a related question. My published essay does not mention the fact that Callahan repeats this speech with a different outcome at the end of the film, but I’ll return to this repetition at the end here too. At this moment in the story, his potential bluffing can be interpreted as surprisingly epistemological and ontological, about knowing and being. It’s involved in Callahan’s moral ambiguity. The scene of the robbery offers a rather dizzying array of potential meanings, and it requires some close attention before we hear more about how Eastwood involves luck in representations of heroism. It would appear that Callahan is ready to murder the subdued man because of an implied question: “I got to know” how bad you really are, or if your gun is still loaded. But his desire “to know” has more to it than that. To know is not to be deceived. It’s shorthand for knowing the truth, and so the final pulling of the trigger is even surprisingly existential. The robber might be asking his question to know a truth about himself, in addition to the more obvious possibility of wanting to dare the policeman to kill him. If he’s asking about himself, it’s about whether he is a good and virtuous man despite the robbery. If Callahan is bluffing and knows that the gun is empty, his pulling of the trigger is, first, a sign of his merciless sense of humour. Second, it’s a judgment. Yes, we can imagine Callahan thinking, you backed down, so you’re good enough not to die right now. Callahan also implies that, unlike the robber, he knows himself to be good and would not fire a loaded gun at a defenceless man. He refers to “the truth” here in a moment that is wryly confessional, especially when he says, “I kinda lost track myself,” but he could be lying about having lost track. Callahan’s attitude and his personality do suggest that he is bluffing: he exudes self-control, or at least confidence—but then there are so many times in the Dirty Harry films when his behaviour is so reckless that he could not possibly know in advance the results of all his actions. When he asks the robber if he feels lucky, for example, Callahan knows at least that he has already won, even if he does not know the extent of the damage that he might cause in winning. What if Callahan is not bluffing? If Callahan really did lose track of the number of shots he fired, then he's playing a version of Russian roulette that does not risk the life of the person holding the gun. Notably, he does this only when asked, “I got to know.” Impulsive and irresponsible, he projects some of the responsibility for his action onto the robber, as if the robber’s guilt or innocence could be decided not by the robber’s violence, because that was already settled, but by his taunting or his curiosity, “I got to know.” The modified Russian roulette in Dirty Harry implies that the action of killing is the responsibility not of the policeman but of the other man, or of luck. This theme suggests that in the Western and cop movies the hero acts according to the morality or immorality of others, and that his own character is not intrinsically moral or immoral, because he applies his ethics to a limit and then he refuses to assume further responsibility. In other words, he might be saying, hey, I’m not really responsible for killing a man who dared me to pull the trigger. The potential for self-deception implied here might call to mind Jean-Paul Sartre’s concept of bad faith, which refers to self-deception or inauthenticity. If you want more on that, please read the published version of this entry. In brief, as I see it, the moral character of people with bad faith is related to their existential dilemmas of agency—and Callahan is hardly in an existential dilemma. You don’t look at him and think, here is a man who is searching his soul, wondering how to act. If he is deceiving anyone, it is another person, not himself—except that he might be mistaken about the number of bullets in his gun. Much like the gun, moral luck—to me—is more political than some of the debaters admit. Their points seem to assume, without ever saying it, that moral luck aligns with a liberal or leftist view: that criminals, like anyone, are often the result of circumstances beyond their control that may be described as bad luck. So, when the robber in Dirty Harry seems to want to know if he’s virtuous in spite of being in a robbery, the question is political: liberals tend to see past a person’s crime toward the conditions that led to it, such as poverty, whereas conservatives tend to focus on the deed itself, and then judge accordingly. These generalizations are up for debate, of course, and we could also debate whether Callahan wants to be involved in them—but I think he does. My view here is that moral luck is not all that liberal as a concept, because it enables Dirty Harry to coerce the bad guys into a mimicry of free will and responsibility, and this coercion is not a liberal style of rehabilitating criminals. Brian Rosebury at the University of Central Lancashire, who comes out of literary studies into philosophy as I do, is more worried than I am that moral luck seems to align with a liberal view. Rosebury’s concern is that “we do not choose our acts either, just because we do not choose what causes them” (508); similarly, maybe we can’t judge anyone, ever, because everyone is created “by biological luck and developed by cultural luck” (292). If this alleged moral relativism is truly a problem, and if it is politically liberal in orientation, as in Rosebury’s allusions to social constructedness, why would a figure as conservative as Callahan invite luck to determine his moral judgment or morality? One answer comes indirectly from professor Claudia Card of the University of Wisconsin, who joins the debate in 1995 with her book called Unnatural Lottery: Character and Moral Luck. Her book openly acknowledges the political relevance of moral luck. Rather than put a Sartrean emphasis on free will, Card puts will into the context of political, social, and economic limitations—such as repressive sexual laws, sexism and racism, and poverty—that people must work against to be responsible. Card focuses on one of Nagel’s four related kinds of luck, one I haven’t mentioned yet, that has been called “circumstantial luck.” Her first sentence, in fact, is that “[m]uch of the luck with which this book is occupied attaches to politically disadvantageous starting points or early positionings in life” (Card ix). Partly because she is not a relativist, Rosebury’s review of her book is positive. Card explains her own not-relativist-but-liberal position when she says she does not want “to let us off the hook morally by showing that fate determines who we become. I am no fatalist [says Card]. [She says,] I find luck influential but not ordinarily determining. It narrows and expands our possibilities, often through the agency of others over whom we have no control and often through the medium of social institutions” (x). For Card, and seemingly for Rosebury, luck can be accepted as an influence but not as the determiner of someone’s morality. So, through Card I might answer my own question. Perhaps Callahan invites luck to determine his morality to suggest (perhaps especially to liberals) that mitigating circumstances are not as important as they might seem and can still be strictly controlled: if there is a bullet left in his gun or not, he has demonstrated how effective a strong and punishing response to crime can be. That’s usually a conservative view. Now, as this entry approaches a conclusion, I want to shift our attention to the men defeated by Dirty Harry, men I’m going to call, ironically, the lucky punks. There’s a pattern in how Eastwood’s characters from the late 1960s through the end of the Dirty Harry series speak with the lucky punks. They are almost always African American men who evoke American racial politics from the era of civil rights to Reaganomics. Let me remind you that, in Coogan’s Bluff, Eastwood’s character was ready to stab a man and, when he’s asked if he would have done it, he says, “I don’t know. That was up to him,” which is the prototype of Callahan’s “Do I feel lucky?” It is also the origin of a third statement, when Callahan says, “Go ahead. Make my day,” in Sudden Impact, from 1983, the fourth of the five Dirty Harry movies. In all three scenes, Callahan’s foe is a black man; each one commits a crime, but each one backs down, luckily for him and for Callahan. If you're not convinced by my argument about the first shootout, above, think about the pattern of these three scenes. I would go so far as to say that they're a stereotypical and wishful commentary on American race relations during the time of the black power movement. This movement was meant to address civil and socio-economic inequalities, such as systematic or systemic racism and its impoverishing effect on Americans of African descent. Coogan and Callahan project responsibility onto what they might assume is blind luck (a synonym for chance that, like the free market, is not supposed to be prejudiced), whereas the pattern of skin colour suggests that it is definitely not blind. I’m fascinated to see that Callahan is represented as poor or at least cheap throughout his first story—cheap pants, hot dogs for lunch and supper—and maybe his lack of money gives him sympathy for the black men who rob the bank. Still, Eastwood’s characters seem to be telling black men (and I’m aghast at the message), “Quit stealing—and be responsible to yourselves and to us.” Upholding the generally anti-governmental position of these films, Callahan and Coogan would probably not be willing to supply the coin to pay the cost of fairer government and justice. Here I have to admit that Callahan uses his “Do I feel lucky” speech twice in the film, once with a black man who backs down and once with a white man who chooses to try to get his gun, and Callahan shoots him. The black criminal is a bank robber, and he is spared. The white criminal is a serial killer, kidnapper, rapist, and extortionist, a much worse criminal, and he is killed at the climax of the film because Callahan does have one more bullet in his gun the second time. We realize then that Callahan’s “Do I feel lucky” speech is a script, possibly one he has used more than once before. If he has used it more than once before, then he probably was bluffing and was in control when he stopped the bank robbers. Maybe it wasn’t moral luck, and in fact Claudia Card argues that “[t]aking responsibility [...] is likely to involve consciously developing an integrity that does not develop spontaneously” (24). I wonder, then, if making others responsible is usually going to be scripted and not “spontaneous.” Ultimately, however, I can only interpret what the film offers me, and there are only two “Do I feel lucky speeches,” and the real script is the screenplay that the writers gave to Clint Eastwood—the actor not the character—and these writers probably realize that there’s an aesthetic balance in having only two “Do I feel lucky” speeches, and there’s dramatic irony because the serial killer doesn’t know that Callahan is basically comparing him with the bank robber. I doubt that Callahan’s just repeating the same script in every showdown, going throughout the city, asking, “Do I feel lucky? Do I? Do you? How about you? Scale of one to ten…” More important, the political commentary seems to be that, on the one hand, that whiteness is associated with the worst crimes (quite a left-leaning admission in the North American context, these days); and, on the other hand, that the white criminal is not subject to luck and cannot be forced to take responsibility, but the black criminal is and can. For Eastwood’s characters, black men must be pressured to conform to expectations of non-violence and obedience. But, unlike the white criminal in the first movie, at least the black criminals have respect for their own lives and are willing to stop violence—and I want to take this detail as the film’s respect for African Americans, even though I can’t entirely. While the filmmakers represent black men with consistent symbolism related to luck throughout the Dirty Harry movies, not all of these men are stereotyped as criminals, and I have one final example to show that moral luck is connected especially to black men in these films. There are many white criminals in these films, and many of them are also stereotyped as symptoms of liberalism, as with the murderer and his girlfriend in Coogan’s Bluff. Sudden Impact plays on our expectations of seeing threatening black men in Dirty Harry movies, but then it introduces Horace, played by Albert Popwell, as an ally to Callahan. It’s interesting to see the sequels respond, or seem to respond, to political critiques of the earlier films, because this kind of listening suggests a style of conservatism that is still open to thoughtful debate. In Magnum Force, the second of the five, which came out in 1973, Callahan’s partner Early Smith, played by Felton Perry, is also a black man. One of their conversations suggests that Smith is aware that Callahan takes risks with people of his colour. In a scene where the two policemen are following suspects by car and beyond their jurisdiction, Callahan says that he wants to confirm a hunch and decides to antagonize the suspects—but Smith doesn’t want to be caught in the middle of a gunfight. He says, no, “I don’t want to be winning bets for anybody.” His reference to “bets” implies that he would agree with my argument that his partner uses others while depending on luck to seek justice. Callahan would disagree; he persists and says, “I’ve never been wrong yet, have I?” But later, after Callahan warns him to take care of himself, the corrupt policemen assassinate Smith because of his partnership with Callahan. Magnum Force suggests that Callahan’s hunches are never wrong, contrary to my argument about his partial uncertainty in the first Dirty Harry film, but he is unquestionably sometimes wrong: Smith dies because Callahan takes risks and cannot take responsibility for everyone; other people, including good people who are on his side, are forced to take responsibility for his actions. Because he cannot save everyone, he is not God, and if he is not God, his claims to certainty must sometimes be in error. He might be on a lucky streak, at least as far as his own survival goes. Although I’ve entertained other points of view, I’ve argued that Callahan was being honest about his partial uncertainty in the first “Do I feel lucky” showdown—though he’s probably confident that he’d win regardless. This discrepancy is ethical and political. He and his prototype Coogan have a common mission, not only to get their men regardless of the law but also to offer a final choice to the enemy, who may be punished if he continues to be violent. Their ethical shortcoming is that their respect for the African American men who confront them is limited to these men’s potential to be coerced into responsibility. In another way, however, the black robber is the most interesting character in the first Dirty Harry film, because he is the one whose unpredictability—and his potential to make a decision—is the truest unpredictability, and the truest potential. The robber is the one character who might want to know himself better. Callahan might want to know others but seems entirely confident in who he is, perhaps too confident. In contrast, we expect that the serial killer is going to try to kill again. He’s predictable. Because of this expectation, moral luck is more a factor in Callahan’s and the robber’s decisions. They are the interesting characters, and moral luck can be a plot device that creates suspense through the unpredictability of these characters. Works Cited
0 Comments
Here in sunny San Diego for the PCA-ACA 2017 conference, I reserved an evening to go see Lolita Chakrabarti’s play Red Velvet (2012; here and now directed by Stafford Arima) at the Old Globe Theatre in Balboa Park, and I discovered a fascinating study of dramatic irony as a parallel of the insidiousness of racialization. What I mean is that the play is about how race fools us.
Dramatic irony is when the audience knows something that a character doesn’t know. In this case, the character is the historical figure Ira Aldridge, an African American actor. He was the first black man to play Othello on the London stage. In Red Velvet, he’s trying to promote the movement toward naturalistic acting but is himself the over-actor incarnate. Albert Jones plays Aldridge to the contrary of the fictional and perhaps historical Aldridge’s preference for “domestic” or naturalistic styles of acting. Ben Brantley in The New York Times reports that Adrian Lester played the role similarly in 2014, so that the actor “exudes the scary, outsize presence of the barnstorming stardom of another time.” Aldridge’s controversial performance in London in 1833 coincided with the final major legal milestone in ending slavery in Britain and its colonies. Until then, white actors played black characters in blackface—and so, in an ironic twist (like that of Patrick Stewart playing Othello in an otherwise all-black cast), Aldridge plays King Lear in whiteface at the conclusion of the play, speaking these colour-sensitive lines from Act V of King Lear: . . . . . . . . . . . . . . . . . . . . . . . . . . They flattered me like a dog; and told me I had white hairs in my beard ere the black ones were there. To say “ay” and “no” to every thing that I said! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Go to, they are not men o’ their words: they told me I was every thing; ’tis a lie, I am not ague-proof. An ague is an illness, especially a fever, and so Lear is calling attention to various possibilities, including that he is confused—by the lies, by the “madness” (III,iv) that he worries is upon him. A mad character is probably always an instance of dramatic irony, at least in those moments when the character is not aware of the madness. In Red Velvet, I think the madness is the idea of race itself—but I’ll come back to that. Aldridge is also calling attention to the weaknesses of his body in lines that Chakrabarti seems to be repurposing. When Lear compares “white” and “black” hairs, he means age and how it is symbolized—here, that white hair is a symbol of wisdom, I think. When Chakrabarti’s Aldridge’s Lear says these lines, however, he signifies that race, like traits such as wisdom (which Lear did not consistently have), is not essential to anyone. Race is partly a bodily performance, especially as Red Velvet dramatizes Aldridge, and partly an attribution that can be manipulated for reasons good and bad. (Coincidentally, the San Diego Museum of Man, just steps away from the Old Globe in Balboa Park, is presently curating an exhibit called “Race: Are We So Different?”) The crisis of Red Velvet is that Aldridge’s critics, the writers who review his play in the newspapers, echo stereotypes of black men as (often sexually) aggressive and thus a threat to white virginity and whiteness-as-property, as in the theme of inheritance suggested by the play’s ailing white father and his son. (For more on the latter, see Cheryl I. Harris’s “Whiteness as Property” essay from the Harvard Law Review.) Aldridge has already seemed to prove his critics right in advance by rehearsing and performing the strangulation of Desdemona too “realistically,” which means according to the commonly held racial stereotype and the reality presumed by the critics. He then attempts to strangle his French manager, an ally and friend, when the Frenchman finally concedes to public pressure to remove Aldridge from the role. Unlike most of his colleagues, Aldridge is presented as an over-actor whether on stage or behind the scenes in the dressing room, and in the program Jason Sherwood, the set designer, comments that the superimposition of Aldridge’s private life (backstage) and public life (centre stage) is crucial to his character as imagined by Stafford Arima. Indeed, Aldridge is almost entirely “public”: projecting from the top of his voice, preoccupied with gesture, vying always for position and attention. One implication of Jones’s performance is that one’s persona invades one’s private life, a commonplace that informs much of my work on celebrity. As I’ve recently written in the context of racialization in The Journal of Commonwealth Literature, it is also that one’s public face can turn an “about face” on the self, allowing social norms to define a person. So, when the stage’s rotating proscenium (yes, a prop that expensive) sends us back to the present near the end of Aldridge’s life, the play ends with his Lear’s exhortation against the “lie” of the public’s and the court’s (and his family’s) support for him, juxtaposed against the flashback to his manager’s withdrawal of support following the racist reviews of his Othello. The play thereby emphasizes the struggles in the historical Aldridge’s remarkably successful career, set against the backdrop of Britain’s very mixed, ambivalent movement toward abolition from the late 1700s to 1833 when, finally—after about a generation—Britain stopped trading in slaves. If a viewer wonders why Aldridge is presented with something less than total sympathy, it’s because the play appears to be made to dramatize the insidious effect of socialization on one’s private life. We know something that the fictional Aldridge does not know: that he is unwittingly the exaggerated product of the racism of his critics, while he believes he is being authentic. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. "The Dramatic Irony of Race and Red Velvet." Publicly Interested, 16 April 2017. In the news again today, Senators are arguing about a controversial bill to change the national anthem, but the politicians and others who say that there is a grammatical problem with the proposed revision are wrong.
I don’t have any objections to the proposed revision. It would be different if the government were trying to revise one of the objectional poems by Irving Layton. The anthem is official and meant to be sung together to encourage citizens to feel that they are part of an imagined community, so inclusive lyrics are a good idea. The proposed revision, in fact, is still too martial and religious for my taste, but that's a topic for a different post. This post is about grammar and controversy, that old pair. Here is the start of the anthem—objectionable to someone in every line, I know, but culminating in the one under debate today: O Canada! Our home and native land! True patriot love in all thy sons command. I can explain the first two lines, but that’s probably not what you want right now.* You want the debate about the third one: “True patriot love in all thy sons command.” To be gender-neutral, Bill C-201 proposes this revision: “True patriot love in all of us command.” According to Senator Michael MacDonald in the CBC News story linked above and below, various people, supposedly including English and linguistics professors, have agreed with MacDonald, who protests that the revision is grammatically incorrect. It’s not. It’s a complete sentence, albeit in archaic syntax: an example of the re-ordering of words that is sometimes necessary to position rhymes at the end of lines, as with “land” and “command.” Adjusting the syntax but keeping all the same words, the line is still a grammatically correct sentence of the imperative type: “Command true patriot love in all of us.” The song begins by addressing Canada as if the country were a person (in a technique called apostrophe: "O Canada!"), and that person carries over to the third line as someone who could command someone else. It’s correct (if not politically) in the official lyrics, too: “Command true patriot love in all thy sons.” When Senator Michael MacDonald says, “The proper and only acceptable pronoun substitution for the phrase ‘All thy sons command’ is ‘All of our command,’” he is neglecting another “proper” reading: that Canada is the commander. His interpretation is fine, more or less, and we can discern it by adjusting the syntax again: “All thy sons command true patriot love.” In this case, we can interpret the line to mean that our boys are growing into authority by telling others, or inspiring others, to care for them nationalistically. Another interpretation is that we are in control of our own love. Also in this case, however, the preposition “in” mysteriously disappears. In my view, MacDonald has to explain the use of that preposition (or why it can just vanish) before he claims that his reading is the “only” one. * The song begins with what’s called an apostrophe—not the punctuation mark, but an address to someone or something. It’s part of a tradition of addressing sublime things, sometimes including the nation, especially if the nation is ruled by a king and the King is close to God, who (you might say) commands sublimity. It’s sometimes called the “apostrophic O,” as in Percy Bysshe Shelley’s “Ode to the West Wind." Here at home, the addressee is Canada, and the first line of the song is an incomplete sentence only if we neglect other types of sentences. The second line is the same: technically a fragment (no subject or verb) but excusable because of its exhortative, exclamatory role. If someone bumps into you at the grocery store and he’s the one to yell “Hey!”, you can’t say it’s not a grammatically correct response. How to cite this blog in MLA format: Deshaye, Joel. “The Grammar of the National Anthem in Canada." Publicly Interested, 4 April 2017, www.publiclyinterested.weebly.com. “Let’s have some decorum,” President Richard Pryor says in a White House press conference just before he jumps into the crowd to attack a journalist for asking a racist rhetorical question about his mother. In this 1977 sketch from the short-lived Richard Pryor Show, Pryor could well have been commenting on recent news about the relationship between the president and the media in the time of Donald Trump.* In the sketch, Pryor imagines himself as the 40th president of the United States—a position that went in fact to Trump’s touchstone, Ronald Reagan, whose so-called Reaganomics started a trend in exacerbating the American racial-economic inequalities that Pryor cited so often in his comedy routines.
When President Pryor channels generic political spin and defends the neutron bomb as “a neo-pacifist weapon,” I still hear Trump, though Trump would never use the Grecian prefix. Trump is less audible (almost an impossibility today) when Pryor’s critique of race emerges. Responding to a question about funding for the space program, Pryor says, “I feel it’s time that black people went to space. White people have been going to space for years, and spacing out on us as you might say. And I feel with the projects that we have in mind we’re going to send explorer ships to other galaxies, and no longer will they have the same type of music, Beethoven, Brahms, Tchaikovsky. Now they’ll have the Miles Davis, Charlie Parker....” If only Pryor were still alive to comment on Trump’s thinly superficial (and faint) praise for the long-dead nineteenth-century abolitionist Fredrick Douglass: “Frederick Douglass is an example of somebody who’s done an amazing job and is getting recognized more and more, I notice.” I bring up Richard Pryor and Donald Trump because we watched the 1985 version of Brewster’s Millions last weekend, or perhaps the previous weekend (a blur in busy times), continuing a series of viewings focused on Reaganomic movies, and I’m compelled by the resonance that this movie has with the current politics of the United States. Can anyone be elected in the United States—regardless of sex, gender, race, class, age? Americans are not alone among people willing to elect the seemingly unsuitable and unqualified, but the election of Trump is nonetheless remarkable. Brewster’s Millions asks a more specific question: Would Americans elect a black millionaire who is otherwise unqualified for public office? Brewster’s Millions is only one of many adaptations of a turn of the twentieth century novel by George Barr McCutcheon, which became a play and a series of films before Walter Hill adapted it and found Pryor and John Candy to play the leads. At least in this version, the story involves a black small-time baseball player, Montgomery Brewster (Pryor), whose elderly white relative (surprise!) dies and bequeaths him $300 million—but only if he can spend $30 million in 30 days without accumulating assets, giving more than 5% to charity, or destroying things that are “inherently valuable” such as works of art. (This unlikely plot recalls Steve Martin’s 1979 movie, The Jerk, when Martin plays a white man who thought he was black, realizes he’s white, becomes a millionaire who squanders his money, and then re-integrates with his adoptive—but now rich—black family.) The lesson is supposed to teach Brewster to hate spending and become frugal. It’s ironic, of course: the premise that conspicuous consumption might lead away from excess to moderation. Brewster sets to work hiring people—valuing the labour of typically under-paid people, with the exception of a few ritzy interior designers, lawyers, and money managers—but also has two inspired moments of how to spend money without gaining anything material. First, he buys the most expensive collector’s stamp in the world, then uses it as postage on a postcard. Second, and this one is special, he decides that the best way to waste other people’s money is to run for office. His campaign for mayor is really a campaign against the establishment, so his slogan is “None of the Above,” a far cry from “Make America Great Again.” But in other ways his campaign is a lot like Trump’s was. Spin off your stardom from tabloids / Reality TV to municipal / federal politics. Buy votes shamelessly. Be the third way (as ironic as it is to say that in Canada where the third way is to the left). Have little respect for office and be honest about it, or seem to. Announcing his candidacy for mayor, Brewster says, “What I’m saying is, only an idiot would vote for me!” His follow-up, what he calls “the bottom line,” is that “I’m here to buy your votes.” Later, at a big rally full of supporters, he declares that he is there “to see to it that neither of my opponents, nor me, win the election! I want to ask the question: Who’s buying the booze? ... And who’s trying to buy your vote? And who are you going to vote for?” The rallying cry is “None of the Above!” But the crowd really means him, and he later drops out of the race to prevent his actual winning. So Americans would elect a black millionaire! At least as mayor. And if he's funny enough. A few of my friends now have said they plan to weather the Trumpnado by sitting back to be entertained while waiting for him to lose an election. But that's exactly what Trump wants us to do. He hates being criticized, but he loves to entertain. Pryor’s critique of star politicians and their fans is that the masses don’t really care about the message as long as they are entertained, e.g., with “the booze.” It’s a classic—and class-based, Marxist—view of the public, one to which I will return in a moment when I ask whether the film itself undermine’s Pryor’s critique. First, consider that, because Brewster is entertaining, his public ignores his message of not supporting the establishment. Not supporting the establishment was perhaps the key premise of Trump’s campaign against the much better qualified Hillary Clinton. Pryor's satire here reveals that rich people like Trump are the establishment, just as much as political lineages such as the Clintons, the Bushes, and the Kennedys are. The people who vote for Brewster or Trump are the “idiot[s],” Pryor claims. Could any idiot be elected in the United States? I reserve judgment on whether Trump would qualify; my point is that any millionaire could be elected. Brewster’s Millions shows us a world in which Americans vote for the money, possibly without realizing how it is the driving force of the corrupted politics that they want to oppose. At the conclusion of Brewster’s Millions, Brewster does claim to be sick of spending money, but he does everything he can to get the $300 million—raising my question about the coherence of this movie’s satire. It ends with Brewster a millionaire without rules on how to spend his millions. In that sense, it promotes unregulated capitalism of the type that Trump supports. It does not promote the legitimacy of black men and women as entrepreneurs or in politics. Further, this unregulated capitalism does have one apparent rule: that white men govern the black men’s money. The 1985 version of Brewster’s Millions can be seen as a pedantic, racially condescending film, because the white man has to train the black man in how to handle money. Worse, the film shows only the training, a frantic montage of conspicuous consumption akin to later hip hop videos, afore the bling became satirical too. Brewster’s claim to be sick of spending money is such a passing gesture, such an ambiguity. Is the mereness of the gesture a sign that Brewster has not learned the white man’s lesson, perhaps deliberately? If he had truly learned the lesson, would he have been so desperate in the final minutes to get the $300 million? It’s a double bind. Either he plays by the rules of a white capitalist economy, or he remains an unemployed baseball player who has humiliated himself as entertainment before the masses. But maybe this is what Prior intended: to show, not only in the film but in its structural relationship with the economy of the culture industry, that black men in the United States are still not taken seriously, even when they are making the most serious of jokes. * I’ve decided not to call it “the era of Donald Trump,” preferring to allude instead to the title of a Gabriel García Márquez novel. How to cite this blog in MLA format: Deshaye, Joel. “Presidents Pryor, Trump, and None of the Above." Publicly Interested, 19 February 2017, www.publiclyinterested.weebly.com. This morning, after yesterday’s American presidential election of the businessman Donald Trump, I went looking for perspective. I wanted to help myself understand more fully why many Americans voted for him. I found a somewhat unexpected explanation through the mathematician and philosopher David Schweickart. In the title of an essay, he claims that “Yes Virginia, There Is an Alternative” to the global capitalism represented by rich elites such as Trump. Coincidentally, my very first post on this blog was an open letter to Justin Trudeau, one that alluded to the child’s letter to Santa Claus that received the famous response from the Republican outlet the New York Sun, “Yes, Virginia… [there is a Santa Claus].” I don't believe in Santa Claus, and I don't believe in Trump, and I don't like Schweickart's newly minted socialism, which—the day after the election—feels just too close to one of Trump's very few ideas, even though it's not. And so today’s post returns to the rosy nostalgia of the Sun’s letter in the context of Trump’s blatant mischaracterization of Hillary Clinton as the rich elite and himself as the outsider to the system.
Trump himself said, and I paraphrase, that America needs not a politician but a businessman—as if there was never a politician who was a businessman first. Many voters echoed this rationale for electing Trump: that government is corrupt and that the United States needs a leader who “isn’t owned by anybody,” and someone who will fire his underlings and thereby increase accountability. But this idealized “boss not politician” identity reveals an disheartening confusion of economy and government: the mistaken idea that capitalism is somehow more democratic than elected government. (This confusion is partly what led to the popularization of the term "neoliberalism" to describe ubiquitous capitalism, i.e., capitalism that is now inseparable from democratic governments, following I think from Margaret Thatcher’s claim that capitalism has no alternative.) Even if it were true that capitalism allows any new competitor into the market and hence provides renewal of its leadership, it would not be true that capitalism is accountable to anyone. (Exceptions are few and far between, especially among transnational capitalists. I don't have a problem with most small businesses, though they be capitalist.) If you disagree with the beliefs and actions of the chief executive officer of the biggest business in the country, you cannot vote that person out. If you think that businesses are somehow better at managing their finances than governments are with theirs, look at the huge number of businesses, including some of Trump’s, that have bankrupted themselves, with negative repercussions on investment and employment. Americans are not entirely irrational to appreciate corporations and mistrust a government that is associated with police brutality; illegal, immoral, and costly wars; and surveillance, torture, and murder. The president is ultimately responsible for these problems, but the police, the military, and the spy agencies are not exactly “government.” I’d like us to remember the term “civil servant” when we think of government. The connotation of civility shouldn’t be forgotten, and servitude, though not a word that describes most workers in government, can at least connote a devotion to a cause. If we, anywhere, are serious about upholding democracy, good government has to be a cause, and we need to consider whether the fat cats are in government as much as in big business. Few of us today are devoted to our corporate employers, because corporations demonstrate little fidelity to employees and often benefit from precarious (yes, sometimes unpaid) employment. Schweickart addresses this comparison in his essay, remarking that among the top 25 incomes in the United States in 2009 was that of a hedge fund manager: $900 million. To tax his income so that it would be equal to that of the president of the country, his tax rate would have to be between 99.95% and 99.99% (Schweickart 174), depending on equalizing before or after the president pays his taxes. (It’s always his. The United States just missed its first opportunity to elect a woman and to realize, at least for another moment, equality of opportunity.) But Schweickart’s essay is weirdly neoliberal in that it accepts, completely, that capitalism should be a part of government. Or that democracy should be a part of capitalism, which is probably the more accurate way of describing Schweickart's suggestions. In his aforementioned essay and his book After Capitalism, Schweickart conceptualizes a form of corporate government called “economic democracy,” which he calls “our new socialism” (183). The innovation, Schweickart claims, would be to replace labour and capital markets (183) with capitalism by the people and for the people (i.e., profit sharing or “worker self-management of firms”) and “social control of investment” (184). As a result, his economic democracy “is also far more compatible with ecological sanity than is capitalism… Capitalist firms tend to maximize total profits. Democratic firms tend to maximize profit-per-worker” (187) and therefore would not expand unsustainably. I like most aspects of these ideas, but not the conflation of government and economy implied in "economic democracy," and anyway these ideas will not be realized at a transformative scale without the regulatory insistence of government, notwithstanding the successes of the Mondragon Corporation, a cooperative. I used to work both for Canadian Tire and the Royal Bank of Canada, both of which engaged in limited profit sharing, but they were hardly democratic institutions willing to change according to the results of a vote. Trump would never do it. When political allies vote for, or work toward, a politician who wants less government and more leadership by corporate fiat, they are forgetting how democratic government serves and protects them with a much higher priority than how corporations do. This ignorance or selective memory has various historical dimensions that can best be explained through Trump’s slogan, “Make America Great Again.” This imperative assertion is an order, in fact, that both reifies his authority and delegates accountability—a big problem with corporate governance. It suggests that now, the end of the Obama administration, is a time when American is not great. Greatness is the past—perhaps the so-called Golden Age of capitalism in the two or three decades after the Second World War. (Trump might well prefer a revolutionary era.) Trump’s echo of Ronald Reagan’s slogan ("Let's Make America Great Again") suggests that he can remember only as far back as the late 1970s and into the 1980s, around when a potentially sustainable capitalism (Schweickart 177-178; Featherstone and Miles 126) veered off the cliffs of insanity. Trump’s remarkably short memory is a sign that we live in a time that Mark Featherstone and Malcolm Miles describe as “a permanent present” (125) on the pretense (not theirs) that no alternative to capitalism means no change and thus no future. It is also evidence of Trump’s nostalgic desire, as Svetlana Boym might describe it, “to obliterate history and turn it into a private or collective mythology” (xv). Voters buy Trump’s economic rationale because it encourages them to romanticize the past rather than believe, as Hillary Clinton asserted, that America’s best days are ahead of it (maybe four years ahead). And, in this case, it’s easy to forget. It requires no work at all. The New York Sun advised Virginia not to think so much about questionable characters like Santa Claus, and its message—though seemingly winsome—is far too close to the anti-intellectual message of Trump and his most manipulative and manipulated followers. The editors in 1897 encouraged young Virginia, eight years old, to concentrate on “faith, poetry, love, romance” rather than wonder about the truth and even begin, in her innocent way, to do some research. How sad that she put her faith in the Sun! How ironic that Trump pointed fingers so often at the liberal bias of the media when this historical example is so aptly contrary. How hilarious to imagine Trump expressing a thought or feeling even remotely poetic. We in (North) America cannot trust “the” government when “the” means Trump and his corporate agenda, one premised at least in the popular imagination on the end of the separation of government and economy. And I am simply heartbroken that so many Americans could trust someone so unwilling to allow his deals to be scrutinized for their legality. And someone so evidently racist, in his plans to ban Muslims and build a wall against Mexico.* And sexist, in his admitted sexual harassment and his repeated misogynistic slurs against one of the most accomplished diplomats in the world. * See the It's All Narrative blog for a convincing explanation of the relationship between economics and racism in Trump's electoral victory. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “Trump’s Appalling Economic Democracy.” Publicly Interested, 9 November 2016, www.publiclyinterested.weebly.com. Recently, Jordan Peterson, a professor at the University of Toronto, helped to cause a minor scandal when he refused to use gender-neutral or accommodating pronouns with students who self-identify as other than “he” or “she.” The university remonstrated him—and then Rex Murphy came to his defence a week ago in The National Post. Yesterday, the professor had a major news outlet, The Toronto Sun, to publish his own essay. That Peterson is gaining publicity for a right-wing perspective should be obvious from the stated dislike of Marxism in his essay and his nigh inexplicable claim that people who want to change pronoun usage have “an intense resentment of anyone who has become successful for any reason whatsoever.” As a more-or-less leftist liberal with only a little nostalgia for the bygone conservatism of the Red Tories, I want to use my own admittedly (and helpfully) jumbled politics, and my position as a professor of English, to ask a simple question. How can we set aside the us-and-them politics of this debate?
Before I go too far, I want to say that if a student ever came to me and said, “I prefer the pronoun 'per'" or any other pronoun, I would use it, or, if I couldn’t remember it among all the options, I’d use the person’s name. Having some control over the words people use to define you is meaningful to your sense of identity and belonging. Here is one of my favourite poets, the insistently or at least consistently lower-case bill bissett, offering a similar opinion: . . . . . . . . . . can b myself he she thinks thn thats the feer that th punishment will cum fr sure if he she cant leev her call her him n start packing Here bissett is also radically objecting to the authority of standard English, while offering the he/she option that many people today would change to “they.” Who would have thought that bissett’s writing would ever be old-fashioned in the eyes of other radicals? But rather than do any research right now to answer this question, I also want to say that I note as “incorrect” the grammar of most students who use “they” when referring to singular nouns and names. When a student’s writing is already excellent, I try not to count “they” as a technical error. Most students, however, are not using "they" for political reasons. Rather, they don’t know which parts of the sentence benefit from agreement with each other. They need a lot of reminders about how parts of sentences fit together to generate and express coherent, consistent thoughts. Asking for agreement in writing is usually not as political as many students and critics think. It's obviously political in the case of Peterson, however, with various parties attempting to convince or cow each other. In my opinion, confrontational assertiveness is no help, and a third way out of the double bind is needed. I can respect someone’s stated preference for a set of pronouns, but, if the word “they” comes from standard English and is plural in standard English, I’d also like people to respect my preference. It’s a part of my sense of identity and belonging as someone who loves language and has fostered that love against various stigmas that persistently degrade art and the humanities. Rather than err with “they,” I’d rather see writers use neologisms such as “per,” “pers,” and “perself,” which Marge Piercy coined in her 1979 novel Woman at the Edge of Time. (I like these ones because they remind us of the English word “person,” so they’re not only affirmative but also easy to remember and say.) To butt heads on “they” as plural or singular is to perform a script produced by a binary opposition whose politics is equally binary and thus potentially antagonistic. (“Politics is” can be correct when “politics” is used as a synonym for other singular nouns such as, in this case, “ideology.”). The third way is the neologism, which should be less contestable, in theory but not in Murphy’s or Peterson’s case. Murphy’s conservatism reacts partly against the perception of these pronouns as “a set of freshly made up words,” or, in other words, what he calls “neologisms.” Notably, according to the Oxford English Dictionary, the word "neologism" itself dates to 1772, which is closer to “new” than “old” in the history of the English language. If Murphy reflected on this relativity, he would soon realize that the English language is constantly changing to reflect new realities, partly by gaining new words. I would remind Mr. Murphy of George Orwell’s coinage of “doublethink,” which I suspect Murphy himself has been glad to have in his verbal toolbox. I love Murphy’s subjunctive and his vocabulary of “imprimatur” and, perhaps ironically, “obscurantists”—but, Mr. Murphy, to use “midwife” as a verb would surely have bothered some English professor somewhere. Maybe even me. Yes, I have been—am—a prescriptivist much of the time. In trying to improve a student’s writing, we’re trying to improve the student’s thinking. Many of us need to improve our thinking by learning how to think beyond binaries, or black and white. This lesson comes partly out of the debate over pronouns, and many of the advocates of gender-neutral pronouns identify as “non-binary.” But, still, knowing how words agree with each other is really helpful: it helps writers to be aware of how sentences work and how their readers might experience their sentences. There’s nothing wrong with this purpose. So I was stung when I first saw how the website Motivated Grammar attacks professors like me for prescriptivism. I’m amazed at how someone could write against prescriptivism and sound like such a bully! Check it out: The only problem with this view [of grammatical rules as helpful] is that all you’ve managed to learn about English is how to get your brain to release some satisfying endorphins every time you blindly regurgitate some authority figure’s unjustified assertion. You’re not helping; you’re just getting someone to pretend to agree with you long enough to shut you up. Or worse, you’re scaring people into submission to a point where they feel compelled to preface their speech with apologies for any unknown violence their words are committing against the presumed propriety of the language. (par. 4) Notably, Peterson believes that his university and his provincial government are trying to do just that: “[scare] people into submission.” He worries that the government will dramatically expand hate speech laws to punish people who misuse pronouns which, I agree, would be scary. I know that a pronoun can be used hatefully, but there are all kinds of other words that are much worse; "hate" is a very serious word. What if you could be punished if someone overheard you misidentifying a genderfluid person who identified as “she” when you knew her, and who later flowed into “he”? Gender is too complex to regulate with such imagined laws, and one would hope that the tone of the discourse surrounding it could be less brutish. Laws can be too rigid, and other forms of power can be more flexible. I like the power of contextualization, of putting things in perspective. Motivated Grammar states that many well-respected writers throughout history have used the singular “they.” If great writers break the rules, why can’t we all? Using a claim to authority (the great writers) to deconstruct a claim to authority (grammar) is fine, but it can be interpreted as just another power play, one power against another. Recently, I heard Alan Doyle of Great Big Sea hosting his program on CBC Radio, and he said of a song he had just played, “I love it—loves it!” He corrected himself into using the grammatically incorrect but culturally appreciated error of subject-verb agreement in Newfoundland. This example of self-policing demonstrates to me that the “grammar police” and the related discipline are not only functions of a dominant language or culture. (Read DA Miller’s The Novel and the Police or Michel Foucault’s Discipline and Punish for more on police and self-policing.) Dialects and subcultures have their own gatekeepers, often cultural figures such as Doyle or Murphy. I like their respective styles of writing, but let me give my own example of a great writer. Not too long ago, I was reading Tim Ingold’s wonderful book Being Alive, specifically its chapter on landscape and weather. The blurb from Stuart McLean (not the Stuart Mclean of The Vinyl Café) on the back cover claims that his prose “is exactingly lucid and charged with poetic eloquence.” Indeed, he is a writer who can use the subjunctive perfectly: “Are pebbles, then ‘objects on the earth’? [James] Gibson would say so, and so would we, were each of us to stop to pick one up and, having examined it, to replace it where it lay” (131). But I found this sentence: “For formerly blind persons whose sight has been restored by a surgical operation, and doubtless for the newborn opening their eyes for the first time, the delirium [of seeing the world appear to be formed in the moment] can be overwhelming” (128). Here, a writer many would call “great” switches from the plural “persons” to the singular “newborn” for no apparent reason, thereafter linking “newborn” with “their” when “newborns” would agree better. Why not write “newborns”? (It’s so easy to fix these minor errors, so why not?) Did Ingold intend to refer back past “newborn” to “blind persons”? Not likely. (That’s a sentence fragment, of course, and I’ve started some sentences with conjunctions, too.) But what harm is done by agreement? And why doesn’t this usage cast doubt on the writer? The short answer is that we trust Ingold’s writing because of who he is (however questionable such authority might be) and, more important for my argument, because most of his writing really is above reproach. Readers in the academy, however, are trained (perhaps a distortion of our education) to be critical of everything, including each other. One of the recent peer reviews of one of my essays returned the feedback that my writing is too “conversational”; I had used a single contraction in 6,500 words. (The essay has since been published.) My former supervisor, in contrast, reacted to my attempt to minimize metaphor (read my book if you wonder why) by telling me my writing had become almost unbearably “stark.” Professors tend to approach everyone’s writing with a critical eye. Students, especially, are usually in the early phases of establishing credibility as thinkers and writers. If my professors over the years hadn’t noted the myriad ways in which my essays were difficult to understand, I might have improved simply by reading a lot more, but I might have needed twenty years instead of—I won’t say how many. In the end, I wish Peterson would relent and eschew his overly conservative ways, but I also wish that the more ardent prescriptivists and political correctors would calm down a little so that we can talk about writing and gender without polarizing our debates. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “The Confessions of a Sisyphean Prescriptivist and bill bissett Fan." Publicly Interested, 4 November 2016, www.publiclyinterested.weebly.com. We often talk about how privacy is “shrinking.” Consider these pieces in The New York Times (on tiny office spaces), The Harvard Business Review (on shareable data such as body metrics), and Slate (on the secrets of corporate "people") as examples. We use this metaphor of space, one that can shrink or grow, to conceptualize privacy, but we rarely talk about “growing” it.
How do you grow privacy? “How do you grow a prairie town?” Robert Kroetsch once asked in a poem. His simplest answer was that “the gopher was the model,” because it could pop up and just as soon vanish. And if privacy is necessarily spatial, like a town, then, yes, I suppose it can come and go quite easily—or you come and go, and it stays wherever it is, sometimes where you might not find it again. If you’re one of the many teenagers who finally get their own room, you might lose it as soon as your parents have another baby. How do you shrink a private space? Easy: grow more people. And because space is finite and we can’t “grow” the space, not exactly (perhaps with the exception of a few built islands), you need to arrange for fewer people or for people who can’t claim it—thus war, colonialism, slavery, and real-estate bubbles or unaffordable housing. To oversimplify. But is privacy necessarily spatial? Two recent essays in The Walrus have been prompting me to think about this. One, by my friend Naben Ruthnum, is about thrillers and detective fiction and how these genres “reassure us that secrets are still possible,” even in the age of social media “when we can discover the unedited, intimate contents of millions of lives online” (70). The other, by Jonathan Kay, claims: “While pop culture continues to push the narrative that privacy is disappearing, the reality is very much the opposite: privacy protection has become a huge element of both engineering design and corporate branding in the technology industry” (26). According to Kay, our privacy is much better protected than we think, because multinational corporations such as Facebook and Microsoft are convinced that their businesses will grow faster if they have robust security protocols and privacy policies that let us believe we’re in good hands. For Kay, in the real world our secrets are safe, and only in the world of fiction do we really have to worry about private detectives, spies, and cat burglars rummaging through our underwear. But in both pieces, privacy is not so much a space as a feeling of security (this being the sense of privacy articulated after slavery in Dionne Brand’s answer to One Hundred Years of Solitude, At the Full and Change of the Moon) or a right to secrecy. While I was reading and re-reading The Walrus, I also happened to be reading the wonderfully bizarre At Swim-Two-Birds, a 1939 novel by Irish author Flann O’Brien that raises some of these questions about privacy. It’s one of the tallest of tales—a whopper you might say—in which an undergraduate writer composes a novel that involves Irish legends mingling into a cowboys-and-Indians narrative that crosses the path of a devil and a fairy. Said writer often escapes from his bullying uncle into his imagination, and his writing—as escapism—is really for him an escape into privacy. This is the opening sentence: “Having placed in my mouth sufficient bread for three minutes’ chewing, I withdrew my powers of sensual perception and retired into the privacy of my mind, my eyes and face assuming a vacant and preoccupied expression.” This line is followed by many other similar “retirements.” I’m fascinated by how physical and temporal it is; he’s chewing, and it’s for “three minutes.” It’s physical, but it’s also beyond “sensual perception,” as if it were meditation, as if he were a yogi. His mind might be a conceptual space (as it is in Phyllis Webb’s metaphor of the “glass castle” or Simonides of Ceos’s “memory palace” and his "method of loci"), but it is also out of space and time. In theory, then, your privacy can be as big as you can imagine it. Escapism is a management of the intrusions of the social world, the social world that is supposedly the real world in contrast with the world of fiction, illusion, or fantasy—whichever you prefer in this case. I don’t believe in this illusion vs. reality dichotomy. Our “real world” is absolutely full of illusion, fantasy, falsehood, deception, and error, and these make the world go round. Sometimes the only assurance is when you escape it into the mind, as when Descartes says, “I think, therefore I am.” Escapism is actually quite important, maybe more so than ever. It helps us minimize the social world, and it enables us to be a little more conscious and in control of the blend of fantasies in our lives—those of others (e.g., entertainment corporations, political parties, the “echo chambers” of social media) and our own. The social media networks offer privacy only so they can monetize your secrets for themselves. It’s your privacy but their property. Escapism can be a way out of this capitalism—if it’s not through more private property, or publishing, or buying video games or Game of Thrones seasons or any of a million other entertainments, activities, acquisitions, and options in general. Ruthnum’s essay suggests that fiction alleviates real-world anxieties (such as homophobia surrounding the trial of Oscar Wilde, alleviated by horror stories of his time) (70). It doesn’t only create an anxiety for the reader’s enjoyment of suspense, and then relieve it by resolving the tensions of the plot. It doesn’t only pose a fictional problem and offer the fictional solution. Ruthnum’s most compelling observation is that many thrillers today are in fact “near-techless thrillers” (69). They are set before the Internet, or people don’t have their smartphones, or their equipment is broken. The “tech” is basically a spoiler; it stops a tense plot from developing. What if that’s the problem with our real world? The inverse of Ruthnum’s observation is that, in our tech-full lives—despite true threats such as cyberbullying—we are usually contending with our own banality. Although plenty of escapism is banal (e.g., most television, even today in its “golden age”), the thrillers that Ruthnum reads are not. The writer’s imagination in At Swim-Two-Birds is not. They are fictional solutions to real problems. A banal world is a small world, whether real or illusory, social or private. Growing our privacy might be simple: shrink the banality—the sheer boredom, the predictable behaviours, the conformism of body and mind. Set aside the phones and their clocks. Be unplugged and alone more often, but not by shrinking the world of real people. Don't covet your neighbour's house. Sometimes I feel that there is nothing more banal than a mortgage. Now if I could only stop binging on Game of Thrones... Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “How Do You Grow Privacy?” Publicly Interested, 17 August 2016, www.publiclyinterested.weebly.com. Right now I want to help a bigger public than usual to understand literature, rather than try to add to what other professors know about William Shakespeare, symbol, and metaphor. For you high school and college and undergraduate students finding this blog, the easy way to cite this post and avoid plagiarism is right here:
If you say, "Bullshit, I'm not curious," you're using a metaphor to call a statement excrement. If you say, "It's too complicated, so I'll probably never get it," you're using one too: the metaphor that knowledge is something you can "get," as if it were some new shoes. And you probably shouldn't believe either of these quotations, because they don't give you much credit, and metaphor is never literally true. My specific interest today is how metaphor interacts with symbol. You know what a symbol is, but I'm going to explain a little more about it, starting with Shakespeare's character Macbeth when he says (or when Patrick Stewart says it, playing Macbeth), Life’s but a walking shadow, a poor player That struts and frets his hour upon the stage And then is heard no more… (V.v.24-26) The first line here is a metaphor: life is a walking shadow. It’s an explicit metaphor because it spells out the basic formula for metaphor, A = B (life = walking shadow), which I learned from Trevor Whittock in his book Metaphor and Film. (There’s no harm in saying where you learned something.) Shakespeare follows up with an implicit metaphor: life = an actor (the "player... upon the stage"). It’s implicit because he doesn’t say “is” in that metaphor. Why does he need two metaphors to explain life? One answer, a short one, is that life’s not easy to understand. Another is that an actor walking in the spotlight on stage will cast a walking shadow, so Shakespeare is not so much adding a metaphor as he is extending the first metaphor. Let's return, then, to "life = walking shadow," A = B. Another way of explaining the formula for metaphor is to say, “this is that” (Frye 11) which I learned (as you already saw in the parenthesis) from Northrop Frye. The equal sign from above is equivalent to “is,” and that’s why we understand metaphor as an expression of shared identity instead of similarity. You probably heard that metaphor is a comparison that doesn’t use like or as. This explanation isn’t bad, but it’s not good, because comparison is what similes assert. Metaphors assert identity: that two things are the same thing. The verbs “to be” and “is” refer to being, and being is essential to identity. The verbs and the equal sign also suggest how specific metaphor is, compared to symbols, which usually have a bigger variety of meanings. But there's also a difference between a symbol and the category of symbolism (things that stand for something else). The category includes metaphor (because the A stands for B). Symbolism includes symbol itself. If you wonder how a category can contain itself, think of your parents. They are symbolism, and you are their child, symbol. But you also have a cousin called metaphor, and another called synecdoche, and another called metonymy. They’re all in the same family and can often be mistaken for each other, but they’re all different. So, the first line I quoted is a metaphor that contains a symbol: the shadow. How you could ever be the child of your cousin is way beyond my understanding of biology, so this is probably where metaphor breaks down—where, if you push it far enough, it doesn't make sense any more. But up to that point, metaphors can make sense. Let’s just focus on the symbol of the shadow. According to an often consulted book called A Glossary of Literary Terms, symbols can be traditional (also known as public or conventional, among other synonyms) or personal (or private or invented) (“symbol” 358), and as a traditional symbol the shadow is easily understood. It means death, transience, guilt—usually negative things. Here's Shakespeare's twist on it. No one interpreting a shadow is likely to say it means “life,” but Shakespeare does—"Life's but a walking shadow"—and metaphor is the device that enables him to be so creative with a symbol. He also makes it walk, which animates the reference to death so that the symbol is not so gloomy (unless, of course, it's the Grim Reaper). You might argue with me here, pointing out that the soliloquy is really depressing by the end, when Macbeth says that life is “a tale / Told by an idiot, full of sound and fury, / Signifying nothing” (V.v.26-27). Agreed: that’s dark. But, guess what? Shakespeare is dead, but lots of actors have cast a shadow on the stage or movie screen since he died, and many of them are saying his lines. That’s life! And when you notice how deft Shakespeare is with symbol and metaphor, you’ll probably agree that his words are signifying something, not nothing. Still, why would he use the word “idiot” to describe actors? He implies here that not only actors but also the people who write their lines are not only stupid but also wordy. (Wordy like the preceding sentence!) He is basically criticizing himself; it’s self-deprecation. But, funnily, not many of us think that of Shakespeare. He was very smart, and he knew it. Interpreted with this self-deprecation in mind, Macbeth’s gloomy speech can also be understood as a joke. Tragedy and comedy combined! Like that idea? Here are two final, slightly more advanced, ways of thinking about it. First, the joke in this metaphor is also what could be called self-reflexive metaphor or theatrical metaphor; he wrote it about himself and his experience in theatre. I mention it because Shakespeare popularized a whole tradition of how we understand the self (the actor on the stage) through theatrical metaphors. Second, think about liking. Think about all the “likes” on websites and social media that are there because a corporation wants to track our desires and simplify our expressions and interactions. Liking is about desire and about connection; fundamentally, it is the expression of a felt similarity between you and what you like: you like panda cubs because you value cuteness. This way of thinking is what similes are for. Metaphors are for when you are so obsessed with pandas that you feel as if you share an identity with them. You want to go live in a forest in China with them and protect them from hunters; your empathy is that powerful. Shakespeare wrote so often about actors and pretending to be other people that he was probably obsessed with them and, of course, acting was part of his career. Often, we relate to others through metaphor because we identify with them, as I learned from Diana Fuss in Identification Papers, a book my friend Mike Lee recommended. When you think of it, metaphor is probably more meaningful than any of us expected. It's about life, your identity, and how you relate to others on the "stage" of the world, which you can affect through your performances, just like your favourite actors and musicians affect the world. And when you know more about metaphor, it reveals hidden or extra meanings about writers and their literature and culture, including their language, which might also be your language. If you're curious now, check out a book called Metaphors We Live By, by George Lakoff and Mark Johnson. You'll be surprised by how much of what we think is metaphor at work in our minds, without our knowing it. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “Shakespeare’s Symbol within Metaphor.” Publicly Interested, 14 July 2016, www.publiclyinterested.weebly.com. (This post is the first in which I've switched from previous MLA formatting guidelines to those in the 8th edition, which is, sadly, going to make me reformat this whole site.) Mainly because of the recent provincial budget in Newfoundland, where literacy rates are the lowest among all provinces in the country, around half of the public libraries are slated to close to save money. On the show Because News, Rick Mercer called the closures “an attack on literacy.” I don’t like military metaphors, but I agree in the sense that the closures are, like most wars, stupid. They seem highly unlikely to improve our collective intelligence. Maybe, as the library board's chairperson reportedly said, we once had even more libraries but unsatisfactory literacy nonetheless. I can accept that libraries are not magical except to those who already love to read, but I would add that they are only part of the story of literacy, just as literacy is only part of the story of libraries.
On the same show as Mercer, Aisha Alfa said that we might not really need libraries anymore, because we can get a lot of books in other ways: e-readers, radio, the Internet. But Tom Henighan explains that, unlike music playing on the radio or streaming on the Internet, "it’s impossible to put a book in the background and turn it on" (qtd. in Mackey 10). Although I agree that literacy now requires competence in multimedia, Alfa's suggestion doesn’t account for enough of the details. Yes, we can get a lot of books on the Internet, but the libraries under consideration (or lack of consideration) tend to be in rural and often remote areas where people tend to have less money and less Internet access—except in public places such as libraries. Another rejoinder is that reading online or on screen and reading a book are not the same. When you look at people read on a computer (e-readers being an exception), you probably don’t see it as reading. They are just “on the Internet.” In Literacies across Media, Margaret Mackey writes that, in any situation, we can't look at readers and "see what is going on inside" these reading minds (6). (I was drawn to Mackey's work on the weekend because she was speaking at the Newfoundland and Labrador Book History Symposium organized by my colleagues, an event at which the library closures were often discussed.) In the case of online reading, they could be looking at anything—an online book, yes, but also the image of a chair to buy or a music video or, most likely, the mosaic of distractions that is the typical webpage. The problem with literacy here and probably anywhere is that, unaware of the pleasure of concentration, too many children don’t care to read, and, obviously, it’s not much fun if you can’t do it and haven't seen others enjoy it. The conventional wisdom now is that having books in the home and reading to your kids are big boosts to literacy. Mackey explains: "We learn how to read in and through the company of other readers" (6). I also read an essay by Daniel Coleman recently, “Beyond the Book: Reading as Public Intellectual Activity,” that prompted me to realize that reading a book shows people an intellectual activity and promotes it by familiarity. We should read books to our kids, and we should read them around our kids. Yesterday, my partner and I borrowed a car and drove up to Pouch Cove, where we sat in the park across from the library, ate a snack, and admired the amazing view of the ocean and a massive strip of fog hiding the horizon. On some concrete wedges serving as a fence just below the view, some kids had painted images of the cove, including the same kind of fog. Inside, although the library was closed (just at that time of day), we peeked through the window and saw a long set of shelves filled with children’s books. The library also had a large health and wellness shelf, plus a big fiction section that included not only Danielle Steele but also local authors such as Elisabeth de Mariaffi. There was a desk in the centre of the room where a librarian could welcome people and help them find books and information, and—crucially—probably also serve a liaison role with the community centre upstairs. This is the crux of so many public spaces: you can go there and benefit from them without having to be screened by a bureaucrat, asked prying questions, reminded of what you need that you might not be able to supply for yourself. On the weekend, we were having brunch with some friends, and our friends gave lots of examples of what libraries offer. They offer librarians (experts on how to avoid the problems of reading on the Internet, for example), warmth in winter, air conditioning in the heat of summer, safe space, public washrooms, Internet access, and books—and people from all walks of life go there, at least in my experience. When you learn at a library, you learn how to be more self-reliant with information and knowledge. You can go there with dignity, even gain dignity (e.g., a sense of pride in learning). When people rightly criticize the proposed library closures as likely to hurt poorer people more than richer people, we should all remember how it feels when our pride, self-respect—dignity—are hurt. The individual consequences, of course, are often badly rationalized by monetary savings in the grand scheme of things. So, the proposed library closures coincide with a new tax on books, the only one in the country, so that the government can not only save, but also make, money. But Chad Pelley at The Overcast seems absolutely right in his prediction that the tax will deter book buyers and thereby raise little money through book sales—“10% of nothing is zero” (6)—while damaging the publishing, distribution, and local retail industries. For my own part in the grand scheme of things, I fear that I am not going to enjoy my teaching as much if or when I am teaching literature to people whose literacy has suffered partly because of the new book tax and proposed library closures. But, sad consolation, I will probably not have to teach many of them; they will be less likely to meet entrance requirements, regardless of our oft-mentioned special responsibility to the people of the province. The accessibility of the university will worsen. The university will be, for people who might once have benefited from a public library, another closed door. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “'The Dignity of the Library.” Publicly Interested. 10 May 2016. Web. [date of access] If you were my downstairs neighbour and I put my stereo speakers on the floor, cones down, and pumped up “Smells Like Teen Spirit,” you could call the police and I could get a ticket.
If we were at a restaurant, together in the room but not “together,” and I was wearing too much cologne, there would be no similar recourse that I know of. Why not? Like music, scents can invade one’s personal space. Although you might not especially want your perfume or cologne to be smelled by everyone, most scents are intensified by synthetic chemicals so that many people can smell them whether or not they are your intimates. Perfume and cologne can expand the wearer’s personal space. They are on clothes, hair, and especially skin, and suddenly the skin can often be detected anywhere in the room. By phenomenological magic, they make the wearer a giant. They are a claim on space. As with a flag planted on a hilltop, a scent says, “You will always be able to notice me.” Unlike the flag, the person wearing the scent doesn’t have to be seen to be noticed. It’s a sign that can point at (seemingly) nothing, so, in the hallways on campus or in office buildings, I can routinely smell fragrances worn by people who are no longer there. Unlike music, a fragrance can trigger asthma attacks, headaches, and dizziness (“Go” par. 7). In my view, or in my nose, it is a bad sign. Except when used sparingly, perfumes and colognes redefine public space. You can’t look away from them. They replace the discourse of speakers and listeners with nonverbal messages, each one loud, like a cry. In an Althusserian sense, it is a hail that provokes an ideological recognition or compliance. I’m serious; like a person’s fashion and couture, a fragrance has meanings related to peer groups, cultural influences, and identity politics. One way of reading the bad sign as an ideological message is that any space is or was open to colonization (a message we descendants of settlers in the West recognize instantly, if not consciously); even the air can be commodified with a branded fragrance such as “Obsession” or “Chanel no. 5.” Whatever the nuance of a perfume or cologne, it always says, “smell me.” But it’s very different from the scents in Michael Ondaatje’s poem “The Cinnamon Peeler.” In this poem, the intoxicating and sensual smells evoked are only in the imagination. They are erotic because you can’t touch the body in the poem. In real life, fragrances often over-deliver. I have been turned on by the occasional perfume, if it teases. But fragrances are often just constantly in your face. Even if you can’t touch the body, you can become numb to its attractions, and your other senses can be overwhelmed—especially taste, which relies so much on smell. Last week, I was in Montreal again, and we splurged to go to my favourite restaurant, P’tit Plateau—but the experience was not what it could have been. It was not because of the food, which was excellent as always. It was not because of the company at our table, because everyone was wonderful even when I became a grump. The problem was with some of the other company at a different table, specifically a guy who came in with his girlfriend and promptly stunk up the place. My train of thought went like this: 1. If my evil eye could kill, he’d be dead right now. 2. Relax, the scent is already gone—no, it’s back. 3. I can’t taste the celeriac. My wine is not from Cologne. 4. Maybe I could ask the waiter to talk to him. 5. Maybe I could leave with a doggy bag of food. I stayed, but I quietly complained to my friends. Later, on the street after having chugged some more wine, I was more vocal. My friends, one who is Parisienne—the French being associated with the finest perfumes—and one who loves his cologne, objected. So did my partner. These were their reasons: 1. Your nose is sensitive. We couldn’t smell him much. 2. He might have put on too much without time to clean up. 3. We have to tolerate people’s differences in public. I’m not certain that I recall #3 exactly, but it was a message of tolerance, and I accept that. Mostly. I do, however, believe we are justified to be intolerant of harmful behaviours. (How intolerant? Possibly more so than I was, given that I stayed and didn't say anything to the waiter or the cologne-wearer.) People who smell a little like the type of food that they eat should be tolerated, because food is necessary and nutritious (other edibles being worthy of disqualification as food, according to Michael Pollan). People who smell like gasoline because they work at a gas station should be tolerated because they need to work for a living. (My grandfather sometimes smelled of engine oil—and the tobacco that he called “snuff.”) I wouldn’t be bothered if they cleaned up before going fine dining, though. I’ve already suggested how, in theory, wearing a scent can be harmful, but I would like to substantiate it so that I don’t appear to be sticking my nose in other people’s business for no reason. First, let’s consider the potential scope of the harm. When I talk about scents, I mean not only colognes and perfumes but also all the fragrances added to moisturizers, hairspray, aftershave, candles, anti-static sheets, soaps and detergents, deodorant (an ironic term if there ever was one), etc. They are everywhere, and that’s part of the problem. Other people’s business. Here’s where perfumes, colognes, and other fragrances become really interesting. Yes, I admit that fragrances can be interesting on their own. Michelyn Camen of ÇaFleurBon blogged to lament the “repercussions of ‘anti-perfumism’ on our Art,” and while I hesitate to attribute the term “art” to everything, especially with a capital A, I have no doubt that some people—artisans, maybe even artists—can create remarkable and meaningful effects with fragrance. Mother nature does it too, and we can accentuate nature. I marvel at the complexity of coffee, which can be grown and cultivated to enhance its enjoyable qualities. So, business. In Canada and the United States, if not elsewhere, fragrance is a big business. It’s pharmaceutical. As with any corporation that uses science to create proprietary formulas, fragrance manufacturers want to keep the secrets of their products. Governments have mainly allowed corporations to continue as is, despite the finding that a “typical fragrances can contain between 100 to 350 ingredients” (“Scents” par. 7). On these long lists are substances such as “carcinogenic ‘hazardous air pollutants’ (1,4-dioxane, acetaldehyde, formaldehyde, and methylene chloride), which have no safe exposure level, according to the U.S. Environmental Protection Agency” (Steinemann par. 5). The Canadian Lung Association focuses on diethyl phthalate, which is an allergen and reproductive toxin. Anne Steinemann, contributing an essay to The David Suzuki Foundation, has expressed concern about the lack of regulation (par. 6). Why, if we are concerned about interactions of prescription drugs, are we not worried about fragrances, which are like drugs in that we ingest them through our noses and skin, and they modify our body chemistry (as food does, of course, but not harmfully)? The situation is reminiscent of the governmental relationship with the tobacco industry before the widespread restriction of smoking in public, with the exception of course that fragrances have not been linked with millions of deaths. Cigarettes are obviously much worse. I remember coming home from the bar (where I was neither a smoker nor much of a drinker, in those days; I was designated driver), and my clothes would stink up the apartment and could transfer smells to the furniture and carpet. Those days are gone, here at least, though several of my older relatives have died or are sick because of their smoking. In our newly healthier environments, Marilee Nelson calls fragrance “the new secondhand smoke.” I would not be surprised if we can one day (if not already) correlate illnesses such as cancer with low dosage interactions between high numbers of chemicals, or simply with the carcinogens already proven to be in many fragrances. When I expressed some of these concerns to the manager of health and safety at one of the universities where I worked previously, he said that he wouldn’t support a no-scents policy because fragrances don’t bioaccumulate and are therefore not harmful, and because some people like the smell. Yes, and some people like to smoke cigarettes. And, in fact, some chemicals in fragrances do bioaccumulate, according to research in the United States (re-reported by the DSF) that found that 70% of umbilical cords contain synthetic musks (“Go” par. 8) . In other words, we keep the synthetics in our bodies and can transfer them to our fetuses and children. They’re also building up in the Great Lakes and in the fish that live there. If nature is the ultimate public space, we are marking it with scents as no other animal has ever done. Works Cited
How to cite this blog in MLA format: Deshaye, Joel. “'Smell Me': Scents and Public Space.” Publicly Interested. 29 Apr. 2016. Web. [date of access] |
AuthorJoel Deshaye is a professor of English literature with an interest in publics, publicity, celebrity, mass media, and popular culture. Categories
All
Archives
June 2022
|