Radicalization, Responsibility, and Control: The Islamophobic Rhetoric We Use to Talk About ISIS

بسم الله الرحمن الرحيم

In the Name of Allah, The Compassionate, The Merciful

Fifteen years following the 9/11 attacks, and studies show that Islamophobia is worse than ever.  One reason that remains under-examined is the manner in which we use Islamophobic rhetoric to talk about ISIS. Rather than doing anything to actually “defeat ISIS” (to quote several politicians), all this rhetoric does is normalize anti-Muslim racism.

First off, it is worth noting that Islamophobia is a confusing term. It insinuates a “phobia” or fear of Islam and, by extension, Muslims. However, reducing Islamophobia in such a simplistic manner does not address the systemic and historical roots of anti-Muslim aggression — namely that it stems from colonial discourses of white supremacy. Put simply, Muslims are technically a religious demographic; but, Islamophobic rhetoric has racialized Muslims as Brown people belonging to a pre-modern civilization that is inferior and subordinate to that of the West. This casts a wide net of who actually experiences Islamophobia: from non-Muslim Arabs, signified by the murder to Khaled Jabara by a neighbor last month, to Sikh men who adorn the turban for religious reasons. This is also why Jaideep Singh even goes so far as to argue that the term Islamo-Racism should replace our use of Islamophobia, with others preferring to use the term “anti-Muslim racism.”

When this rhetoric of Muslims’ inferiority is used to shape our understanding of ISIS, all it does is instrumentalize ISIS to rationalize structural anti-Muslim racism. This was demonstrated well during the Republican and Democratic national conventions this summer where Muslims were portrayed either as ISIS affiliates or individuals willing to work with the state to defeat ISIS-like radicalization. In his RNC speech, for example, Donald Trump framed his ambition to “defeat the barbarians of ISIS” by strategically placing the attack in Nice, France, alongside a myriad of other examples in the U.S., including the 2001 attack on the World Trade Center and the recent Pulse nightclub shooting in Orlando, FL. In doing so, Trump spoke through a false dichotomy wherein the victims of ISIS are uniformly Western, and marks the aggression as a coherent, linear brand of “Islamic terrorism.” In proclaiming this violence as inherently and solely “Islamic,” however, his rhetoric leaves no room to address that Muslims are actually the primary victims of ISIS. July alone, for instance, witnessed one the deadliest attacks in Baghdad, Iraq since 2003 killing over 250 (predominantly Shia) people in the Karada shopping district; one of the deadliest attacks in Kabul, Afghanistan since 2001, killing over 80 Shi’i Hazara Muslim; and an attack in Istanbul, Turkey killing over 40 people.

Donald Trump speaking at the 2016 RNC

Donald Trump speaking at the 2016 RNC

Unlike Trump’s RNC speech that promoted an “Islamic radicalization versus the progressive West” narrative, the DNC fueled a classical “good Muslim”/“bad Muslim” narrative in which the “good Muslim” was one eager to cooperate with the state to defeat radicalization. Bill Clinton stated this point blank by proclaiming, “if you’re a Muslim and you love America and freedom and you hate terror, stay here and help us win and make a future together.” Several Muslims immediately pointed out via social media the manner in which Muslim loyalty was put on trial. It may also be worth noting that neither of the candidates – Hillary Clinton or Donald Trump – even mentioned the word “Muslim,” and that the only time Trump did was in reference to the Muslim Brotherhood. This means that Muslims were invisible in both conventions, unless and until terrorism and/or militarism was brought to the forefront.

What the language in the speeches demonstrates is that anti-Muslim racism in today’s ISIS hysteria-ridden/post-9/11 context paints Muslims as if they have been infected by this thing called “Islam.” If it is not surveilled and monitored properly, this will turn into the virus we now call “radicalization”; but with proper care, this radicalization will lie dormant. This was particularly clear in the fear that Trump spurred through his application of “radical Islam,” and in Bill Clinton’s plea with Muslims to stay in the U.S. so long as they can fulfill their “responsibility” to fight terrorism (insinuating that Muslims would have insight into terrorism by virtue of being Muslim).

What’s more, the entity responsible for controlling the “radicalization” outbreak, is the State itself. That is, the federal government has taken it upon itself to cure the nation of this illness by enlisting the help of the U.S. public, and especially Muslims and Arabs themselves, to participate in the surveillance process. Anti-Muslim rhetoric, then, reinforces the need for control and surveillance through the fear of radicalization.

Muslims demand equal rights in a 2013 U.S. protest.

Muslims demand equal rights in a 2013 U.S. protest.

The language of fear, responsibility, and control that was extolled throughout the convention speeches has material consequences that structure anti-Muslim racism, particularly through state and federal policies.  This includes (but is not limited to):

  1. Support for surveillance programs such as the “Shared Responsibility Committees” which ask counselors, teachers, and community leaders to help the state identify individuals who have been potentially “radicalized.” These rely heavily on the use of “suspicious activity reporting,” which, some claim, has questionable and unsuccessful methods.
  1. Rationalized forms of racial profiling, particularly through the use of community informants. This, arguably, encourages Muslims and Arabs to engage in the State’s work of criminalizing their own communities. Since the informants are community members, this allows the federal government to evade the accusation of direct racial profiling.
  1. The prevalence of anti-Shari’a legislation in nine states, even though Shari’a was never practiced in U.S. court systems.
  1. The stigmatization of refugees seeking asylum simply on the basis of ethnic and faith-background, as indicated by attempts by state governors to deny refugees asylum — even though they do not hold that power.
  1. The emboldening of policies such as the Countering Violent Extremism Act, which may make innocuous acts of Muslim worship appear suspect, thereby further criminalizing Muslims for merely observing their faith. The CVE website “Don’t be a Puppet” reflects the manner in which not engaging in the acts of surveillance renders you “a puppet” that may, one day, be responsible for the presence of radicalization within our country.

As a necessary point of clarification, I am not trying to argue that nothing should be done or that “extremism” doesn’t exist – as a Shia Muslim woman myself, I belong to one of the most targeted groups of ISIS; however, the “countering extremism” measures here promote structural anti-Muslim racism by criminalizing Muslim communities, and renders Muslim victims of both ISIS as well as structural anti-Muslim racism, completely invisible. This is enhanced through the rhetoric surrounding radicalization, responsibility, and control.

Leave a Comment

Filed under Uncategorized

On Conspiracies and Cucks: The Rhetoric of the “Alt-Right”

This post was co-authored by Kevin Musgrave (UW-Madison, Communication Arts) and Jeff Tischauser (UW-Madison, Journalism and Mass Communication).

The brash new wing of the conservative movement, the so-called “Alt-Right,” has drawn public attention and ire, with Democratic Presidential nominee Hillary Clinton condemning them in a recent campaign speech in Reno, Nevada.  Despite this recent publicity, questions abound.  Who exactly are the “Alt-Right?”  How do they differ from other conservative groups and what are the defining characteristics of their rhetoric?

Clinton condemns the "Alt-Right" in Reno

Clinton condemns the “Alt-Right” in Reno

Though many prominent media outlets including the New York Times, Salon, the Daily Beast, as well as media watchdog groups FAIR and Media Matters have published pieces on the group, a solid conceptualization of the “Alt-Right” remains elusive.  Emerging from these pieces, however, is a list of common characteristics that may allow us to articulate the defining communicative and rhetorical norms and strategies of the “Alt-Right.”

The “Alt-Right” is often defined with and against the development of the New Right, a nebulous conservative movement represented by the rise of Barry Goldwater and Ronald Reagan. Blending fiscal and social conservatism with a strong military presence and foreign policy, the New Right offered a means of fusing what rhetorical scholar Michael Lee calls the conflicting dialects of traditionalism and libertarianism that constitute the political language of conservatism in the United States.

Yet, if Reagan has become synonymous with the fusionist message of the New Right, the “Alt-Right” has emerged within a conservative vacuum that has seen the Republican Party, post-George W. Bush, struggle to create a message capable of unifying traditionalists and libertarians alike. Indeed, what the “Alt-Right” appears to be doing in its rhetoric is actively delinking these two dialects, re-articulating an extremist traditionalist message, and separating the language of conservatism from the Republican Party.

Manifesting primarily in online forums such as 4Chan, Reddit, and RadixJournal, the “Alt-Right” consists mainly of 18-35 year-old white males who are, as their leader Milos Yiannopoulos claims, “young, creative and eager to commit secular heresies,” through the creation and circulation of openly racist, sexist, and nationalistic memes.

The usage of these memes is a way of signaling belonging to the group by demonstrating a fluency in the “Alt-Right” vernacular.  The memes are marked by the conspiratorial style of a white genocide narrative, an ironic deployment of racial tropes, protectionist rhetorics of white tribalism, and a European Anglo identity politics premised on racist pseudo-science.

In advancing a return to tribal politics, the “Alt-Right,” as Jack Hunter of the Daily Beast argues, defines itself against the radical individualism of the libertarian dialect as articulated by conservative firebrands such as Goldwater.  Denouncing individualism in favor of a radical traditionalism that calls for a return to communal, authoritarian, and hierarchical politics premised on racial difference, the “Alt-Right” abandons the religious metaphysics of Russell Kirk, Richard Weaver, and other traditionalists in favor of a racial science that justifies a disdain for egalitarianism and democracy.

One prime example of how those in the “Alt-Right” use memes to circulate these messages is the common term used by its members: “cuck.”  This term rhetorically defines the “Alt-Right” as opposed to establishment Republicans–cuckservatives–maligned by the “Alt-Right” as neocons and liberal Republicans pushing an internationalist agenda that threatens the white race.

In the satirical depiction of the American Conservative magazine below, internet users have fashioned the publication as a cuckservative mouthpiece, promulgating the death of the white race in efforts to achieve social justice.  Advancing a conspiratorial narrative that views immigration, inclusion, assimilation, and diversity as an affront to white masculinity and racial purity, the cuckservative is one who has emasculated himself, become a traitor to his race, and permitted the affordance of a white minority population and colored body politic through liberal policies that advocate for pluralism and equality.

WhiteGenocideCuckservative

Engaging in racist pseudo-science to support claims of white supremacy, the “Alt-Right” not only biologizes racial difference but also uses hereditary and cognitive science to argue against egalitarianism.  In this way, the values of the Enlightenment philosophy of classical liberalism, heralded by the libertarian right, become anathema to core “Alt-Right” tenets of communal, tribal belonging, racial hierarchy, and authoritarianism.  In redefining conservatism this way, the “Alt-Right” is imagining conservatism as an Anglo European identity politics and mainstreaming core tenets of white, authoritarian nationalism in popular discourse.

Enter Pepe the frog, a character previously associated with #gamergate, anti-semitic attacks on journalists and activists, and the male rights movement. Pepe plays to members of the white in-group who understand the joke for what it really is, a call to action. In this sense, Pepe is the wink after the racist joke. The rhetorical power of Pepe, like the racist joke, is that it lets its purveyors escape with plausible deniability. The ironic detachment that emerges in Pepe’s history helps to conflate intention with effect, allowing users to distance themselves from its often racist connotations. Rendering Pepe in Hammerskin Nation-like attire, covered in blood, carrying guns used by Nazi SS Stormtroopers, is not racist, or disrespectful, rather it’s an irreverent way to shock and disrupt PC culture.  Sharing Pepe memes allows members of the “Alt-Right” to espouse its “fuck your feelings politics,” distancing themselves from liberals and mainstream conservatives through vitriolic rhetoric.

PepeTrump

Pepe
When the Trump campaign tweeted an image of himself as Pepe in October 2015, to @BrietbartNews and others, with the message, “You Can’t Stump the Trump,” he rhetorically positioned himself as the Presidential candidate of the “Alt-Right.”  As a candidate whose views on American exceptionalism, immigration, and anti-PC culture resonate with the message of the “Alt-Right,” Trump stands as a figure capable of making white nationalist ideas a political reality.

Trump thus represents the power to create a sovereign nation state that protects white men from perceived economic and cultural threats. However, Trump stands more as a vehicle for the “Alt-Right” ideology than its driver. Even as Trump’s appeal appears to be diminishing with conservatives, his core “Alt-Right” constituency, aided by an array of “Alt-Right” media outlets and its dedicated meme warriors who troll Reddit and 4Chan, the “Alt-Right” as a political force is not going away any time soon.

Leave a Comment

Filed under Uncategorized

What’s in a Name? The Rhetoric of Bernie Sanders’s “Political Revolution”

Sanders

Coming off a solid Super Tuesday, the Bernie Sanders campaign is not going down without a fight.  Championing a platform premised on economic justice and corporate power, Sanders’s rhetoric resounds with that of Progressive reformers of a century ago.  Yet, despite these similarities, he carries significant differences as well.  Perhaps most notable is Sanders’s rhetorical positioning of himself as the leader of a “political revolution” rather than a reformer.  However, what about his campaign is revolutionary?  Can politics be revolutionary?

If we can think of revolution as a populist, non-institutionalized movement that seeks to bring fundamental change to the current political system from outside of the established political order, then can electoral politics be revolutionary?  Can something that exists within the system bring about revolution? Or, to cast Audre Lorde’s famous words into a question, can the master’s tools bring down the master’s house? Can something that is inside also be outside?

Let’s begin by looking at Sanders’s positioning of himself as a political outsider.  Many in the Clinton camp have questioned how Sanders can defend such a position.  After all, Sanders has been a congressman for nearly thirty years in the State of Vermont, and before that he was the Mayor of Burlington.  Rather than being opposed to the establishment, Sanders looks much like a part of the system he condemns.

However, refusing to take money from corporations, PACs, Super PACs, or vested political interests, Sanders’s campaign has been fueled by primarily small, individual donations to his campaign.  This is one of the most impressive facts of the campaign and certainly unprecedented in an electoral system that is largely pay to play.  By highlighting this aspect of his campaign, with its clear relation to his policy platform, Sanders justifies this label by portraying himself as outside of the political status quo.

This raises an interesting dynamic.  In this light, both positions seem to be correct to some degree. While Sanders is the consummate career politician, he is also refusing to be bound by the structures of the current campaign finance system. Thus, Sanders is both insider and outsider, being at once within and exterior to the political establishment.

Let’s dig a bit further.  On February 28th of this year U.S. Representative Tulsi Gabbard of Hawaii, the now former vice chair of the Democratic National Convention, resigned her position in order to endorse Sanders.  When interviewed about her decision Gabbard claimed that she was warned by DNC officials not to break from Clinton in support of Sanders. Resigning her position and feeling pressure to back Clinton raises interesting questions regarding Sanders’s outsider status, and his claims of a political revolution. Not only is Sanders outside of the current campaign finance system, he also appears to exist outside of the DNC official position. While Sanders has long been allowed to exist as an independent voice in the senate, there is now hesitancy on the part of the Party to allow Sanders’s particular brand of democratic socialism (a topic worthy of its own essay) to the be their official platform.  His views clearly seem to differ from a majority of those in the Party. This is Sanders’s establishment.

Coming to the larger question of revolution we now need to understand the attempts to rhetorically align politics proper with revolutionary aspirations. Many in the media have questioned the viability of Sanders’s “revolution,” pointing to its colorblind policies, overwhelmingly white voter base, and sexist supporters (certainly all interrelated problems) as inherent limitations to his vision of socialist politics.

While there have been ardent critics of Sanders on his relative silence on matters of racial inequality, from Black Lives Matter protestors at the Net Roots conference, as well as in Seattle, and also after the alleged “English only” comments at a recent rally, others have found in his platform room for a more inclusive revolution.  With individuals such as rapper Killer Mike, Cornel West, Michelle Alexander, Ta-Nehisi Coates (who has been quite critical of Sanders), and Erica Garner all endorsing Sanders, while continuing to push him on race, his history of activism for civil rights, coupled with Sanders’s condemnation of the Bernie Bros, seem to demonstrate attempts to craft a revolution across multiple axes of identity including race, gender, and class.

Through the debates regarding Sanders’s platform, I think we may begin to draw out an understanding of political revolution as occupying a paradoxical position that seeks to reshape established political institutions from both within and without. I believe that it is this liminal space that is the site of Sanders’s political revolution. By fundamentally altering the current structure of campaign finance and the DNC itself, as well as seeking to craft a coalitional movement across identity and difference, Sanders seeks to change politics from within while playing outside the bounds of the norms of political culture.  Sanders seeks to be within their world but not of it.

However, this raises important questions for leftist politics. Is such a position truly viable? Does the usage of the label “revolution” do damage to more radical positions by appropriating the term? Or, to the contrary, does inoculating the public from some of its more radical connotations allow for a more nuanced discussion of non-traditional policy positions by broader publics?  What are the limitations to Sanders’s positions on race? Additionally, how do we reconcile this label with Sanders’s platform?  Espousing largely a nationalist agenda, how can we reconcile the label of revolutionary with his positions on foreign policy, specifically with regard to Palestine?

Certainly these are only some of the important questions raised by Sanders’s campaign and by the paradoxical idea of a political revolution more broadly.  The answers to these questions are not simple. They may likely exist in the liminal space of a “political revolution” itself.

 

Leave a Comment

Filed under Uncategorized

What Were They Thinking?: Selling Rocket Mortgage in a Post-2008 Economy

“Push button. Get Mortgage.” That’s the tagline of Quicken Loans’ new product, Rocket Mortgage. The company introduced Rocket Mortgage to the massive television audience watching Super Bowl 50 in a one-minute commercial. The ad describes a simple push button app that allows people to get mortgages on their phones, which would lead to a “tidal wave of ownership [that] floods the country with new homeowners who now must own other things.”

After it aired, there was an eruption of criticism for the piece, titled “What We Were Thinking.” Here is a sampling of the next day’s headlines: Rocket Mortgage Super Bowl Ad Criticized for Encouraging Another Housing Crisis, Is that Quicken Loans Super Bowl Ad an Omen of Another Housing Crash?, Quicken’s Rocket Mortgage Super Bowl Ad Sparks Backlash, and Everything That’s Wrong With the Super Bowl’s Worst Ad. Tweeters, bloggers, journalists, and the Consumer Financial Protection Bureau noted the parallels of the promise of quick mortgages to the flurry of irresponsible mortgage-selling activity that caused the 2008 housing and financial crisis. Meanwhile, Quicken defended its product as one that ensures “full transparency.”

 

Quicken Loans Twitter

The advertisement for Rocket Mortgage is visually compelling and calls upon tried and true cultural tropes. Why then, was it not persuasive? Let’s take a closer look:

“What We Were Thinking” has a strong, consistent cadence throughout. The guitar in the background music is a driving beat. Even when other sounds are laid on top, the underlying sound remains the same – and it’s one that creates a feeling of both urgency and forward motion.

The ad’s narrator does the same thing with her words. She strings together a chain of questions, each beginning with the word “and.” Each question appears to follow from the previous, and together they build a progress narrative:

And if it could be that easy, wouldn’t more people buy homes? And wouldn’t those buyers need to fill their homes with lamps and blenders and sectional couches with hand-lathed wooden legs? And wouldn’t that mean all sorts of wooden leg making opportunities for wooden leg makers? And wouldn’t those new leg makers own phones from which they could quickly and easily secure mortgages of their own, further stoking demand for necessary household goods as our tidal wave of ownership floods the country with new homeowners who now must own other things. And isn’t that the power of America itself?

It’s possible for someone to argue with the premises of any of these questions or with the conclusions they imply, but the narrator does not pause between them to allow this. Instead, her cadence paired with the linked questions fosters a sense of inevitability.

It could seem strange to end this series with “And isn’t that the power of America itself?,” but the progress narrative is so key to the American mythos, and so reliant on notions of inevitability, that it makes sense here. In response to criticism of the ad, Quicken’s chief marketing officer said, “I think that everyone is realizing it’s time for the housing industry to advance.” Quicken Loans is not just selling houses, it’s selling a very particular understanding of progress, one rooted in consumption, accumulation, ownership. Indeed, the American Dream, our leading progress narrative in the U.S., is often defined first in terms of ownership, via “a house with a white picket fence.”

All of this is accompanied by a sense of “no big deal.” The ad’s narrator begins with, “Here’s what we were thinking,” and ends with, “anyway, that’s what we were thinking.” It feels nonchalant, makes it sound like the narrative is just common sense. The mundanity of the activities depicted in the commercial contribute to this feeling: people at a movie, people doing their jobs, people at an exercise class, a woman carrying her baby in the kitchen. These are everyday activities for everyday folks. They are common. In the Quicken ad, progress is common sense. Ownership is common sense.

Given these well worn tropes, one would expect the viewing public to be on board with Rocket Mortgages. But here’s the thing: the message doesn’t match the audience’s experience.

Early in the commercial, as the app promises to “turn an intimidating process into an easy one,” a magician appears on the screen with fireworks and a woman sawed in half. This visual misstep serves as a reminder that what appears to be easy, what is advertised as common sense, is often an illusion. There is not a single scene in which people look at each other during the ad, apart from the one containing the magician, who looks directly at the viewer. It is as though he is trying to hypnotize the audience into buying what Quicken is selling.

Magician

In all of the other scenes, people only look at their phones. The household goods that mark progress throughout the piece spin around in the air, but people do not interact with them. This tells us a lot about how Quicken sees its audience – as consumers, but not as community.

The trouble for Quicken is that most of the Super Bowl viewers watched their communities crumble over the course of the 2008 housing crisis, and beyond. They watched as banks were bailed out while their friends, family members, or even themselves, were left jobless, houseless. History is wont to repeat itself, but the backlash to “What We Were Thinking” shows us our collective memories last at least eight years. It may also suggest that “the people” and the banking industry may have now very different ideas about what it means to progress.

Leave a Comment

Filed under Uncategorized

Sensuous Communication and the Disciplining of the Woman’s Body

Afternoon

Where I work is just over a quarter mile from the bus stop that I ride everyday. I notice roughly the same people—a mother jogging with her baby in stroller, the man selling the local Street Pulse newspaper, the same hurried faces on their way to work as they pass the library and a Starbucks behind the bus shelter where I wait. It’s not uncommon to be smiled at or briefly talked to by these strangers—some more innocuous than others—but this Tuesday afternoon was different.

I was wearing my bulky winter coat, wrapped in a scarf, hat, and mittens. A man rushed into the bus shelter where I was sitting and spotted the only visible part of my body that could be considered remotely sexual about me that day.

“Can I tell you something?” he blurted out. “Those boots are beautiful. Do you know that you have some really stunning boots on right now?”

Taken aback by the peculiarity of his comment, I gazed up from my phone, looked at him and responded curtly, “Thanks.” That was the last time I looked at his face. He had red hair and wore glasses and a backpack.

“I’ve seen you before, haven’t I,” he said smirking and with a hint of titillation.

I waited just a few seconds. “Nope,” I replied tersely.

“Oh, I’d never forget a face like that. I know I’ve seen you before.”

I huddled over my phone while staring intently into whatever happened to be on the screen at the time. My hope that he would read my disregard and lose interest in me quickly dissipated in what happened next. He followed up with one of the most bewildering, most troubling, most invasive things ever said to me by a stranger in public: “I want to fornicate with you.” That’s right, fornicate.

I sat there with two options: I could object, defend myself, and risk an altercation that might escalate into something physically dangerous, or I could continue looking at my phone, crouching my body in an attempt to ignore him while appearing unfazed by his comments. I chose the latter option. He continued to tell another bus rider, a younger man who had just approached the shelter, how he wanted to fornicate with me and that “we’d make beautiful children.” The other rider was more generous than me—acknowledging and responding somewhat apathetically to his outlandish remarks. Just before he left, he told the other bus rider to take care and that he “respected” him because that man looked him in the eye, when “this woman wouldn’t.”

Sexual violence pervades the ubiquity of everyday experiences, and on this day, I encountered it in crass yet commonplace ways. When women choose to protect their bodies, they simultaneously confront the choice to silence themselves. In silencing myself, I protected my body; in silencing myself, I failed to challenge these sexually violent norms because, consequently, the risks for doing so are high.

Although the media has culled attention to the problem of sexual violence and the ways it manifests in various legal and institutional contexts, including on college campuses and in workplaces and civic spaces, sexual violence persists, tainting women’s most mundane social activities and movements. Such cultural norms discipline a woman’s body to act and move in ways geared toward protecting her in public. That is, the woman’s body is trained by a sexually violent society to move throughout public, private, social, academic, and professional spaces. Even places like a bus stop frequented daily.

Young women—young girls—quickly learn how a non-consensual exchange takes place between her body and others in public. The body becomes a topic of conversation, an invitation for unsolicited “compliments,” and most unsettling, a battleground for physical trespass. This is the unspoken education, the social conditioning women receive to stare at their cell phones or avert eye contact when merely moving in public.

The outside world perceives women’s bodies and imbues them with insecurities and unwanted shame. A woman’s body is quickly read and coded as sexual, risky, available, abnormal, fat, thin, disgusting, desired, and worst of all, not her own. Because of this, women are vigilant about their bodies in public, and we operate under persistent unease about and suspicion over the body in public: Who will touch it? Who will call it out? Who will claim it? And most egregious, who will violate it? To stave off undesirable answers to these questions, women then operate under the direction of don’ts: Don’t walk alone at night. Don’t wear “slutty” clothes. Don’t make eye contact. Don’t drink too much. Don’t ask for it…because it will be your fault.

A rape culture has rendered a woman’s body in a state of constant defense; it has disciplined a militant, confused, and even shameful relationship between women and their bodies. Female bodies sensuously communicate through protective movement to ward off potential spoken and physical violence. We crouch over, position headphones in our ears, avoid eye contact, stare blankly at cell phones, navigate alternative paths, cover our breasts, legs, necks, and feet, deciding whether or not to engage when called out in an effort to police our bodies in public. These are the silent narratives whispered to women’s bodies that disgust me but ones that I follow.

 
While sexual assault implies a legal category with judiciary responsibilities, sexual violence remains a cultural problem—our culturally inherited baggage that maintains a deep legacy of disbelief and blame. What happened at the bus stop is symptomatic of a larger problem concerning sexually violent norms; sexual violence has grown perniciously normative to the extent that it works to discipline women’s bodies. At a moment when society grapples over the legal and institutional responsibility to victims of rape, I ask us to consider, too, how movement reveals sensuous communication rhetorically inflected by a rape culture—communication that calls women to deflect the risk of violation through bodily movement.

 

2 Comments

Filed under Uncategorized

A Public Sphere Without a Public Good

Scholars have long viewed the public sphere, among other things, as a realm for coordinated action. In The Structural Transformation of the Public Sphere, Jürgen Habermas recounted how the bourgeoisie developed a sense of themselves as a collective subject and asserted the public sphere as the locus of political authority. In The Public and Its Problems, John Dewey explained that publics arise from recognition of their implication in the consequences of human activity. Publics organized to address these consequences and pursue their interests purposefully, rather reacting haphazardly to societal developments. In her reflections on publics, Hannah Arendt called people’s coordinated activities the constitutive power of human relationships. She wrote that that “power springs up between [people] when they act together and vanishes the moment they disperse.”[1]

Arendt’s reference to power “vanishing” underscores the contingency of the public sphere, intimating its embedment in wider systems and societies. Action in the public sphere may change society, but society may change the public sphere. These changes may include the assumptions and values that inform coordinated action in the public sphere. In this spirit, a primary assumption is the possibility of coordinated action itself. Scholars have long believed that this action is consequential because when people act together, their activities amount to more than the sum of their individual efforts. Public action transforms individual effort. Working together, people can pursue problems and possibilities that elude them individually. Public action assumes and, in turn, sustains a public good.

But what happens if people lose faith in the idea and practice of a public good? What if people doubt the possibility of a public “we” and insist instead that society consists only of an assemblage of “me”s? Under these circumstances, how, if at all, may the public sphere coordinate action?

We live in such a time. In the United States and elsewhere, public, in its multiple meanings, has become a source of skepticism and, for some, anger. Surveys suggest that people’s trust in government to serve the public interest has plummeted.[2] Across a range of different issues, from municipal trash collection to prisons, local, state, and federal governments have outsourced formerly public functions to private enterprises. Public functions that had previously been regarded as unrelated to pecuniary considerations have now become opportunities for profit maximization. These developments have included the public provision of education, which historically has been regarded as crucial for the vibrant functioning of democracy and the public sphere. Vouchers, charter schools, defunding public education, standardized testing—all of these policy initiatives recast education from a public good serving a “we” that includes students, their families, and everyone else to a private good that leaves individuals responsible for maximizing their educational opportunities and outcomes, and blameworthy if they fail.

These policies represent the enactment of a school of economic and political thought that denies the existence of publics and public goods as anything other than the aggregation of individuals and individual interest. In Capitalism and Freedom, Milton Friedman made plain his view that references to a public constituted a fiction, asserting that “a free man” rightly discerned the constitution of a country “as a collection of individuals who compose it, not something over and above them.”  A free man, he continued, “recognizes no national goal except as it is the consensus of the goals that the citizens severally serve.  He recognizes no national purpose except as it is the consensus of the purposes for which the citizens severally strive.”[3] As a leading figure in the Chicago School, Friedman outlined an approach that has become axiomatic to present-day neoliberal governing regimes: the body politic exists only as bodies that may be constituted and disciplined as individuals who are compelled to adopt a market rationality. As Wendy Brown puts it, “the body politic ceases to be a body but is, rather, a group of individual entrepreneurs and consumers.”[4] A public shaped through interaction and engagement reappears as an aggregated public.

In this moment, the public sphere appears, as Habermas has remarked in relation to a different period in history, as ideology and more than ideology. Skepticism in a public good has not stopped politicians from appealing to a public good as they pursue policies that undermine publics. For example, when asked about perceived attacks on public education in Wisconsin, the chair of the State Assembly’s Education Committee retorted that the sum of these recent changes “not only challenges the public schools to step up their game, but it also gives parents opportunities that they didn’t have before.”[5] In this response, competition fosters a public good, and choice replaces coordination as a mode of public agency. Nevertheless, this comment also indicates the continued resonance of a common good, as the committee chair asserts that everyone benefits.

An ominous development, the emergence of a public sphere without a public good need not represent a permanent condition of public life. Rather, the current situation is filled with tensions and varied possibilities that may bolster or weaken publics. At the local level, for example, communities have pushed back against attacks on public education. They have demanded a role—a collective, democratic role—in educational decision-making. As rhetoric and communication scholars, our role is to investigate the contrasting pressures on a public good and their implications for the public sphere. In this process, we may discern emancipatory possibilities.

[1] Hannah Arendt, The Human Condition (1958; Chicago: University of Chicago Press, 1989), 200.

[2] American National Election Studies. The ANES guide to public opinion and political behavior [table 5A.1]. 2010. Retrieved from www.electionstudies.org/nesguide/toptable/tab5a_1.htm; James A. Davis and Tom W. Smith, General Social Surveys, 1972-2008. Chicago: National Opinion Research Center, 2009.

[3] Milton Friedman, Capitalism and Freedom (Chicago: University of Chicago Press, 1962), 1-2.

[4] Wendy Brown, “Neo-liberalism and the End of Liberal Democracy,” Theory and Event 7 (2003). Accessed online at

[5] “Public Educators Are Being Challenged, not Under Assault, Rep. Jeremy Thiesfeldt says,” Capital Times, 24 May 2015, http://host.madison.com/ct/news/local/writers/todd-milewski/public-educators-are-being-challenged-not-under-assault-rep-jeremy/article_e64116ce-2f47-5190-bb3a-1acd2d226f88.html

Leave a Comment

Filed under Uncategorized

States of Emergency

In New York City on any given night, there are more than 59,000 people experiencing homelessness. That’s nearly three times the number of people who can fit into a sold out Madison Square Garden.

Madison Square Garden, NYC ©Diana Robinson 2014

Madison Square Garden, NYC ©Diana Robinson 2014

In Los Angeles, there are approximately 26,000 homeless people each day, larger than many small towns around the country.

Homelessness – as we know it now – emerged as a “crisis” in the United States in the 1980s when, for the first time, cities were beginning to see large quantities of people sleeping on street grates and park benches. Despite myriad approaches to address the situation, the number of people experiencing homelessness in the U.S. each year remains in the millions.

Now, communities are trying a new approach, declaring homelessness a “state of emergency.”

This all started in late September, when Mayor Eric Garcetti made Los Angeles the first city in the nation to make such a declaration.  In early October, Portland, Oregon followed suit. Two days later, the governor of Hawaii declared a state-wide state of emergency over homelessness. Then, on November 2, the mayor of Seattle and the county executive of the surrounding King County joined together to declare a civil emergency regarding homelessness there.

Declaring a state of emergency, an act generally reserved for natural disasters, has at least two important functions. First, it is an instrumental move. As Portland Mayor Charlie Hales explained, “We’ve tried slow-and-steady. We’ve tried by-the-book. It’s time to add to the tools we currently lack.” These “tools” include opening up access to additional funding streams, both local and federal. In Los Angeles, the emergency declaration appears to have freed up $113 million to help address homelessness. Another “tool” provided by the declaration is the ability to suspend zoning codes that prevent communities from building or converting properties into homeless shelters.

Secondly, and most relevant to this blog’s readers, is the symbolic function of an emergency declaration. Labeling homelessness an “emergency” marks it as a pressing need, an urgent concern. It also, in the sense of the root “emergent,” helps to bring homelessness out from concealment and into greater public visibility.

© Garry Knight 2014

© Garry Knight 2014

Seattle Mayor Ed Murray explained it this way: “We must get this issue back on the national agenda. The reality is, we are in a moment in our history where decades of service cuts, growing income inequality, and many untreated issues of mental health and drug addiction have finally resulted in a human crisis seldom seen in the history of our city.”

Homelessness is, indeed, urgent. Living without consistent shelter is life-threatening. It reduces one’s life expectancy and increases one’s vulnerability to violence, illness, and injury. But what’s interesting to me about the timing of these declarations of emergency is their tardiness.

The Oxford English Dictionary defines an emergency as “the arising, sudden, or unexpected occurrence (of a state of things, an event, etc.).” Similarly, it says an emergency is “a juncture that arises or ‘turns up’; esp. a state of things unexpectedly arising, and urgently demanding immediate action.”

At this point, homelessness is neither sudden, nor unexpected. It arose in shocking numbers starting in 1980, now 35 years ago. What, then, is the juncture that has prompted communities, just now, to seek additional resources for its amelioration? Why is it newly urgent? Murray says this has “finally resulted in a human crisis seldom seen” [emphasis mine], but it’s unclear what exactly has caused these community leaders to have reached a tipping point (or, in rhetorical parlance, a “kairotic moment”) for pursuing emergency measures.

Certainly homelessness is rising in these communities – Hawaii’s homeless population has risen 23% since 2013 – but the numbers of people living on the streets, in hotels, and in shelters in all of these communities was already in the thousands.

Maybe it’s rising death tolls. In King County, Washington, 66 people have died while homeless just this year.

Maybe it’s the elections. Increasingly, I’m seeing homelessness as a much-discussed voting issue in local politics. And with a national election on the horizon, there may be a hope that homelessness could become a more significant part of Congress and the president’s agendas.

Maybe it’s a cascade effect. Governments frequently look to other examples of what to do to address social and political problems. It certainly looks like that’s happening in this case.

Of course, it may matter little why these declarations are being made now if they succeed in persuading people that homelessness must be urgently addressed. As is typical of critics of rhetoric, some worry that “this is all simply words,” and advocates say they’re in a “wait-and-see mode” until “after the initial press coverage fades.” These folks’ emphasis on the instrumental function of states of emergency misses the opportunity of the symbolic.

Most of the measures these declarations make possible are temporary. For example, Seattle’s civil emergency opens up a one-time burst of $5.3 million, but offers no further guarantee of elevated funding levels. Even suspending zoning requirements is just that: a temporary suspension of a community’s typical approach to managing its space. But reminding people that homelessness is urgent and “demanding immediate action” has the potential to shift public attitudes about homeless people. If homelessness is an emergency, like a natural disaster, it may be harder to blame (and disregard) people who are experiencing it for their poverty. And even if it does not shift these attitudes, making homelessness an immediate priority may reduce the number of people subject to the demonization that often accompanies the condition.

Homelessness has long been an emergency for people experiencing it. Perhaps these new declarations from our governments will help more people perceive it at such.

Leave a Comment

Filed under Uncategorized

Spot the Africa: Trevor Noah and Representational Power

And…we’re back! Sorry for the long hiatus, but we’re excited to return with all new commentary on rhetoric and current events, starting with this post by Emily Sauter.

Noah

Replacing the incredibly popular John Stewart, new host of the Daily Show Trevor Noah has some big shoes to fill. Adjusting to the new host audiences must not only get used to a new face, but a host with a new accent and a wildly different background. Former host John Stewart is Jewish by birth and a native New Yorker, an identity that he used to great effect, whether for comedy or solidarity. Readers might remember Stewart’s emotional opening monologue after 9/11, where he said, “The view from my apartment was the World Trade Center. Now it’s gone. They attacked it. This symbol of, of American ingenuity and strength, and labor and imagination and commerce and it’s gone. But you know what the view is now? The Statue of Liberty. The view from the south of Manhattan is the Statue of Liberty. You can’t beat that.”

In that moment Stewart’s identity as an American was paramount and provided audiences with an anchor point of empathy. Trevor Noah has no such point of connection. Instead, he must use his background as a South African as a chance to build new comedic opportunities, to stand outside the American institution and comment on it. As a correspondent Noah made his debut in December of 2015 with a short segment titled “Spot the Africa.” In the segment Noah talks about life in Africa versus life in America, and in what might be a surprise to some viewers, the comparison does not work out in America’s favor. In one joke he says, “Africa’s worried about you guys. You know what African mothers tell their children? ‘Be grateful for what you have, because there are fat children starving in Mississippi.’” He then presents Stewart with a jar full of pennies and a song, “Feed America.”

During the segment Noah switches back and forth between referring to South Africa specifically, and Africa at large. Considering American audiences have little to no knowledge about Africa or the differences between nations, the conflation is concerning. This trend continues in one of his newest segments as host of the show, where he compares Donald Trump to an “African President,” using clips of Jacob Zuma (president of South Africa), Yahya Jammeh (president of Gambia), Rob Mugabe (president of Zimbabwe), Idi Amin (former president of Uganda), and Muammar Gaddafi (former leader of Libya). Throughout the segment an image of Trump is decked out in increasingly ludicrous faux-military medals and sashes. The ensemble is then topped with a pair of black shades—the perfect African President.

The segment is funny, no doubt about it. And there are indeed some eerie rhetorical similarities between Trump and Africa’s most notorious dictators. For example, both Trump and President Zuma of South Africa claim “most” immigrants are criminals, though not all—a rhetorical choice that Noah labels “light xenophobia with just a dash of diplomacy.” The President of Gambia claims he can cure AIDS using bananas, and Trump claims vaccinations cause autism. In perhaps the most amusing comparison of the segment, Noah links Trump’s bragging about his money and his brain to Uganda’s Idi Amin, who is shown in a series of clips to be making the same claims to wealth, popularity, and intellect.

However, Noah’s attempt to use his African heritage as a prop to mock the American presidential candidate does more harm to America’s understanding of Africa than it does to Donald Trump. At best Trump looks absurd, maybe delusional; at worst he looks crazy.

Donald Trump

Donald Trump

Muammar Gaddafi

Muammar Gaddafi

 

Let’s look more deeply at the comparisons Noah used shall we? Muammar Gaddafi was condemned internationally for his egregious violations of human rights against his own people, suspected of ordering the bombing of Pan Am Flight 103 where almost 300 people died, and considered AIDs as a “peaceful virus.” Idi Amin’s rule was characterized by human rights abuses, political repression, ethnic persecution, extrajudicial killings, nepotism, corruption, and gross economic mismanagement. International observers and human rights groups estimate that 10,000-50,000 people were killed under his regime. Robert Mugabe has been president of Zimbabwe since 1980, and between 1982 and 1985 at least 20,000 people died in ethnic cleansing and were buried in mass graves. Yahya Jammeh has been “elected” several times under suspicious conditions, has introduced legislation that would result in beheading for any LGBTQ citizens, has had students and journalists killed, has reportedly “disappeared” or indefinitely detained those who oppose him, and has instigated a literal witch hunt that has killed hundreds. Jacob Zuma, Noah’s own president, is certainly not considered the cleanest of presidents. He has been charged with corruption, rape, and has been  involved in a number of scandals.

Noah has unprecedented access to the American public and a true chance to help educate us on the many differences between African nations in general and the reality of a place like South Africa specifically. What audiences encountered during this segment was not a funny insightful joke about Trump’s more dictatorial theatrics, but the enforcement of harmful western narratives—that “Africa” is violent, dictatorial, and unable to maintain any sort of true democratic government. I admit, as someone who studies South Africa I was incredibly excited to see Noah step into the role as host of the Daily Show, hoping to see a representative of South Africa who could bring more understanding to an audience woefully lacking information on the country. If this is the caliber of comedy that I can expect from Noah though, I think I’ll take a pass on future episodes.

Leave a Comment

Filed under Uncategorized

Generic Expectations and Misbehaving Narratives: The UVa Rape Story

Note: This is a co-authored post by Brandi Rogers (UW-Madison Rhetoric, Politics, and Culture) and Stephanie Larson (UW-Madison Composition and Rhetoric).

In November of 2014, Rolling Stone released the article “A Rape on Campus: A Brutal Assault and Struggle for Justice at UVa.” The article details the events of a brutal campus gang rape that took place at University of Virginia just after the start of the 2012 fall semester. Journalist Sabrina Rubin Erdely recounted Jackie’s (pseudonym) horrifying account of a gripping sexual assault that was conducted by seven men over the course of three hours in the Phi Kappa Psi fraternity house. Jackie was led upstairs by her date, a fellow classmate and co-worker, where he instructed seven other men to take turns having intercourse with Jackie, forcing oral sex, and even thrusting a beer bottle inside of her as an ostensible hazing ritual. Erdely contextualized the story within a long history of rape culture at UVa where a legacy of sexual violence remains masked by prestige, patriarchy, and a reputation reverberating the founding fathers in the midst of institutional indifference.

The original story, which has now been retracted from the magazine, faced critical commentary questioning the piece’s credibility immediately after its release. After the Washington Post among many news outlets chastised Erdely and the magazine for dubious journalistic ethics, Will Dana, managing editor of Rolling Stone, requested the help of Steve Coll from the Columbia School of Journalism to investigate the veracity of reporting, editing, and fact-checking. Coll and his team received no payment, and the report—released April 5, 2015—highlighted a number of journalistic errors centering around the following main problem: Erdely remained “too accommodating” of her sole source, Jackie, and should have been “tougher” on this rape victim. Mirroring the Washington Post’s initial critiques, Coll’s team concluded that the main problems of methodology and faulty fact checking, though intended to challenge institutional indifference, “may have spread the idea that many women invent rape allegations.”

As more information about Erdely’s process of choosing a source whose rape story would anchor her larger investigation into rape culture on college campuses has trickled out, it appears that Erdely began her search with a particular rape story in mind. In an interview conducted after Rolling Stone apologized for publishing the story, Will Dana admitted that “the article stemmed from a feeling he and other senior editors had over summer that the issue of unpunished campus rapes would make a compelling and important story.” Driven by the editorial staff’s conception of the problem of rape on college campuses, Erdely went in search of a story that was emblematic of women’s experiences living amid college rape culture. Jay Rosen, Professor of Journalism at NYU, remarks that “[t]he most consequential decision Rolling Stone made was made at the beginning: to settle on a narrative and go in search of the story that would work just right for that narrative.” That Rolling Stone and Erdely began the investigation with a clear idea of the type of rape story that best reflects the experiences of rape victims on college campuses prompts us, as rhetorical scholars, to ask: Have rape narratives become a unique genre of discourse in their own right, complete with recurring tropes and predictable, required characteristics that we use both to recognize and to judge them by? If so, what tropes are necessary for a rape story to be heard and believed, and what happens to stories that do not fit the generic requirements?

Before embarking on an analysis of the UVa controversy, a bit of theoretical explanation is warranted. Simply put, we understand genre as a category of stories that display similar features or tropes. Genre allows audiences to sort, arrange, and make sense of stories, and genre serves as a resource for invention when authors craft new narratives. Audiences identify and judge the success of a story based on a familiarity with common, generic tropes and reoccurring exigencies. Put differently, if the narrative faithfully displays particular characteristics in a predictable order and number in a given cultural moment, audience expectations are met. For example, rhetorical scholar Judy Segal argues that stories of individual cancer survivorship have become a ubiquitous feature of public culture. Though breast cancer narratives work to inform the public, like other narrative genres, they also “evaluate and govern us,” writes Segal.[1] Because narratives constitute and perpetuate sets of norms and values, they influence how we think about breast cancer, including its causes, its victims, and its survivors, as well as what we do about cancer.  Similar to dominant breast cancer narratives, rape narratives like Jackie’s, can have a therapeutic and medical function, but they also support cultural and political interests, as activists use narratives to argue for or against kinds of political and institutional change.

Not unlike those journalists who first questioned Erdely’s reporting, we too were troubled–both by public attacks on Jackie and an uneasiness with the narrative itself. Something about the story felt “off,” but what exactly? Since the Columbia report, we tiptoe the line of agreeing with Coll’s assessment, yet we feel uncomfortable challenging the victim’s voice. As rhetorical scholars, this internal consternation drove us to consider the Rolling Stone controversy in greater detail. Our analysis of the controversy points to three major findings: 1) Individual rape testimonies comprise a unique narrative genre constituted by and predicated upon public expectations; 2) The rape narrative genre obfuscates institutional, communal, and even social culpability because of its myopic focus on the individual; and 3) We fear testimonies that don’t behave accordingly to generic conventions will remain silenced or when heard, be distrusted.

Falling in line with a legacy of narratives that determine how rape stories are told, Erdely centered her investigation of rape culture on a sensationalized recounting of one victim’s rape testimony. The rape story that Erdely sought out and chose as the centerpiece of her exposé included a robust menu of the generic tropes, including physical violence, intoxicants, rapist(s) born in and nurtured by misogynist institutions, and an innocent victim.  Even as Jackie’s testimony met every generic requirement–the event took place on an elite college campus, at a fraternity house, under the influence of alcohol, by an acquaintance-perpetrator, who facilitated a rape by several men as an initiation ritual, all the while both her friends and the institution failed to recognize Jackie’s assault–the bloat of generic tropes weighed down the narrative, tipping it into the realm of fiction. Although the victim’s story meets all of the generic expectations of a college rape narrative, and then some, Erdely’s retelling of it still managed to arouse suspicion. If scholars such as Segal are right to submit that narratives must abide by generic conventions that appeal to audience expectations, then the suspicion that this story aroused suggests generic misbehavior; perhaps its tropes are too numerous, too sensational, and thus, too unbelievable. Unfortunately, this, coupled with Erdely’s failure to follow up with other sources, led journalists and the public to distrust the narrative, blame the victim, and question the traumatized memory.

Furthermore, we suggest that though rape narratives generally seek to prove institutional culpability, they are incapable of doing this work in light of a series of tropes that pivot back to the individual. The narrative along with its perennial features distracts us from locating institutional responsibility. Erdely’s article and Coll’s report remain fixated on concerns over individual blame: the victim is to blame for not being truthful, and Erdely is to blame for relying too heavily on this victim. The rape genre is entangled in a web of assessing blame that relies on individual memory, traumatic recall, and personal responsibility. Within this case, we found a couple opportunities where advocates might have made the case against institutional indifference with respect to rape. While Erdely expresses the intention of exploring rape culture more broadly, as well as the institutions that insulate and perpetuate it, by foregrounding her investigation with a sensational rape narrative, she loses sight of her initial agenda. Ultimately, as the narrative collapses under scrutiny, so too does any larger argument about how culture, institutions, or communities enable and tolerate rape. Coll, on the other hand, is left to make sense of Erdely’s mishaps, but he, too, is unable to escape assessing the tropes of Jackie’s narrative. In doing so, his report implicitly stages an archetypal rape victim as inherently unbelievable.

The continued reliance on individual rape narratives as a method of getting at the problem of rape constrains our ability to talk about rape, and it elides other discursive approaches that might improve our understanding of rape. Erdely, the media controversy, and the Columbia report, while not intending to, inadvertently exacerbate our cultural tunnel vision on individual blame. More importantly, Erdely’s choice to focus on Jackie’s story, rather than another less sensational victim narrative, raises important questions about the power for narrative to give voice to those who have been silenced. Ultimately, we require a new means of apprehending rape, one that circumvents the generic demands and generic surveillance concomitant with rape narratives today, one that de-sutures the binds between rape and individual blame. Having exposed how these narratives are packaged into a genre that insists on a set of conventions grounded in personal responsibility in the face of institutional bulwark, we are left to consider how to craft new rhetorical approaches, ones mutually capable of tackling institutional culpability without also silencing victims.

[1] Judy Z. Segal, “Breast Cancer Narratives as Public Rhetoric: Genre Itself and the Maintenance of Ignorance,” Linguistics & the Human Sciences 3, no. 1 (April 2007): 3–23.

3 Comments

Filed under Uncategorized

Dead Men Spinning: The Irony of Using Famed Ecologists as Metonyms for Environmental Concern

This year’s Earth Day is a tumultuous one for ecologists and nature lovers in the state of Wisconsin. In a state that lays claim to such celebrated ecological pioneers and naturalists as John Muir, Aldo Leopold, and the founder of Earth Day, Gaylord Nelson, a series of proposed cuts to public funding of natural resource management has sparked alarm and protest. During a week generally reserved for celebration of ecology, several Wisconsin nature lovers are fearing changes and mounting efforts to defend the preserves and natural places they love.

Yet in published and online responses to the proposed cuts, opponents have adopted a familiar and recurrent and, frankly, somewhat curious tactic—that is, when seeking to defend and protect the import of the natural environment, free from human interference, these advocates are making public appeals not by describing natural flora or fauna or terrain or waterways but, rather, by making off-handed reference to human figures.

For instance, in response the the initial announcement of the proposed changes to the Wisconsin state Department of Natural Resources, the Milwaukee Journal Sentinel staff ran a widely-circulated op-ed piece, that began: “Aldo Leopold didn’t just roll over in his grave Tuesday; he started spinning at accelerating speed after Gov. Scott Walker announced his proposed. The state’s hunters and anglers—and everyone else who loves the outdoors—should be just as shocked as the famed Wisconsin naturalist would be.” Leopold was a naturalist who worked as a professor at the University of Wisconsin, was instrumental in founding the school’s famed Arboretum, and gained worldwide acclaim with his posthumously published nature writings in A Sand County Almanac, based primarily on observations of the natural environment in south central Wisconsin.

 

Gaylord Nelson.

Gaylord Nelson.

Aldo Leopold

Aldo Leopold

Then again, this past week, when the state’s land board—led by treasurer Matt Adamczyk—barred state employees from speaking or writing on “climate change,” another paper, the Madison-based Isthmus, lambasted the action by evoking former state governor, state senator, and naturalist Gaylord Nelson. Granted, Adamczyk had targeted his comments at Nelson’s daughter, Tia Nelson, currently the head of the Wisconsin Board of Commissioners of Public Lands, but the article’s argument built its appeal to readers by evoking Nelson in a similar way to how the Journal Sentinel had cited Leopold, stating that on the 45th anniversary of Earth Day, “Tia Nelson’s dad is rolling over in his grave.”

Even as modern ecology looks beyond the human and pushes beyond the “anthropocene,” it’s remarkable that our public arguments in defense of the natural so frequently evoke the human. When biophysical environments or ecological policies are threatened, rather than describing the ecosystem and biological science at play, we decry these decisions by saying famed ecological thinkers are “turning in their graves.” When celebrating Earth Day and urging others to care for the planet, we frequently reference an anthropomorphized “Mother Earth.” When inspiring children and members of the general public to care for plants and animals, we frequently favor anthropomorphic characters like Smokey Bear and Hoot the Owl to actual wildlife or science. Our most popular and successful environmentalist campaigns have rallied around human figures like John Muir or Keep America Beautiful’s “Crying Indian.”

On one level, this is a savvy tactic. Humans are drawn to other humans. Anthropomorphizing any concern makes it easier for many people to understand and empathize with it.

On another level, citing an ecological hero like Leopold or Nelson is a kind of rhetorical metonymy, alluding to all of their ideas and writings and championed causes without having to repeat the arguments and ideas at length. That is, I can make reference to Leopold or Nelson rolling in their graves, or Mother Earth weeping, or John Muir and Theodore Roosevelt shaking their heads in shame, and, in doing so, present an artful, accessible, and concise way of saying the broader field of ecological expertise and tradition rejects a policy or action.

And yet, such metonyms do anthropomorphize ecological concerns all the same. They perpetuate a paradox of caring for biophysical well-being on a par with human well-being by favoring and spotlighting the human! By defending ecology with reference to Leopold of Nelson, we may be metonymically referring to their larger work and arguments, but we are also at the same time simply deferring to the unquestioned, unproblematized authority of a single human actor (one that is, besides, college educated, white, and male). Simple reference to a name allows, potentially, for lip-service sustainability.

As ecologists protest budget cuts and policies that place human interests, development, and profit ahead of environmental sustainability and biophysical concerns, they might recognize the irony involved and see that their own rhetorical word choices and metonyms often do the exact same thing.

Leave a Comment

by | April 22, 2015 · 1:00 pm