Saturday, 8 September 2018

Kwame Anthony Appiah: Our lives would all go better if we were less inclined to essentialism

Your book deals a lot with the idea that identity can be a double-edged sword, and I find nationalism to be such an interesting example of that. I remember after the travel-ban news came out I went to JFK for the protests there, and there were thousands of people there who were genuinely invested in the fight against people being excluded from the country on the basis of race or religion. What struck me, despite stereotypes about the left hating nationalism, was that it could actually be seen as a nationalistic gathering in a certain way.

Kwame Anthony Appiah, a New York University professor of philosophy and law: Yes, absolutely. I mean it wouldn’t have made sense for them to be there unless they cared, I would say, about American rightness. They were there in shock because they thought that in the name of their country something terrible was being done and they wanted to actually play a role in stopping it happening.

Right, there is a certain, unifying power to the idea that as Americans we will not stand for this. Which is probably why both liberal and conservative politicians seek to tap those sentiments.
Yes. I mean, you could’ve been there, and probably there were people there who were not American citizens. There may have been a few who were just there under the rubric of respect for rights, or hostility to Islamophobia, or a bunch of other things. But, as you say, sometimes you want people just to be brought together by the sense of “Hey, we are the people who care about rights.”

For me, another double-edged identity concept is “people of color,” which is generally used for good or benign purposes but which strikes me as sometimes problematic. Taken literally, “person of color” encompasses what, 3 or 4 billion people? Some part of me does worry that these claims often are essentializing because they gloss over so much difference among massive groups of people who are often held together by flimsy, artificial racial categories. Am I being sort of overly sensitive? Should I just let it go because it’s being used for good purposes? How do you view that kind of language?

It’s often the case that what brings people together is the hostility of some other group against them. European Jewry was extremely diverse but it was brought together by the rise of organized anti-Semitism, or the re-rise of organized anti-Semitism in the late 19th and into the 20th century. So Moses Mendelssohn and some peasant in a shtetl — these are very different kinds of people but the most important thing for both of them in the context of their lives (if they were, say, in Berlin) would be that they would both be potential objects of anti-Semitism. So I think that one of the things that does actually bring people together is even if they are otherwise extremely diverse is a form of hostility that identifies them than if they were organized against everybody in the group.

Now what we don’t want to do is forget when we start talking about people of color, that there are vast differences in experience. Differences having to do both with, as it were, what kind of color you are, what kind of nonwhite person you are, and also with other things — gender, class, and so on. The result is that there’s a risk every time you take one of these labels that you essentialize and you treat everybody as if they were the same in ways in which they’re not. It’s something to be vigilant against, but I don’t think that needs to lead us to be against using such labels providing we use them carefully. I think it’s good to have in the culture the general thought that we should worry about this kind of essentialization so that when we do appeal to these sorts of concepts we remember that there are going to be things that divide people of any identity group.

I guess the flip side of that is the way “whiteness” is often discussed in progressive spaces. I find this to be a tricky thing to talk about. But I’m a white liberal Jew living in Brooklyn and I genuinely think I have more in common with the “average” person in Brooklyn, as big and diverse as it is, than I do with either the “average” white person in Appalachia, or the “average” white Republican in a Texas suburb. I find it’s hard to find the language to express that. It seems to me that there’s something going on with the language that seems to imply the problem is white skin itself — like white skin gives rise to problematic politics and behavior, rather than the problem being reactionary politics, or racism, or whatever.

Right. Yeah, and I think that’s perfectly reasonable. We should remember that it’s not inevitable that the label white should mean very much to its bearers. Some people are thinking, “I’m a white person, so I’m going to do this,” and a lot of people don’t. Now, it would be reasonable to point out that if you’re not otherwise marked, then one reason why many liberal-minded white people don’t think about being white is that they don’t have to worry about the color of their skin because it isn’t, in the context of interactions with officials and so on, likely to be burdensome to them. But I think that it’s perfectly proper to insist that essentialism about whiteness is as absurd as essentialism about blackness or any of these other identities.

You know, on the one hand, however you feel about your whiteness will sometimes make a difference to what happens to you — and on the other hand, it may not matter very much, to you. You may not think of your whiteness as having to do with anything except regretting the role of racism and so on. So that’s to make the point that identities have both a subjective and an objective dimension in some sense. They matter to how the world treats you, but they also matter to how you feel about the world, and the very same label can have very different subjective meanings for the people who bear it, and it can also lead to very different objective results in different circumstances. And all of that’s worth remembering.

So in other words, the other side of me saying I don’t feel like I have a deep essential commonality with the Appalachian or Houstonian is that whether or not I feel that way we will all benefit from being white — we’re, in certain contexts, less likely to get pulled over or followed around a convenience store, and stuff like that. That exists independent of my own feelings of what my identity is.
Right. There are a few things to say about it. One is, it’s important to care about it because it’s a very important fact about how our society works, and something we might like to do something about as citizens. But the second thing is: you didn’t do that. You didn’t make that true. You’re probably not doing anything to keep it being true, so it would be wrong to hold you responsible for it. There’s a difference between thinking someone is privileged by some identity and thinking that they’re to be blamed for that privilege. That’s not true, in general. There’s also the kind of privilege where people are desperately trying, actually, not to take advantage of it, though it isn’t up to them whether they gain advantage from it, because as I say identities have this subjective dimension in the sense that other people will treat you in virtue of their identities in a way they decide to, and you don’t control that.

Right, and it seems like there is a little bit of moral confusion in the air that mixes up those two concepts: being the beneficiary of privilege and actively causing privilege or trying to actively maintain it.

Yes, and I think it’s strategically unfortunate because instead of getting those people on your side you get their backs up if you talk to them in that way. So I think it’s important to get this right and to see that … Look, it’s probably also important to remind people that in the context of these racial identities in our society that there are lots and lots of other kinds of unearned privilege, some of which are held by some black people. So there’s lots and lots of, as it were, unearned class privilege in the United States and upper-middle-class black people get that too. So it’s not as if some people are permanently privileged by all their identities and other people are disprivileged by all of them.
There are contexts in which being black is a terrific thing in the United States. Because I am of African descent I have been treated extremely well in many contexts by African-Americans with whom I don’t have much history in common, just because we are both black. I mean, suppose I had been white, and otherwise had all the same properties that I have — I would have been, for them, an upper middle-class Englishman, and probably upper-middle-class Englishmen don’t seem like natural allies or friends for many African-Americans. For me, in many contexts in this country, being black has been a privilege as well as, no doubt, potentially the source of abuse or discrimination.

I really like your chapter on religion. People to seem to think that the rank-and-file ISIS members are all deeply devout, religious fundamentalists when in fact they’re often recruited just by the promise of a social identity and a place of belonging, and more terrestrial concerns like that.

Many of the European recruits of ISIS, of course, don’t know Arabic and can’t, therefore, read the Koran in the way in which you’re supposed to if you’re a devout person.

Even most Americans who go to some sort of church or synagogue probably intuitively understand that a lot of the time they’re just mouthing along — they’re there for the community, or comfort, or simply out of habit. But why do you think we have so much trouble extending that logic to others, either in the context of ISIS or religious conflict more generally? Why are we driven to want to think that that guy over there really does have crazy beliefs in his head rather than he’s just a member of a different tribe with slightly different rituals?

We’re just very bad at treating other people’s identities with the same care with which we’re happy to treat our own.

One of the ways in which you identify as a conservative Evangelical Christian in this country is by insisting that you believe in the literal truth of the Christian Bible. I think of that as just a performance — something that you say in order to indicate where you are on the community questions and on the moral questions. Because I simply cannot believe that anybody who has read the Bible — and these people mostly have — can actually believe, literally, everything in it. For one thing, it’s inconsistent, and you can’t literally believe both sides of the contradiction when it’s drawn to your attention. And there are just obvious problems with the stories in the Bible, starting with the fact that there are two accounts of creation in Genesis, which seem to be different from each other.
So I’m not being disrespectful … in fact, I’m being respectful! I’m saying, I can’t take you seriously when you say that because I too have read the Bible and I think that a sensible person, like you, can’t believe literally everything that’s in it. 

So I worry, but everybody who is a citizen of this country is a fellow-citizen of mine and I care about thinking of all of them. 

Anything else you want to talk about or you want people to know about the book?

The arguments in the book are meant to be offered up for consideration and conversation. I’m not claiming to have got everything right about any of these things, I’m just trying to, as it were, move the conversation along by pointing to some things I think I’ve noticed. I’m hoping to generate lively debate about these things that’s less sort of aggressive, hostile, nasty than some of the debate about identity that’s currently going on.

Do you think things on the left are getting more and more essentialist or is that an overstatement?

Um … I’m thinking about this because I don’t know that I have a lot of evidence about whether things are changing. Yet there is a lot of essentialism everywhere, on the right as well, and I’m in favor of reducing the extent to which that’s true. Since I myself am a progressive liberal person, I’m particularly worried about it when my people do it because I think it undermines the main thrust of the progressive liberal tradition, which is to aim to liberate people of all identities so that we can have meaningful lives.

So I worry, but everybody who is a citizen of this country is a fellow citizen of mine and I care about thinking of all of them, even the ones that I disagree with about policy, and so I would hope that everybody on the left and right would be willing to entertain for the moment the possibility that they may not have got this right, and that our lives would all go better if we were less inclined to essentialism.

Monday, 3 September 2018

If we want saner politics, we need to start building better foundations from the playground up

Jonathan Haidt is a social psychologist at New York University’s Stern School of Business; Greg Lukianoff is the president and chief executive of the Foundation for Individual Rights in Education
  Before he died, Senator John McCain wrote a loving farewell statement to his fellow citizens of “the world’s greatest republic, a nation of ideals, not blood and soil.” Senator McCain also described our democracy as “325 million opinionated, vociferous individuals.” How can that many individuals bind themselves together to create a great nation? What special skills do we need to develop to compensate for our lack of shared ancestry?
When Alexis de Tocqueville toured America in 1831, he concluded that one secret of our success was our ability to solve problems collectively and cooperatively. He praised our mastery of the “art of association,” which was crucial, he believed, for a self-governing people.
In recent years, however, we have become less artful, particularly about crossing party lines. It’s not just Congress that has lost the ability to cooperate. As partisan hostility has increased, Americans report feeling fear and loathing toward people on the other side and have become increasingly less willing to date or marry someone of a different party. Some restaurants won’t serve customers who work for — or even just support — the other team or its policies. Support for democracy itself is in decline.
What can we do to reverse these trends? Is there some way to teach today’s children the art of association, even when today’s adults are poor models? There is. It’s free, it’s fun and it confers so many benefits that the American Academy of Pediatrics recently urged Americans to give far more of it to their children. It’s called play — and it matters not only for the health of our children but also for the health of our democracy.Young mammals play, and in doing so they expend energy, get injured and expose themselves to predators. Why don’t they just stay safe? Because mammals enter the world with unfinished nervous systems, and they require play — lots of it — to finish the job. The young human brain “expects” the child to engage in thousands of hours of play, including thousands of falls, scrapes, conflicts, insults, alliances, betrayals, status competitions, and even (within limits) acts of exclusion, in order to develop its full capacities.
But not all play is created equal. Peter Gray, a developmental psychologist at Boston College, studies the effects of “free play,” which he defines as “activity that is freely chosen and directed by the participants and undertaken for its own sake, not consciously pursued to achieve ends that are distinct from the activity itself.” Guitar lessons and soccer practice are not free play — they are supervised and directed by an adult. But when kids jam with friends or take part in a pickup soccer game, that’s free play.
The absence of adults forces children to practice their social skills. For a pickup soccer game, the children themselves must obtain voluntary participation from everyone, enforce the rules and resolve disputes with no help from a referee, and then vary the rules or norms of play when special situations arise, such as the need to include a much younger sibling in the game. The absence of an adult also leaves room for children to take small risks, rather than assuming that adults will always be there, like guard rails, telling them where the limits of safety lie. Outdoor free play, in mixed-age groups, is the most effective way for children to learn these essential life skills, Professor Gray says.
But during the 1980s and 1990s, children became ever more supervised, and lost opportunities to learn to deal with risk and with one another. You can see the transformation by walking through almost any residential neighborhood. Gone is the “intricate sidewalk ballet” that the urbanist Jane Jacobs described in 1961 as she navigated around children playing in her Greenwich Village neighborhood. One of us lives in that same neighborhood today. His son, at the age of 9, was reluctant to go across the street to the supermarket on his own. “People look at me funny,” he said. “There are no other kids out there without a parent.”
A study by sociologists at the University of Michigan documented this change by comparing detailed records of how kids spent several days in 1981 and 1997. The researchers found that time spent in any kind of play decreased 16 percent, and much of the play had shifted indoors, often involving a computer and no other children.
The trend has continued in the 2000s. Recess and free play time were reduced to make room for more standardized testing and academic work. Homework became common for even the youngest schoolchildren. After-school playtime morphed into structured activities overseen by adults.
Another trend that reduced outdoor free play is the grossly exaggerated fear of “stranger danger.” The spread of cable TV gave us more programming focused on rare but horrific cases of child abduction. Those fears, combined with the long, slow decline of trust in neighbors and fellow citizens, gave rise to a belief by the 1990s that persists today: Children who are not in sight of a responsible adult are at risk of abduction, so parents who allow unsupervised outdoor play are bad parents. The authorities should be notified.
The constant presence of adults is intended to keep children safe, but what are its likely effects? How might kids deprived of opportunities for free play, risk-taking and self-governance differ from previous generations when they leave the nest? We would expect two main areas of difficulty.
The first is that when these kids become adults, they are likely to be less resilient. Like the immune system, children are “antifragile,” as Nassim Taleb, a professor of risk engineering at New York University put it in his book by the same name. The immune system requires repeated exposure to dirt and germs in order to develop its protective abilities. Children who don’t get enough exposure are more susceptible to autoimmune diseases later on.
By the same logic, if we “protect” kids from the small risks and harms of free play, we stunt their ability to handle challenges and recover from failures. When such children arrive at college, we would expect them to perceive more aspects of their new environment as threatening compared with previous generations. We would expect to see more students experiencing anxiety and depression, which is precisely what is happening, according to national surveys and surveys of student counseling centers. These large increases do not just reflect a greater willingness to seek help; there has been a corresponding rise in self-harm, suicidal thinking and suicide among American adolescents and college students.
The second predictable consequence of play deprivation is a reduction in conflict management and negotiation skills. If there is always an adult who takes over, this is likely to create a condition sociologists call “moral dependence.” Instead of learning to resolve conflicts quickly and privately, kids who learn to “tell an adult” are rewarded for making the case to authority figures that they have been mistreated.

It’s easy to see how overprotection harms individuals, but in a disturbing essay titled “Cooperation Over Coercion,” the economist Steven Horwitz made the case that play deprivation also harms liberal democracies. He noted that a defining feature of the liberal tradition is its desire to minimize coercion by the power of the state and maximize citizens’ freedom to create the lives they choose for themselves. He reviewed work by political scientists showing that self-governing communities and democracies rely heavily on conversation, informal norms and local conflict resolution procedures to manage their affairs with minimal appeal to higher authorities. He concluded that self-governance requires the very skills that Peter Gray finds are best developed in childhood free play.
Unsupervised play is the perfect apprenticeship for Tocqueville’s art of association, but this art can be lost if children are prevented from practicing it. Professor Horwitz’s essay warns us that play deprivation will likely lead to a “coarsening of social interaction” that will “create a world of more conflict and violence, and one in which people’s first instinct will be increasingly to invoke coercion by other parties to solve problems they ought to be able to solve themselves.”
This is already evident on some college campuses. Efforts to disinvite speakers, punish people who tell jokes deemed offensive and regulate everything from dining hall food and Halloween costumes to the organizations that students are permitted to join are at odds with a longstanding liberal distaste for coercion.
Play is clearly not sufficient for political cooperation — today’s political elites had plenty of free play as children. But if Professors Gray and Horwitz are right that free play is the best teacher of the art of association, and if recent campus trends are harbingers of corporate and social trends, then we can expect our political dysfunction to worsen in the coming decades. We can look forward to rising levels of conflict at work and in other places where an authority is willing to resolve disputes. The job market for lawyers will boom as civil lawsuits are increasingly used to settle interpersonal conflicts.
If such a future comes to pass, it will not be the fault of today’s young people. It will be the result of well-meaning parents, teachers and college administrators who tried to protect young people from harm without understanding that overprotection itself is harmful.
Democracy is hard. It demands teamwork, compromise, respect for rules and a willingness to engage with other opinionated, vociferous individuals. It also demands practice. The best place to get that practice may be out on the playground.

"I'm Bilal, so please don't call me Billy"

Podcaster Bilal Harry Khan discusses the challenges of growing up with his name in the UK and makes a plea for people to try harder with names they are not familiar with.

 When I was nine years old my mum took me out shopping. We ended up in a small book shop in Shepherd's Bush Market, in west London. The shop was owned by an old Rastafarian man. He came up to me and said, "What's your name?"

I promptly replied, "I don't talk to strangers."

But then my mum approached us. It felt safe, so I said, "My name is Bilal."

The man said, "That's a very strong name. Do you know about the famous Bilal?"

I said, "No."

He told me. Of Ethiopian descent, Bilal ibn Rabah was born a slave in Mecca in 580 AD. He went on to become a trusted disciple and companion of the Prophet Mohammad.

I thought, "Wow, that's a cool story."

The Rastafarian man went on to tell me about Bilal's beautiful voice and how he was chosen by the prophet himself to lead one of the first calls to prayer.

"Can you sing?" the Rastafarian man asked me.

"Er, no," I replied awkwardly, hoping he wouldn't make me.

But inside I was instantly proud. I felt finally proud that I could own my name. And embrace it. If my name came with such a strong history, I thought, then I was no longer going to shy away from it.

I was born in north-west London in the 1990s and grew up my whole life in Neasden. My mum is Jamaican of mixed East Asian and black Caribbean heritage, and my dad is Kenyan of South Asian heritage. Although he was born in East Africa, his family is from a part of Kashmir that is now Pakistan.

I guess I'm not your conventional mixed-race Brit.

Both my parents emigrated to England when they were teenagers and met in school.

My parents did a great job at showing me both sides of my culture, but in terms of the food I ate and the music I listened to, I had a stronger affinity and connection to my Jamaican side than my South Asian side.

Neasden is an ethnically mixed area in North London, and I didn't grow up around many white people. But most of my teachers at school were white and they struggled with my name. Some would outright butcher it.

I remember being annoyed that one teacher would call me Bee-laarl. I would think, "That's not my name. That's not how you pronounce it." I couldn't understand why he couldn't pronounce a name like Bilal.

At the age of about eight I remember sitting down with my nan and my cousin. I told them that I was very upset about my name. I told them that I wanted an easier name so that I could navigate life without constantly having to explain myself. I wanted an English name. I wanted the name David.

There was no particular reason why I chose the name David, I just thought it was an easy name to grasp - a nice, simple, Biblical name. But I soon started having doubts.

My family called me "B". So changing my name to David wouldn't make sense. I decided that maybe I needed an easy white person name that started with the letter B. I briefly toyed with "Ben". But, no. I didn't want to be a Ben. I didn't feel like a Ben.

There's another thing with the name Bilal. It's a Muslim name, from my father's Kashmiri ancestry, but I'd been brought up as a Christian. So I got a lot of questions from parts of the Caribbean church community.

"Where is your name really from?"

"How did you get a name like that?"

"Why would your parents call you that?"

"Isn't that a Muslim name?"

The questions were, and are, constant. It's tiring.

It felt like I was having to explain my back story all the time - I couldn't just be. That's a lot for a young boy in primary school. If you're called David or Ben in this country you don't have the burden of constant explanation.

But once I had been told the origin of my name by the Rastafarian man in Shepherd's Bush, I stopped wanting an English name.
Now it's other people who seem to wish I had a name that was more like theirs.

After university I started doing doing workshops and talks on improving diversity at schools.
Oddly, even at these events sometimes seemed to struggle with my name, accidentally or on purpose. At diversity seminars.

Recently I was asked to give a talk to students at a mostly white school. I'd been in back-and-forth email contact with one of the teachers for ages. My full name, Bilal Harry Khan, comes up in email communication. I'd signed off all our emails as Bilal and introduced myself to him that way too. He had been addressing me as Bilal in these emails the entire time. But as he got up to introduce me to a whole assembly hall of teachers and students, he suddenly said, "Everyone, this is Harry."

I sat there in the front of the room, looking at him thinking, "What a mug!"

So I got up to talk and I said: "Hi everyone, I'm Bilal."

The room looked from me to him in confused silence. I started talking.

Afterwards, the teacher came up to me and said, "I'm sorry about that, I just knew they wouldn't understand. I just knew it would be difficult for them."

And I thought: "No, it wasn't difficult for the students. Many of them came up to me after the the talk and said my name perfectly. It was difficult for you."

Things do appear to be changing. If you look at the list of popular baby names in the UK, in 2017 Muhammad and Mateo were in the top 50 for boys, and Aaliyah and Luna were in the top 50 for girls.

So hopefully more people are getting used to those names. But right now it feels as though some people are still trying to change my name to something they are comfortable with.

Nicknames annoy me particularly. I can't stand being called Bill or Billy by people I barely know. (It makes me think of Billy Mitchell from Eastenders.)

It may seem minor, but I'm not here to make your life easier for you. You should learn how to pronounce the name I've just told you is mine. You're picking a name that is closer to home for you, but not closer to home for me.

Children in the UK should be able to grow up loving and being proud of their names. You can play a part in that by learning to pronounce them properly.

It not that hard. If you can say "Tchaikovsky", you can pronounce our names.

Saturday, 25 August 2018

Everyone has an accent

 Roberto Rey Agudo is the language program director of the department of Spanish and Portuguese at Dartmouth College

I have an accent. So do you.
I am an immigrant who has spent nearly as much time in the United States as I have in my home country, Spain. I am also the director of Dartmouth’s language programs in Spanish and Portuguese. Both facts explain, but only partly, why I feel a special fondness for the FX drama “The Americans,” in which Keri Russell and Matthew Rhys play Elizabeth and Philip Jennings, a husband-and-wife team of undercover K.G.B. agents living in suburban Washington. I can’t be the only one who nodded approvingly when they were both nominated for Emmys last week.
What interests me as a linguist is that the Jenningses are, as the pilot tells us, “supersecret spies living next door” who “speak better English than we do.” Even their neighbor, an F.B.I. agent on the counterintelligence beat, suspects nothing.
Living as I do, deeply immersed in the work of teaching and learning second languages, it was fun to watch a TV series in which the main characters’ aptitude for them was so central to the plot. Nonetheless, the premise that you can speak a language without any accent at all is a loaded one. You can’t actually do this.
Worse, when we fetishize certain accents and disdain others, it can lead to real discrimination in job interviews, performance evaluations and access to housing, to name just a few of the areas where having or not having a certain accent has profound consequences. Too often, at the hospital or the bank, in the office or at a restaurant — even in the classroom — we embrace the idea that there is a right way for our words to sound and that the perfect accent is one that is not just inaudible, but also invisible
If you look at the question from a sociolinguistic point of view, having no accent is plainly impossible. An accent is simply a way of speaking shaped by a combination of geography, social class, education, ethnicity and first language. I have one; you have one; everybody has one. There is no such thing as perfect, neutral or unaccented English — or Spanish, for that matter, or any other language. To say that someone does not have an accent is as believable as saying that someone does not have any facial features.
We know this, but even so, at a time when the percentage of foreign-born  residents in the United States is at its highest point in a century, the distinction between “native” and “nonnative” has grown vicious, and it is worth reminding ourselves of it again and again: No one speaks without an accent.
When we say that someone speaks with an accent, we generally mean one of two things: a nonnative accent or a so-called nonstandard accent. Both can have consequences for their speakers. In other words, it is worth acknowledging that people discriminate on the basis of accent within their own language group, as well as against those perceived as language outsiders. The privileged status of the standard accent is, of course, rooted in education and socioeconomic power.
The standard accent is not necessarily the same as the highest-status accent. It is simply the dominant accent, the one you are most likely to hear in the media, the one that is considered neutral. Nonstandard native accents are also underrepresented in the media, and like nonnative accents, are likely to be stereotyped and mocked. Terms like Southern drawl, Midwestern twang or Valley Girl upspeak underscore the layered status attached to particular ways of speaking.
Such judgments are purely social — to linguists, the distinctions are arbitrary. However, the notion of the neutral, perfect accent is so pervasive that speakers with stigmatized accents often internalize the prejudice they face. The recent re-evaluation of the “Simpsons” character Apu provides an important example of how the media and popular culture use accents to make easy — and uneasy — jokes.
When you are learning a language, a marked accent is usually also accompanied by other features, like limited vocabulary or grammatical mistakes. In the classroom, we understand that this is a normal stage in the development of proficiency. My family back in Madrid would have a hard time understanding the Spanish of my English-speaking students in my first-semester classroom.
Later, these same students study abroad in Barcelona or Cuzco or Buenos Aires, and often struggle to make themselves understood. But such is the privilege of English — and this is key — that nobody hearing their American accents presumes that they are less capable, less ambitious or less honest than if their R’s had a nicer trill. Yet this is exactly the kind of assumption that a Spanish accent — and many, many others — is likely to trigger within the United States.
It’s certainly true that a marked accent can get in the way of making yourself understood. E.S.L. learners and others are well advised to work on their pronunciation. As a teacher, I do try to lead my students toward some version of that flawed ideal, the native accent. One of the ironies in this is that I — along with most of my fellow teachers from the 20 countries (not counting Puerto Rico) where Spanish is an official language — long ago shed the specific regional, class-shaped intonations and vocabulary that are, or once were, our native accents. My point is not that we need to forget the aim of easily comprehensible communication — obviously, that remains the goal. But we do need to set aside the illusion that there is a single true and authentic way to speak.
English is a global language with many native and nonnative varieties. Worldwide, nonnative speakers of English outnumber natives by a ratio of three to one. Even in the United States, which has the largest population of native English speakers, there are, according to one estimate, nearly 50 million speakers of English as a second language. What does it even mean to sound native when so many English speakers are second-language speakers? Unless you are an embedded spy like the Jenningses, it is counterproductive to hold nativelike pronunciation as the bar you have to clear.
Accent by itself is a shallow measure of language proficiency, the linguistic equivalent of judging people by their looks. Instead, we should become aware of our linguistic biases and learn to listen more deeply before forming judgments. How large and how varied is the person’s vocabulary? Can she participate in most daily interactions? How much detail can he provide when retelling something? Can she hold her own in an argument?
Language discrimination based on accent is not merely an academic idea. Experiments show that people tend to make negative stereotypical assumptions about speakers with a nonnative accent. The effect extends all the way to bias against native speakers whose name or ethnicity reads as foreign. Studies show that when nonnative speakers respond to advertisements for housing, their conversations with prospective landlords are more likely to be unsuccessful, on average, than those of callers “without accents.”
So I hope you like my accent as much as I like yours.