Saturday, 3 December 2016

* Saudade: Portugal's love affair with melancholy

BBC
Eric Weiner
As an American, I’ve been inculcated with the importance of being happy – or at least pretending to be happy – at all costs. It’s an ethos epitomized by the smiley face emoji, which is said to have been invented in the US in 1963, and empty expressions like “have a nice day”.

In Portugal, no one tells you to have a nice day. No one particularly cares if you have a nice day, because chances are they’re not having a nice day either. If you ask a Portuguese person how they’re doing, the most enthusiastic reply you can expect is mais ou menos (so so).

Portugal’s culture of melancholy is hard to miss. You see it etched on people’s sombre expressions – this is no Thailand, known as the Land of Smiles – and even in the statues that occupy prime real estate in Lisbon’s public squares. In most countries, the men (and it’s almost always men) honoured in such places are macho generals. In Portugal, it’s moody poets

Yes, Portugal is a sad land, ranking 93rd of 157 countries (just behind Lebanon), according to the UN’s latest World Happiness Report. But don’t pity the Portuguese. They’re content with their discontentment, and, in an odd but enlightening way, actually enjoy it. It’s easy to assume that the Portuguese are masochists, but if you spend some time here, as I did recently, you quickly realize that the Portuguese have much to teach us about the hidden beauty, and joy, in sadness.

Portugal’s “joyful sadness” is encapsulated in a single word: saudade. No other language has a word quite like it. It is untranslatable, every Portuguese person assured me, before proceeding to translate it.
                   
Saudade is a longing, an ache for a person or place or experience that once brought great pleasure. It is akin to nostalgia but, unlike nostalgia, one can feel saudade for something that’s never happened, and likely never will.

At the heart of saudade lies a yawning sense of absence, of loss. Saudade, writes scholar Aubrey Bell in his book In Portugal, is “a vague and constant desire for something... other than the present.”
It is possible to feel saudade for anything, publisher Jose Prata told me over lunch one day at Lisbon’s bustling Cais do Sodre market. “You can even feel saudade for a chicken,” he said, “but it has to be the right chicken.”

What makes saudade tolerable, pleasant even, is that “it is a very sharable feeling,” Prata explained. “I’m inviting you to share at the table of my sadness.” In Portugal, that’s a big table with room for everyone. In fact, a Portuguese chef has even started a line of chocolate called “Saudade”. Naturally, it is bittersweet.

One day, while sipping an espresso at the Largo de Camões public square in central Lisbon, I met Mariana Miranda, a clinical psychologist. This was the perfect person, I realized, to explain Portugal’s joyful sadness.

Sadness is an important part of life, she told me, adding that she can’t understand why anyone would avoid it.

“I want to feel everything in every possible way. Why paint a painting with only one colour?” By avoiding sadness at all costs, she said, we diminish ourselves. “There is actually lot of beauty in sadness.”

Another day, I met a genial police inspector name Romeu, a friend of a friend. He has happy days and sad days, he said, and he welcomes both equally. In fact, when confronted with an unhappy Portuguese person, he explained, the worst thing you can do is try to cheer him up.

“You’re sad and you want to be sad,” he said. “You’re at the office and people are trying to cheer you up, and you say ‘Don’t make me cheerful. Today is my pleasurable sadness day.’”

Several studies suggest that the Portuguese are onto something. One study, published in 2008 in the Journal of Experimental Social Psychology, found that sadness improves our memory. On gloomy, rainy days, people recalled details (of objects they had seen in a shop) more vividly than on bright sunny days, according to Australian psychologist and lead author Joseph Forgas. Another study in the same journal suggests sadness improves judgment. Participants were asked to watch videotaped statements of people accused of theft and figure out who was lying. The participants experiencing negative emotions at the time were able to more accurately identify the deceptive suspects.

Even sad music has its benefits. Researchers from the Free University of Berlin surveyed 772 people around the world and found that sad music “can actually lead to beneficial emotional effects,” according to the study, published in the journal Plos One. It does this, researchers Stefan Koelsch and Liila Taruffi believe, by enabling people to “regulate” negative moods. Sad music also fires the imagination and evokes “a wide range of complex and partially positive emotions,” they concluded.

Interestingly, the positive benefits of sad music were experienced differently among different cultures. For Europeans and North Americans, the strongest emotion that sadness induced was nostalgia, while for Asians it was peacefulness.


No one does sad music like the Portuguese. In particular, fado music is melancholy set to a melody. Fado means literally “destiny” or “fate”, and therein lays its sad beauty. We must accept our fate, even if it’s cruel, especially if it’s cruel.

The genre took root nearly two centuries ago in hardscrabble, working-class neighbourhoods of Lisbon. The first fado singers, or fadistas, were prostitutes and the wives of fishermen who may or may not return from sea. In other words, people on a first-name basis with suffering.

Today, fado is the soundtrack of life in Portugal. You hear it – and feel it – everywhere: on the radio, in concert halls and, most of all, in Lisbon’s several dozen fado houses. One evening, I dropped by one, a tiny place called Duque da Rua, tucked away in the city’s Chiado district. There's nothing slick about this sort of fado house. The singers are mostly amateurs – people like Marco Henriques, who works as an agronomist by day and tends bar in the club in the evening to help make ends meet.

Some fado singers have beautiful, angelic voices, he told me, while others do not. “You can have a bad voice and be a great fado singer,” he said, “because fado comes from the heart.”'

Listening to the music, I felt an odd combination of melancholy and relief. Melancholy, because the music was undeniably morose, as were the lyrics, which a Portuguese friend translated for me. Relief, because, for once, I felt no compunction to squelch or deny my sadness. Fado gave me permission to honour my shadow self.

A few days later, in the seaside town of Estoril, 30km southwest of Lisbon, I met Cuca Roseta, a popular fado singer who is one of the few able to earn a living from her music. She prepares for each performance with a minute of silence, a sort of prayer, “before giving myself”, she told me. “This is music where you give yourself. It’s a gift of your emotions and it’s very intimate.”

Roseta represents a new generation of fado singers. The melody is just as melancholic as traditional fado, but the lyrics are subtly optimistic. A sign perhaps that Portugal’s love affair with “joyful sadness” is beginning to wane? I sure hope not.

Monday, 21 November 2016

* Echo chamber: You are what you read

https://medium.com/
Tobia Rose-Stockwell

The thing that has become the most clear to us this election year is that we don’t agree on the fundamental truths we thought we did.

I went to college in the part of Pennsylvania that definitely flipped the state for Trump. A good number of my friends are still living there, and have posted messages from what seems at this moment in history to be a completely different country.

Over the last several weeks I have watched dozens of my friends on Facebook de-friend one another. I have seen plenty of self-righteous posts flow across my news feed, along with deeply felt messages of fear, anger and more recently — existential despair.

On the other side I see reflections of joy, levity, gratitude and optimism for the future. It could not be more stark.

The thing that both groups have in common is very apparent: A sense of profound confusion about how the other side cannot understand their perspective.

This seemed to be building on a trend in social media that hit full tilt in the lead up to the election: Political divisions between us are greater than they ever have been, and are still getting worse by the day.

I don’t believe that the Media Elite, Donald Trump or the Alt Right are to blame for the state of our politics. They peddle influence and ideas, but they don’t change the actual makeup of our country. Elected officials are still a fairly accurate representation of voters’ wishes.

I also don’t believe this is inherently a reaction to the political overreach of the status quo. This discontent is part of something felt outside of our borders too. You do not have to look far to see this rising tide of hyper-nationalism going international.

The reason is much more subversive, and something we really haven’t been able to address as humans until now. I believe that the way we consume information has literally changed the kind of people we are.

How did we get here?

For much of the 20th century and into the 21st, we had a very small handful of channels through which to consume things like the news. (Advance warning: for the sake of brevity, I’m going to gloss over a lot.

We had the big 3 TV networks, and a number of regional papers and radio stations that pumped out the majority of what we watched, read and listened to.

When politicians did things wrong, journalists competed to ask questions and report on it — and scoop each other on The Facts. When a claim of biased reporting was leveled, it was considered a pretty big insult.

Having a single source of news also had its drawbacks — it was basically a monopoly, which allowed for fewer opinions that deviated outside of the mainstream.

This media pipeline was so important in politics that there was a law passed in 1927 called the Equal-Time rule, which stated that any candidate for political office that was given a prime-time spot for Radio or TV must be given equivalent airtime. Remember that.

The Invention of the Private Personal Pipeline

When the internet came along, it was heralded as a new way to democratize this traditional monopoly on The Facts. People generally thought this was a great thing, and a way to expose us to a diverse new range of opinions.

About a decade ago, a few new startups began giving us reasons consume media by being online all the time. The ones that we know really well are Facebook and Twitter, but we’re mostly going to talk about Facebook here. It went from zero to a billion users in less than 8 years, and has essentially changed humanity’s relationship with the internet.
The most significant thing they built was your personal pipeline — the News Feed. It quickly changed from a fairly simple way to read posts from your friends to one based on a much more complicated algorithm that optimized for ‘engagement.’

As you know already, Facebook got really good at this. Their sorting algorithm became the primary method to serve us every type of content. It blew past Twitter and every other media channel (and is likely how you’re reading this article now).

Very suddenly, people realized this feed was way more important than the Big 3, newspapers, or radio ever were. A lot of people stopped buying papers or watching the news on TV. Everyone began to piggyback on this algorithm because it did such a good job of keeping people’s eyeballs online and happy.

But those eyeballs stopped caring as much about the big brand name news sites, because there were plenty of little news sites for us to read. Many of those sites had a more squishy relationship with journalism.
And as long as what the articles said made us feel pretty good and looked the same as traditional news, we kept reading them. For the first time, we suddenly had plenty of choice on The Facts.

You Are What You Read

If you are an average American with access to the internet, you consume a big portion of your news through Social Media — 62% of us get news this way. Facebook’s news feed is now the primary driver of traffic to news sites.

Most of the events that you read about will come through this feed. Most of your opinions will be shaped by it. This is a stream of information that is curated and limited by the things that will not make you uncomfortable — and certainly will not provide equal airtime to opposing viewpoints.

This news feed is a bubble, and the things that filter through are the things that do not challenge you. This is a version of what internet activist Eli Pariser called the Filter Bubble.

The Wall Street Journal recently built a tool that illustrates just how radically this has allowed for us to self-select the bubbles of our facts. Red Feed Blue Feed creates two custom news feeds based on the exact same topic (say, Michelle Obama) from conservative and liberal news sites on Facebook, and displays them side by side. It shows how easily one can become insulated inside a stream of news that confirm our assumptions and suspicions about the world, just by algorithmically tailoring the people and pages we follow.

We Prefer Information Ghettos

There is a funny quirk in our nature that psychologists call Confirmation Bias. It’s a real thing, and you can see people fall into it all the time. It is the natural human tendency to interpret new information as confirming our existing beliefs or theories. When we have a choice to read news that confirms our worldview or challenges it — we almost always choose the former, regardless of the evidence.

Since we feel uncomfortable when we’re exposed to media that pushes back on our perspective (like that weird political uncle you see at a family reunion), we usually end up avoiding it. It requires a lot of effort to change opinions, and generally it feels gross to have difficult chats with people that don’t agree with us. So, we politely decline the opportunity to become their friend, buy their product, read their magazine, or watch their show.

We insulate ourselves in these ‘information ghettos’ not because we mean to, but because it’s just easier.

Our own Facebook feed is no different. It is a manifestation of who we are. It was created by us: by the things we have liked in the past, by the friends we have added along the way, and by people that tend to have opinions a lot like ours. It is made by us.

This is self-segregation, and it happens naturally. But the success of Facebook’s algorithm has effectively poured gasoline on this smoldering innate bias.

The Problem with Community

But what about community? Facebook (and the internet in general) has done an amazing job at helping people find community. It has given us a way to connect with our best-matching, most specific, perfectly fitting counterparts online. From Furby collectors to exotic mushroom cultivators to the Alt Right, there is a place for everyone.

But there is a flaw in how we see community. As humans, we evolved in small tribes and rarely saw really large groups of other people. Because of this, we are bad at instinctively understanding the difference between ‘big’ numbers and ‘huge’ numbers. In any kind of physical setting, the difference between many thousands of people and many millions of people is actually impossible for us to see.

Online this has allowed us to insulate ourselves entirely within groups that may be a tiny fraction of our nation, without ever seeing another side. We instinctively feel like this is representative of a majority.

These online communities — to us — might seem to be purveyors of truth that embody The Facts better than anywhere else. They also might feel like they are enormous — thousands of people might agree with you. But that doesn’t make them true, or a majority opinion.

Contact Increases Empathy, Insulation Kills It

In social psychology there is a framework called the Contact Hypothesis, which has shown that prejudice is reduced through extended contact with people that have different backgrounds, opinions and cultures than ourselves. Developed by psychologist Gordon Allport as a way to understand discrimination, it is widely seen as one of the most successful tools for reducing prejudice and increasing empathy. It is a measurable and time-tested way of helping people get along.

The more time you spend with others that are different from you in an environment that is mutually beneficial, the more you will understand them. The more you understand them, the less prejudice and implicit bias you will have.

This contact is important in the context of our social channels. They are designed to let us insulate ourselves from the people and opinions we would prefer not to see.

We must agree on The Facts in order to co-exist

Facebook has stated that their mission is to make the world a more open and connected place. And they have, by anyone’s measure, connected more humans than any company in history.

With this success, they have also created a tool that has allowed us to become more insulated in our own ideological bubbles than we ever have been before.

Because of this lack of pluralism, we are systematically losing our ability to empathize. This is what we now see in the wider world — from Brexit to Trump to hyper-nationalistic movements worldwide. People globally no longer have the same incentives to find a shared understanding. This is not just dissatisfaction with globalization or the status quo. This is how we are changing our society by not seeing each other.

The precursor to building walls around nations is building walls around ideas

A reasonable understanding of The Facts is necessary for a concept we don’t really think about very much these days: Compromise. Compromise is what leads to consensus, and consensus is what allows for democracy.

It is not always joyous or exuberant. It doesn’t always feel good to require ourselves to care about other people’s opinions, needs and desires — especially when they don’t agree with our own. But this is what democracy is: a decision to live within a shared idea of the future. A mutual attempt at the hard civility of real compromise in order to keep moving forward together.

We need a moment for catharsis.To breathe. To cry. To be relieved, or to be angry.

But we need to also remember this — If we cannot build the tools of our media to encourage empathy and consensus, we will retract further into the toxic divisions that have come to define us today.

That careful consensus is the foundation upon which democracy is created — a sober understanding that allows for us to act as one whole. An attempt to find mutuality in our imperfections and differences, with the trust that we are together more extraordinary than our individual parts.
How we can do this better:

Ways to increase your political empathy online

  • Expose yourself to alternative opinions — Read the other side: Your news sources likely have their own bias baked right in. There is no better way of unpacking your own beliefs than exposing yourself to the news sites that disagree with you.
  • Examine the source of news for bias and factual inaccuracy before you share it — Cultivate a healthy skepticism when you see an exciting headline that comes from a website you haven’t heard of. Many of these posts are designed to appeal to hyper-partisanship in order to get you to share them.
  • Engage with people who are different from you when you can — Don’t delete the friends on Facebook that disagree with you (Trolls excepted). You will not ‘pollute’ your worldview by talking to them and trying to understand their perspective. Expend the extra effort to go through a civil discourse, build common ground and avoid a shouting match.
What Facebook can do:

(Warning — this gets nerdy)

Facebook should do more to prioritize posts that come from verified sources. It should functionally de-prioritize/flag sites that peddle in fake news (easy to implement) and even hyper-partisan news from both sides (harder). This editorial process should be neutral. A news feed that is optimized for engagement is essentially the algorithmic equivalent of “it bleeds it leads” — this is problematic when journalistic due process is missing from a huge portion of web-based news.

Consider Equal Air Time (or Equal Attention). Facebook knows exactly how long you spend consuming the media you do on their platform. They also know how partisan you are (or are likely to be), how old you are, and the kind of media you like. If the content you consume is exclusively partisan (as determined on a per-source basis above), Facebook should rank this transparently, and allow space for sources with opposing political views to enter your feed (demographically, your pool of “friends of friends” can cross into a large range of political perspectives).

Ensure exposure to median viewpoints, not just opposing ones. The companies that dictate our diet of information must have fail-safes to keep us from isolating ourselves completely inside fully partisan ‘information ghettos’. Using the demographic information Facebook has about us, they can determine just how limited we are to alternative viewpoints, and improve our access to posts outside our immediate social graph. This does require a tagging mechanism on the assumed partisanship of various news sources and articles, but is doable.

Finally, Facebook should be more open with how its algorithm editorializes the types of content we see. Being transparent about this methodology will reduce any claims of partisanship and bias. A solution to this, and all of these problems, can be found without compromising its IP.
Facebook’s success has turned it into one of the most powerful tools we have for connecting to other humans. Something this powerful, which has come to deserve so much of our attention, also deserves our scrutiny in equal measure.

Wednesday, 16 November 2016

* Technocracy is at the heart of the global anti-elite backlash

Book review by Samuel Moyn, professor of law and history at Harvard University

What does it mean to say knowledge is power?

Francis Bacon is alleged to have said it first. In that version, the remark is supposed to have captured the signature aspiration of modernity — to deploy knowledge for the sake of the mastery on which human progress depends. The inquiry of experts would unlock the arcana of nature, and provide a mode of beneficial rule that could escape old criticisms of the power of ill-informed and thus to some extent illegitimate monarchs. “[T]he sovereignty of man lieth hid in knowledge,” Bacon wrote,
wherein many things are reserved, which kings with their treasure cannot buy, nor with their force command; their spials and intelligencers can give no news of them, their seamen and discoverers cannot sail where they grow: now we govern nature in opinions, but we are thrall unto her in necessity . . . [but] we should command her by action.
Cover for A World of Struggle: How Power, Law, and Expertise Shape Global Political Economy
Expertise, that is, would offer liberation from the age-old yoke of nature by taking humanity beyond the realm of mere opinion. Kings had proved themselves powerless to lift this yoke, but experts would do so for the sake of man’s advancement and “sovereignty.” It was an optimistic, untroubled, and even visionary statement.

In the several centuries since, expert governance — rule by elite knowledge claimed to be superior to mere opinion — has fallen under suspicion. But there is a serious debate about how to diagnose its possible failings. Bacon’s own younger colleague, and sometime amanuensis, Thomas Hobbes, could not believe his predecessor’s rather optimistic views of the politics of knowledge. According to Hobbes’s radically nominalist account, there was not a world to know nor master independent of human struggle to decide how to think and even talk about that world. “[S]uch is the nature of men,” Hobbes wrote, “that howsoever they may acknowledge many others to be more witty, or more eloquent, or more learned; Yet they will hardly believe there be many so wise as themselves . . . .”

Kings were needed not because people could agree who knew most, but for precisely the opposite reason. Before human beings could decide what the world was like, they would have to find a way to settle their differences. Knowledge was not an alternative to the uninformed power of kings; rather, bitter partisanship about how to know the world provided one more reason for their pacifying authority. But after Hobbes, exactly what drove epistemic struggle — and how it was best to be explained — have remained themselves persistently controversial.

For Karl Marx and his heirs, expert rule would have to be regarded as a species of ideology originating in and covering up the class domination that, in turn, followed from the mode of production of an age. The forms and workings of intelligence, for this reason, had to be traced to ultimately material factors. For twentieth-century skeptics, things seemed more complicated. While never freeing expertise from the workings of capital entirely, French sociologist Pierre Bourdieu insisted that professional fields had their own internal dynamics of struggle for prestige and status.

Yet like Marx, Bourdieu hoped to demystify these workings, for the sake of better insight and political change. Michel Foucault, in his withering portrait of “power/knowledge,” took cynicism to the breaking point. Knowledge did not merely serve power; it was power. It constituted domination in claiming to neutrally describe reality. And there was no apparent, let alone easy, alternative to subordination. Foucault went as far as possible to reverse the Baconian vision of liberation and legitimation through knowledge — studying experts was for Foucault the great device of delegitimation, with unclear consequences.

In his new book on how the world is ruled today through expert knowledge, Professor David Kennedy enters this continuing discussion in brilliant, pathbreaking, and trademark fashion. Slyly presenting himself as a disinterested observer of global governance, Kennedy eclectically draws on twentieth-century perspectives about knowledge, achieving a synthesis all his own. Presented without theoretical encumbrance or jargon, A World of Struggle is a straightforward but sophisticated account that capitalizes on prior insight to achieve a unique and powerful vantage point. The superlative book wins its distinction not only because it constructs a novel theory but also because it applies that theory to how the globe as a whole is ruled — something no one in the canon of social theory has really done.

According to Kennedy, accounts of global governance are themselves typically products of an expertise that does much of the work of immunizing a contestable world from serious critique or change. “Terribly unjust, subject to crisis, environmentally unwise, everywhere politically and economically captured by the few, and yet somehow impossible for anyone to alter or escape” is Kennedy’s description of the contemporary situation (pp. 31–32). His “hypothesis” in response is that “this stability arises from the relative invisibility and imperviousness of the world of technical management to contestation” (p. 32). To understand expertise is to grasp how the terms of debate and decision about solutions end up reinstating problems.

Much in the book is vintage Kennedy. There is a sinuous prose cast with enviable lucidity in spite of its high level of complexity. There is the structuralist vocation that, from Kennedy’s beginnings, has delighted in providing inventories of options of discourse (and charts graphically illustrating the argumentative choices). Indeed, one of the hallmarks of A World of Struggle is how heavily it focuses on the language that constitutes, in Kennedy’s account, the familiar realities of global governance, from the interstate system to the global economy. There is also the extravagant political hope that Kennedy never imposes on his readers but allows to lurk on the margin as an attractive but vague possibility.

Altogether, Kennedy’s new book reminds his old readers and instructs his new ones why he is, without doubt, the single most important innovator in international legal thought of the past several decades, a fact proved not only by his own arguments but also by his extraordinary influence. Inaugurating a “new stream” of scholarship on international law, Kennedy has brought the field out of its doctrinalism and parochialism into conversation with social thought and humanistic inquiry.

With few possible contenders, like his close associate Professor Martti Koskenniemi, Kennedy may have done the most to make the “invisible college” of international lawyers visible, or at least interesting, to those outside it in diverse fields of academic pursuit. And this book takes that remarkable achievement to a new level. As a result, this is the rare text occupied with international law that is likely to be legible by — indeed, exhilarating to — outsiders to the field, elsewhere in the legal academy and beyond.

Monday, 14 November 2016

* Le bouddhisme occidental serait une imposture

Anne Both, anthropologue

Voir au-delà des apparences. Tel est, nous rappelle Marion Dapsance dans son essai, « l’objectif métaphysique prescrit par le bouddhisme ». Il est donc tout à fait banal que le lama tibétain ­Sogyal Rinpoché l’exige de la part de ses adeptes occidentaux. Ce qui serait beaucoup moins anodin, en revanche, d’après elle, c’est le fonctionnement et les visées du réseau international Rigpa.

Image result for Les Dévots du bouddhisme
Pour comprendre comment les centres bouddhistes en France forment leurs disciples, l’anthropologue s’est immergée dans les parcours d’apprentissage proposés par cette entreprise, créée en 1978 par Sogyal Rinpoché lui-même, auteur du Livre tibétain de la vie et de la mort (La Table ronde, 1993), best-seller mondial.

Ses investigations, étalées sur sept ans, ont mené Marion Dapsance à Nice, Paris, Monaco, Bristol, Londres ou Levallois-Perret (Hauts-de-Seine). A travers une touchante galerie de portraits et de rigoureuses descriptions de pèlerinages, de retraites ou de cours, elle retrace une double expérience.

D’un côté, celle des « étudiants », engagés sur la voie de l’affranchissement du stress et du matérialisme occidental : un jeune psychothérapeute, une enseignante à la retraite, une hippie vieillissante ou un ancien pilote d’Air France.

De l’autre, celle d’une jeune chercheuse, au départ curieuse, voire très enthousiaste, mais qui déchante au fur et à mesure que son enquête progresse.

L’ouvrage nous offre une ethnographie saupoudrée d’un salvateur éclat de légèreté. En effet, la transmission de la sagesse universelle, dont elle consigne les étapes dans son carnet, s’écarte sensiblement de celle pratiquée au pays du dalaï-lama. Par exemple, la méditation, présentée aux Européens comme l’essence de la tradition tibétaine bouddhique, y est – aussi incroyable que cela puisse paraître – parfaitement inconnue.

L’auteure nous rappelle, en revanche, avec son érudition feutrée, que la meditatio est une pratique médiévale de lecture des Evangiles. De même, explique-t-elle, les enseignements au sein de Rigpa, totalement affranchis de « substance philosophique », ne sont en rien des commentaires de textes sacrés, contrairement à ceux des lamas du Tibet traditionnel.

Marion Dapsance, actuellement en résidence à l’université Columbia (New York), en arrive à écrire que l’on « peut à juste titre en conclure que les “maîtres de la folle sagesse” se paient littéralement la tête des Occidentaux ». Cette qualification en forme d’oxymore, « Maîtres de la folle sagesse », renvoie à Chögyam Trungpa, un lama iconoclaste et modèle de Sogyal Rinpoché. Ce dernier, l’auteure nous le décrit comme provocateur, ouvertement outrancier avec ses disciples, capricieux, assisté dans ses moindres gestes, disposant de jeunes femmes (les dakinis) au service de tous ses besoins – y compris sexuels.

Mais prendre sur soi, satisfaire le maître dans toutes ses lubies, donner de son temps et beaucoup de son argent, c’est toujours entrer en connexion avec lui, se sentir élu, acquérir un statut prestigieux, atteindre le nirvana de la consécration. L’homme d’affaires n’accorde d’ailleurs ce traitement qu’aux Occidentaux. Il s’agit, pour eux, du prix à payer pour évacuer l’ego, s’affranchir du sens commun et accéder enfin à la « perception pure ».

Le démontage systématique, pièce à pièce, de cette mécanique sacrée est opéré à partir de matériaux empiriques soigneusement collectés. L’auteure nous rapporte ce qu’elle a vu ou entendu. Et c’est sur un ton d’une neutralité tout à fait déconcertante qu’elle nous le confie : une de ses interlocutrices lui a avoué être la réincarnation de Jeanne d’Arc.

Sans jamais se moquer, ni sombrer dans une empathie mielleuse, Marion Dapsance nous offre une ethnographie saupoudrée d’un salvateur éclat de légèreté, sur un sujet qui en est pour le moins dépourvu.

Son livre, étoffé par un savoir tibétologique, échappe à la facilité de la caricature, de l’amalgame et de la généralisation, à l’image de ses dernières lignes : « Cela n’a rien à voir avec la pratique tibétaine du bouddhisme ; il s’agit plutôt d’un effet pervers de la transformation d’une religion hiérarchisée, dé­votionnelle et ritualiste en ersatz de psychothérapie pour Occidentaux fatigués et spirituellement démunis. » Le lecteur, lui, n’a pas attendu d’arriver à la fin de l’ouvrage pour le comprendre.

Philippe Cornu, éminent spécialiste du Tibet, a signé, le 2 novembre, une tribune contre le livre sur le site Internet du « Monde des religions », intitulée « Quand le bouddhisme est attaqué… ». L’auteur de ce texte acerbe est l’un des responsables du centre Rigpa de Levallois-Perret. Ceci explique peut-être cela. Quant à la congrégation religieuse, reconnue comme telle en France depuis le décret du 29 janvier 2002, elle a publié un communiqué dénonçant des « accusations [qui] tombent dans les stéréotypes sensationnalistes qui peuvent facilement tromper les gens ». Ah ! Les apparences...

Thursday, 10 November 2016

* There is no such thing as western civilisation

Kwame Anthony Appiah, professor of philosophy and law at New York University

Like many Englishmen who suffered from tuberculosis in the 19th century, Sir Edward Burnett Tylor went abroad on medical advice, seeking the drier air of warmer regions. Tylor came from a prosperous Quaker business family, so he had the resources for a long trip. In 1855, in his early 20s, he left for the New World, and, after befriending a Quaker archeologist he met on his travels, he ended up riding on horseback through the Mexican countryside, visiting Aztec ruins and dusty pueblos. Tylor was impressed by what he called “the evidence of an immense ancient population”.

And his Mexican sojourn fired in him an enthusiasm for the study of faraway societies, ancient and modern, that lasted for the rest of his life. In 1871, he published his masterwork, Primitive Culture, which can lay claim to being the first work of modern anthropology.

Primitive Culture was, in some respects, a quarrel with another book that had “culture” in the title: Matthew Arnold’s Culture and Anarchy, a collection that had appeared just two years earlier. For Arnold, culture was the “pursuit of our total perfection by means of getting to know, on all the matters which most concern us, the best which has been thought and said in the world”. Arnold wasn’t interested in anything as narrow as class-bound connoisseurship: he had in mind a moral and aesthetic ideal, which found expression in art and literature and music and philosophy.

But Tylor thought that the word could mean something quite different, and in part for institutional reasons, he was able to see that it did. For Tylor was eventually appointed to direct the University Museum at Oxford, and then, in 1896, he was appointed to the first chair of anthropology there. It is to Tylor more than anyone else that we owe the idea that anthropology is the study of something called “culture”, which he defined as “that complex whole which includes knowledge, belief, arts, morals, law, customs, and any other capabilities and habits acquired by man as a member of society”. Civilisation, as Arnold understood it, was merely one of culture’s many modes.

Nowadays, when people speak about culture, it is usually either Tylor’s or Arnold’s notion that they have in mind. The two concepts of culture are, in some respects, antagonistic. Arnold’s ideal was “the man of culture” and he would have considered “primitive culture” an oxymoron. Tylor thought it absurd to propose that a person could lack culture. Yet these contrasting notions of culture are locked together in our concept of western culture, which many people think defines the identity of modern western people. So let me try to untangle some of our confusions about the culture, both Tylorian and Arnoldian, of what we have come to call the west.

Someone asked Mahatma Gandhi what he thought of western civilisation, and he replied: “I think it would be a very good idea.” Like many of the best stories, alas, this one is probably apocryphal; but also like many of the best stories, it has survived because it has the flavour of truth. But my own response would have been very different: I think you should give up the very idea of western civilisation. It is at best the source of a great deal of confusion, at worst an obstacle to facing some of the great political challenges of our time. I hesitate to disagree with even the Gandhi of legend, but I believe western civilisation is not at all a good idea, and western culture is no improvement.

One reason for the confusions “western culture” spawns comes from confusions about the west. We have used the expression “the west” to do very different jobs. Rudyard Kipling, England’s poet of empire, wrote, “Oh, east is east and west is west, and never the twain shall meet”, contrasting Europe and Asia, but ignoring everywhere else. During the cold war, “the west” was one side of the iron curtain; “the east” its opposite and enemy. This usage, too, effectively disregarded most of the world.

Often, in recent years, “the west” means the north Atlantic: Europe and her former colonies in North America. The opposite here is a non-western world in Africa, Asia and Latin America – now dubbed “the global south” – though many people in Latin America will claim a western inheritance, too. This way of talking notices the whole world, but lumps a whole lot of extremely different societies together, while delicately carving around Australians and New Zealanders and white South Africans, so that “western” here can look simply like a euphemism for white.

Of course, we often also talk today of the western world to contrast it not with the south but with the Muslim world. And Muslim thinkers sometimes speak in a parallel way, distinguishing between Dar al-Islam, the home of Islam, and Dar al-Kufr, the home of unbelief. I would like to explore this opposition further. Because European and American debates today about whether western culture is fundamentally Christian inherit a genealogy in which Christendom is replaced by Europe and then by the idea of the west.

This civilisational identity has roots going back nearly 1,300 years, then. But to tell the full story, we need to begin even earlier.

For the Greek historian Herodotus, writing in the fifth century BC, the world was divided into three parts. To the east was Asia, to the south was a continent he called Libya, and the rest was Europe. He knew that people and goods and ideas could travel easily between the continents: he himself travelled up the Nile as far as Aswan, and on both sides of the Hellespont, the traditional boundary between Europe and Asia. Herodotus admitted to being puzzled, in fact, as to “why the earth, which is one, has three names, all women’s”. Still, despite his puzzlement, these continents were for the Greeks and their Roman heirs the largest significant geographical divisions of the world.

But here’s the important point: it would not have occurred to Herodotus to think that these three names corresponded to three kinds of people: Europeans, Asians, and Africans. He was born at Halicarnasus – Bodrum in modern Turkey. Yet being born in Asia Minor didn’t make him an Asian; it left him a Greek. And the Celts, in the far west of Europe, were much stranger to him than the Persians or the Egyptians, about whom he knew rather a lot. Herodotus only uses the word “European” as an adjective, never as a noun. For a millennium after his day, no one else spoke of Europeans as a people, either.

Then the geography Herodotus knew was radically reshaped by the rise of Islam, which burst out of Arabia in the seventh century, spreading with astonishing rapidity north and east and west. After the prophet’s death in 632, the Arabs managed in a mere 30 years to defeat the Persian empire that reached through central Asia as far as India, and to wrest provinces from Rome’s residue in Byzantium.

The Umayyad dynasty, which began in 661, pushed on west into north Africa and east into central Asia. In early 711, it sent an army across the straits of Gibraltar into Spain, which the Arabs called al-Andalus, where it attacked the Visigoths who had ruled much of the Roman province of Hispania for two centuries. Within seven years, most of the Iberian Peninsula was under Muslim rule; not until 1492, nearly 800 years later, was the whole peninsula under Christian sovereignty again.

The Muslim conquerors of Spain had not planned to stop at the Pyrenees, and they made regular attempts in the early years to move further north. But near Tours, in 732CE, Charles Martel, Charlemagne’s grandfather, defeated the forces of al-Andalus, and this decisive battle effectively ended the Arab attempts at the conquest of Frankish Europe. The 18th-century historian Edward Gibbon, overstating somewhat, observed that if the Arabs had won at Tours, they could have sailed up the Thames. “Perhaps,” he added, “the interpretation of the Koran would now be taught in the schools of Oxford, and her pulpits might demonstrate to a circumcised people the sanctity and truth of the revelation of Mahomet.”

What matters for our purposes is that the first recorded use of a word for Europeans as a kind of person, so far as I know, comes out of this history of conflict. In a Latin chronicle, written in 754 in Spain, the author refers to the victors of the Battle of Tours as “Europenses”, Europeans. So, simply put, the very idea of a “European” was first used to contrast Christians and Muslims. (Even this, however, is a bit of a simplification. In the middle of the eighth century much of Europe was not yet Christian.)

Now, nobody in medieval Europe would have used the word “western” for that job. For one thing, the coast of Morocco, home of the Moors, stretches west of Ireland. For another, there were Muslim rulers in the Iberian Peninsula – part of the continent that Herodotus called Europe – until nearly the 16th century. The natural contrast was not between Islam and the west, but between Christendom and Dar al‑Islam, each of which regarded the other as infidels, defined by their unbelief.

Starting in the late 14th century, the Turks who created the Ottoman empire gradually extended their rule into parts of Europe: Bulgaria, Greece, the Balkans, and Hungary. Only in 1529, with the defeat of Suleiman the Magnificent’s army at Vienna, did the reconquest of eastern Europe begin. It was a slow process. It wasn’t until 1699 that the Ottomans finally lost their Hungarian possessions; Greece became independent only in the early 19th century, Bulgaria even later.

We have, then, a clear sense of Christian Europe – Christendom – defining itself through opposition. And yet the move from “Christendom” to “western culture” isn’t straightforward.

For one thing, the educated classes of Christian Europe took many of their ideas from the pagan societies that preceded them. At the end of the 12th century, Chrétien de Troyes, born a couple of hundred kilometres south-west of Paris, celebrated these earlier roots: “Greece once had the greatest reputation for chivalry and learning,” he wrote. “Then chivalry went to Rome, and so did all of learning, which now has come to France.”

The idea that the best of the culture of Greece was passed by way of Rome into western Europe gradually became, in the middle ages, a commonplace. In fact this process had a name. It was called the “translatio studii”: the transfer of learning. And it was an astonishingly persistent idea. More than six centuries later, Georg Wilhelm Friedrich Hegel, the great German philosopher, told the students of the high school he ran in Nuremberg: “The foundation of higher study must be and remain Greek literature in the first place, Roman in the second.”

So from the late middle ages until now, people have thought of the best in the culture of Greece and Rome as a civilisational inheritance, passed on like a precious golden nugget, dug out of the earth by the Greeks, transferred, when the Roman empire conquered them, to Rome. Partitioned between the Flemish and Florentine courts and the Venetian Republic in the Renaissance, its fragments passed through cities such as Avignon, Paris, Amsterdam, Weimar, Edinburgh and London, and were finally reunited – pieced together like the broken shards of a Grecian urn – in the academies of Europe and the United States.

There are many ways of embellishing the story of the golden nugget. But they all face a historical difficulty; if, that is, you want to make the golden nugget the core of a civilisation opposed to Islam. Because the classical inheritance it identifies was shared with Muslim learning. In Baghdad of the ninth century Abbasid caliphate, the palace library featured the works of Plato and Aristotle, Pythagoras and Euclid, translated into Arabic. In the centuries that Petrarch called the Dark Ages, when Christian Europe made little contribution to the study of Greek classical philosophy, and many of the texts were lost, these works were preserved by Muslim scholars. Much of our modern understanding of classical philosophy among the ancient Greeks we have only because those texts were recovered by European scholars in the Renaissance from the Arabs.

In the mind of its Christian chronicler, as we saw, the battle of Tours pitted Europeans against Islam; but the Muslims of al-Andalus, bellicose as they were, did not think that fighting for territory meant that you could not share ideas. By the end of the first millennium, the cities of the Caliphate of Cordoba were marked by the cohabitation of Jews, Christians, and Muslims, of Berbers, Visigoths, Slavs and countless others.

There were no recognised rabbis or Muslim scholars at the court of Charlemagne; in the cities of al-Andalus there were bishops and synagogues. Racemondo, Catholic bishop of Elvira, was Cordoba’s ambassador to the courts of the Byzantine and the Holy Roman empires. Hasdai ibn Shaprut, leader of Cordoba’s Jewish community in the middle of the 10th century, was not only a great medical scholar, he was the chairman of the Caliph’s medical council; and when the Emperor Constantine in Byzantium sent the Caliph a copy of Dioscorides’s De Materia Medica, he took up Ibn Shaprut’s suggestion to have it translated into Arabic, and Cordoba became one of the great centres of medical knowledge in Europe. The translation into Latin of the works of Ibn Rushd, born in Cordoba in the 12th century, began the European rediscovery of Aristotle. He was known in Latin as Averroes, or more commonly just as “The Commentator”, because of his commentaries on Aristotle. So the classical traditions that are meant to distinguish western civilisation from the inheritors of the caliphates are actually a point of kinship with them.

But the golden-nugget story was bound to be beset by difficulties. It imagines western culture as the expression of an essence – a something – which has been passed from hand to hand on its historic journey. The pitfalls of this sort of essentialism are evident in a wide range of cases. Whether you are discussing religion, nationality, race or culture, people have supposed that an identity that survives through time and space must be propelled by some potent common essence. But that is simply a mistake. What was England like in the days of Chaucer, father of English literature, who died more than 600 years ago? Take whatever you think was distinctive of it, whatever combination of customs, ideas, and material things that made England characteristically English then. Whatever you choose to distinguish Englishness now, it isn’t going to be that. Rather, as time rolls on, each generation inherits the label from an earlier one; and, in each generation, the label comes with a legacy. But as the legacies are lost or exchanged for other treasures, the label keeps moving on. And so, when some of those in one generation move from the territory to which English identity was once tied – move, for example, to a New England – the label can even travel beyond the territory. Identities can be held together by narratives, in short, without essences. You don’t get to be called “English” because there’s an essence that this label follows; you’re English because our rules determine that you are entitled to the label by being somehow connected with a place called England.

So how did the people of the north Atlantic, and some of their kin around the world, get connected to a realm we call the west, and gain an identity as participants in something called western culture?
It will help to recognise that the term “western culture” is surprisingly modern – more recent certainly than the phonograph. Tylor never spoke of it. And indeed he had no reason to, since he was profoundly aware of the internal cultural diversity even of his own country. In 1871 he reported evidence of witchcraft in rural Somerset. A blast of wind in a pub had blown some roasted onions stabbed with pins out of the chimney. “One,” Tylor wrote, “had on it the name of a brother magistrate of mine, whom the wizard, who was the alehouse-keeper, held in particular hatred ... and whom apparently he designed to get rid of by stabbing and roasting an onion representing him.” Primitive culture, indeed.

So the very idea of the “west,” to name a heritage and object of study, doesn’t really emerge until the 1890s, during a heated era of imperialism, and gains broader currency only in the 20th century. When, around the time of the first world war, Oswald Spengler wrote the influential book translated as The Decline of the West – a book that introduced many readers to the concept – he scoffed at the notion that there were continuities between western culture and the classical world. During a visit to the Balkans in the late 1930s, the writer and journalist Rebecca West recounted a visitor’s sense that “it’s uncomfortably recent, the blow that would have smashed the whole of our western culture”. The “recent blow” in question was the Turkish siege of Vienna in 1683.

If the notion of Christendom was an artefact of a prolonged military struggle against Muslim forces, our modern concept of western culture largely took its present shape during the cold war. In the chill of battle, we forged a grand narrative about Athenian democracy, the Magna Carta, Copernican revolution, and so on. Plato to Nato. Western culture was, at its core, individualistic and democratic and liberty-minded and tolerant and progressive and rational and scientific. Never mind that pre-modern Europe was none of these things, and that until the past century democracy was the exception in Europe – something that few stalwarts of western thought had anything good to say about. The idea that tolerance was constitutive of something called western culture would have surprised Edward Burnett Tylor, who, as a Quaker, had been barred from attending England’s great universities. To be blunt: if western culture were real, we wouldn’t spend so much time talking it up.

Of course, once western culture could be a term of praise, it was bound to become a term of dispraise, too. Critics of western culture, producing a photonegative emphasising slavery, subjugation, racism, militarism, and genocide, were committed to the very same essentialism, even if they see a nugget not of gold but of arsenic.

Talk of “western culture” has had a larger implausibility to overcome. It places, at the heart of identity, all manner of exalted intellectual and artistic achievements – philosophy, literature, art, music; the things Arnold prized and humanists study. But if western culture was there in Troyes in the late 12th century when Chrétien was alive, it had little to do with the lives of most of his fellow citizens, who did not know Latin or Greek, and had never heard of Plato. Today the classical heritage plays no greater role in the everyday lives of most Americans or Britons. Are these Arnoldian achievements that hold us together? Of course not. What holds us together, surely, is Tylor’s broad sense of culture: our customs of dress and greeting, the habits of behaviour that shape relations between men and women, parents and children, cops and civilians, shop assistants and consumers.

Intellectuals like me have a tendency to suppose that the things we care about are the most important things. I don’t say they don’t matter. But they matter less than the story of the golden nugget suggests.

So how have we bridged the chasm here? How have we managed to tell ourselves that we are rightful inheritors of Plato, Aquinas, and Kant, when the stuff of our existence is more Beyoncé and Burger King? Well, by fusing the Tylorian picture and the Arnoldian one, the realm of the everyday and the realm of the ideal. And the key to this was something that was already present in Tylor’s work.

Remember his famous definition: it began with culture as “that complex whole”. What you’re hearing is something we can call organicism. A vision of culture not as a loose assemblage of disparate fragments but as an organic unity, each component, like the organs in a body, carefully adapted to occupy a particular place, each part essential to the functioning of the whole. The Eurovision song contest, the cutouts of Matisse, the dialogues of Plato are all parts of a larger whole. As such, each is a holding in your cultural library, so to speak, even if you have never personally checked it out. Even if it isn’t your jam, it is still your heritage and possession. Organicism explained how our everyday selves could be dusted with gold.

Now, there are organic wholes in our cultural life: the music, the words, the set-design, the dance of an opera fit and are meant to fit together. It is, in the word Wagner invented, a Gesamtkunstwerk, a total work of art. But there isn’t one great big whole called culture that organically unites all these parts. Spain, in the heart of “the west,” resisted liberal democracy for two generations after it took off in India and Japan in “the east,” the home of Oriental despotism. Jefferson’s cultural inheritance – Athenian liberty, Anglo-Saxon freedom – did not preserve the United States from creating a slave republic. At the same time, Franz Kafka and Miles Davis can live together as easily – perhaps even more easily – than Kafka and his fellow Austro-Hungarian Johann Strauss. You will find hip-hop in the streets of Tokyo. The same is true in cuisine: Britons once swapped their fish and chips for chicken tikka masala, now, I gather, they’re all having a cheeky Nando’s.

Once we abandon organicism, we can take up the more cosmopolitan picture in which every element of culture, from philosophy or cuisine to the style of bodily movement, is separable in principle from all the others – you really can walk and talk like an African-American and think with Matthew Arnold and Immanuel Kant, as well as with Martin Luther King and Miles Davis. No Muslim essence stops the inhabitants of Dar al-Islam from taking up anything from western civilisation, including Christianity or democracy. No western essence is there to stop a New Yorker of any ancestry taking up Islam.

The stories we tell that connect Plato or Aristotle or Cicero or Saint Augustine to contemporary culture in the north Atlantic world have some truth in them, of course. We have self-conscious traditions of scholarship and argumentation. The delusion is to think that it suffices that we have access to these values, as if they are tracks on a Spotify playlist we have never quite listened to. If these thinkers are part of our Arnoldian culture, there is no guarantee that what is best in them will continue to mean something to the children of those who now look back to them, any more than the centrality of Aristotle to Muslim thought for hundreds of years guarantees him an important place in modern Muslim cultures.

Values aren’t a birthright: you need to keep caring about them. Living in the west, however you define it, being western, provides no guarantee that you will care about western civilisation. The values European humanists like to espouse belong just as easily to an African or an Asian who takes them up with enthusiasm as to a European. By that very logic, of course, they do not belong to a European who has not taken the trouble to understand and absorb them. The same, of course, is true in the other direction. The story of the golden nugget suggests that we cannot help caring about the traditions of “the west” because they are ours: in fact, the opposite is true. They are only ours if we care about them. A culture of liberty, tolerance, and rational inquiry: that would be a good idea. But these values represent choices to make, not tracks laid down by a western destiny.

In the year of Edward Burnett Tylor’s death, what we have been taught to call western civilisation stumbled into a death match with itself: the Allies and the Great Central Powers hurled bodies at each other, marching young men to their deaths in order to “defend civilisation”. The blood-soaked fields and gas-poisoned trenches would have shocked Tylor’s evolutionist, progressivist hopes, and confirmed Arnold’s worst fears about what civilisation really meant. Arnold and Tylor would have agreed, at least, on this: culture isn’t a box to check on the questionnaire of humanity; it is a process you join, a life lived with others.

Culture – like religion and nation and race – provides a source of identity for contemporary human beings. And, like all three, it can become a form of confinement, conceptual mistakes underwriting moral ones. Yet all of them can also give contours to our freedom. Social identities connect the small scale where we live our lives alongside our kith and kin with larger movements, causes, and concerns. They can make a wider world intelligible, alive, and urgent. They can expand our horizons to communities larger than the ones we personally inhabit. But our lives must make sense, too, at the largest of all scales. We live in an era in which our actions, in the realm of ideology as in the realm of technology, increasingly have global effects. When it comes to the compass of our concern and compassion, humanity as a whole is not too broad a horizon.

We live with seven billion fellow humans on a small, warming planet. The cosmopolitan impulse that draws on our common humanity is no longer a luxury; it has become a necessity. And in encapsulating that creed I can draw on a frequent presence in courses in western civilisation, because I don’t think I can improve on the formulation of the dramatist Terence: a former slave from Roman Africa, a Latin interpreter of Greek comedies, a writer from classical Europe who called himself Terence the African. He once wrote, “Homo sum, humani nihil a me alienum puto.” “I am human, I think nothing human alien to me.” Now there’s an identity worth holding on to.