Tuesday, 6 March 2018

Religious traditions are human affairs (Buddhism is no exception)

Dan Arnold is an associate professor of philosophy of religions at the University of Chicago Divinity School. Alicia Turner is an associate professor of humanities and religious studies at York University.

Most adherents of the world’s religions claim that their traditions place a premium on virtues like love, compassion and forgiveness, and that the state toward which they aim is one of universal peace. History has shown us, however, that religious traditions are human affairs, and that no matter how noble they may be in their aspirations, they display a full range of both human virtues and human failings.

While few sophisticated observers are shocked, then, by the occurrence of religious violence, there is one notable exception in this regard; there remains a persistent and widespread belief that Buddhist societies really are peaceful and harmonious. This presumption is evident in the reactions of astonishment many people have to events like those taking place in Myanmar. How, many wonder, could a Buddhist society — especially Buddhist monks! — have anything to do with something so monstrously violent as the ethnic cleansing now being perpetrated on Myanmar’s long-beleaguered Rohingya minority? Aren’t Buddhists supposed to be compassionate and pacifist?
While history suggests it is naïve to be surprised that Buddhists are as capable of inhuman cruelty as anyone else, such astonishment is nevertheless widespread — a fact that partly reflects the distinctive history of modern Buddhism. By “modern Buddhism,” we mean not simply Buddhism as it happens to exist in the contemporary world but rather the distinctive new form of Buddhism that emerged in the 19th and 20th centuries. In this period, Buddhist religious leaders, often living under colonial rule in the historically Buddhist countries of Asia, together with Western enthusiasts who eagerly sought their teachings, collectively produced a newly ecumenical form of Buddhism — one that often indifferently drew from the various Buddhist traditions of countries like China, Sri Lanka, Tibet, Japan and Thailand.

This modern form of Buddhism is distinguished by a novel emphasis on meditation and by a corresponding disregard for rituals, relics, rebirth all the other peculiarly “religious” dimensions of history’s many Buddhist traditions. The widespread embrace of modern Buddhism is reflected in familiar statements insisting that Buddhism is not a religion at all but rather (take your pick) a “way of life,” a “philosophy” or (reflecting recent enthusiasm for all things cognitive-scientific) a “mind science.”

Buddhism, in such a view, is not exemplified by practices like Japanese funerary rites, Thai amulet-worship or Tibetan oracular rituals but by the blandly nonreligious mindfulness meditation now becoming more ubiquitous even than yoga. To the extent that such deracinated expressions of Buddhist ideas are accepted as defining what Buddhism is, it can indeed be surprising to learn that the world’s Buddhists have, both in past and present, engaged in violence and destruction.

There is, however, no shortage of historical examples of violence in Buddhist societies. Sri Lanka’s long and tragic civil war (1983-2009), for example, involved a great deal of specifically Buddhist nationalism on the part of a Sinhalese majority resentful of the presence of Tamil Hindus in what the former took to be the last bastion of true Buddhism (the “island of dharma”). Political violence in modern Thailand, too, has often been inflected by Buddhist involvement, and there is a growing body of scholarly literature on the martial complicity of Buddhist institutions in World War II-era Japanese nationalism. Even the history of the Dalai Lama’s own sect of Tibetan Buddhism includes events like the razing of rival monasteries, and recent decades have seen a controversy centering on a wrathful protector deity believed by some of the Dalai Lama’s fellow religionists to heap destruction on the false teachers of rival sects.

These and other such examples have, to be sure, often involved eloquent Buddhist critics of violence — but the fact remains that the histories of Buddhist societies are as checkered as most human history.

It is important to emphasize that the current violence against the Rohingya is not a straightforwardly “religious” matter. Myanmar’s long history of exclusion and violence toward the Rohingya has typically been framed by the question of who counts as a legitimate ethnic minority and who is instead to be judged a foreigner (and thus an illegal migrant). It is also significant that the contemporary nation-state of Myanmar represents the blending of the former military dictatorship and the democratically elected National League of Democracy led by Aung San Suu Kyi; in this hybrid form of government, the mechanisms and influence of civil society and public opinion are relatively new.

Nevertheless, the violence against the Rohingya is certainly related to increasingly popular campaigns in recent years to revive Myanmar’s Buddhist tradition (understood by some to be the marker of “real” Burmese identity) and to protect it particularly against the threat that Islam is thought to represent. Popular campaigns to this effect involve the politics of monastic hierarchies, revivalist education campaigns, the advancement of laws for the “protection of race and religion” and attempts to influence the 2015 elections. While the movement is diverse, there is little doubt that it is shaped by (and that it further fuels) a strong anti-Muslim discourse.

This anti-Muslim discourse is, to be sure, exacerbated by all manner of sociopolitical considerations (in Myanmar as elsewhere there is widespread uncertainty at a time of rapid economic, social and political change), and these and other factors are used by a wide range of political actors to gain advantage in the new hybrid democracy. One notion central to this discourse, though, is the idea that Buddhism is under threat in the contemporary world — an idea that appears not only in Myanmar’s history but also in the Buddhist texts, written in the Indic language of Pali, that are taken as canonical in Myanmar. Indeed, many Buddhist traditions preserve narratives (undergirded by the cardinal doctrine of impermanence) to the effect that the Buddha’s teachings are always in decline.

Efforts to revive and preserve Buddhism against this supposed decline have driven many developments in Burmese Buddhism for at least two centuries. One such movement was the Buddhist leader Ledi Sayadaw’s colonial-era program of teaching insight meditation to Buddhist laypeople, who had not traditionally engaged in the meditative and other practices typical only of monastics. This lay meditation movement was later promoted as a practice available to an international audience — a development that is part of the history of contemporary Western fascination with mindfulness.

What is especially interesting is that Buddhist proponents of anti-Muslim discourse often assert that Myanmar is under threat from Muslims precisely because Buddhism is, they say, a uniquely peaceful and tolerant religion. In arguing that Rohingya are illegal immigrants who promote an exclusivist and proselytizing religion that is bent on geographical and cultural conquest through conversion and marriage, some Buddhist leaders in Myanmar thus exploit the very same presumption of uniform tolerance and peacefulness that makes many Westerners uniquely surprised by Buddhist violence.

There are, in fact, important historical reasons that the idea of distinctively Buddhist tolerance figures both in nationalist disparagement of Myanmar’s Rohingya and in widespread Western astonishment at the idea of Buddhists engaging in it. Both phenomena have something to do with Myanmar’s experience under British colonial rule, during which religion came to be an important and operative aspect of Burmese identity.

In this regard, it is not self-evident that being “Buddhist” or “Muslim” should be taken as the most salient facts about people who are many other things (Burmese, shopkeepers, farmers, students) besides. Nevertheless, religious identity under British rule came to be overwhelmingly significant — significant enough that it can now be mobilized to turn large numbers of Buddhists against the Muslim neighbors with whom they have lived peacefully for generations.

The British colonial state required, for instance, that every person have a single religious identity for the purposes of personal law and administration. Such policies reflected the extent to which colonial administrators typically interpreted all of the various cultural interactions in colonial Burma through the lens of “world religions.” According to this way of seeing things, relatively distinct and static religious traditions were defined in opposition to one another, with each one thought to infuse its communities of believers with distinctive characteristics. One of the characteristics ascribed to “Buddhists,” according to this rubric, was that they are generally tolerant and pacifist. The idea of Myanmar’s Buddhists as distinctively tolerant, then, became a key mechanism for dividing Burmese Buddhists from the Indian Hindus and Muslims living alongside them.

Colonial discourse that praised Burmese Buddhists for their tolerance functioned in part to condemn the “superstitious” and “backward” practices of caste Hindus and Muslims in colonial Myanmar. This discourse was picked up by Burmese nationalists and is now invoked, tragically, to justify violence toward Rohingya Muslims.

There is a philosophically problematic presupposition that also figures in widespread surprise at the very idea of violence perpetrated by Buddhists — that there is a straightforward relationship between the beliefs people hold and the likelihood that they will behave in corresponding ways.

Even if we suppose that most Buddhists, or members of any other religious group, really do hold beliefs that are pacifist and tolerant, we have no reason to expect that they will really be pacifist and tolerant. As Immanuel Kant well understood, we are not transparent to ourselves and can never exhaustively know why we do what we do. We can never be certain whether or to what extent we have acted for the reasons we think we did (whether because, for example, “it was the right thing to do”), or whether we are under the sway of psychological, neurophysiological or socioeconomic causes that are altogether opaque to us.

That doesn’t mean that we should (or can) jettison all reference to our stated beliefs, reasons, rationality; indeed, Kant also cogently argued that despite the efforts of all manner of determinists, we cannot coherently explain these away (for any attempt to explain away our rationality would itself represent a use of that faculty). But it does mean that we cannot infer from, say, a society’s widely held belief in toleration and peace that the actions of people in that society will be strictly guided by those beliefs.

We should thus be wary of any narrative on which historical events are straightforwardly explained by the fact that the people in any society hold whatever religious beliefs they do. It just doesn’t follow from the fact that someone is admirable — or for that matter, that she is vile — that it is because of her beliefs that she is so. Given this, we should expect that even in societies where virtuous beliefs are widely held, we will find pretty much the same range of human failings evident throughout history. Buddhist societies are no different in this respect than others.

Many of history’s great Buddhist philosophers would themselves acknowledge as much. Buddhist thinkers have typically emphasized that there is a profound difference between merely assenting to a belief (for example, that all sentient beings deserve compassion) and actually living in ways informed by that belief. To be really changed by a belief regarding one’s relationship to all other beings, one must cultivate that belief — one must come to experience it as vividly real — through the disciplined practices of the Buddhist path.

The reason this is necessary, Buddhist philosophers recognized, is that all of us — even those who are Buddhists — are deeply habituated to self-centered ways of being. Indeed, if that weren’t the case, there would be no need for Buddhist practice; it is just because people everywhere (even in Tibet, Myanmar and Japan) are generally self-centered that it takes so much work — innumerable lifetimes of it, according to many Buddhists — to overcome the habituated dispositions that typically run riot over our stated beliefs.

The basic Buddhist analysis of the human predicament makes sense, as well, of the irony of colonialist conceptions of Buddhism and of the misguidedness of colonial attempts to exploit religious identities. According to a Buddhist analysis, we go through life thinking we’re advancing our own interests, while actually producing ever more suffering because we misunderstand ourselves.

Similarly, as the case of Myanmar shows, the colonial origins of the modern secular state have, in some ways, insidiously fostered the hardening of religious identities. To that extent, the violence perpetrated by Buddhists in Myanmar, astonishing though it might seem to us, may not be so far from the origins of our own ways of perceiving the world. It is clear that this violence is driven by Burmese participation in (and interpretation of) global contemporary discourses that also shape societies in Europe and North America, where the vilification of Islam and of immigrants has (not coincidentally) also been widespread.

Indeed, our own perception of Buddhism as peaceful and tolerant may itself contribute to a global discourse that has, among other things, represented Muslims as less than full citizens — indeed, less than fully human — in Myanmar as in many other places.

Wednesday, 28 February 2018

Neuroscience “will play a huge role in the future of business education”

Seb Murray
When Victoria Westerhoff decided to study for an MBA at University of Pennsylvania’s Wharton business school in 2016, an unusual elective caught her eye: consumer neuroscience. The field was important to her — she trained as a cognitive scientist at Yale.
As part of the Wharton course, Ms Westerhoff used an eye-tracking system to assess how much attention she paid to product placement in film clips, and she noted a lift in attention paid to products placed prominently. “The more time your eyeballs are on something, the larger the impact,” she says.

That research made her realise how science can improve business. Neuroscientific consumer research can be more detailed and effective than traditional methods, such as surveys. “We can capture the emotional responses that help drive unconscious decisions, including what we buy,” Ms Westerhoff says.

Business education is not limited to accounting, strategy and finance. Future leaders are trying out “mind-scanning” electroencephalography (EEG), heart-rate monitors and meditation, as schools create courses at the nexus of business and brain science to help students improve productivity, influence decision-making and handle stress.
Neuroscience “will play a huge role in the future of business education,” says Michael Platt, a Wharton professor, because “we have reached a point where we understand so much about the human brain — how it processes information — that we can use neuroscience to do business better”.

Thomas Bonfiglio, a regional director with American Medical Response in New York, a medical transportation company, says practising guided meditation with his team at the beginning of meetings has made them more productive. Mr Bonfiglio learnt techniques on a two-day neuroscience for leadership course at MIT’s Sloan School of Management, in 2014.
“We have a lot of aggressive, alpha-type personalities,” he says. “It was often difficult to get the group to work together.” But after introducing meditation, they worked more quickly and effectively, Mr Bonfiglio says.

“At first people were sceptical because it took up time. However, I found that instead of arguments, there was more positive discussion, and the tone was more conciliatory.”
Another reason schools launch neuroscience programmes is because students demand them. At Columbia Business School, enrolment to a three-day “neuroleadership” executive course has increased by 50 per cent over the past two years.“

Demand is growing because business leaders who are ahead of the curve know that emotion can impact their performance,” says Yoshie Tomozumi Nakamura, Columbia’s director of organisational learning and research.
A moderate increase in heart rate can improve performance because it increases the amount of blood in the brain, and the neurotransmitter activity that enhances cognitive processing, according to Lee Waller, director of research at the UK’s Hult International Business School.

“We think clearly, make good decisions and learn well,” she says. But too much stress causes the opposite response as more blood flows to the limbs — known as “fight or flight”— and that reduces cognitive function.
Participants on a three-day leadership programme at Ashridge Executive Education, part of Hult, practise stressful scenarios, such as criticising an employee’s performance.

They wear heart-rate variance monitors to understand the impact of stress on their sympathetic nervous system — which activates the fight-or-flight response — and how well their bodies recover.
Sport and management

Elite athletes are helping to improve managers’ performance. This year, 100 MBA and executive MBA students from Mannheim Business School will visit TSG 1899 Hoffenheim, a top-tier German football club.
They aim to learn how to use sports analytics — the application of data and analytics to performance management — in business.

TSG teamed up with SAP to place sensors on footballers to gather data from training, such as speed averages and ball possession. The data can be used to personalise training to focus on a player’s strengths or weaknesses, for instance to focus more on acceleration and to assess movement to reduce the chance of injuries.
“We will ask the students to come up with ways to use people analytics to improve organisational performance,” says Dr Sabine Staritz, Mannheim director of corporate relations.

For example, some companies track employees’ heart rate, sleep and other personal data to avoid a different type of injury: burnout.
“Practice develops ‘muscle memory’: the next time we encounter the experience, our brain remembers how we dealt with it, and we are able to cope with the stress better,” Mrs Waller says.

In an ethics course at Kellogg School at Northwestern University, MBA students are taught how neuroscience can “influence people or persuade people”, says Adam Waytz, an associate professor.
He says research suggests that people are more likely to take moral action, such as donating to disaster relief, when they feel emotion, rather than when presented with reasoning and logic. That understanding may help fundraisers design more effective campaigns, for example.

However, neuroscience presents business schools with problems. One challenge they face in teaching the subject is finding scholars who can apply neuroscientific research to business, says Patricia Riddell, professor of applied neuroscience with Henley Business School at University of Reading.
“That expertise is in very short supply,” she says. Henley has used academics from University of Reading’s psychology department to teach neuroscience to students on its MA in leadership course.

The greater worry is that neuroscience could be used in a way that manipulates people into buying products, for example, by using stimuli in advertising to activate parts of the brain associated with pleasure.
“We do not want students to graduate and claim that they have found the ‘buy button’ in the brain,” says Angelika Dimoka, director of the Center for Neural Decision Making at the Fox School of Business.

Fox runs a PhD programme in decision neuroscience. “We teach our students about the ethical use of neurophysiological tools in business, by organising workshops and conferences,” Dr Dimoka says. “But we can’t police every future decision they make.”

Friday, 23 February 2018

Bring on more “soy boys*” please!

Michael Ian Black, a comedian, actor and author

I used to have this one-liner: “If you want to emasculate a guy friend, when you’re at a restaurant, ask him everything that he’s going to order, and then when the waitress comes … order for him.” It’s funny because it shouldn’t be that easy to rob a man of his masculinity — but it is.

Last week, 17 people, most of them teenagers, were shot dead at a Florida school. Marjory Stoneman Douglas High School now joins the ranks of Sandy Hook, Virginia Tech, Columbine and too many other sites of American carnage. What do these shootings have in common? Guns, yes. But also, boys. Girls aren’t pulling the triggers. It’s boys. It’s almost always boys.

America’s boys are broken. And it’s killing us.

The brokenness of the country’s boys stands in contrast to its girls, who still face an abundance of obstacles but go into the world increasingly well equipped to take them on.

The past 50 years have redefined what it means to be female in America. Girls today are told that they can do anything, be anyone. They’ve absorbed the message: They’re outperforming boys in school at every level. But it isn’t just about performance. To be a girl today is to be the beneficiary of decades of conversation about the complexities of womanhood, its many forms and expressions.

Boys, though, have been left behind. No commensurate movement has emerged to help them navigate toward a full expression of their gender. It’s no longer enough to “be a man” — we no longer even know what that means.

Too many boys are trapped in the same suffocating, outdated model of masculinity, where manhood is measured in strength, where there is no way to be vulnerable without being emasculated, where manliness is about having power over others. They are trapped, and they don’t even have the language to talk about how they feel about being trapped, because the language that exists to discuss the full range of human emotion is still viewed as sensitive and feminine.

Men feel isolated, confused and conflicted about their natures. Many feel that the very qualities that used to define them — their strength, aggression and competitiveness — are no longer wanted or needed; many others never felt strong or aggressive or competitive to begin with. We don’t know how to be, and we’re terrified.

But to even admit our terror is to be reduced, because we don’t have a model of masculinity that allows for fear or grief or tenderness or the day-to-day sadness that sometimes overtakes us all.

Case in point: A few days ago, I posted a brief thread about these thoughts on Twitter, knowing I would receive hateful replies in response. I got dozens of messages impugning my manhood; the mildest of them called me a “soy boy”.

And so the man who feels lost but wishes to preserve his fully masculine self has only two choices: withdrawal or rage. We’ve seen what withdrawal and rage have the potential to do. School shootings are only the most public of tragedies. Others, on a smaller scale, take place across the country daily; another commonality among shooters is a history of abuse toward women.

To be clear, most men will never turn violent. Most men will turn out fine. Most will learn to navigate the deep waters of their feelings without ever engaging in any form of destruction. Most will grow up to be kind. But many will not.

We will probably never understand why any one young man decides to end the lives of others. But we can see at least one pattern and that pattern is glaringly obvious. It’s boys.

I believe in boys. I believe in my son. Sometimes, though, I see him, 16 years old, swallowing his frustration, burying his worry, stomping up the stairs without telling us what’s wrong, and I want to show him what it looks like to be vulnerable and open but I can’t. Because I was a boy once, too.

There has to be a way to expand what it means to be a man without losing our masculinity. I don’t know how we open ourselves to the rich complexity of our manhood. I think we would benefit from the same conversations girls and women have been having for these past 50 years.

I would like men to use feminism as an inspiration, in the same way that feminists used the civil rights movement as theirs. I’m not advocating a quick fix. There isn’t one. But we have to start the conversation. Boys are broken, and I want to help.

* a common insult among the alt-right that links soy intake to estrogen

Monday, 19 February 2018

How natural are "natural" products?

Julie Creswell

In recent years, one bright spot in an otherwise lackluster market for packaged foods, beverages and consumer products has been merchandise promoted as “natural.”

Consumers, increasingly wary of products that are overly processed or full of manufactured chemicals, are paying premium prices for natural goods, from fruit juices and cereals to shampoos and baby wipes.

But as a spate of lawsuits and consumer advocacy efforts show, one person’s “natural” is another person’s methylisothiazolinone.

The problem, consumer groups and even some manufacturers say, is that there is no legal or regulatory definition of what “natural” is.

On one side are companies eager to cash in on consumers’ willingness to pay higher prices for natural products by slapping “all natural” labels on them. At times, the claims have stretched the limits of credulity — like “All Natural” 7Up, Pop-Tarts “Baked With Real Fruit” and Crystal Light “Natural” lemonade. (Some labels like these were eventually changed.)

On the other side is a cadre of plaintiffs’ lawyers — nicknamed “the Food Bar” — who have filed more than 300 lawsuits seeking class-action status in the past three years. Those claims involve allegations of misrepresentations, just on food labels, of the term “natural” and other descriptions.

And, increasingly, the lawsuits are moving beyond food and focusing on consumer goods like baby wipes and cleaning products.

Among the brands that have faced legal challenges are several that have long promoted their use of natural ingredients: Tom’s of Maine antiperspirants and toothpastes, the Honest Company’s laundry detergent and dish soap, Annie’s Homegrown salad dressings, Breyers and Ben & Jerry’s ice cream, Aveeno face moisturizers and Seventh Generation dish soap.

“The lawsuits you see are only a fraction of the claims that are made,” said David T. Biderman, a partner at Perkins Coie who defends food companies in class-action lawsuits. Behind the scenes, Mr. Biderman said, plaintiffs’ lawyers are sending letters to companies and threatening to file lawsuits over labels they argue are misleading or violate consumer protection laws. Those letters, Mr. Biderman said, are often rejected, go away or are resolved with a small payment.

Whether the lawsuits are necessary, or a nuisance, depends on whom you ask.

Proponents say that in lieu of clear regulation, consumers have been protected by these lawsuits, pointing to a number of cases in which manufacturers have altered their labels. General Mills, which faced at least two federal lawsuits claiming that its Nature Valley granola bars contained artificial ingredients, replaced labels that once read “100% Natural” to say they are “Made With 100% Natural Whole Grain Oats.”

Corporations, lawyers say, have been reluctant to allow a case to go to trial and risk having a legal definition of “natural” emerge — which might set standards companies would have to meet. As a result, the bulk of the lawsuits filed over the past decade have been settled, dismissed or, more recently, stayed by judges who hope regulators will step in with a definition.

Kim Richman, a plaintiffs’ lawyer, said his clients, including consumers and nonprofit groups, “engage in socially conscious litigation to level the playing field against corporate America as government oversight increasingly falls short.”

But critics say a big chunk of the settlement money lands in the pockets of the plaintiffs’ lawyers, and does not financially benefit consumers.
Critics also argue that some lawyers are robo-filing lawsuits — cut-and-pasting nearly identical allegations into complaints against multiple companies — and in the process are bringing head-scratching cases that risk undermining the credible ones.

A federal judge tossed a case a few years back, for instance, after concluding that reasonable consumers would understand that the “crunchberries” in Cap’n Crunch cereal were not real fruit. But another judge in California last year refused to dismiss a case against Krispy Kreme, which claimed that consumers were denied the health benefits found in raspberries because the company’s “raspberry-filled” doughnuts did not include real fruit. The plaintiff later dropped the lawsuit.

A number of more recent cases involve allegations that products labeled natural were misleading because they contained small amounts of materials linked to genetically modified organisms. In December, a New York federal court judge dismissed a lawsuit claiming that Dannon yogurt was falsely labeled natural because the cows might have been given genetically modified feed.

The new focus in litigation away from the ingredients in the food to the actual food chain — how the crop was grown or what the animals were fed — may undermine the original goal of the lawsuits, which was addressing nutritional concerns, some experts say.

“We’re really getting into splitting hairs about what is natural and what’s not,” said Maia Kats, the director of litigation for the Center for Science in the Public Interest, a public advocacy group has been involved in a handful of lawsuits over so-called natural products.

Stuck in the middle of this natural-or-not morass are consumers. Unable to trust the labels lining store shelves, shoppers are left with little choice but to examine the small type on the back of the box and try to decipher terms like methylisothiazolinone, a synthetic preservative found in some personal- and skin-care products.

A survey of consumers in 2015 by Consumer Reports magazine showed that at least 60 percent of respondents believed “natural” on packaged and processed foods meant they contained no artificial colors or ingredients and no genetically modified materials.

“About two-thirds of consumers surveyed think that natural on a food package means no pesticides were used,” said Charlotte Vallaeys, a senior policy analyst with Consumers Union, the advocacy division of Consumer Reports. “They’re confusing it with organic,” which prohibits nearly all pesticides from use on food products.

But when it comes to commonly used terms like “natural” or even “healthy,” the various agencies that oversee food and beverages and advertising have been slow to come up with definitions.

In late 2015, the United States Food and Drug Administration sought feedback from consumers and the industry on whether it should define and regulate the word “natural” on food labeling.

More than 7,600 comments flooded in. Some consumers wanted the word banned from all food labeling. Others asked that the term be defined simply.

“We recognize that consumers are trusting in products labeled ‘natural’ without clarity around the term,” Dr. Scott Gottlieb, the commissioner of the F.D.A., said in an emailed statement. “Consumers have called upon the F.D.A. to help define the term ‘natural’ and we take the responsibility to provide this clarity seriously. We will have more to say on the issue soon.”

Most agencies argue they lack the resources to police products that, while not necessarily truthful in their labeling, do not appear to be causing harm to consumers, either. The class-action lawsuits, for the most part, seek reimbursement in the price difference between the “natural” product and its less natural competition.

Apart from food, the regulation of claims in the advertising of shampoos and laundry detergent gets even murkier.

In 2012, when the Federal Trade Commission updated its Green Guides, a road map to help marketers avoid making environmental claims that could mislead consumers, it did not define the word “natural.” Laura Koss, a lawyer in the division of enforcement, said the commission received insufficient consumer feedback about the term when it sought comment.

Still, lawyers say that until regulators come up with a definition, the not-so-natural dance among consumers, manufacturers and lawyers will continue.

“You’ve got a lot of companies, in the absence of clear standards, willing to take the class-action risk because there is so much consumer demand for products that are marketed as natural,” said Randal M. Shaheen, a lawyer with Venable who defends corporate advertising and marketing claims. “But if you’re a consumer who is really passionate about, for example, not giving your kids high-fructose corn syrup, you should read the label to see how it is sweetened.”

Tuesday, 23 January 2018

The real Adam Smith

Paul Sagar, lecturer in political theory in the Department of Political Economy, King’s College London
If you’ve heard of one economist, it’s likely to be Adam Smith. He’s the best-known of all economists, and is typically hailed as the founding father of the dismal science itself.
Furthermore, he’s usually portrayed as not only an early champion of economic theory, but of the superiority of markets over government planning. In other words, Smith is now known both as the founder of economics, and as an ideologue for the political Right.
Yet, despite being widely believed, both these claims are at best misleading, and at worst outright false.
Smith’s popular reputation as an economist is a remarkable twist of fate for a man who spent most of his life as a somewhat reclusive academic thinker. Employed as professor of moral philosophy at the University of Glasgow, the majority of Smith’s teaching was in ethics, politics, jurisprudence and rhetoric, and for most of his career he was known for his first book, The Theory of Moral Sentiments (1759). His professional identity was firmly that of a philosopher – not least because the discipline of ‘economics’ didn’t emerge until the 19th century, by which time Smith was long dead. (He died in July 1790, just as the French Revolution was getting into full swing.)
Admittedly, Smith’s reputation as an economist isn’t entirely mysterious. His oft-quoted An Inquiry into the Nature and Causes of the Wealth of Nations (1776) was undoubtedly important in the eventual formation – in the next century – of the discipline of economics. But even here things are not as straightforward as they appear. For The Wealth of Nations – a 1,000-page doorstopper that blends history, ethics, psychology and political philosophy – bears little resemblance to the ahistorical and highly mathematical nature of most current economic theory. If anything, Smith’s best-known book is a work of political economy, a once-prevalent field of enquiry that suffered a striking decline in the latter half of the 20th century.
Smith’s reputation, however, began to get away from him early on. Shortly after publication, The Wealth of Nations was fêted in the British Parliament by the Whig leader Charles James Fox. Ironically, Fox later admitted that he had never actually read it (few subsequent non-readers of the book have showed such candour, despite plenty of them citing it). Indeed, Smith suspected that those quickest to sing his praises had failed to understand the main arguments of his work. He later described The Wealth of Nations as a ‘very violent attack … upon the whole commercial system of Great Britain’. Despite this, his vocal political cheerleaders in Parliament continued to prop up the very system that Smith was railing against.
Yet if Smith was disappointed by his work’s immediate reception, he would likely have taken even less cheer from the future uses to which his name would be put. For it has been his fate to become associated with the strain of Right-wing politics that rose to dominance in the early 1980s, and which continues to exert a strong influence on politics and economics today. Usually known as neoliberalism, this development is most famously associated with Ronald Reagan and Margaret Thatcher. But it is in fact a movement with deep intellectual roots, in particular in the mid-century writings of the economists Friedrich Hayek and Ludwig von Mises. Later, the Chicago economist Milton Friedman and the British policy adviser Keith Joseph championed it during the 1980s, as did the extensive network of academics, think tanks, business leaders and policymakers associated with the Mont Pelerin Society.
Neoliberals often invoke Smith’s name, believing him to be an early champion of private capitalist endeavour, and a founder of the movement that seeks (as Thatcher hoped) to ‘roll back the frontiers of the state’ so as to allow the market to flourish. The fact that there is a prominent Right-wing British think tank called the Adam Smith Institute – which since the 1970s has aggressively pushed for market-led reforms, and in 2016 officially rebranded itself a ‘neoliberal’ organisation – is just one example of this tendency.
It is certainly true that there are similarities between what Smith called ‘the system of natural liberty’, and more recent calls for the state to make way for the free market. But if we dig below the surface, what emerges most strikingly are the differences between Smith’s subtle, skeptical view of the role of markets in a free society, and more recent caricatures of him as a free-market fundamentalist avant-la-lettre. For while Smith might be publicly lauded by those who put their faith in private capitalist enterprise, and who decry the state as the chief threat to liberty and prosperity, the real Adam Smith painted a rather different picture. According to Smith, the most pressing dangers came not from the state acting alone, but the state when captured by merchant elites.

The context of Smith’s intervention in The Wealth of Nations was what he called ‘the mercantile system’. By this Smith meant the network of monopolies that characterised the economic affairs of early modern Europe. Under such arrangements, private companies lobbied governments for the right to operate exclusive trade routes, or to be the only importers or exporters of goods, while closed guilds controlled the flow of products and employment within domestic markets.
As a result, Smith argued, ordinary people were forced to accept inflated prices for shoddy goods, and their employment was at the mercy of cabals of bosses. Smith saw this as a monstrous affront to liberty, and a pernicious restriction on the capacity of each nation to increase its collective wealth. Yet the mercantile system benefited the merchant elites, who had worked hard to keep it in place. Smith pulled no punches in his assessment of the bosses as working against the interests of the public. As he put it in The Wealth of Nations: ‘People of the same trade seldom meet together, even for merriment and diversion but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.’
The merchants had spent centuries securing their position of unfair advantage. In particular, they had invented and propagated the doctrine of ‘the balance of trade’, and had succeeded in elevating it into the received wisdom of the age. The basic idea was that each nation’s wealth consisted in the amount of gold that it held. Playing on this idea, the merchants claimed that, in order to get rich, a nation had to export as much, and import as little, as possible, thus maintaining a ‘favourable’ balance. They then presented themselves as servants of the public by offering to run state-backed monopolies that would limit the inflow, and maximise the outflow, of goods, and therefore of gold. But as Smith’s lengthy analysis showed, this was pure hokum: what were needed instead were open trading arrangements, so that productivity could increase generally, and collective wealth would grow for the benefit of all.
Even worse than this, Smith thought, the merchants were the source of what his friend, the philosopher and historian David Hume, had called ‘jealousy of trade’. This was the phenomenon whereby commerce was turned into an instrument of war, rather than the bond of ‘union and friendship’ between states that it ought properly to be. By playing on jingoistic sentiments, the merchants inflamed aggressive nationalism, and blinded domestic populations to the fact that their true interests lay in forming peaceful trading relationships with their neighbours.
The peace and stability of the European continent was imperilled by the conspiracies of the merchants, who goaded politicians into fighting wars to protect home markets, or acquire foreign ones. After all, being granted militarily-backed private monopolies was far easier than having to compete on the open market by lowering prices and improving quality. The merchants in this manner constantly conspired to capture the state, defrauding the public by using political power to promote their own sectional advantage.
The invisible hand was invoked not to draw attention to the problem of state intervention, but of state capture
Indeed, Smith’s single most famous idea – that of ‘the invisible hand’ as a metaphor for uncoordinated market allocation – was invoked in precisely the context of his blistering attack on the merchant elites. It is certainly true that Smith was skeptical of politicians’ attempts to interfere with, or bypass, basic market processes, in the vain hope of trying to do a better job of allocating resources than was achievable through allowing the market to do its work. But in the passage of The Wealth of Nations where he invoked the idea of the invisible hand, the immediate context was not simply that of state intervention in general, but of state intervention undertaken at the behest of merchant elites who were furthering their own interests at the expense of the public.
It is an irony of history that Smith’s most famous idea is now usually invoked as a defence of unregulated markets in the face of state interference, so as to protect the interests of private capitalists. For this is roughly the opposite of Smith’s original intention, which was to advocate for restrictions on what groups of merchants could do. When he argued that markets worked remarkably efficiently – because, although each individual ‘intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention’ – this was an appeal to free individuals from the constraints imposed upon them by the monopolies that the merchants had established, and were using state power to uphold. The invisible hand was originally invoked not to draw attention to the problem of state intervention, but of state capture.
Smith was, however, deeply pessimistic about the stranglehold that the merchants had managed to exert over European politics, and despaired of it ever being loosened. Accordingly, he labelled his preferred alternative – of liberal markets generating wealth to be passed on to all members of society – a ‘Utopia’ that would never come to pass. History has to some extent proved him wrong on this score: we now live in an era of comparative market freedom. But nobody should deny that merchant conspiracy, and the marriage of the state to what we now call corporate power, remain defining features of our present-day political and economic reality.
In any case, Smith’s hostility to the merchants is a long way removed from a Reagan-style championing of the entrepreneurial capitalist hero, she who needs only to be released from the constraints of the state to lead us to the sunlit uplands of economic growth. On the contrary, Smith’s analysis implies that a free society with a healthy economy is going to need to put fetters on economic elites if the invisible hand is to have any chance of doing its paradoxical work.
Does this, then, make Smith an early proponent of the political Left? No, and it would be a serious mistake to draw that conclusion. The truth is both more complex, and more interesting, than that.

Although Smith was deeply critical of the way that the merchants conspired to promote their own advantage at the expense of the rest of society, he was under no illusion that political actors might successfully replace private merchants as the necessary conduits of economic activity.
Certainly, when merchants were allowed to rule as sovereigns – as the British East India Company had been permitted to do in Bengal – the results were disastrous. ‘Want, famine and mortality’, themselves the results of ‘tyranny’ and ‘calamity’, had been unleashed on India, all products of an ‘oppressive authority’ based on force and injustice. Under absolutely no circumstances, Smith thought, should merchants be put in charge of politics. Their monopolistic conspiracies would be ‘destructive’ to all countries ‘which have the misfortune to fall under their government’.
On Smith’s final analysis, the merchants were a pernicious but necessary part of large-scale economies
Nonetheless, something like the reverse was also true: politicians made for terrible merchants, and ought not to attempt to take over the systematic running of economic affairs. This was a product of the structural predicament faced by political leaders, whom Smith claimed have ‘scarce ever succeeded’ in becoming ‘adventurers in the common branches of trade’, despite often having been tempted to try, and often from a genuine desire to better their nation’s condition.
Politicians, according to Smith, were much poorer judges of where and how to allocate resources than the aggregated outcome of individuals spontaneously undertaking free exchange. As a result, in matters of trade it was usually folly for politicians to try to replace the vast network of buyers and sellers with any form of centralised command. This, however, included precisely those networks structured around the profit-seeking activities of merchant elites.
On Smith’s final analysis, the merchants were a potentially pernicious, but entirely necessary, part of the functioning of large-scale economies. The true ‘science of a statesman or legislator’ consisted in deciding how best to govern the merchants’ nefarious activities. Effective politicians had to strike a balance between granting economic elites the liberty to pursue legitimate commercial activities, while also applying control when such activities became vehicles for exploitation. In other words, Smith was very far from asking us to put our faith in ‘entrepreneurs’, those supposed ‘wealth-creators’ whom neoliberalism looks to as drivers of economic prosperity. On the contrary, giving the entrepreneurs free reign would be rather like putting the foxes in charge of the chicken coup.

Crucially, however, Smith did not offer up any kind of premeditated plan regarding how to strike the right balance between commercial freedom and watchful political control. On the contrary, he pressed home the deep underlying difficulties of the situation that commercial societies found themselves in.
Political actors, Smith claimed, were liable to be swept up by a ‘spirit of system’, which made them fall in love with abstract plans, which they hoped would introduce sweeping beneficial reform. Usually the motivations behind these plans were perfectly noble: a genuine desire to improve society. The problem, however, was that the ‘spirit of system’ blinded individuals to the harsh complexities of real-world change. As Smith put it in The Theory of Moral Sentiments in one of his most evocative passages:
[The man of system] seems to imagine that he can arrange the different members of a great society with as much ease as the hand arranges the different pieces upon a chessboard. He does not consider that the pieces upon the chessboard have no other principle of motion besides that which the hand impresses upon them; but that, in the great chessboard of human society, every single piece has a principle of motion of its own, altogether different from that which the legislature might choose to impress upon it. If those two principles coincide and act in the same direction, the game of human society will go on easily and harmoniously, and is very likely to be happy and successful. If they are opposite or different, the game will go on miserably, and the society must be at all times in the highest degree of disorder.
Smith’s point is easily misunderstood. At first glance, it can look like a modern Right-wing injunction against socialist-style state planning. But it is much more subtle than that.
What Smith is saying is that in politics any preconceived plan – especially one that assumes that the millions of individuals composing a society will just automatically go along with it – is potentially dangerous. This is because the ‘spirit of system’ infects politicians with a messianic moral certainty that their reforms are so necessary and justified that almost any price is worth paying to achieve them.
Thatcher’s restructuring of the economy was as much a product of the ‘spirit of system’ as any Soviet strategy
Yet it is a short step from this to discounting the very real harm that a plan can unleash if it starts to go wrong – and especially if the ‘pieces upon the chessboard’ act in ways that resist, or subvert, or confound, the politician’s scheme. This is because the ‘spirit of system’ encourages the sort of attitude captured in such cheap sayings as ‘You can’t make an omelette without breaking eggs’. In other words, that inconvenient opponents and bystanders can be sacrificed to an overriding moral vision.
Smith was warning against all abstract plans alike. Certainly, his outlook urges skepticism about such strategies as taking over the industrial base of a state, presuming to know what goods citizens will want and need over the next five years, and thereby trying to eliminate the market as a mechanism for resource allocation. But it likewise views with deep suspicion a plan to rapidly privatise previously state-owned industries, exposing millions of citizens to the ravages of unemployment and the attendant destruction of their communities. In other words, while she certainly didn’t realise it, Thatcher’s violent restructuring of the British economy during the 1980s was as much a product of the ‘spirit of system’ as any piece of top-down Soviet industrial strategy.
The message that Smith conveys cuts across party and ideological lines, and applies to both Left and Right. It is about a pathological attitude that politicians of all stripes are prone to. If not kept in check, this can be the source not just of disruption and inefficiency but of cruelty and suffering, when those who find themselves on the wrong side of the plan’s consequences are forced by the powerful to suffer them regardless. Smith in turn urges us to recognise that real-world politics will always be too complex for any prepackaged ideology to cope with. What we need in our politicians is careful judgment and moral maturity, something that no ideology, nor any position on the political spectrum, holds a monopoly on.
In the fraught times that we now occupy, it is hard to believe that the careful and responsible political judges that Smith envisaged have much chance of emerging. (Does anybody in Western politics currently measure up?) Much more likely will be new men and women of system, with alternative abstract plans, seducing desperate electorates before attempting to impose their own forceful reforms, regardless of what the pieces on the chessboard happen to think or want.
Whether these reforms come from the Left or the Right might not, in the end, matter much. As Western economies continue to struggle, and politics becomes increasingly polarised, the results could yet be catastrophic. But if so, we should certainly not consign Smith to any parade of blame. On the contrary, he tried to warn us of the dangers that we face. It is time that we listened, a little more carefully, to what the real Adam Smith had to say.