Sunday, 18 October 2020

Why we fail to foresee and contain catastrophe

Elke U. Weber, Gerhard R. Andlinger Professor in Energy and the Environment and Professor of Psychology and Public Affairs at Princeton University

We are living in a time of crisis. From the immediate challenge of the COVID-19 pandemic to the looming existential threat of climate change, the world is grappling with massive global dangers—to say nothing of countless problems within countries, such as inequality, cyberattacks, unemployment, systemic racism, and obesity. In any given crisis, the right response is often clear. Wear a mask and keep away from other people. Burn less fossil fuel. Redistribute income. Protect digital infrastructure. The answers are out there. What’s lacking are governments that can translate them into actual policy. As a result, the crises continue. The death toll from the pandemic skyrockets, and the world makes dangerously slow progress on climate change, and so on.

It’s no secret how governments should react in times of crisis. First, they need to be nimble. Nimble means moving quickly, because problems often grow at exponential rates: a contagious virus, for example, or greenhouse gas emissions. That makes early action crucial and procrastination disastrous. Nimble also means adaptive. Policymakers need to continuously adjust their responses to crises as they learn from their own experience and from the work of scientists. Second, governments need to act wisely. That means incorporating the full range of scientific knowledge available about the problem at hand. It means embracing uncertainty, rather than willfully ignoring it. And it means thinking in terms of a long time horizon, rather than merely until the next election. But so often, policymakers are anything but nimble and wise. They are slow, inflexible, uninformed, overconfident, and myopic.

Why is everyone doing so badly? Part of the explanation lies in the inherent qualities of crises. Crises typically require navigating between risks. In the COVID-19 pandemic, policymakers want to save lives and jobs. With climate change, they seek a balance between avoiding extreme weather and allowing economic growth. Such tradeoffs are hard as it is, and they are further complicated by the fact that costs and benefits are not evenly distributed among stakeholders, making conflict a seemingly unavoidable part of any policy choice. Vested interests attempt to forestall needed action, using their money to influence decision-makers and the media. To make matters worse, policymakers must pay sustained attention to multiple issues and multiple constituencies over time. They must accept large amounts of uncertainty. Often, then, the easiest response is to stick with the status quo. But that can be a singularly dangerous response to many new hazards. After all, with the pandemic, business as usual would mean no social distancing. With climate change, it would mean continuing to burn fossil fuels. 

But the explanation for humanity’s woeful response to crises goes beyond politics and incentives. To truly understand the failure to act, one must turn to human psychology. It is there that one can grasp the full impediments to proper decision-making—the cognitive biases, emotional reactions, and suboptimal shortcuts that hold policymakers back—and the tools to overcome them. 


People are singularly bad at predicting and preparing for catastrophes. Many of these events are “black swans,” rare and unpredictable occurrences that most people find difficult to imagine, seemingly falling into the realm of science fiction. Others are “gray rhinos,” large and not uncommon threats that are still neglected until they stare you in the face (such as a coronavirus outbreak). Then there are “invisible gorillas,” threats in full view that should be noticed but aren’t—so named for a psychological experiment in which subjects watching a clip of a basketball game were so fixated on the players that they missed a person in a gorilla costume walking through the frame. Even professional forecasters, including security analysts, have a poor track record when it comes to accurately anticipating events. The COVID-19 crisis, in which a dystopic science-fiction narrative came to life and took everyone by surprise, serves as a cautionary tale about humans’ inability to foresee important events. 

Not only do humans fail to anticipate crises; they also fail to respond rationally to them. At best, people display “bounded rationality,” the idea that instead of carefully considering their options and making perfectly rational decisions that optimize their preferences, humans in the real world act quickly and imperfectly, limited as they are by time and cognitive capacity. Add in the stress generated by crises, and their performance gets even worse.

Because humans don’t have enough time, information, or processing power to deliberate rationally, they have evolved easier ways of making decisions. They rely on their emotions, which serve as an early warning system of sorts: alerting people that they are in a positive context that can be explored and exploited or in a negative context where fight or flight is the appropriate response. They also rely on rules. To simplify decision-making, they might follow standard operating procedures or abide by some sort of moral code. They might decide to imitate the action taken by other people whom they trust or admire. They might follow what they perceive to be widespread norms. Out of habit, they might continue to do what they have been doing unless there is overwhelming evidence against it.

Humans evolved these shortcuts because they require little effort and work well in a broad range of situations. Without access to a real-time map of prey in different hunting grounds, for example, a prehistoric hunter might have resorted to a simple rule of thumb: look for animals where his fellow tribesmen found them yesterday. But in times of crisis, emotions and rules are not always helpful drivers of decision-making. High stakes, uncertainty, tradeoffs, and conflict—all elicit negative emotions, which can impede wise responses. Uncertainty is scary, as it signals an inability to predict what will happen, and what cannot be predicted might be deadly. The vast majority of people are already risk averse under normal circumstances. Under stress, they become even more so, and they retreat to the familiar comfort of the status quo. From gun laws to fossil fuel subsidies, once a piece of legislation is in place, it is hard to dislodge it, even when cost-benefit analysis argues for change.

Another psychological impediment to effective decision-making is people’s natural aversion to tradeoffs. They serve as a reminder that we cannot have it all, that concessions need to be made in some areas to gain in others. For that reason, people often employ decision rules that are far from optimal but minimize their awareness of the need for tradeoffs. They might successively eliminate options that do not meet certain criteria—for example, a user of a dating app might screen people based on height and then miss someone who would have been the love of his or her life but was half an inch too short. Tradeoffs between parties make for conflict, and people dislike conflict, too. They see it not as an opportunity to negotiate joint gains but as a stressful confrontation. Years of teaching negotiation have shown me that although everybody understands that negotiations are about distributing a finite pie (with unavoidable conflict), it is much harder to get across the concept that they are also often about creating solutions that make all sides better off.


A further hindrance to crisis response is the lack of an easily identified culprit. Some crises, such as military standoffs during the Cold War or, more recently, terrorist attacks, have clear causes that can be blamed and villains who can be fought. But many others—the pandemic and climate change being prime examples—do not. They are more ambiguous, as they are caused by a range of factors, some proximate, others not. They become catastrophes not because of any particular trigger or evildoer but because of the action or inaction of policymakers and the public. When it isn’t clear who is friend and who is foe, it’s difficult to see a clear and simple path of action. 

Psychologists speak of the “single-action bias,” the human tendency to consider a problem solved with a single action, at which point the sense that something is awry diminishes. For example, one study found that radiologists will stop scrutinizing an x-ray for evidence of pathology after they have identified one problem, even though multiple problems may exist. This bias suggests that humans’ preferred way of dealing with risks evolved during simpler times. To avoid being killed by lions at the watering hole, there was an easy, one-step solution: stay away from the lions. But today, many crises have no culprit. The enemy is human behavior itself, whether that be the burning of fossil fuels, the consumption of virus-infected animals, or the failure to wear masks or abide by social-distancing rules.

The solutions to these problems are often inconvenient, unpopular, and initially expensive. They involve making uncomfortable changes. When that is the case, people tend to exploit any ambiguity in the cause of the problem to support alternative explanations. When the COVID-19 pandemic began, for instance, some embraced a conspiracy theory that falsely claimed that the virus was the intentional product of a Chinese lab. For many, that idea was easier to swallow than the scientific consensus that the virus emerged from bats. Indeed, in a survey of Americans that my colleagues and I conducted in April, a mind-boggling 29 percent of respondents held this view. 

Another psychological barrier to effective governance in times of crisis relates to how people learn and revise their beliefs. If people followed the Bayesian method of inference, they would update their beliefs in the face of new information. Over time, as more and more information became available, a consensus would emerge—for example, that climate change is caused by human activity. But not everyone sees and acknowledges the same new information and integrates it in the same rational way. In practice, they give more weight to concrete personal experience than abstract statistical information. The death of a single close friend from COVID-19 is much more of a wake-up call than a news report about high infection rates. Someone who loses a house in a wildfire will grasp the risk of climate change more than someone who looks at a graph of rising temperatures. Personal experience is a powerful teacher, far more convincing than pallid statistics provided by scientific experts, even if the latter carry far greater evidentiary value.

People vastly underestimate the likelihood of low-probability events, until they personally experience one. At that point, they react, and perhaps even overreact, for a short while, until the perceived threat recedes again. After an official is the victim of an email hack, for example, he or she may take greater cybersecurity precautions for a while but will likely become less vigilant as the months go on.

The value of personal experience is reflected in the phrase “seeing is believing.” But the opposite can also be the case: sometimes, believing is seeing. In other words, people who are committed to their beliefs, especially when those beliefs are shared by ideological allies, will pay selective attention to information that confirms their preexisting notions and fail to see evidence that contradicts them. That’s why it is often the case that people are increasingly divided, rather than united, over time about the causes of and solutions to crises. Beliefs about COVID-19 and climate change have gotten more polarized over time, with Democrats more likely to subscribe to science-based explanations of both crises and express greater concern and Republicans more likely to agree with conspiracy theories that downplay the risks. 


One response to all these psychological biases is for officials to change their ways and embrace more rational decision-making processes, which would lead to better policies. They would need to acknowledge the true extent of their ignorance about future events and creatively guard against probable and unpredictable high-impact surprises. (With the COVID-19 crisis, for example, they would plan for the possibility that a vaccine cannot be identified or proves to be short lived.) Policymakers would seek to guide and educate the public rather than follow it. Some might view this approach as paternalistic, but it need not be, provided that it is implemented with input from groups across society. Indeed, people regularly delegate decision-making to those with greater expertise—going to a doctor for a diagnosis, for instance, or letting a lawyer handle legal issues. In principle, at least, elected officials are supposed to take care of the big-picture strategic planning that individuals don’t have the time, attention, or foresight to do themselves.

It might seem as if the politician who deviates from public opinion to think about more long-term problems is the politician who fails to get reelected. But public opinion is malleable, and initially unpopular changes can gain support over time. In 2003, for example, New York City banned smoking in restaurants and bars. After an initial outcry and a drop in Mayor Michael Bloomberg’s popularity, the city came to see that the new policy was not as detrimental as originally thought, support for the ban rose, and Bloomberg won reelection twice. In 2008, the Canadian province of British Columbia also instituted an unpopular policy: a carbon tax on fossil fuels. Again, disapproval was followed by acceptance, and the province’s premier, Gordon Campbell, won an election the next year. Some reforms don’t poll well at first, but it would be a mistake to see failure as a foregone conclusion. Passing initially unpopular reforms may require creative policies and charismatic politicians, but eventually, the public can come around. 

Another approach to improving crisis decision-making would be to work with, rather than against, psychological barriers. In 2017, the Behavioral Science and Policy Association published a report that identified four categories of policy problems with which the insights of psychology could help: “getting people’s attention; engaging people’s desire to contribute to the social good; making complex information more accessible; and facilitating accurate assessment of risks, costs, and benefits.” The experts behind the report came up with a variety of tools to meet these objectives. One recommendation was that policymakers should set the proper default—say, automatically enrolling households in energy-reduction programs or requiring that new appliances be shipped with the energy-saving settings turned on. Another was that they should communicate risks using a more intuitive time frame, such as speaking about the probability of a flood over the course of a 30-year mortgage rather than within 100 years. 

In the same spirit, the cognitive scientist Steven Sloman and I put together a special issue of the journal Cognition in 2019 to examine the thought processes that shape the beliefs behind political behavior. The authors identified problems, such as people’s tendency to consume news that confirms their existing beliefs and to let their partisan identities overpower their ability to evaluate probabilities rationally. But they also identified solutions, such as training people to better understand the uncertainty of their own forecasts. Policymakers need not take public opinion as an immutable barrier to progress. The more one understands how people think, feel, and react, the more one can use that information to formulate and implement better policy. 

The field of psychology has identified countless human biases, but it has also come up with ways of countering their effects. Psychologists have developed the concept of choice architecture, whereby decisions are structured in such a way to nudge people toward good choices and away from bad choices. When companies automatically enroll their employees in retirement plans (while allowing them to opt out), the employees are more likely to save. When governments do the same with organ donation, people are more likely to donate. Psychologists also know that although playing on negative emotions, such as fear or guilt, can have undesirable consequences, eliciting positive emotions is a good way to motivate behavior. Pride, in particular, is a powerful motivator, and campaigns that appeal to it have proved effective at convincing households to recycle and coastal communities to practice sustainable fishing. All these techniques are a form of psychological jujitsu that turns vulnerabilities into strengths. 

Effective public leaders understand and use the richness of human behavior. German Chancellor Angela Merkel comes to mind. Combining the rationality of the scientist she was with the human touch of the politician she is, she has proved adept at managing emergencies, from Europe’s currency crisis to its migration crisis to the current pandemic. Such leaders are evidence-based, analytic problem solvers, but they also acknowledge public fears, empathize with loss and pain, and reassure people in the face of uncertainty. They are not prisoners of psychology but masters of it.

Tuesday, 8 September 2020

The boomerang effect of the ideological pursuit of meritocracy




Julian Coman

Philosopher Michael Sandel was 18 years old when he received his first significant lesson in the art of politics. The future philosopher was president of the student body at Palisades high school, California, at a time when Ronald Reagan, then governor of the state, lived in the same town. Never short of confidence, in 1971 Sandel challenged him to a debate in front of 2,400 left-leaning teenagers. It was the height of the Vietnam war, which had radicalised a generation, and student campuses of any description were hostile territory for a conservative. Somewhat to Sandel’s surprise, Reagan took up the gauntlet that had been thrown down, arriving at the school in style in a black limousine. The subsequent encounter confounded the expectations of his youthful interlocutor.

“I had prepared a long list of what I thought were very tough questions,” recalls Sandel, now 67, via video-link from his study in Boston. “On Vietnam, on the right of 18-year-olds to vote – which Reagan opposed – on the United Nations, on social security. I thought I would make short work of him in front of that audience. He responded genially, amiably and respectfully. After an hour I realised I had not prevailed in this debate, I had lost. He had won us over without persuading us with his arguments. Nine years later he would get elected to the White House in the same way.”

Undeterred by this early setback, Sandel has become one of the most famous public intellectuals and debaters in the English-speaking world, taking a berth at Harvard after receiving a doctorate as a Rhodes scholar in Oxford. He has been described as “a philosopher with the global profile of a rock star”, reaching audiences of millions online from his Harvard base. Listeners to his BBC Radio 4 series, The Public Philosopher, will have become familiar with the Socratic style of questioning, as Sandel artfully tests the assumptions in the arguments of his audience. Millions of YouTube viewers, where his lectures on justice can be freely accessed, will be familiar with the high, serious forehead and gentle, softly spoken delivery.

Sandel’s politics are squarely on the left. In 2012, he added intellectual lustre to Ed Miliband’s renewal project for Labour, speaking to that year’s party conference on the moral limits of markets. The speech, and his book of the same year, What Money Can’t Buy, helped inspire Miliband’s critique of “predatory capitalism”, which was the Labour leader’s distinctive contribution to post-crash political debate in Britain.

What Money Can’t Buy sealed Sandel’s status as perhaps the most formidable critic of free-market orthodoxy in the English-speaking world. But as an age of violently polarised, partisan and poisonous politics has taken hold, it is that early encounter with Reagan that has begun to play on his mind. “It taught me a lot about the importance of the ability to listen attentively,” he says, “which matters as much as the rigours of the argument. It taught me about mutual respect and inclusion in the public square.”

The question of how to revive these civic virtues lies at the heart of Sandel’s new book, published this month. As American commentators warn of an “Armageddon” election in a divided country, how can a less resentful, less rancorous, more generous public life be revived? The starting point, uncomfortably, turns out to be a bonfire of the vanities that sustained a generation of progressives.

The Tyranny of Merit is Sandel’s response to Brexit and the election of Donald Trump. For figures such as Barack Obama, Hillary Clinton, Tony Blair and Gordon Brown, it will make challenging reading. By championing an “age of merit” as the solution to the challenges of globalisation, inequality and deindustrialisation, the Democratic party and its European equivalents, Sandel argues, hung the western working-class and its values out to dry – with disastrous consequences for the common good.

As he talks, the tone is as modulated as ever; the phrasing characteristically elegant and fluent. But a sense of frustration is palpable, as Sandel charts the rise of what he sees as a corrosive leftwing individualism: “The solution to problems of globalisation and inequality – and we heard this on both sides of the Atlantic – was that those who work hard and play by the rules should be able to rise as far as their effort and talents will take them. This is what I call in the book the ‘rhetoric of rising’. It became an article of faith, a seemingly uncontroversial trope. We will make a truly level playing field, it was said by the centre-left, so that everyone has an equal chance. And if we do, and so far as we do, then those who rise by dint of effort, talent, hard work will deserve their place, will have earned it.”

The recommended way to “rise” has been to get a higher education. Or, as the Blair mantra had it: “Education, education, education.” Sandel homes in on a 2013 speech by Obama in which the president told students: “We live in a 21st-century global economy. And in a global economy jobs can go anywhere. Companies, they’re looking for the best-educated people wherever they live. If you don’t have a good education, then it’s going to be hard for you to find a job that pays the living wage.” For those willing to make the requisite effort, there was the promise that: “This country will always be a place where you can make it if you try.”

Sandel has two fundamental objections to this approach. First, and most obvious, the fabled “level playing field” remains a chimera. Although he says more and more of his own Harvard students are now convinced that their success is a result of their own effort, two-thirds of them come from the top fifth of the income scale. It is a pattern replicated across the Ivy League universities. The relationship between social class and SAT scores – which grade high school students ahead of college – is well attested. More generally, he notes, social mobility has been stalled for decades. “Americans born to poor parents tend to stay poor as adults.”

But the main point of The Tyranny of Merit is a different one: Sandel is determined to aim a broadside squarely at a left-liberal consensus that has reigned for 30 years. Even a perfect meritocracy, he says, would be a bad thing. “The book tries to show that there is a dark side, a demoralising side to that,” he says. “The implication is that those who do not rise will have no one to blame but themselves.” Centre-left elites abandoned old class loyalties and took on a new role as moralising life-coaches, dedicated to helping working-class individuals shape up to a world in which they were on their own. “On globalisation,” says Sandel, “these parties said the choice was no longer between left and right, but between ‘open’ and ‘closed’. Open meant free flow of capital, goods and people across borders.” Not only was this state of affairs seen as irreversible, it was also presented as laudable. “To object in any way to that was to be closed-minded, prejudiced and hostile to cosmopolitan identities.”

A relentless success ethic permeated the culture: “Those at the top deserved their place but so too did those who were left behind. They hadn’t striven as effectively. They hadn’t got a university degree and so on.” As centre-left parties and their representatives became more and more middle-class, the focus on upward mobility intensified. “They became reliant on the professional classes as their constituency, and in the US as a source of campaign finance. In 2008 Barack Obama became the first Democratic candidate for president to raise more than his Republican opponent. That was a turning point but it wasn’t noticed or highlighted at the time.”

Blue-collar workers were in effect given a double-edged invitation to “better” themselves or carry the burden of their own failure. Many took their votes elsewhere, nursing a sense of betrayal. “The populist backlash of recent years has been a revolt against the tyranny of merit, as it has been experienced by those who feel humiliated by meritocracy and by this entire political project.”

It is a withering analysis. Does he empathise, then, with Trumpism? “I have no sympathy whatsoever for Donald Trump, who is a pernicious character. But my book conveys a sympathetic understanding of the people who voted for him. For all the thousands and thousands of lies Trump tells, the one authentic thing about him is his deep sense of insecurity and resentment against elites, which he thinks have looked down upon him throughout his life. That does provide a very important clue to his political appeal.

“Am I tough on the Democrats? Yes, because it was their uncritical embrace of market assumptions and meritocracy that prepared the way for Trump. Even if Trump is defeated in the next election and is somehow extracted from the Oval Office, the Democratic party will not succeed unless it redefines its mission to be more attentive to legitimate grievances and resentment, to which progressive politics contributed during the era of globalisation.”

So much for the diagnosis. The only way out of the crisis, Sandel believes, is to dismantle the meritocratic assumptions that have morally rubber-stamped a society of winners and losers. The Covid-19 pandemic, and in particular the new appreciation of the value of supposedly unskilled, low-paid work, offers a starting point for renewal. “This is a moment to begin a debate about the dignity of work; about the rewards of work both in terms of pay but also in terms of esteem. We now realise how deeply dependent we are, not just on doctors and nurses, but delivery workers, grocery store clerks, warehouse workers, lorry drivers, home healthcare providers and childcare workers, many of them in the gig economy. We call them key workers and yet these are oftentimes not the best paid or the most honoured workers.”

There must be a radical re-evaluation of how contributions to the common good are judged and rewarded. The money to be earned in the City or on Wall Street, for example, is out of all proportion with the contribution of speculative finance to the real economy. A financial transactions tax would allow funds to be channelled more equably. But for Sandel, the word “honour” is as important as the question of pay. There needs to be a redistribution of esteem as well as money, and more of it needs to go to the millions doing work that does not require a college degree.

“We need to rethink the role of universities as arbiters of opportunity,” he says, “which is something we have come to take for granted. Credentialism has become the last acceptable prejudice. It would be a serious mistake to leave the issue of investment in vocational training and apprenticeships to the right. Greater investment is important not only to support the ability of people without an advanced degree to make a living. The public recognition it conveys can help shift attitudes towards a better appreciation of the contribution to the common good made by people who haven’t been to university.”

A new respect and status for the non-credentialed, he says, should be accompanied by a belated humility on the part of the winners in the supposedly meritocratic race. To those who, like many of his Harvard students, believe that they are simply the deserving recipients of their own success, Sandel offers the wisdom of Ecclesiastes: “I returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding… but time and chance happeneth to them all.”

“Humility is a civic virtue essential to this moment,” he says, “because it’s a necessary antidote to the meritocratic hubris that has driven us apart.”

 The Tyranny of Merit is the latest salvo in Sandel’s lifelong intellectual struggle against a creeping individualism that, since the Reagan and Thatcher era, has become pervasive in western democracies. “To regard oneself as self-made and self-sufficient. This picture of the self exerts a powerful attraction because it seems on the face of it to be empowering – we can make it on our own, we can make it if we try. It’s a certain picture of freedom but it’s flawed. It leads to a competitive market meritocracy that deepens divides and corrodes solidarity.”

Sandel draws on a vocabulary that challenges liberal notions of autonomy in a way that has been unfashionable for decades. Words such as “dependency”, “indebtedness”, “mystery”, “humility” and “luck” recur in his book. The implicit claim is that vulnerability and mutual recognition can become the basis of a renewed sense of belonging and community. It is a vision of society that is the very opposite of what came to be known as Thatcherism, with its emphasis on self-reliance as a principal virtue.

There are, he believes, optimistic signs beyond the “clap for carers” moment that an ethical shift is finally taking place. “The Black Lives Matter movement has given moral energy to progressive politics. It has become a multiracial, multigenerational movement and is opening up space for a public reckoning with injustice. It shows that the remedy for inequality is not simply to remove barriers to meritocratic achievement.”

In the closing section of his book, Sandel recalls the story of Henry Aaron, the black baseball player who grew up in the segregated south and broke Babe Ruth’s record for career home runs in 1974. Aaron’s biographer wrote that hitting a baseball “represented the first meritocracy in Henry’s life”. It’s the wrong lesson to draw, says Sandel. “The moral of Henry Aaron’s story is not that we should love meritocracy but that we should despise a system of racial injustice that can only be escaped by hitting home runs.”

Fair competition does not constitute a just vision of society. Even if Trump is defeated in November’s presidential election, this is a truth, Sandel says, that Joe Biden, and his counterparts in Europe, must take on board. For inspiration, he says, they could do worse than turn to one of his intellectual heroes, the English Christian socialist RH Tawney.

“Tawney argued that equality of opportunity was at best a partial ideal. His alternative was not an oppressive equality of results. It was a broad, democratic ‘equality of condition’ that enables citizens of all walks of life to hold their heads up high and to consider themselves participants in a common venture. My book comes out of that tradition.”

Sunday, 30 August 2020

Wanna tame your biases and prejudices: get out of your comfort zone


Tiffanie Wen

Police forces and many other institutions are turning to implicit bias training to help their staff recognise when they are relying upon racist assumptions and stereotypes. But does it really work?

The killing of George Floyd by police officers in Minneapolis three months ago and the shooting of Jacob Blake by police in Wisconsin have led the US to a period of reckoning. As thousands have marched in the streets to protest against racial inequality, many others have also been forced to ask some difficult questions about their levels of prejudice.

While some people mistake racism as being only overt prejudice, there is another crucial component that affects our decisions and actions towards others: implicit bias. An implicit bias is any prejudice that has formed unintentionally and without our direct knowledge – and it can often contradict our explicit beliefs and behaviours. Usually, it reflects a mixture of personal experience, attitudes around us as we have grown up, and our wider exposure to society and culture – including the books we read, television we watch and news we follow.

Many police departments in the US have pointed to schemes aimed at tackling implicit bias as evidence of their attempts to root out racism from their ranks. It is an appealing approach – police forces face many challenges when it comes to tackling racism among their officers. Powerful unions and state laws can protect police officers from investigations into misconduct, while officers who have been fired or have resigned in the past are often rehired by other forces that may be unaware of their career history. Dealing with these systemic issues often requires major structural and institutional change, while training individuals to recognise their own unconscious biases can seem relatively easy to implement by comparison.

While kneeling on a man’s neck until he stops breathing is an extreme act, implicit bias can lead to many forms of discrimination, and can often go unnoticed by those perpetrating them. It can affect how everyone in a society – not just police officers – behaves towards one another.

But the evidence for whether implicit bias training can work is mixed. Is it really possible to reduce biases we are barely aware we have? And can we really hope to rid ourselves of them completely?

The first step is recognising you have biases in the first place. One indication of implicit bias is something known as the implicit association task (IAT). I first encountered the IAT in 2004 as an undergraduate research assistant in a social cognition lab at the University of California, Davis. Developed in the mid 1990s, it is basically a categorisation game, where a series of words and/or pictures flash up on a screen and they have to be sorted as quickly as possible. For a racial bias IAT, you might see a jumbled series of words that have positive or negative connotations mixed with a jumbled series of black or white faces. Mistakes and timing are tracked.

Even if we don’t consider ourselves to be biased, our behaviour, tracked down to milliseconds, usually says otherwise

If you are like a lot of Americans (of all races), you will probably tend to categorise white faces with good words on one side and black faces with bad words on the other.

In 1998, Harvard University made the test available online as part of the research they conduct for Project Implicit. So far, more than 25 million people have tried the IAT there. Calvin Lai, director of research at Project Implicit, says participation has “increased dramatically with the resurgence of the Black Lives Matter movement”.

Taking the test can be an eye-opening experience. Even if we don’t consider ourselves to be biased, our behaviour, tracked down to milliseconds, usually says otherwise. Lai says that their data from 2007 to 2015 shows that 73% of white people, 34% of black people, and 64% of people of other races have a pro-white, anti-black bias. This bias is so pervasive that even children as young as four, of all races, show a bias in favour of white people.

In our evolutionary past, implicit biases may have helped us make quick assessments about other people, animals or situations so we could decide to fight or run away. Today, however, they can lead us to discrimination – or worse.

Police departments are also not alone in hoping tackling unconscious bias can bring about change. Multinational corporations such as Starbucks have mandated implicit bias training in response to racist incidents involving their employees, and the BBC has made it mandatory for staff to go on an unconcious bias course.

Their deep-rooted nature means they can be especially hard to overcome.

“Research shows that it’s easy to change attitudes for a short period of time, maybe a few hours, but it’s hard to change for more than a day,” says Jeffrey Sherman, a psychologist who ran the lab where I worked at the University of California, Davis.

Still, studies point to some strategies that might work.

In a 2012 study, participants were made aware of their bias by taking the IAT before viewing a presentation that discussed their own level of bias, how implicit bias can lead to discrimination and negative consequences for minorities. They then learned about cognitive strategies for reducing bias. Two months later, the participants had lower IAT scores, and indicated they had more concern about bias and awareness about bias in their own behaviour.

Among the strategies that were used for reducing bias were exercises aimed at individuation, which is used to test an assumption that people might commonly make about someone.

Patricia Devine, a psychologist at the University of Wisconsin-Madison who led the study, poses a situation in which a tall, young black man is walking on a college campus. “A student might assume he’s on the basketball team,” she says. In this situation Devine suggests if people check the assumption, they will likely realise there is no evidence other than the stereotype.

For every one person who confirms a bias, it takes three to disconfirm a stereotype to balance it out – William Cox

Devine says another approach can be to counter or replace stereotypes. But combating stereotypes is not easy. Research due to be published later this year by Devine and her colleagues Xizhou Xie and William Cox, shows encounters that contradict strongly held stereotypes get weighed less heavily in our mind. “We found that for every one person who confirms a bias, it takes three to disconfirm a stereotype to balance it out,” says Cox.

Returning to our example, that would mean the student would need to come across more than three black non-basketball players for every black basketball player they encounter to reduce their bias.

Devine isn’t sure we can ever fully get rid of our biases. She and other experts think that a better approach may be to use the strategies to better manage our behaviour.

In a 2017 follow up to their 2012 study, she and her colleagues found that students who completed the bias habit-break intervention they had developed did not differ in their level of bias on the IAT from controls who did not complete the intervention – both groups decreased their bias. But they did find that students who completed the training were more likely to publicly disagree on social media with what they thought was an online newspaper column in favour of racial stereotyping, two years later.

So while the student’s score on the IAT did not appear to change, their long-term behaviour did, perhaps because they were more motivated to recognise and challenge bias, argue Devine and her team.

“I don’t know if we could ever get rid of those underlying associations, but we can make them less powerful in our thinking,” says Devine. “We can learn to recognise the ways in which (biases) lead us to conclusions that we understand are not appropriate or unwarranted, and then we can do something else that reflects your intentions and your values much more.”

Devine and her colleagues have also shown interventions that emphasised gender bias led to increased hiring of female faculty in science, technology, engineering, mathematics and medicine departments at the University of Wisconsin-Madison. While the percentage of females who were hired in the departments that did not receive the anti-bias training stayed around 32%, departments that received the training saw the proportion of women hired in the subsequent two years jump to 47%.

Black faces looked more criminal to police officers – the more black, the more criminal

But can this sort of training be effective in the highly charged and confrontational environment of law enforcement and criminal justice?

A series of experiments at Stanford University found that priming 61 police officers from an unidentified urban department in the US with words related to crime like “violent”, “crime”, “stop”, “investigate”, “arrest” and “shoot” increased their visual attention toward black faces over white faces

The police officers were also asked to look at pictures of black and white faces who, they were told, might be criminals (they were really pictures of Stanford University employees). They were asked to judge which ones “looked criminal”. Not only did the officers judge more black faces as criminal, but faces that had been rated as more “stereotypically black” – which included having darker skin – by another group of study participants, were considered more criminal than those deemed to be “less black”. “Black faces looked more criminal to police officers – the more black, the more criminal,” the authors write.

The findings are only those of a relatively small sample of police officers from a single police department, and so care must be taken with generalising the results to all police forces.

But other research by Jennifer Eberhardt, a psychologist at Stanford University, has also found that the more “stereotypically black” a defendant is perceived to be, the more likely they are to be found guilty and sentenced to capital punishment. Prompting police officers to think of capturing, shooting or arresting also leads their eyes to settle on black faces.

How implicit racial bias factors into the decisions taken by officers when they choose to use deadly force is still being researched. Some simulations have found that police officers are more likely to mistakenly shoot unarmed black people, while others have shown that officers are more hesitant to shoot armed black suspects than armed white suspects.

Simulations, however, are different than the real world. And analyses of real-world data have found bias against minorities. One study of all 991 incidences of people being killed by police in the US in 2015, for example, found that black people were more than twice as likely as white people to be unarmed when they were killed by police. “It seem that… officers subconsciously perceived minority civilians to have been a greater threat than they were,” the authors write.

But researchers say that it would be wrong to conclude that police officers have more bias than the wider population.

There’s lots of research with police officers that shows they don’t have more bias than non-police officers do at the implicit level – Jeffrey Sherman

There are clearly some cops that are just racist, and they won’t be motivated to reduce their implicit bias,” says Sherman. “But more generally they don’t seem more prone than the rest of us to demonstrating bias in a short window of time. There’s lots of research with police officers that shows they don’t have more bias than non-police officers do at the implicit level.”

What is different with police officers is the situations they find themselves in, which require lightning-fast decisions and actions – and the force, both legal and physical, that they are equipped with.

“When you have that kind of time and response restriction, bias is going to show,” says Sherman. “If you’re working in a bookstore, the consequences [of having implicit bias] are far less consequential than if you’re a cop.”

What we know is that officers can be trained to improve their decision making in such situations through computer simulations, and some experts are hopeful that means they can improve in real life.

Lorie Fridell, a professor of criminology at the University of South Florida, also works as an implicit bias trainer for police departments across the United States, including the NYPD. In her training, she presents some of Eberhardt’s work to officers to illustrate how the majority of people have an implicit bias that associates certain races with crime. She then teaches them strategies for managing their biases.

“Something just really quick and easy, for instance, is what we call the ‘litmus test’,” she says. “Ask yourself, would I be stopping this person but for the fact that they are, fill in the blank: transgender, male, Hispanic, homeless, and so forth,” she says.

Another strategy she emphasises is for officers to slow down whenever possible and focus on collecting more data instead of relying on stereotypes.

Given  the  scarcity of research, this is not the kind of bias training or anti-racist training I would reach for in this moment – Betsy Paluck

“We also emphasise to beware of the biases of others,” she says. “We talk about ‘complicit bias’ – meaning if you detect bias in your colleague and you don't do anything about it, you're guilty of complicit bias. But we also talk about bias on the part of community members, or ‘profiling by proxy’. This is when community members call the police based on their own biases.

“But what we want to say to the officers is have your antenna up for the possibility that you're being called to this scene based on a community member’s biases and you shouldn’t just pick up their biases in your policing.”

But not everyone is convinced of the effectiveness of implicit bias training, particularly when some police departments are spending millions of dollars to contractors for such training.

“Some believe that if implicit bias training includes actions and ways to channel motivation to control implicit bias, that it can be successful,” says Betsy Paluck, a psychology professor at Princeton University, who co-authored a forthcoming meta-analysis on the effectiveness of prejudice reduction strategies.

And there are other factors that can also influence how prone to unconscious bias and stereotyping we are. Fatigue, for example, can increase the implicit association between race and crime. It can also inhibit an officer’s ability or willingness to deescalate a situation.

We found that officers who were sleep deprived had higher scores for implicit association between black Americans and weapons,” says James. “There are two potential possibilities. One is that sleep affects base implicit bias. Or, the more likely, fatigue affects their ability to monitor and block their implicit bias. We have also found that officers who are tired are either less able or less willing to de-escalate and tend to be quicker to shoot.” 

It suggests that efforts to reduce the levels of fatigue among police offices might also indirectly improve an officer’s reliance on implicit bias.

Long-term social contact – in person or online – with other groups is the best way to change your attitude about those groups

Some interventions designed to manage biased behaviour have proven helpful already. Eberhardt and her colleagues worked with the Oakland Police Department to reduce the amount of racially motivated traffic stops by adding a simple yes or no question to the top of the officer’s form: “Is this intelligence led?”

They found that in the year following the change to the form, stops of black people dropped by 43% – and far from causing the city to become more dangerous, the crime rate actually continued to drop.

But if we are determined to reduce our biases so they are more aligned with our beliefs, what else can we do beyond regulating our behaviour to improve? 

Experts suggest changing our experiences and environment will change our biases in the long run.

Fridell volunteered at a homeless shelter for seven years, which she says changed her attitudes and implicit bias toward homeless people. She shares the anecdote in her courses to emphasise the long-term commitment it takes to change biases. It’s also the logic behind community outreach initiatives like Coffee with a Cop, which allows community members to connect with police officers in non-law enforcement context and vice versa.

Lai agrees that long-term social contact – in person or online – with other groups is the best way to change your attitude about those groups. For police forces trying to reduce biases in their ranks, it is a valuable lesson.

“We find it is one of the most robust ways to reduce prejudice and discrimination, be it implicit or explicit,” Lai says. “The ideal form of contact is actual interpersonal contact in daily life.”

Monday, 20 July 2020

Why we need to be cautious of the romanticism associated with democracy

Varghese K.George

A political scientist explores the transition from early to modern democracy and points out that it is an experiment whose transformation is ongoing amid a fresh wave of technological progress

Debates on democracy are often noisy and inconclusive, appropriately perhaps, but there is near-universal agreement on its moral superiority over other forms of social organisation. David Stasavage, professor of politics at New York University, is a self-declared optimist on democracy but he is cautious of the romanticism associated with it. His new book, The Decline and Rise of Democracy, is a rich and coherent account of democracy’s evolution over millennia and across diverse geographical and environmental settings, “a deep history.”

“The democracy we have today is but one potential way of organizing things,” and there could be other forms also, and the volume pays particular attention to autocracy which is often considered its antithesis. There is nothing inevitable about the birth of a democracy and there is nothing deterministic about the course of its evolution, but a long view of history allows some generalisations. 

People and rulers
Rulers listen to the people when they need to, rather than an act of enlightenment — it could be to devise efficient mechanisms for tax collection at one point and to mobilise soldiers at another. The ruler needs the council to gather information and seek consent of the ruled, when he is weak and his powers not far-reaching.

Places where an efficient bureaucracy took root earlier on turned out to be less hospitable for democracy — China and Islamic West Asia being the living examples.
The Communist Party of China or the Islamic ideology cannot be linked to the present-day organisation of these societies in any absolute terms. Islam had consultation as an elementary component of its faith but early Islamic empire builders inherited strong bureaucracies that made resource extraction and exercise of power easy in the lands they freshly conquered. They did not need councils. The CCP built on the long tradition of bureaucratic control over people in China. After the revolutionary takeover of the state, Mao Zedong declared that “our present task is to strengthen the people’s state apparatus.” Technological advancements that reinforce bureaucratic authority can be inimical to democracy in certain situations.

The historically diffused nature of its social organisation, its ‘king and council’ template, was the differentiator that made Europe fertile for the seeds of modern democracy. It is here that representative democracy takes its current form. Commercial vibrancy and democracy are not necessarily correlated, with China and Islamic empires offering illustrations. There is also no empirical evidence historically to validate the suggestion that democracy creates wealth or wealth creates democracies. When France turned into a democracy with the Third Republic in 1870, its per capita income was around the same as that of Tanzania today.

As for creating wealth, democracies and autocracies both have advantages and disadvantages. Poor countries have become democracies too, and India is a forceful example. The village councils as an institution survived many empires that created prototypes of a central authority occasionally. A resistance to centralisation continued and that helped the survival of Indian democracy.

What happened in America
Europeans transplanted to the Americas, where no form of state existed, to build a robust form of democracy. Land was in abundance, labour was in shortage, and there was no apparatus of state control. The only means of forming a community was allowing participation of everyone.
Classical ideas and medieval experiments in democracy in Europe had the perfect setting for growth and evolution in America. Suffrage was the most expansive in America — but still it was only restricted to white males. It would take several centuries before African Americans — brought as slaves to create a labour class that did not have political options — could get equal voting rights.
Chiselled and formatted in the U.S., modern democracy circulated back to Europe and other parts of the world, but this by no means should obscure the fact that democracies existed in many societies in antiquity, including what is present day Bihar in India.

Mass redistribution
An old elite worry that democracy might force mass redistribution of wealth has turned out be unfounded. In fact, democracy has not even resulted in any massive reduction in inequality. In recent years, representative democracy has raised fresh concerns of trust and concentration of executive power.

Altogether, this volume is an unsentimental and rigorous analysis of democracy drawn from the author’s engagement with the topic over two decades. “In the end, China is not a deviation from the European pattern of political development; it is simply a different path that has its own logic to it and may well stay that way,” he says, in a suggestion that might not please democratic evangelists. The author is also critical of what passes of as democracy these days. “Instead of only asking whether democracy will survive, we need to also ask whether we will be satisfied with the democracy that does survive.” Democracy is facing its biggest threat in history, in the fresh wave of unprecedented technological progress. This volume helps us look into the future, and one might be unsettled by what can be seen.

Tuesday, 14 July 2020

Why Live Aid was the greatest show of all


Mark Beaumont
In the swarming backstage scrum of Wembley Stadium, rock’n’roll supernovas collided. A sweaty post-gig Pete Townshend embraced Elton John, en route to the stage. Freddie Mercury accosted Bono by the makeshift Hard Rock Cafe constructed for the occasion. David Bowie and Paul McCartney mock-boxed for the cameras between luxurious ferns. On one dressing room door, a sign listed the three acts scheduled to use it during the day, then the mysterious “Ensemble Male”. A space reserved, rumours abounded, in case it was needed by the surviving, reunited Beatles.

Out front, history was in the making. “It’s 12 noon in London, 7am in Philadelphia,” BBC presenter Richard Skinner had told an audience of almost 2 billion across the globe, 40 per cent of the Earth’s 1985 population. “And around the world it’s time for Live Aid.” The Coldstream Guards struck up the royal salute, Status Quo piled into “Rockin’ All Over the World” and, 35 years ago today, the biggest, most ambitious concert that had ever been staged crashed onto 500 million TV sets from New York to Tokyo, Moscow to Montreal. By the time it wrapped up 16 hours later with an emotional rendition of “We Are the World” at the JFK Stadium in Philadelphia, legends had been created and £50m raised for famine relief. When Freddie Mercury, at the climax of his famous call-and-response “Aaaaaay-o” segment, struck what would become known as The Note Heard Around the World, he couldn’t have imagined it would resonate with a global tone of such rich compassion.
With a nebula of stars queueing up to perform at two simultaneous stadium shows in London and Philadelphia, Live Aid wasn’t just the greatest gig on Earth, it was the birth of music as a formidable humanitarian and philanthropic force, a defining peak of the Eighties musical pomp and splendour and the culmination of rock’s decades-long expansion to critical mass. It was also a gigantic leap of faith built from Bob Geldof’s determination to hustle, bully and cajole the greatest show he could imagine into reality.

Following the 3 million-selling success of Band Aid’s “Do They Know It’s Christmas?” the previous year, which became the fastest selling UK single ever and raised £8m for Ethiopian famine aid, the perception might well have been that Geldof now possessed a golden Filofax and had the biggest names in rock at his beck and call. In fact, when Boy George suggested organising a star-studded concert after Geldof and assorted Band Aid alumni joined Culture Club for an encore of the single at Wembley Arena in December 1984, it took every ounce of Geldof’s single-minded guile and resolve to pull it off.

“He was a charismatic leader,” says Live Aid’s UK production manager Andrew Zweck today. “He was inspiring, he motivated us. The greatest legacy of Live Aid for me personally, is the example of how Bob Geldof’s leadership demonstrated the power of the individual. How the voice and action of just one person could start a movement that could make a difference.”

First stop, promoter Harvey Goldsmith’s office. “I didn’t really get a chance to say no,” Goldsmith told The Observer in 2004. “Bob arrived in my office and basically said, ‘We’re doing this’.” Back in 1986, he told Rolling Stone, “Bob said this should be the definitive statement for the music business. He said we ought to do a show in England and one in America as well. The idea was to do a worldwide television hook-up and raise money with a telethon … He asked, ‘Is it possible?’, and that’s when the [organisational] nightmare started.”

Goldsmith set about hiring Wembley Stadium and his US counterpart Bill Graham secured the 100,000-capacity JFK Stadium for the American leg of the show – all at a charity discount and with everything from hotel rooms to flights, hire cars and food donated; a show that would usually have cost $20m came in at a fifth of the price. And Geldof entered into the most high-stakes game of bluff in his life. “He’d say to Bowie, ‘Queen are doing it, Elton’s doing it’ and he was making it up,” says Zweck. “Then he’d say to Elton ‘oh yeah, Bowie’s definitely in, I’ve spoken to him’. He played that game very successfully in the end. Bob was really good at that. He was running at 100mph most days, all kinds of ideas and demands flying in all directions.”

“When I announced it, the only one who was dithering, as ever, was Bryan Ferry,” Geldof himself told The Observer. “So I just said, ‘... and Bryan Ferry.’ And he rang to say, ‘I didn’t say “yeah”.’ I said, ‘Well, say no, then. You’re the one who can announce it though.’”

Ferry, it turns out, wasn’t the only artist dithering. When Geldof officially announced his “global jukebox” at press conferences in London and New York on 10 June, he reeled off a line-up that was significantly more TBC than he made out. Bowie, Elton, The Who, Eric Clapton, U2, Madonna, McCartney, Robert Plant, Dire Straits, Phil Collins and a plethora of New Romantic chart-pop acts of the day were certainly all signed up. But hearing their names among the roll-call came as a surprise to the likes of Mick Jagger, Paul Simon, Huey Lewis, Tears For Fears and Stevie Wonder, most of whom were still undecided about appearing or, in Wonder’s case, had already declined. “Mick was a bit surprised,” Jagger associate Tony King told Rolling Stone. “But he wasn’t annoyed. He thought, ‘Okay, now I’ve actually got to do something.’” Within weeks Jagger was in Goldsmith’s office, discussing whether it would be possible for him to perform a transatlantic duet with Bowie, or for one of them to perform from space.

As the A-listers piled up, so did the offers. Although some in the UK team suspected Graham was warning off some big names, such as Paul Simon and Whitney Houston, uncertain that Geldof could pull it off, Graham soon had 100 major artists asking to perform and the world’s most famous musicians jostling for prime-time slots. Graham had to turn down arena-filling acts such as Foreigner and Yes even when he’d shortened sets and brought the US show’s start time forwards from noon to 9am to fit more performers in.

The press duly noted that, considering this was a concert for African famine relief, Geldof had struggled to find black acts for the bill. Diana Ross and the Pointer Sisters were on tour, Donna Summer studio-bound, Michael Jackson, according to his publicist Norman Winter, “immersed in a couple of heavy projects… He’d liked to have done the show, but it’s impossible.” Even Live Aid’s suggestion of jetting him in for a duet with Jagger or McCartney wouldn’t fly: “He and Paul work very well together, but he has other commitments.” When Graham learnt that someone in his New York office had turned down an offer to play from Run-DMC, he swiftly rang to rectify the error.
Nonetheless, there were high-profile absentees. Prince, one of the most famous no-shows for the recording of “We Are the World”, sent a pre-recorded video to be screened in Philadelphia. Bruce Springsteen, who’d played at Wembley the week before, left his stage for the event’s use but didn’t feel he could stop his band taking a well-earned vacation. Billy Joel declined to play, fearing that a solo piano set would be lost in a screaming stadium. Huey Lewis pulled out of the Philadelphia show, citing concerns that the money raised by the associated charity singles wasn’t reaching the people who needed help. Cliff Richard later stated that he was unable to perform, although Geldof couldn’t remember asking him.

Camped out in an office at Phonogram Records in the weeks before the show, Geldof hustled, ruthlessly, right up to the last minute. He drove up the price of licensing the broadcast to NBC in America by pretending rival US channels CBS and NBC were interested (they weren’t), and threatened to pull the broadcast from Germany unless it agreed to run a telethon. A lot of tables were thumped long before he famously demanded the BBC viewing audience “give me the money”.
It got results. The night before the main event, Eric Clapton, Simple Minds’ Jim Kerr and Bryan Adams were spotted in the bars and restaurants of the Philadelphia Four Seasons hotel, where Ozzy and Judas Priest’s Rob Halford worried, over tea, if they might disintegrate when playing in sunlight. Meanwhile hordes of Duran Duran fans besieged the Palace Hotel across the road, where it had laid on a sumptuous pool party late into the night for performers including Mick Jagger, Hall & Oates and Robert Plant. Jagger was fresh from his raunch-heavy rehearsal at the stadium with Tina Turner: “We both had to say that we wouldn’t go too far, the way we normally would at a show,” he told Rolling Stone. “MTV might stay on, but I don’t know about ABC.”

One of the few performers not at the Palace party was Bernard Watson. An amateur folk singer just out of high school, Watson had driven to Philadelphia from Miami Beach determined to play at the biggest show on earth. He hunted down Bill Graham to give him a tape and slept in his car outside the stadium for days. Graham, who had liked the tape, eventually went out to meet him. “You mind opening for Joan Baez?” he asked, and put him first on the bill, before the cameras rolled. Charity was in the air.

Geldof himself spent the night before the show sleeping on towels because of night sweats, terrified no one would show up. And, at the stadium, Zweck had his own concerns. “It was intense,” he recalls. “It was brand new. How do you put on 22 acts in one day down to the second? The BBC had written this time schedule flip-flopping with the broadcast, and it said ‘at 7.22pm The Who will leave the stage’ and we thought ‘how the hell are we gonna stick to a schedule like that?’ So there was a lot of concern. I remember one week previous being very scared and phoning every experienced rock’n’roll production manager in the world saying ‘quick, wherever you are, jump on a plane, come to Wembley, we’re out of our depth!’ None were available, they were all in Philadelphia. But we put a team of volunteers together and pulled it off.”

And how. As the crowds poured in beneath Wembley’s famous twin towers and Noel Edmonds’ helicopter company ferried the stars from Battersea to a nearby cricket ground where the local teams continued their tournament between arrivals (Edmonds claimed that Bowie’s management insisted, as a prank, that he would only fly in a blue helicopter), 13 July arrived with the air of history in the making. Everyone from Elton to Adam Ant lined up in the banquet hall to greet Prince Charles and Diana, and Geldof, Queen and Bowie took seats with them in the royal box to watch the opening acts, basking in the phenomenal buzz of the crowd.

“There was a fantastic atmosphere, positive and joyous, everyone was supporting each other,” says Zweck, who was overseeing operations from side of stage. “The sun was shining, it was 12 noon, Status Quo started rocking and we were walking on air. You just knew that it was super special.”

Geldof took an early slot with The Boomtown Rats. “It was only when I walked on stage with the band that the romance of it and the hugeness of it got to me,” he said. “That moment when I pull up sharp on ‘I Don’t Like Mondays’ – ‘and the lesson today is how to die’ – time became elastic, like I stood there for hours and my hand just stayed in mid-air.”

Geldof stole the afternoon, and it was a tough afternoon to steal. By 3.20pm the crowd had already been graced with sets by some of the biggest chart names of the decade – Adam Ant, Ultravox, Spandau Ballet, Nik Kershaw, Sade and Elvis Costello, leading Wembley in a rendition of “old northern English folk song”, “All You Need Is Love”. And here, while the event was still warming up, were Sting and Phil Collins tag-teaming Police hits with “Against All Odds (Take A Look At Me Now)” and “In The Air Tonight”. Collins raced straight from the stage to Edmonds’s helicopter, bound for Heathrow to catch a Concorde flight to the US where he would join in the Philadelphia leg. On board Concorde he ran into Cher, who was oblivious to the whole event. “She asked what was going on,” Collins told The Observer. “I told her about Live Aid and she asked whether I could get her on. I told her to just turn up.”

Arriving in Philadelphia, Phil Collins walked into a scene of sheer pandemonium, a tsunami of stars. Backstage, The Beach Boys and Crosby, Stills and Nash were being swarmed by photographers and TV crews. Madonna and Sean Penn, inseparable to the point of reportedly going into a one-person portable toilet together, shunned the press, preferring to hang out in their trailer with Jim Kerr and Chrissie Hynde. “This thing today is like 100,000 Ed Sullivan Shows,” Tom Petty quipped to Rolling Stone. “It was bedlam backstage,” Bryan Adams told The Observer. “I remember I walked up the stairs to the stage and Yoko Ono passed me. When I got to the top of the stairs someone said that I was to start after the gentleman introduced me. That gentleman was Jack Nicholson.”

After a guest spot with Clapton and a solo set at JFK Stadium, Collins stuck around for what he thought was going to be a low-key performance with his old friends Jimmy Page and Robert Plant, but which had been upgraded to a Led Zeppelin reunion. What followed would be described as “one of the worst rock’n’roll reunions of all time”. Plant’s voice was shot, Page’s guitar was out of tune and Collins’s drumming was unregimented at best.

“Robert told me Phil Collins wanted to play with us,” Page later explained to The Scotsman. “I told him that was all right if he knows the numbers. But at the end of the day, he didn’t know anything. We played ‘Whole Lotta Love’, and he was just there bashing away cluelessly and grinning. I thought that was really a joke.” It was, Plant said elsewhere, “a f***ing atrocity for us … It made us look like loonies.”

“It wasn’t my fault it was crap,” Collins told Ultimate Classic Rock. “If I could have walked off, I would have. But then we’d all be talking about why Phil Collins walked off Live Aid – so I just stuck it out.”

The band have refused to allow the footage to be screened since, and a jet-lagged Collins swiftly escaped to his New York hotel, arriving just in time to catch Cher joining in the final massed singalong of “We Are the World” on TV.

Back in London, Wembley wasn’t without its own hiccups. Bryan Ferry began his set to find his microphone was broken, his guest David Gilmour’s guitar wasn’t working and his drummer had destroyed a drum skin on the very first strike. The onstage traffic light system, designed to give acts a two-minute warning before the plug was pulled, got mysteriously smashed during The Who’s raucous performance. And Paul McCartney’s surprise rendition of “Let It Be” towards the end of the show was famously inaudible for the first two minutes.

“The crew got tired,” says Zweck. “They were plugging his vocal into the green channel and pulling up the fader of the blue channel so people couldn’t hear. We’d had meetings in his office, they said ‘the technical thing is really important’ and I said ‘don’t worry, we’ve got the best people in the world, it’s all gonna be great’ and sadly it wasn’t. I’ve always felt embarrassed about that.”

None of which distracted from an event of historic impact and significance, and some of the most memorable live moments of the decade. Then a lesser-known act, U2’s set proved a breakthrough, even though their closing song “Pride (In The Name Of Love)” had to be cut as Bono, sporting one of the Eighties’ lushest mullets, noticed 15-year-old Kal Khalique being suffocated as the crowd surged towards him (at Bono’s beckoning) and the band elongated “Bad” to 14 minutes while he leapt off the stage to help rescue and dance with her; Khalique later claimed Bono saved her life that day.

Bowie also cut the song “Five Years” from his set in order to screen a video of footage from the famine accompanied by The Cars’ “Drive”, a film so moving that phone donations – which had reached £300 per second when a tired and emotional Geldof had visited the BBC booth to demand viewers empty their pockets – rocketed further. Speaking to The Tube backstage after his performance, Bowie was asked about his plans for the rest of the evening. “I’m going to go home,” he said straight down the camera, “and I’m going to have a really good f***.”

t was Queen’s magical 22-minute set, however, which has come to epitomise Live Aid. Introduced by Mel Smith and Griff Rhys Jones dressed as policemen investigating a noise complaint from Belgium, Mercury jogged onstage for a career-defining performance: the piano intro of “Bohemian Rhapsody” gave way to stadium-wide cult clapping for “Radio Gaga”, “We Are The Champions” turned Wembley into a sea of swaying arms and Mercury bestrode the event like a moustachio’d Colossus with a baton-mike sceptre. “I remember a huge rush of adrenaline as I went on stage and a massive roar from the crowd,” Brian May told The Observer, “and then all of us just pitching in. Looking back, I think we were all a bit over-excited, and I remember coming off and thinking it was very scrappy. But there was a lot of very good energy too. Freddie was our secret weapon. He was able to reach out to everybody in that stadium effortlessly, and I think it was really his night.”

As for Geldof, it was a stressful and highly strung experience. His mood ricocheted throughout the day, aggravated by pain from a sprained back that kept him slightly hunched. By the time he was gathering a stage full of stars for the finale of “Do They Know It’s Christmas?” he was thoroughly exhausted, carried shoulder-high by Townshend at the show’s end towards a much-needed rest.
In Philadelphia, the party raged on. At 1am in a second-floor suite at the Palace Hotel, Keith Richards, Ronnie Wood and Bob Dylan chatted with Jimmy Page and Stephen Stills about their various onstage mishaps. “Fun?” said Dylan of his three-song set with Richards and Wood. “No, we couldn’t hear anything.”

“Would have been better if we’d gotten paid,” Richards joked to Rolling Stone. Indirectly, though, most of them did. As the CD era was dawning, sales of the acts involved with Live Aid soared. Collins, Madonna, U2 and Queen saw their records catapulted back into the charts, and one of the most immediate legacies of the show was its cementing of a top tier of heritage musicians who would hob-nob with Charles and Diana at similar events over the coming years – a rock’n’roll royalty of their own.

Financially, the success of the event would come into question. Huey Lewis was right to be concerned about how effectively the money raised was being used to help the victims of famine. In the wake of the Band Aid single, relief food was left to rot in Ethiopian docks as the country’s dictatorial leader Mengistu Haile Mariam – who had helped to bring on the famine by napalming farmland – prioritised the unloading of weapons for his four internal conflicts. The $127m raised by Live Aid helped to break the trucking cartel that was stopping relief getting into the country but, according to investigations by Spin in 1986, much of it was funnelled through Mengistu’s government, who used the money to purchase hi-tech weaponry from the Soviet Union and the food to lure his people into a brutal resettlement programme that killed hundreds of thousands. “I’ll shake hands with the devil on my left and on my right to get to the people we are meant to help,” Geldof said in response to warnings from aid group Medicins Sans Frontiers. But both devils were channelling his charity away from the starving.

The beneficial legacy of Live Aid, however, cannot be underestimated. In its wake governments woke up to the swell of public support for humanitarian global relief and began to place it at the heart of foreign policy decisions. “We took an issue that was nowhere on the political agenda,” Geldof told The Guardian, “and, through the lingua franca of the planet – which is not English but rock’n’roll – we were able to address the intellectual absurdity and the moral repulsion of people dying of want in a world of surplus.” The ripple effect of Live Aid, in terms of lives indirectly saved, is incalculable.
“What I’ve seen over the 35 years,” says Zweck today, “is the awakening of the social conscience of the music industry, with artists realising they had a power and they could do good with that power. We saw after that Bono and Sting, Roger Waters, using their voice, their position and their platform to push for causes they believe in. It would change people’s perspective of charity and mobilise public opinion to such an extent that government policies in the developing world and other areas would be altered thereafter. You can look back at Live Aid and see that’s where it started. Governments now listen, and that all started with a pop concert.”