More than just sanctuary, migrants need social citizenship #seventhperson

By Nancy Berlinger:  a research scholar at The Hastings Center in New York. Her most recent book is Are Workarounds Ethical? Managing Moral Problems in Health Care Systems (2016). She co-directs the Undocumented Patients project. | Via Creative Commons Attribution-No Derivatives


In 1975, the English author John Berger wrote about the political implications of immigration, at a time when one in seven workers in the factories of Germany and Britain was a male migrant – what Berger called the ‘seventh man’. Today, every seventh person in the world is a migrant.

Migrants are likely to settle in cities. In the United States, 20 cities (accounting for 36 per cent of the total US population in 2014) were home to 65 per cent of the nation’s authorised immigrants and 61 per cent of unauthorised immigrants. In Singapore, migrant workers account for 20 per cent of the city-state’s population. (Migrants continue to be a significant rural population. In the US, three-quarters of farm workers are foreign-born.)

Scholarship on migration tends to focus normative arguments on the national level, where policy concerning borders and immigration is made. Some prominent political philosophers – including David Miller at Nuffield College, Oxford, and Joseph Carens at the University of Toronto – also outline an account of ‘social membership’ in receiving societies. This process unfolds over five to 10 years of work, everyday life and the development of attachments. As Carens writes in ‘Who Should Get In?’ (2003), after a period of years, any migrant crosses a ‘threshold’ and is no longer a stranger. This human experience of socialisation holds true for low-wage and unauthorised migrants, so a receiving society should acknowledge that migrants themselves, not only their economic contributions, are part of that society.

Carens and Miller apply this argument to the moral claims of settled migrants at risk of deportation because they are unauthorised or because the terms of their presence are tightly limited by work contracts. In the US, for example, most of the estimated 11.3 million people who crossed a border without authorisation or are living outside the terms of their original visas have constituted a settled population for the past decade, with families that include an estimated 4 million children who are US citizens by birthright. In The Ethics of Immigration (2013), Carens writes that the prospect of deporting young immigrants from the place where they had lived most of their lives was especially troubling: it is ‘morally wrong to force someone to leave the place where she was raised, where she received her social formation, and where she has her most important human connections’. Miller and Carens concur with the Princeton political theorist Michael Walzer’s view of open-ended guest-worker programmes as ethically problematic. The fiction that such work is temporary and such workers remain foreign obscures the reality that these migrants are also part of the societies in which they live and work, often for many years, and where they deserve protection and opportunities for advancement.

Not all migrants will have access to a process leading to national citizenship or permanent legal residence status, whether this is because they are unauthorised, or their immigration status is unclear, or they are living in a nation that limits or discourages immigration while allowing foreign workers on renewable work permits. If we agree that migration is part of the identity of a society in which low-wage migrants live and work, whether or not this is acknowledged by non-migrants or by higher-status migrants, what would it mean to build on the idea of social membership and consider migrants as social citizens of the place in which they have settled? And what realistic work can the idea of social citizenship do in terms of improving conditions for migrants and supporting policy development?

Social citizenship is both a feeling of belonging and a definable set of commitments and obligations associated with living in a place; it is not second-class national citizenship. The place where one’s life is lived might have been chosen in a way that the nation of one’s birth was not; for a Londoner or a New Yorker, local citizenship can be a stronger identity than national citizenship. Migrants live in cities with a history of welcoming immigrants, in cities that lack this history, and also in cities where national policy discourages immigration. Considering how to ensure that social citizenship extends to migrants so that they get to belong, to contribute, and to be protected is a way to frame ethical and practical questions facing urban policymakers.

Considering migrants as social citizens of the cities in which they settle is related to but not the same as the idea of the city as a ‘sanctuary’ for migrants. Throughout the US, local officials have designated ‘sanctuary cities’ for undocumented immigrants subject to deportation under policies announced by the federal government in February 2017. This contemporary interpretation of an ancient concept refers to a policy of limited local cooperation with federal immigration officials, often associated with other policies supporting a city’s migrant population. Canadian officials use the term ‘sanctuary city’ similarly, to refer to local protections and potentially also to limited cooperation with border-control authorities. In Europe, the term ‘city of sanctuary’ tends to refer to efforts supporting local refugees and coordinated advocacy for refugee admission and rights. These local actions protecting migrants are consistent with a practical concept of social citizenship in which civic history and values, and interests such as being a welcoming, diverse or growing city, correspond to the interests of migrants. However, the idea of ‘sanctuary’ suggests crisis: an urgent need for a safe place to hide. To become social citizens, migrants need more from cities than sanctuary.

Local policies that frame social citizenship in terms that apply to settled migrants should go beyond affirming migrants’ legal rights and helping them to use these rights, although this is certainly part of a practical framework. Social citizenship, as a concept that should apply to migrants and non-migrants alike, on the basis of being settled into a society, can build on international human rights law, but can be useful in jurisdictions where human rights is not the usual reference point for considering how migrants belong to, contribute to, and are protected by a society.

What can a city expect or demand of migrants as social citizens? Mindful that the process of social integration usually takes more than one generation, it would not be fair to expect or demand that migrants integrate into a new society on an unrealistic timetable. Most migrants are adults, and opportunities to belong, to contribute, and to be protected should be available to them, as well as to the next generation. Migrants cannot be expected to take actions that could imperil them or their families. For example, while constitutionally protected civil rights in the US extend to undocumented immigrants, using these rights (by identifying themselves publicly, for example) can bring immigrants to the attention of federal authorities, a reality or fear that might constrain their ability to participate in civic life.

In his novel Exit West (2017), Mohsin Hamid offers a near-future fictional version of a political philosopher’s ‘earned amnesty’ proposal. Under the ‘time tax’, newer migrants to London pay a decreasing ‘portion of income and toil’ toward social welfare programmes for longstanding residents, and have sweat-equity opportunities to achieve home ownership by working on infrastructure construction projects (the ‘London Halo’). Today, the nonfictional citizens of Berlin are debating how to curb escalating rents so that the city remains open to lower-wage residents, including internal and transnational migrants. A robust concept of social citizenship that includes migrants who have begun the process of belonging to a city, and those who should be acknowledged as already belonging, will provide a necessary framework for understanding contemporary urban life in destination cities.Aeon counter – do not remove

Nancy Berlinger

This article was originally published at Aeon and has been republished under Creative Commons.



American exceptionalism, from Stalin with love

By Ian Tyrrell | He is emeritus professor of history at the University of New South Wales in Sydney, Australia. His latest book is Crisis of the Wasteful Nation: Empire and Conservation in Theodore Roosevelt’s America (2015). Creative Commons Attribution-No Derivatives. Via Aeon


Every time a public figure uses the term ‘American exceptionalism’, ordinary Americans turn to my website. It’s number one for a quick answer to the question: ‘What is American exceptionalism?’ My latest benefactor was Hillary Clinton, who used the term in a speech on 31 August. My website hits spiked. Until about 2010, few Americans had heard the term. Since then, its use has expanded exponentially. It is strange that such an inelegant term should be adopted by two major political parties when so many people had not a clue what it meant. Of course, one doesn’t have to use the term to believe in the underlying concept. But the phrase has a history that helps us to understand the current hyperbolic use.

American exceptionalism is not the same as saying the United States is ‘different’ from other countries. It doesn’t just mean that the US is ‘unique’. Countries, like people, are all different and unique, even if many share some underlying characteristics. Exceptionalism requires something far more: a belief that the US follows a path of history different from the laws or norms that govern other countries. That’s the essence of American exceptionalism: the US is not just a bigger and more powerful country – but an exception. It is the bearer of freedom and liberty, and morally superior to something called ‘Europe’. Never mind the differences within Europe, or the fact that ‘the world’ is bigger than the US and Europe. The ‘Europe’ versus ‘America’ dichotomy is the crucible in which American exceptionalist thinking formed.

Some presume that the Frenchman Alexis de Tocqueville invented the term in the 1830s, but only once did de Tocqueville actually call American society ‘exceptional’. He argued that Americans lacked culture and science, but could rely on the Anglo-Saxons in Britain to supply the higher forms of civilisation. This is not what Americans mean by ‘exceptionalism’ today.

American exceptionalism is an ideology. The ‘ism’ is the giveaway. De Tocqueville examined US institutions and moral behaviours as structural tendencies of democratic societies. He did not see US democracy as an ideology. To him, the US was the harbinger of a future that involved the possible democratisation of Europe, not an unrepeatable outlier of civilisation. He studied the US as a model of democratic society, whose workings needed to be understood, because the idea was spreading.

Some think that Werner Sombart, the German socialist of the early 1900s, invented the term, but he did not. Sombart claimed only that US capitalism, and its abundance, made the country temporarily unfavourable terrain for the development of socialism. It was actually Joseph Stalin, or his minions, who, in 1929, gave the idea its name. It is surely one of the ironies of modern history that both major US political parties now compete to endorse a Stalinist term.

Orthodox communists used the term to condemn the heretical views of the American communist Jay Lovestone. In the late 1920s, Lovestone argued that the capitalist economy of the US did not promote the revolutionary moment for which all communists waited. The Communist Party expelled Lovestone, but his followers and ex-Trotskyites in the US embraced the exceptionalist epithet and, eventually, the idea that the US would permanently avoid the socialist stage of development.

After the Nazi-Soviet Pact of 1939, as well as later during the Cold War, many of these US Marxists jettisoned their old political allegiances but retained the mindset that the economic success of the US buried class struggle in their nation – permanently. As the leader of the free world, the chief victor in the Second World War over ‘totalitarian’ Germany, and by far the world’s most prosperous economy, the US seemed in all these ways an exceptional nation. Seymour Martin Lipset, the eminent Stanford political sociologist, made a career investigating the many factors that led to this American exceptionalism. Until his death in 2006, Lipset continued to hold that the US was not subject to the historical norms of all other nations.

No one did more than Ronald Reagan to amplify and popularise the US as exceptional. Refusing to accept the doldrums of the Jimmy Carter presidency or the transgressions of Richard Nixon as the best that Americans could do, Reagan promoted the image of the US as a shining ‘city upon a hill’. This reference is to a 1630 sermon by John Winthrop, the Governor of Massachusetts Bay Colony. Winthrop was calling on the new Pilgrim settlers heading for Massachusetts to stick to the narrow path of Puritanism.

Reagan and his followers wrongly attributed American exceptionalism to this Puritan injunction, and added ‘shining’ to the original, which gave the phrase a distinctly different connotation. Nor was Winthrop referring to any nation, but rather a discrete community of English Protestant believers. Notably, Winthrop’s sermon had been neglected for centuries. It was resurrected only in the 1940s by a few Harvard academics who were engaged in an intellectual rehabilitation of Puritan thought. In a 1961 speech, John F Kennedy, who had been a Harvard student and was influenced by that university’s Americanists, used the ‘city upon a hill’ phrase. The idea of the US as a ‘city upon a hill’, however, really gained purchase in political rhetoric in the 1970s and ’80s, as Reagan sought to reinvent the country.

Without question, Reagan saw the US as an exceptional nation. The language of exceptionalism, however, derived from Marxism, not God. The idea of a morally superior and unique civilisation destined to guide the world did not come under the banner of an orthodox ‘ism’ until very recently, until the 21st century. In the wake of 9/11, the speeches of George W Bush and his supporters asserted the radical distinctiveness of the US with a new belligerence. We have all heard it: it is ‘our freedoms’ that Islamic terrorists hated; they wished to kill Americans because they envied this exceptional inheritance.

The global financial crisis of 2007-10 added to the geopolitical turmoil that followed 9/11. Though the US economy expanded in the 1990s and early 2000s, economic inequality that began to grow in the Reagan era also became worse. In the post-1945 age, when academics first posed American exceptionalism as a coherent doctrine, the idea also became linked to global US military and political hegemony. In the past two generations, since the Reagan era, Americans have not prospered to the same extent, and American exceptionalism has been increasingly linked only to military hegemony.

Decline is, in fact, the midwife to the ideology of American exceptionalism. The less exceptional that circumstances in the US appear, the louder defenders of exceptionalism insist on orthodoxy. When the nation was indisputably powerful and its people prosperous, Americans did not collectively require an ‘ism’ to serve as a guiding light. In these more polarised times, when the fates of Americans become based more on their class and less on their shared nationality, the ideological orthodoxy of American exceptionalism has emerged on a political level. A previously obscure academic term became a rallying cry for a political agenda.

When Hillary Clinton joins the exceptionalist bandwagon, it reflects a political consensus that Donald Trump denies. In wanting to make America great again, Trump implicitly accepts that it is not currently ‘great’, and never was exceptional. No longer is the Republican Party the chief cheerleader of American exceptionalism. But the Democrats have picked up the mantle, and the language of exceptionalism continues to rally a party and a country.Aeon counter – do not remove

Ian Tyrrell

This article was originally published at Aeon and has been republished under Creative Commons.


Bragging rights: when beating your own drum helps (or hurts)

By Patrick Heck | He is a PhD candidate in social psychology at Brown University in Rhode Island, where he studies the Self, social judgment and decision making, and prosocial behavior. Via Creative Commons Attribution-No Derivatives

Social observers are particularly attuned to braggadocio. What do you think of a person who claims to be a better driver, performer or lover than average? Is this person better described as confident or cocky; self-important or honest? Would you put your health or safety in their hands? And what about the opposite type of person, who claims to be worse than others? Would you hire this person for a job? In what field?

Social scientists have been asking for decades whether boastful, self-aggrandising beliefs and behaviours are beneficial to those who make such claims. According to one school of thought, claiming to be better than others feels good, and when we feel good, we are happier and better adjusted. This argument suggests that bragging to others can satisfy the motive to craft and maintain a positive self-image. According to another line of research, however, consistently viewing oneself as superior entails a distortion of reality. Inaccurate individuals with low self-knowledge have weaker relationships and a tendency to make riskier decisions than their accurate, self-aware counterparts.

Together with Joachim Krueger at Brown University in Rhode Island, I recently proposed a middle ground: braggadocio could be a double-edged sword. In our paper in the journal Social Psychology, we argue that thinking you are better than average (and bragging to others about it) can damage some aspects of your reputation but boost others. Bragging can help or harm depending upon your goals – so you’d do well to know what you want to accomplish before tooting your own horn.

To test how observers respond to braggadocio and humility, we recruited nearly 400 volunteers and asked them to rate a series of target individuals along the two major dimensions of social perception: competence, including rationality, intelligence and naiveté, and morality, including ethics, trustworthiness and selfishness. Some of the targets were defined as performing better or worse than average without making claims Some claimed to be better or worse than average without any evidence. Others both made a claim about themselves (‘I did better/worse than average’) while researchers revealed their scores.

The results demonstrated several detrimental effects of boasting, although we observed some surprising benefits too. Perhaps the most interesting finding was what we call the ‘humility paradox’. In the absence of evidence (ie, a test score), bragging to be better than average boosted a target’s reputation as competent, but diminished their reputation as moral. Conversely, those who remained humble by claiming to be worse than average were rated as more moral and less competent than the braggarts. The paradox suggests that when deciding whether or not to boast about your performance, keen decision-makers might first stop to consider which aspect of reputation they are most interested in emphasising or protecting.

The results were especially nuanced when test subjects rated targets whose claims were either validated or violated by objective evidence (their actual test performance). For moral reputations, humility remained a beneficial strategy even when a target performed well. Across the board, participants rated targets who claimed to be worse than average as more moral than targets who claimed to be better than average, regardless of their actual performance. In the domain of morality, humility pays.

For perceived competence, evidence mattered. The absolute worst thing a target could do was to claim superiority (‘I am better than average’) when the evidence proved him wrong (‘Harry actually scored below average on the test’).

There was, to be sure, some strategic benefit to making a boastful claim: targets who claimed to be better than average were seen as quite competent either when:

(a) evidence supported this claim; or

(b) no evidence was available.

In other words, boasting appeared to benefit a target’s reputation as competent, so long as contradictory evidence was never revealed.

As is the case with most experiments in social psychology, these studies were conducted in a contrived laboratory setting, and carry several limitations. All our participants lived in the United States, although we know that cultural background can encourage or discourage boasting. Similarly, all the targets that our participants rated had male names in order to rule out any confounding effects of gender, even though we know that the gender of observers and targets plays an important role in social perception. Culture and gender are two variables we would like to incorporate in future studies on the nature and perception of bragging.

Despite these limitations, the results of our studies suggest a few strategies for daily life: in situations where your competence is of critical interest (such as a job interview or debate), claiming to be better than the other candidates could be beneficial, so long as contradictory evidence will never come to light. But in situations where your reputation as a warm or moral person is put to the test (say, while networking or on a date), it appears that humility is the best strategy, even if you truly have something to brag about. Aeon counter – do not remove

Patrick Heck

This article was originally published at Aeon and has been republished under Creative Commons.


Why rudeness at work is contagious and difficult to stop

By Trevor Foulk | He is a PhD candidate in business administration at the University of Florida. He is interested in negative work behaviours, team dynamics, decision-making, and depletion/recovery. Creative Commons Attribution-No Derivatives


Most people can relate to the experience of having a colleague inexplicably treat them rudely at work. You’re not invited to attend a meeting. A co-worker gets coffee – for everyone but you. Your input is laughed at or ignored. You wonder: where did this come from? Did I do something? Why would he treat me that way? It can be very distressing because it comes out of nowhere and often we just don’t understand why it happened.

A large and growing body of research suggests that such incidents, termed workplace incivility or workplace rudeness, are not only very common, but also very harmful. Workplace rudeness is not limited to one industry, but has been observed in a wide variety of settings in a variety of countries with different cultures. Defined as low-intensity deviant behaviour with ambiguous intent to harm, these behaviours – small insults, ignoring someone, taking credit for someone’s work, or excluding someone from office camaraderie – seem to be everywhere in the workplace. The problem is that, despite their ‘low-intensity’ nature, the negative outcomes associated with workplace rudeness are anything but small or trivial.

It would be easy to believe that rudeness is ‘no big deal’ and that people must just ‘get over it’, but more and more researchers are finding that this is simply not true. Experiencing rudeness at work has been associated with decreased performance, decreased creativity, and increased turnover intentions, to name just a few of the many negative outcomes of these behaviours. In certain settings, these negative outcomes can be catastrophic – for example, a recent article showed that when medical teams experienced even minor insults before performing a procedure on a baby, the rudeness decimated their performance and led to mortality (in a simulation). Knowing how harmful these behaviours can be, the question becomes: where do they come from, and why do people do them?

While there are likely many reasons people behave rudely, at least one explanation that my colleagues and I have recently explored is that rudeness seems to be ‘contagious’. That is, experiencing rudeness actually causes people to behave more rudely themselves. Lots of things can be contagious – from the common cold, to smiling, yawning and other simple motor actions, to emotions (being around a happy person typically makes you feel happy). And as it turns out, being around a rude person can actually make you rude. But how?

There are two ways in which behaviours and emotions can be contagious. One is through a conscious process of social learning. For example, if you’ve recently taken a job at a new office and you notice that everybody carries a water bottle around, it likely won’t be long until you find yourself carrying one, too. This type of contagion is typically conscious. If somebody said: ‘Why are you carrying that water bottle around?’, you would say: ‘Because I saw everybody else doing it and it seemed like a good idea.’

Another pathway to contagion is unconscious: research shows that when you see another person smiling, or tapping a pencil, for example, most people will mimic those simple motor behaviours and smile or tap a pencil themselves. If someone were to ask why you’re smiling or tapping your pencil, you’d likely answer: ‘I have no idea.’

In a series of studies, my colleagues and I found evidence that rudeness can become contagious through a non-conscious, automatic pathway. When you experience rudeness, the part of your brain responsible for processing rudeness ‘wakes up’ a little bit, and you become a little more sensitive to rudeness. This means that you’re likely to notice more rude cues in your environment, and also to interpret ambiguous interactions as rude. For example, if someone said: ‘Hey, nice shoes!’ you might normally interpret that as a compliment. If you’ve recently experienced rudeness, you’re more likely to think that person is insulting you. That is, you ‘see’ more rudeness around you, or at least you think you do. And because you think others are being rude, you become more likely to behave rudely yourself.

You might be wondering, how long does this last? Without more research it’s impossible to say for sure, but in one of our studies we saw that experiencing rudeness caused rude behaviour up to seven days later. In this study, which took place in a negotiations course at a university, participants engaged in negotiations with different partners. We found that when participants negotiated with a rude partner, in their next negotiation their partner thought they behaved rudely. In this study, some of the negotiations took place with no time lag, sometimes there was a three-day time lag, and sometimes there was a seven-day time lag. To our surprise, we found that the time lag seemed to be unimportant, and at least within a seven-day window the effect did not appear to be wearing off.

Unfortunately, because the rudeness is contagious and unconscious, it’s hard to stop. So what can be done? Our work points to a need to re-examine the types of behaviours that are tolerated at work. More severe deviant behaviours, such as abuse, aggression and violence, are not tolerated because their consequences are blatant. While rudeness of a more minor nature makes its consequences a little harder to observe, it is no less real and no less harmful, and thus it might be time to question whether we should tolerate these behaviours at work.

You might be thinking that it will be impossible to end workplace rudeness. But work cultures can change. Workers once used to smoke at their desks, and those same workers would have said it was a natural part of office life that couldn’t be removed. Yet workplace smoking is verboten everywhere now. We’ve drawn the line at smoking and discrimination – and rudeness should be the next to go.Aeon counter – do not remove

Trevor Foulk

This article was originally published at Aeon and has been republished under Creative Commons.


Moderation may be the most challenging and rewarding virtue

By Aurelian Craiutu

He is a professor of political science and adjunct professor of American studies at Indiana University, Bloomington. His most recent book is Faces of Moderation: The Art of Balance in an Age of Extremes (2016). He lives in Bloomington.

Three centuries ago, the French political philosopher Montesquieu claimed that human beings accommodate themselves better to the middle than to the extremes. Only a few decades later, George Washington begged to differ. In his Farewell Address (1796), the first president of the United States sounded a warning signal against the pernicious effects of the spirit of party and faction. The latter, he argued, has its roots in the strongest passions of the human mind and can be seen in ‘its greatest rankness’ in popular government where the competition and rivalry between factions are ‘sharpened by the spirit of revenge’ and immoderation.

If one looks at our world today, we might be tempted to side with Washington over Montesquieu. Our political scene offers a clear sign of the little faith we seem to have in this virtue without which, as John Adams memorably put it in 1776, ‘every man in power becomes a ravenous beast of prey’. Although our democratic institutions depend on political actors exercising common sense, self-restraint and moderation, we live in a world dominated by hyperbole and ideological intransigence in which moderates have become a sort of endangered species in dire need of protection. Can we do something about that to save them from extinction? To answer this question, we should take a new look at moderation, which Edmund Burke regarded as a difficult virtue, proper only to noble and courageous minds. What does it mean to be a moderate voice in political and public life? What are the principles underlying moderation? What do moderates seek to achieve in society, and how do they differ from more radical or extremist minds?

Continue reading

Why bureaucrats matter in the fight to preserve the rule of law

By Melissa Lane
Ms. Lane is the Class of 1943 professor of politics and director of the University Center for Human Values at Princeton University. She is the author of a number of books, including Eco-Republic (2011/2012) and The Birth of Politics (2015), and has appeared often on the ‘In Our Time’ radio broadcast on BBC Radio 4. Via Creative Commons Attribution-No Derivatives

Socrates, while serving on the Athenian Council, sought to prevent it from making an illegal decision. Martin Luther, when a council convened by the Emperor Charles V in 1521 told him to recant, is said to have declared: ‘Here I stand; I can do no other.’ The United States’ attorney general Elliot Richardson and the deputy attorney general William D Ruckelshaus both chose to resign in 1973 rather than obey President Richard Nixon’s order to fire the special prosecutor investigating Watergate. More recently, the acting attorney general Sally Yates was fired after she announced that the US Department of Justice would not cooperate in enforcing President Donald Trump’s executive order against Muslim immigrants. They all said no. Each of them, for reasons of principle, opposed an order from a higher authority (or sought to prevent its issuance). They are exceptional figures, in extraordinary circumstances. Yet most of the time, the rule of law is more mundane: it depends on officials carrying out their ordinary duties within the purposes of the offices they hold, and on citizens obeying them. That is to say, the rule of law relies upon obedience by bureaucrats, and obedience of bureaucrats – but crucially, within the established norms of the state.

The ancient Greeks made no sharp distinction between political rulers and bureaucratic officials. They considered anyone in a position of constitutional authority as the holder of an office. The ancient Greek world did not have a modern bureaucracy but they did confront the question of respect for norms of office and of obedience to office-holders. Plato addresses these questions, in both the Republic and the Laws, in relation to the danger of usurpation of democracy by a budding tyrant.

Of course, Plato was no democrat. But he did recognise the value of liberty – most explicitly in the Laws, where he posited liberty, wisdom and friendship as the three values that ought to guide the work of government. Plato wanted to balance liberty with what we would call the rule of law. For him, that included not only obedience to the law, but also obedience to the officials who have to carry it out. In the Republic’s portrait of democracy (in some ways a caricature, to be sure), he warns against drinking liberty unmixed with obedience, likening it to wine unmixed with water – a serious social solecism for the ancient Greeks. Doing so, he thinks, can lead to a deterioration of the norms of political office. Too much liberty might lead to the point that a city ‘insults those who obey the rulers as willing slaves and good-for-nothings, and praises and honours, both in public and in private, rulers who behave like subjects and subjects who behave like rulers’ (translation by G.M.A. Grube revised by C.D.C. Reeve, in John M. Cooper (ed.) Plato. Complete Works (Indianapolis: Hackett, 1997)).

To insult ‘those who obey the rulers’ by calling them ‘willing slaves’ is to reject the value of a norm of obedience to state office-holders. No constitution – no organisation of power into authority – can long subsist if the authority of its officials routinely merits defiance. The resister might be heroic and her actions could sometimes be necessary, but she must be an exceptional rather than an everyday case. Officials who defy illegitimate orders must logically always be the exceptions to a general rule of obeying orders, lest the very meaning of their defiance evaporate. Any conception of liberty, or any practice of government, that rejects the need for obedience to the norms of office, will destroy itself. So Plato reaffirms in the Laws that ‘complete freedom (eleutheria) from all rulers (archōn) is infinitely worse than submitting to a moderate degree of control’.

The statebuilding efforts of medieval and early modern Europe are great and complex endeavours, with their own rich histories. In relation to the rule of law and role of the bureaucrats, we can think of their papal chanceries, state treasuries and imperial ministries as a kind of foundation on which modern reformers and rulers and revolutionaries alike would build liberalism and the rule of law. These bureaucracies constituted the tools of power for rulers. In providing the impartial officials, rule of law procedures and institutional forms of equality, bureaucracy constituted the mechanisms to vouchsafe people’s rights. Liberal reformers used these very mechanisms to try to extend wider rights and liberties to more and more groups.

Max Weber, the influential early 20th-century German sociologist, feared that bureaucracy would be part of the over-rationalisation that he described as a looming ‘iron cage’. He feared it would grow too powerful, choking off meaning, value and political responsibility in its means-ends instrumental rationality. If Weber had lived a few years longer (he died at only 56, in 1920) and had been asked to speak about the crisis of liberalism in the young Weimar Republic, I think he would have expressed the concern (already present in his last writings) that no sufficiently charismatic and powerful politicians would emerge who would be able to bring the bureaucracy to heel. He saw bureaucracy as a major threat to modern life. The fear that the bureaucracy itself was vulnerable to tyrannical usurpation would not likely have crossed his mind.

Today, the US faces the threat of what we can think of as the political iron cage breaking down – possibly from executive leadership ignorant or contemptuous of the purposes of the organisation. Though obviously it has accelerated, the threat is not entirely new with the Trump administration. When president in the 1980s, Ronald Reagan pioneered the nomination of cabinet secretaries committed to abolishing or drastically curtailing the very agencies they were named to head. President George W Bush named agency administrators such as Michael D Brown, who lacked knowledge of his area of responsibility, as head of the Federal Emergency Management Agency. Brown’s eventual resignation in 2005 in the aftermath of Hurricane Katrina betokened not heroic defiance but a reaction to the storm of criticism for his lackadaisical response to the crisis. These public officials were not committed to the basic purposes and processes of the bureaucracies they were appointed to lead or serve.

To be sure, we must not be blind to the ways in which the machinery of state will remain a major resource for parties and politicians who seek to control and to advance their own ends. My point is that, while aspects of this machinery might remain intact, challenges to evidence-based reasoning, fair procedure and impartial officialdom – to the whole apparatus of bureaucratic office and the rule of law – threaten to corrode it. Whether in the long run the machinery itself can withstand this corrosion is an open question.

There is an irony here. Weber’s fear was that the iron cage of rationalising modernity, including bureaucracy, would stifle liberty, meaning and ultimate value, squeezing out responsible, charismatic politicians. Yet today, faced with the menace of charismatic, reckless politicians, what Weber feared as an iron cage appears to us to be the building block of some of history’s most hard-won rights. Plato looks more prescient: long ago he warned of both the charismatic but irresponsible politicians, and the insouciant, irresponsible officials who serve them, who risk eroding the norms of office on which the values of the rule of law and liberty rest.Aeon counter – do not remove

Melissa Lane

This article was originally published at Aeon and has been republished under Creative Commons.


Massimo Pigliucci: To be happier, focus on what’s within your control

by Massimo Pigliucci
(This article was originally published at Aeon and has been republished under Creative Commons)

God, grant me the serenity to accept the things I cannot change,
Courage to change the things I can,
And wisdom to know the difference.

This is the Serenity Prayer, originally written by the American theologian Reinhold Niebuhr around 1934, and commonly used by Alcoholics Anonymous and similar organisations. It is not just a key step toward recovery from addiction, it is a recipe for a happy life, meaning a life of serenity arrived at by consciously taking what life throws at us with equanimity.

The sentiment behind the prayer is very old, found in 8th-century Buddhist manuscripts, as well as in 11th-century Jewish philosophy. The oldest version I can think of, however, goes back to the Stoic philosopher Epictetus. Active in the 2nd century in Rome and then Nicopolis, in western Greece, Epictetus argued that:

We are responsible for some things, while there are others for which we cannot be held responsible. The former include our judgment, our impulse, our desire, aversion and our mental faculties in general; the latter include the body, material possessions, our reputation, status – in a word, anything not in our power to control. … [I]f you have the right idea about what really belongs to you and what does not, you will never be subject to force or hindrance, you will never blame or criticise anyone, and everything you do will be done willingly. You won’t have a single rival, no one to hurt you, because you will be proof against harm of any kind.

I call this Epictetus’ promise: if you truly understand the difference between what is and what is not under your control, and act accordingly, you will become psychologically invincible, impervious to the ups and downs of fortune.

Of course, this is far easier said than done. It requires a lot of mindful practice. But I can assure you from personal experience that it works. For instance, last year I was in Rome, working, as it happened, on a book on Stoicism. One late afternoon I headed to the subway stop near the Colosseum. As soon as I entered the crowded subway car, I felt an unusually strong resistance to moving forward. A young fellow right in front of me was blocking my way, and I couldn’t understand why. Then the realisation hit, a second too late. While my attention was focused on him, his confederate had slipped his hand in my left front pocket, seized my wallet, and was now stepping outside of the car, immediately followed by his accomplice. The doors closed, the train moved on, and I found myself with no cash, no driver’s licence, and a couple of credit cards to cancel and replace.

Before I started practising Stoicism, this would have been a pretty bad experience, and I would not have reacted well. I would have been upset, irritated and angry. This foul mood would have spilled over the rest of the evening. Moreover, the shock of the episode, as relatively mild as the attack had been, would have probably lasted for days, with a destructive alternation of anger and regret.

But I had been practicing Stoicism for a couple of years. So my first thought was of Epictetus’ promise. I couldn’t control the thieves in Rome, and I couldn’t go back and change what had happened. I could, however, accept what had happened and file it away for future reference, focusing instead on having a nice time during the rest of my stay. After all, nothing tragic had happened. I thought about this. And it worked. I joined my evening company, related what happened, and proceeded to enjoy the movie, the dinner, and the conversation. My brother was amazed that I took things with such equanimity and that I was so calm about it. But that’s precisely the power of internalising the Stoic dichotomy of control.

And its efficacy is not limited to minor life inconveniences, as in the episode just described. James Stockdale, a fighter-jet pilot during the Vietnam War, was shot down and spent seven and a half years in Hoa Lo prison, where he was tortured and often put in isolation. He credits Epictetus for surviving the ordeal by immediately applying the dichotomy of control to his extreme situation as a captive, which not only saved his life, but also allowed him to coordinate the resistance from inside the prison, in his position as senior ranking officer.

Most of us don’t find ourselves in Stockdale’s predicament, but once you begin paying attention, the dichotomy of control has countless applications to everyday life, and all of them have to do with one crucial move: shifting your goals from external outcomes to internal achievements.

For example, let’s say that you are preparing your résumé for a possible job promotion. If your goal is to get the promotion, you are setting yourself up for a possible disappointment. There is no guarantee that you will get it, because the outcome is not (entirely) under your control. Sure, you can influence it, but it also depends on a number of variables that are independent of your efforts, including possible competition from other employees, or perhaps the fact that your boss, for whatever unfathomable reason, really doesn’t like you.

That’s why your goal should be internal: if you adopt the Stoic way, you would conscientiously put together the best résumé that you can, and then mentally prepare to accept whatever outcome with equanimity, knowing that sometimes the universe will favour you, and other times it will not. What do you gain by being anxious over something you don’t control? Or angry at a result that was not your doing? You are simply adding a self-inflicted injury to the situation, compromising your happiness and serenity.

This is no counsel for passive acceptance of whatever happens. After all, I just said that your goal should be to put together the best résumé possible! But it is the mark of a wise person to realise that things don’t always go the way we wish. If they don’t, the best counsel is to pick up the pieces, and move on.

Do you want to win that tennis match? It is outside of your control. But to play the best game you can is under your control. Do you want your partner to love you? It is outside of your control. But there are plenty of ways you can choose to show your love to your partner – and that is under your control. Do you want a particular political party to win the election? It is outside of your control (unless you’re Vladimir Putin!) But you can choose to engage in political activism, and you can vote. These aspects of your life are under your control. If you succeed in shifting your goals internally, you will never blame or criticise anyone, and you won’t have a single rival, because what other people do is largely beyond your control and therefore not something to get worked up about. The result will be an attitude of equanimity toward life’s ups and downs, leading to a more serene life.Aeon counter – do not remove

Massimo Pigliucci  is professor of philosophy at City College and at the Graduate Center of the City University of New York. His latest book is How to Be a Stoic: Ancient Wisdom for Modern Living (May, 2017). He lives in New York.

This article was originally published at Aeon and has been republished under Creative Commons.


What know-it-alls don’t know, or the illusion of competence

by Kate Fehlhaber (This article was originally published at Aeon and has been republished under Creative Commons).


One day in 1995, a large, heavy middle-aged man robbed two Pittsburgh banks in broad daylight. He didn’t wear a mask or any sort of disguise. And he smiled at surveillance cameras before walking out of each bank. Later that night, police arrested a surprised McArthur Wheeler. When they showed him the surveillance tapes, Wheeler stared in disbelief. ‘But I wore the juice,’ he mumbled. Apparently, Wheeler thought that rubbing lemon juice on his skin would render him invisible to videotape cameras. After all, lemon juice is used as invisible ink so, as long as he didn’t come near a heat source, he should have been completely invisible.

Police concluded that Wheeler was not crazy or on drugs – just incredibly mistaken.

The saga caught the eye of the psychologist David Dunning at Cornell University, who enlisted his graduate student, Justin Kruger, to see what was going on. They reasoned that, while almost everyone holds favourable views of their abilities in various social and intellectual domains, some people mistakenly assess their abilities as being much higher than they actually are. This ‘illusion of confidence’ is now called the ‘Dunning-Kruger effect’, and describes the cognitive bias to inflate self-assessment.

To investigate this phenomenon in the lab, Dunning and Kruger designed some clever experiments. In one study, they asked undergraduate students a series of questions about grammar, logic and jokes, and then asked each student to estimate his or her score overall, as well as their relative rank compared to the other students. Interestingly, students who scored the lowest in these cognitive tasks always overestimated how well they did – by a lot. Students who scored in the bottom quartile estimated that they had performed better than two-thirds of the other students!

This ‘illusion of confidence’ extends beyond the classroom and permeates everyday life. In a follow-up study, Dunning and Kruger left the lab and went to a gun range, where they quizzed gun hobbyists about gun safety. Similar to their previous findings, those who answered the fewest questions correctly wildly overestimated their knowledge about firearms. Outside of factual knowledge, though, the Dunning-Kruger effect can also be observed in people’s self-assessment of a myriad of other personal abilities. If you watch any talent show on television today, you will see the shock on the faces of contestants who don’t make it past auditions and are rejected by the judges. While it is almost comical to us, these people are genuinely unaware of how much they have been misled by their illusory superiority.

Sure, it’s typical for people to overestimate their abilities. One study found that 80 per cent of drivers rate themselves as above average – a statistical impossibility. And similar trends have been found when people rate their relative popularity and cognitive abilities. The problem is that when people are incompetent, not only do they reach wrong conclusions and make unfortunate choices but, also, they are robbed of the ability to realise their mistakes. In a semester-long study of college students, good students could better predict their performance on future exams given feedback about their scores and relative percentile. However, the poorest performers showed no recognition, despite clear and repeated feedback that they were doing badly. Instead of being confused, perplexed or thoughtful about their erroneous ways, incompetent people insist that their ways are correct. As Charles Darwin wrote in The Descent of Man (1871): ‘Ignorance more frequently begets confidence than does knowledge.’

Interestingly, really smart people also fail to accurately self-assess their abilities. As much as D- and F-grade students overestimate their abilities, A-grade students underestimate theirs. In their classic study, Dunning and Kruger found that high-performing students, whose cognitive scores were in the top quartile, underestimated their relative competence. These students presumed that if these cognitive tasks were easy for them, then they must be just as easy or even easier for everyone else. This so-called ‘imposter syndrome’ can be likened to the inverse of the Dunning-Kruger effect, whereby high achievers fail to recognise their talents and think that others are equally competent. The difference is that competent people can and do adjust their self-assessment given appropriate feedback, while incompetent individuals cannot.

And therein lies the key to not ending up like the witless bank robber. Sometimes we try things that lead to favourable outcomes, but other times – like the lemon juice idea – our approaches are imperfect, irrational, inept or just plain stupid. The trick is to not be fooled by illusions of superiority and to learn to accurately reevaluate our competence. After all, as Confucius reportedly said, real knowledge is knowing the extent of one’s ignorance.Aeon counter – do not remove

Kate Fehlhaber is the editor in chief of Knowing Neurons and a PhD candidate in neuroscience at the University of California, Los Angeles. She lives in Los Angeles.

This article was originally published at Aeon and has been republished under Creative Commons.


Wars are not won by military genius or decisive battles

by Cathal J Nolan
(This article was originally published at Aeon and has been republished under Creative Commons).

War is the most complex, physically and morally demanding enterprise we undertake. No great art or music, no cathedral or temple or mosque, no intercontinental transport net or particle collider or space programme, no research for a cure for a mass-killing disease receives a fraction of the resources and effort we devote to making war. Or to recovery from war and preparations for future wars invested over years, even decades, of tentative peace. War is thus far more than a strung-together tale of key battles. Yet, traditional military history presented battles as fulcrum moments where empires rose or fell in a day, and most people still think that wars are won that way, in an hour or an afternoon of blood and bone. Or perhaps two or three. We must understand the deeper game, not look only to the scoring. That is hard to do because battles are so seductive.

War evokes our fascination with spectacle, and there is no greater stage or more dramatic players than on a battlefield. We are drawn to battles by a lust of the eye, thrilled by a blast from a brass horn as Roman legionaries advance in glinting armour or when a king’s wave releases mounted knights in a heavy cavalry charge. Grand battles are open theatre with a cast of many tens of thousands: samurai under signal kites, mahouts mounted on elephants, a Zulu impi rushing over lush grass toward a redcoat firing line. Battles open with armies dressed in red, blue or white, flags fluttering, fife and drums beating the advance. Or with the billowing canvas of a line of fighting sail, white pufferies erupting in broadside volleys. Or a wedge of tanks hard-charging over the Russian steppe. What comes next is harder to comprehend.

Regimental Combat Team 6, 1st Battalion, 6th Marine Regiment DOD Photo Taken By Cpl James Clark | 01.20.2012 (Diplopundit selected this photo for the article)

The idea of the ‘decisive battle’ as the hinge of war, and wars as the gates of history, speaks to our naive desire to view modern war in heroic terms. Popular histories are written still in a drums-and-trumpets style, with vivid depictions of combat divorced from harder logistics, daily suffering, and a critical look at the societies and cultures that produced mass armies and sent them off to fight in faraway fields for causes about which the average soldier knew nothing.

Visual media especially play on what the public wants to see: raw courage and red days, the thrill of vicarious violence and spectacle. This is the world of war as callow entertainment, of Quentin Tarantino’s Inglourious Basterds (2009) or Brad Pitt in Fury (2014). It’s not the world of real Nazis or real war.

Battles also entice generals and statesmen with the idea that a hard red day can be decisive, and allow us to avoid attrition, which we all despise as morally vulgar and without redemptive heroism. We fear to find only indecision and tragedy without uplift or morality in trench mud, or roll calls of dead accumulating over years of effort and endurance. Instead, we raise battles to summits of heroism and generals to levels of genius that history cannot support. Though some historians might try, celebrating even failed campaigns as glorious. Prussia is wrecked, yet Frederick is the greatest of Germans. France is beaten and an age is named for Louis XIV, another for Napoleon. Europe lies in ruin, but German generals displayed genius with Panzers.

Whether or not we agree that some wars were necessary and just, we should look straight at the grim reality that victory was most often achieved in the biggest and most important wars by attrition and mass slaughter – not by soldierly heroics or the genius of command. Winning at war is harder than that. Cannae, Tours, Leuthen, Austerlitz, Tannenberg, Kharkov – all recall sharp images in a word. Yet winning such lopsided battles did not ensure victory in war. Hannibal won at Cannae, Napoleon at Austerlitz, Hitler at Sedan and Kiev. All lost in the end, catastrophically.

There is heroism in battle but there are no geniuses in war. War is too complex for genius to control. To say otherwise is no more than armchair idolatry, divorced from real explanation of victory and defeat, both of which come from long-term preparation for war and waging war with deep national resources, bureaucracy and endurance. Only then can courage and sound generalship meet with chance in battle and prevail, joining weight of materiel to strength of will to endure terrible losses yet win long wars. Claims to genius distance our understanding from war’s immense complexity and contingency, which are its greater truths.

Modern wars are won by grinding, not by genius. Strategic depth and resolve is always more important than any commander. We saw such depth and resilience in Tsarist Russia in 1812, in France and Britain in the First World War, in the Soviet Union and the United States during the Second World War, but not in Carthage or overstretched Nazi Germany or overreaching Imperial Japan. The ability to absorb initial defeats and fight on surpassed any decision made or battle fought by Hannibal or Scipio, Lee or Grant, Manstein or Montgomery. Yes, even Napoleon was elevated as the model of battle genius by Clausewitz and in military theory ever since, despite his losing by attrition in Spain, and in the calamity of the Grand Armée’s 1812 campaign in Russia. Waterloo was not the moment of his decisive defeat, which came a year earlier. It was his anticlimax.

Losers of most major wars in modern history lost because they overestimated operational dexterity and failed to overcome the enemy’s strategic depth and capacity for endurance. Winners absorbed defeat after defeat yet kept fighting, overcoming initial surprise, terrible setbacks and the dash and daring of command ‘genius’. Celebration of genius generals encourages the delusion that modern wars will be short and won quickly, when they are most often long wars of attrition. Most people believe attrition is immoral. Yet it’s how most major wars are won, aggressors defeated, the world remade time and again. We might better accept attrition at the start, explain that to those we send to fight, and only choose to fight the wars worth that awful price. Instead, we grow restless with attrition and complain that it’s tragic and wasteful, even though it was how the Union Army defeated slavery in America, and Allied and Soviet armies defeated Nazism.

With humility and full moral awareness of its terrible costs, if we decide that a war is worth fighting, we should praise attrition more and battle less. There is as much room for courage and character in a war of attrition as in a battle. There was character aplenty and courage on all sides at Verdun and Iwo Jima, in the Hürtgen Forest, in Korea. Character counts in combat. Sacrifice by soldiers at Shiloh or the Marne or Kharkov or Juno Beach or the Ia Drang or Korengal Valley were not mean, small or morally useless acts. Victory or defeat by attrition, by high explosive and machine gun over time, does not annihilate all moral and human meaning.

The Allure of Battle: A History of How Wars Have Been Won and Lost by Cathal Nolan is out now through Oxford University Press.Aeon counter – do not remove

Cathal J Nolan  teaches military history at Boston University. He is the author of The Allure of Battle: A History of How Wars Have Been Won and Lost(2017).

This article was originally published at Aeon and has been republished under Creative Commons.



What every dictator knows: young men are natural fanatics


by Joe Herbert, emeritus professor of neuroscience at the Cambridge Centre for Brain Repair at the University of Cambridge. His latest book is Testosterone: Sex, Power, and the Will to Win (2015). This article was originally published at Aeon and has been republished under Creative Commons.

Young men are particularly liable to become fanatics. Every dictator, every guru, every religious leader, knows this. Fanatics have an overwhelming sense of identity based on a cause (a religion) or a community (gang, team), and a tight and exclusive bond with other members of that group. They will risk injury, loss or even death for the sake of their group. They regard everyone else as outsiders, or even enemies. But why are so many of them young males?

In a world of nation-states, young men fought the wars that formed most countries. The same goes for tribes, villages and factions. Young males have qualities that specialize them for this essential function. They readily identify with their group. They form close bonds with its other members. They are prone to follow a strong leader. This is why young males are so vulnerable to environmental influences, such as the prevailing culture in which they happen to live, and why they are so easily attracted by charismatic leaders or lifestyles that promise membership of restricted groups with sharply defined objectives and values. They like taking risks on behalf of their group – and they usually underestimate the danger that such risks represent. If they didn’t have these properties, they would be less willing to go to war, and therefore less able to fulfil one of their essential sociobiological roles.

Why are young men like this? Part of it seems to depend on testosterone, acting on their brain during early foetal life. Exposure in the womb ‘masculinises’ the brain – giving it certain properties, including sexual identity as a male, as well as a preference for play patterns that involve physical contact and even play fights. We know this because girls exposed to abnormal levels of testosterone during this time show similar behaviour, but much less otherwise. At puberty, there is another surge of testosterone acting on this already-prepared brain: this not only awakens sexuality, but encourages various strategies for competing for a mate – including the use of aggression and risk-taking behaviour. But testosterone is far from the only factor in making a fanatic.

Testosterone acts on an ancient part of the brain, the limbic system. The human limbic system looks very like that in other primates, such as chimpanzees, and is even easily recognisable in rats. But this part of the human brain is regulated by a more recent addition: the frontal lobes, which lie behind your forehead. Folk usage recognises their importance: in a hangover from the age of physiognomy, we call bright people ‘highbrow’, reflecting their tall foreheads (and thus their assumed larger frontal lobes). Among their other functions, the frontal lobes are important for personality, social interactions ­– and restraint. Damage to them results in impaired and inappropriate social behaviour, as well as lack of judgment.

Crucially, males’ frontal lobes don’t fully mature until their late 20s, whereas those of women mature earlier. This part of the brain is highly reactive to social cues and the behaviour of other people. The stereotyped young man – loud, risky, unreasonable, aggressive (but also non-conformist and thus innovative) – might be one result. So while it’s an evolutionary advantage to the group as a whole, a combination of rampant testosterone and an immature frontal lobe also explains why young men like taking risks and why they are liable to fanaticism.

Of course, not all young men, even the fanatics, become terrorists. Young men are not all the same. Different outcomes might be due to different social factors. Many terrorists come from criminal or deprived backgrounds. We know that a neglected or abusive childhood can result in antisocial or deviant behaviour later in life. An individual’s social environment, particularly early in life, can have long-lasting behavioural implications. We are beginning to learn something about how these conditions can result in persistent or even permanent changes to the brain, but so far we cannot do much about undoing them. We call people who have disregard for normal human relationships ‘psychopaths’, implying that they have abnormal (pathological) events in their ‘psyche’ (mind). We also know that there are people who develop genetically abnormal social traits (autism is one example) irrespective of upbringing. We do not know the precise defects in the brain that are responsible. Nevertheless, their nature – abnormal social behaviour and inter-personal relationships – points towards the frontal lobes, though other areas of the brain can also be involved.

Social status is prized by the males of many animal species, including humans. Several non-human primates maintain clear-cut dominance rankings. Higher status gives increased access to food, shelter and mates. It’s mostly based on physical prowess, and males fight or threaten each other to determine their relative position.

This also occurs in humans, of course. And yet the human brain has developed other ranking systems, including those based on money, birth or technical ability. The development of projectile weapons has reduced our dependence on muscular strength, but emphasised other traits, such as ruthlessness, bravery and leadership. Within fanatical groups, there is much competition to show qualities that increase a member’s standing with others in the group. This might be particularly attractive to those who, in the rest of life, have little cause to think they rank highly.

Terrorist or aggressive acts, therefore, can be carried out to prove a member’s worth, and attract the kind of attention that seems otherwise unattainable. It’s a modern way to satisfy an ancient biological need, for the respect that individual males crave. In summary, the propensity of the masculine brain is to form bonds with other males (eg street gangs), to recognise and identify with groups, to defend those groups against others, and compete with them for assets. A young male’s hormonal constitution and the way his brain matures together increase his susceptibility to fanaticism, an extreme instance of bonding, and make him prone to taking risk-laden actions on behalf of his group.

The human brain has invented additional categories of identity seemingly unknown in other species, including those based on common beliefs or ethical points of view. Today, identity is increasingly based on beliefs. The huge human brain has enabled the invention of weapons; these have given fanatics increasingly effective means of achieving the primitive aim of dominance by terrorising others. The path to fanaticism will be influenced by a male’s genes, his early experiences, his hormones, the maturity or otherwise of his brain, and the social context in which he finds himself. All these can result in a brain state we label fanaticism, a dangerous mutation of a role that is biologically essential for young men. Our task is to recognise what that brain state might be, how it arises and, if possible, to counter it.Aeon counter – do not remove

Joe Herbert

This article was originally published at Aeon and has been republished under Creative Commons.