Bragging rights: when beating your own drum helps (or hurts)

By Patrick Heck | He is a PhD candidate in social psychology at Brown University in Rhode Island, where he studies the Self, social judgment and decision making, and prosocial behavior. Via Creative Commons Attribution-No Derivatives

Social observers are particularly attuned to braggadocio. What do you think of a person who claims to be a better driver, performer or lover than average? Is this person better described as confident or cocky; self-important or honest? Would you put your health or safety in their hands? And what about the opposite type of person, who claims to be worse than others? Would you hire this person for a job? In what field?

Social scientists have been asking for decades whether boastful, self-aggrandising beliefs and behaviours are beneficial to those who make such claims. According to one school of thought, claiming to be better than others feels good, and when we feel good, we are happier and better adjusted. This argument suggests that bragging to others can satisfy the motive to craft and maintain a positive self-image. According to another line of research, however, consistently viewing oneself as superior entails a distortion of reality. Inaccurate individuals with low self-knowledge have weaker relationships and a tendency to make riskier decisions than their accurate, self-aware counterparts.

Together with Joachim Krueger at Brown University in Rhode Island, I recently proposed a middle ground: braggadocio could be a double-edged sword. In our paper in the journal Social Psychology, we argue that thinking you are better than average (and bragging to others about it) can damage some aspects of your reputation but boost others. Bragging can help or harm depending upon your goals – so you’d do well to know what you want to accomplish before tooting your own horn.

To test how observers respond to braggadocio and humility, we recruited nearly 400 volunteers and asked them to rate a series of target individuals along the two major dimensions of social perception: competence, including rationality, intelligence and naiveté, and morality, including ethics, trustworthiness and selfishness. Some of the targets were defined as performing better or worse than average without making claims Some claimed to be better or worse than average without any evidence. Others both made a claim about themselves (‘I did better/worse than average’) while researchers revealed their scores.

The results demonstrated several detrimental effects of boasting, although we observed some surprising benefits too. Perhaps the most interesting finding was what we call the ‘humility paradox’. In the absence of evidence (ie, a test score), bragging to be better than average boosted a target’s reputation as competent, but diminished their reputation as moral. Conversely, those who remained humble by claiming to be worse than average were rated as more moral and less competent than the braggarts. The paradox suggests that when deciding whether or not to boast about your performance, keen decision-makers might first stop to consider which aspect of reputation they are most interested in emphasising or protecting.

The results were especially nuanced when test subjects rated targets whose claims were either validated or violated by objective evidence (their actual test performance). For moral reputations, humility remained a beneficial strategy even when a target performed well. Across the board, participants rated targets who claimed to be worse than average as more moral than targets who claimed to be better than average, regardless of their actual performance. In the domain of morality, humility pays.

For perceived competence, evidence mattered. The absolute worst thing a target could do was to claim superiority (‘I am better than average’) when the evidence proved him wrong (‘Harry actually scored below average on the test’).

There was, to be sure, some strategic benefit to making a boastful claim: targets who claimed to be better than average were seen as quite competent either when:

(a) evidence supported this claim; or

(b) no evidence was available.

In other words, boasting appeared to benefit a target’s reputation as competent, so long as contradictory evidence was never revealed.

As is the case with most experiments in social psychology, these studies were conducted in a contrived laboratory setting, and carry several limitations. All our participants lived in the United States, although we know that cultural background can encourage or discourage boasting. Similarly, all the targets that our participants rated had male names in order to rule out any confounding effects of gender, even though we know that the gender of observers and targets plays an important role in social perception. Culture and gender are two variables we would like to incorporate in future studies on the nature and perception of bragging.

Despite these limitations, the results of our studies suggest a few strategies for daily life: in situations where your competence is of critical interest (such as a job interview or debate), claiming to be better than the other candidates could be beneficial, so long as contradictory evidence will never come to light. But in situations where your reputation as a warm or moral person is put to the test (say, while networking or on a date), it appears that humility is the best strategy, even if you truly have something to brag about. Aeon counter – do not remove

Patrick Heck

This article was originally published at Aeon and has been republished under Creative Commons.

#


Advertisements

Why rudeness at work is contagious and difficult to stop

By Trevor Foulk | He is a PhD candidate in business administration at the University of Florida. He is interested in negative work behaviours, team dynamics, decision-making, and depletion/recovery. Creative Commons Attribution-No Derivatives

 

Most people can relate to the experience of having a colleague inexplicably treat them rudely at work. You’re not invited to attend a meeting. A co-worker gets coffee – for everyone but you. Your input is laughed at or ignored. You wonder: where did this come from? Did I do something? Why would he treat me that way? It can be very distressing because it comes out of nowhere and often we just don’t understand why it happened.

A large and growing body of research suggests that such incidents, termed workplace incivility or workplace rudeness, are not only very common, but also very harmful. Workplace rudeness is not limited to one industry, but has been observed in a wide variety of settings in a variety of countries with different cultures. Defined as low-intensity deviant behaviour with ambiguous intent to harm, these behaviours – small insults, ignoring someone, taking credit for someone’s work, or excluding someone from office camaraderie – seem to be everywhere in the workplace. The problem is that, despite their ‘low-intensity’ nature, the negative outcomes associated with workplace rudeness are anything but small or trivial.

It would be easy to believe that rudeness is ‘no big deal’ and that people must just ‘get over it’, but more and more researchers are finding that this is simply not true. Experiencing rudeness at work has been associated with decreased performance, decreased creativity, and increased turnover intentions, to name just a few of the many negative outcomes of these behaviours. In certain settings, these negative outcomes can be catastrophic – for example, a recent article showed that when medical teams experienced even minor insults before performing a procedure on a baby, the rudeness decimated their performance and led to mortality (in a simulation). Knowing how harmful these behaviours can be, the question becomes: where do they come from, and why do people do them?

While there are likely many reasons people behave rudely, at least one explanation that my colleagues and I have recently explored is that rudeness seems to be ‘contagious’. That is, experiencing rudeness actually causes people to behave more rudely themselves. Lots of things can be contagious – from the common cold, to smiling, yawning and other simple motor actions, to emotions (being around a happy person typically makes you feel happy). And as it turns out, being around a rude person can actually make you rude. But how?

There are two ways in which behaviours and emotions can be contagious. One is through a conscious process of social learning. For example, if you’ve recently taken a job at a new office and you notice that everybody carries a water bottle around, it likely won’t be long until you find yourself carrying one, too. This type of contagion is typically conscious. If somebody said: ‘Why are you carrying that water bottle around?’, you would say: ‘Because I saw everybody else doing it and it seemed like a good idea.’

Another pathway to contagion is unconscious: research shows that when you see another person smiling, or tapping a pencil, for example, most people will mimic those simple motor behaviours and smile or tap a pencil themselves. If someone were to ask why you’re smiling or tapping your pencil, you’d likely answer: ‘I have no idea.’

In a series of studies, my colleagues and I found evidence that rudeness can become contagious through a non-conscious, automatic pathway. When you experience rudeness, the part of your brain responsible for processing rudeness ‘wakes up’ a little bit, and you become a little more sensitive to rudeness. This means that you’re likely to notice more rude cues in your environment, and also to interpret ambiguous interactions as rude. For example, if someone said: ‘Hey, nice shoes!’ you might normally interpret that as a compliment. If you’ve recently experienced rudeness, you’re more likely to think that person is insulting you. That is, you ‘see’ more rudeness around you, or at least you think you do. And because you think others are being rude, you become more likely to behave rudely yourself.

You might be wondering, how long does this last? Without more research it’s impossible to say for sure, but in one of our studies we saw that experiencing rudeness caused rude behaviour up to seven days later. In this study, which took place in a negotiations course at a university, participants engaged in negotiations with different partners. We found that when participants negotiated with a rude partner, in their next negotiation their partner thought they behaved rudely. In this study, some of the negotiations took place with no time lag, sometimes there was a three-day time lag, and sometimes there was a seven-day time lag. To our surprise, we found that the time lag seemed to be unimportant, and at least within a seven-day window the effect did not appear to be wearing off.

Unfortunately, because the rudeness is contagious and unconscious, it’s hard to stop. So what can be done? Our work points to a need to re-examine the types of behaviours that are tolerated at work. More severe deviant behaviours, such as abuse, aggression and violence, are not tolerated because their consequences are blatant. While rudeness of a more minor nature makes its consequences a little harder to observe, it is no less real and no less harmful, and thus it might be time to question whether we should tolerate these behaviours at work.

You might be thinking that it will be impossible to end workplace rudeness. But work cultures can change. Workers once used to smoke at their desks, and those same workers would have said it was a natural part of office life that couldn’t be removed. Yet workplace smoking is verboten everywhere now. We’ve drawn the line at smoking and discrimination – and rudeness should be the next to go.Aeon counter – do not remove

Trevor Foulk

This article was originally published at Aeon and has been republished under Creative Commons.

#


Moderation may be the most challenging and rewarding virtue

By Aurelian Craiutu

He is a professor of political science and adjunct professor of American studies at Indiana University, Bloomington. His most recent book is Faces of Moderation: The Art of Balance in an Age of Extremes (2016). He lives in Bloomington.

Three centuries ago, the French political philosopher Montesquieu claimed that human beings accommodate themselves better to the middle than to the extremes. Only a few decades later, George Washington begged to differ. In his Farewell Address (1796), the first president of the United States sounded a warning signal against the pernicious effects of the spirit of party and faction. The latter, he argued, has its roots in the strongest passions of the human mind and can be seen in ‘its greatest rankness’ in popular government where the competition and rivalry between factions are ‘sharpened by the spirit of revenge’ and immoderation.

If one looks at our world today, we might be tempted to side with Washington over Montesquieu. Our political scene offers a clear sign of the little faith we seem to have in this virtue without which, as John Adams memorably put it in 1776, ‘every man in power becomes a ravenous beast of prey’. Although our democratic institutions depend on political actors exercising common sense, self-restraint and moderation, we live in a world dominated by hyperbole and ideological intransigence in which moderates have become a sort of endangered species in dire need of protection. Can we do something about that to save them from extinction? To answer this question, we should take a new look at moderation, which Edmund Burke regarded as a difficult virtue, proper only to noble and courageous minds. What does it mean to be a moderate voice in political and public life? What are the principles underlying moderation? What do moderates seek to achieve in society, and how do they differ from more radical or extremist minds?

Continue reading

Why bureaucrats matter in the fight to preserve the rule of law

By Melissa Lane
Ms. Lane is the Class of 1943 professor of politics and director of the University Center for Human Values at Princeton University. She is the author of a number of books, including Eco-Republic (2011/2012) and The Birth of Politics (2015), and has appeared often on the ‘In Our Time’ radio broadcast on BBC Radio 4. Via Creative Commons Attribution-No Derivatives

Socrates, while serving on the Athenian Council, sought to prevent it from making an illegal decision. Martin Luther, when a council convened by the Emperor Charles V in 1521 told him to recant, is said to have declared: ‘Here I stand; I can do no other.’ The United States’ attorney general Elliot Richardson and the deputy attorney general William D Ruckelshaus both chose to resign in 1973 rather than obey President Richard Nixon’s order to fire the special prosecutor investigating Watergate. More recently, the acting attorney general Sally Yates was fired after she announced that the US Department of Justice would not cooperate in enforcing President Donald Trump’s executive order against Muslim immigrants. They all said no. Each of them, for reasons of principle, opposed an order from a higher authority (or sought to prevent its issuance). They are exceptional figures, in extraordinary circumstances. Yet most of the time, the rule of law is more mundane: it depends on officials carrying out their ordinary duties within the purposes of the offices they hold, and on citizens obeying them. That is to say, the rule of law relies upon obedience by bureaucrats, and obedience of bureaucrats – but crucially, within the established norms of the state.

The ancient Greeks made no sharp distinction between political rulers and bureaucratic officials. They considered anyone in a position of constitutional authority as the holder of an office. The ancient Greek world did not have a modern bureaucracy but they did confront the question of respect for norms of office and of obedience to office-holders. Plato addresses these questions, in both the Republic and the Laws, in relation to the danger of usurpation of democracy by a budding tyrant.

Of course, Plato was no democrat. But he did recognise the value of liberty – most explicitly in the Laws, where he posited liberty, wisdom and friendship as the three values that ought to guide the work of government. Plato wanted to balance liberty with what we would call the rule of law. For him, that included not only obedience to the law, but also obedience to the officials who have to carry it out. In the Republic’s portrait of democracy (in some ways a caricature, to be sure), he warns against drinking liberty unmixed with obedience, likening it to wine unmixed with water – a serious social solecism for the ancient Greeks. Doing so, he thinks, can lead to a deterioration of the norms of political office. Too much liberty might lead to the point that a city ‘insults those who obey the rulers as willing slaves and good-for-nothings, and praises and honours, both in public and in private, rulers who behave like subjects and subjects who behave like rulers’ (translation by G.M.A. Grube revised by C.D.C. Reeve, in John M. Cooper (ed.) Plato. Complete Works (Indianapolis: Hackett, 1997)).

To insult ‘those who obey the rulers’ by calling them ‘willing slaves’ is to reject the value of a norm of obedience to state office-holders. No constitution – no organisation of power into authority – can long subsist if the authority of its officials routinely merits defiance. The resister might be heroic and her actions could sometimes be necessary, but she must be an exceptional rather than an everyday case. Officials who defy illegitimate orders must logically always be the exceptions to a general rule of obeying orders, lest the very meaning of their defiance evaporate. Any conception of liberty, or any practice of government, that rejects the need for obedience to the norms of office, will destroy itself. So Plato reaffirms in the Laws that ‘complete freedom (eleutheria) from all rulers (archōn) is infinitely worse than submitting to a moderate degree of control’.

The statebuilding efforts of medieval and early modern Europe are great and complex endeavours, with their own rich histories. In relation to the rule of law and role of the bureaucrats, we can think of their papal chanceries, state treasuries and imperial ministries as a kind of foundation on which modern reformers and rulers and revolutionaries alike would build liberalism and the rule of law. These bureaucracies constituted the tools of power for rulers. In providing the impartial officials, rule of law procedures and institutional forms of equality, bureaucracy constituted the mechanisms to vouchsafe people’s rights. Liberal reformers used these very mechanisms to try to extend wider rights and liberties to more and more groups.

Max Weber, the influential early 20th-century German sociologist, feared that bureaucracy would be part of the over-rationalisation that he described as a looming ‘iron cage’. He feared it would grow too powerful, choking off meaning, value and political responsibility in its means-ends instrumental rationality. If Weber had lived a few years longer (he died at only 56, in 1920) and had been asked to speak about the crisis of liberalism in the young Weimar Republic, I think he would have expressed the concern (already present in his last writings) that no sufficiently charismatic and powerful politicians would emerge who would be able to bring the bureaucracy to heel. He saw bureaucracy as a major threat to modern life. The fear that the bureaucracy itself was vulnerable to tyrannical usurpation would not likely have crossed his mind.

Today, the US faces the threat of what we can think of as the political iron cage breaking down – possibly from executive leadership ignorant or contemptuous of the purposes of the organisation. Though obviously it has accelerated, the threat is not entirely new with the Trump administration. When president in the 1980s, Ronald Reagan pioneered the nomination of cabinet secretaries committed to abolishing or drastically curtailing the very agencies they were named to head. President George W Bush named agency administrators such as Michael D Brown, who lacked knowledge of his area of responsibility, as head of the Federal Emergency Management Agency. Brown’s eventual resignation in 2005 in the aftermath of Hurricane Katrina betokened not heroic defiance but a reaction to the storm of criticism for his lackadaisical response to the crisis. These public officials were not committed to the basic purposes and processes of the bureaucracies they were appointed to lead or serve.

To be sure, we must not be blind to the ways in which the machinery of state will remain a major resource for parties and politicians who seek to control and to advance their own ends. My point is that, while aspects of this machinery might remain intact, challenges to evidence-based reasoning, fair procedure and impartial officialdom – to the whole apparatus of bureaucratic office and the rule of law – threaten to corrode it. Whether in the long run the machinery itself can withstand this corrosion is an open question.

There is an irony here. Weber’s fear was that the iron cage of rationalising modernity, including bureaucracy, would stifle liberty, meaning and ultimate value, squeezing out responsible, charismatic politicians. Yet today, faced with the menace of charismatic, reckless politicians, what Weber feared as an iron cage appears to us to be the building block of some of history’s most hard-won rights. Plato looks more prescient: long ago he warned of both the charismatic but irresponsible politicians, and the insouciant, irresponsible officials who serve them, who risk eroding the norms of office on which the values of the rule of law and liberty rest.Aeon counter – do not remove

Melissa Lane

This article was originally published at Aeon and has been republished under Creative Commons.

#


Massimo Pigliucci: To be happier, focus on what’s within your control

by Massimo Pigliucci
(This article was originally published at Aeon and has been republished under Creative Commons)

God, grant me the serenity to accept the things I cannot change,
Courage to change the things I can,
And wisdom to know the difference.

This is the Serenity Prayer, originally written by the American theologian Reinhold Niebuhr around 1934, and commonly used by Alcoholics Anonymous and similar organisations. It is not just a key step toward recovery from addiction, it is a recipe for a happy life, meaning a life of serenity arrived at by consciously taking what life throws at us with equanimity.

The sentiment behind the prayer is very old, found in 8th-century Buddhist manuscripts, as well as in 11th-century Jewish philosophy. The oldest version I can think of, however, goes back to the Stoic philosopher Epictetus. Active in the 2nd century in Rome and then Nicopolis, in western Greece, Epictetus argued that:

We are responsible for some things, while there are others for which we cannot be held responsible. The former include our judgment, our impulse, our desire, aversion and our mental faculties in general; the latter include the body, material possessions, our reputation, status – in a word, anything not in our power to control. … [I]f you have the right idea about what really belongs to you and what does not, you will never be subject to force or hindrance, you will never blame or criticise anyone, and everything you do will be done willingly. You won’t have a single rival, no one to hurt you, because you will be proof against harm of any kind.

I call this Epictetus’ promise: if you truly understand the difference between what is and what is not under your control, and act accordingly, you will become psychologically invincible, impervious to the ups and downs of fortune.

Of course, this is far easier said than done. It requires a lot of mindful practice. But I can assure you from personal experience that it works. For instance, last year I was in Rome, working, as it happened, on a book on Stoicism. One late afternoon I headed to the subway stop near the Colosseum. As soon as I entered the crowded subway car, I felt an unusually strong resistance to moving forward. A young fellow right in front of me was blocking my way, and I couldn’t understand why. Then the realisation hit, a second too late. While my attention was focused on him, his confederate had slipped his hand in my left front pocket, seized my wallet, and was now stepping outside of the car, immediately followed by his accomplice. The doors closed, the train moved on, and I found myself with no cash, no driver’s licence, and a couple of credit cards to cancel and replace.

Before I started practising Stoicism, this would have been a pretty bad experience, and I would not have reacted well. I would have been upset, irritated and angry. This foul mood would have spilled over the rest of the evening. Moreover, the shock of the episode, as relatively mild as the attack had been, would have probably lasted for days, with a destructive alternation of anger and regret.

But I had been practicing Stoicism for a couple of years. So my first thought was of Epictetus’ promise. I couldn’t control the thieves in Rome, and I couldn’t go back and change what had happened. I could, however, accept what had happened and file it away for future reference, focusing instead on having a nice time during the rest of my stay. After all, nothing tragic had happened. I thought about this. And it worked. I joined my evening company, related what happened, and proceeded to enjoy the movie, the dinner, and the conversation. My brother was amazed that I took things with such equanimity and that I was so calm about it. But that’s precisely the power of internalising the Stoic dichotomy of control.

And its efficacy is not limited to minor life inconveniences, as in the episode just described. James Stockdale, a fighter-jet pilot during the Vietnam War, was shot down and spent seven and a half years in Hoa Lo prison, where he was tortured and often put in isolation. He credits Epictetus for surviving the ordeal by immediately applying the dichotomy of control to his extreme situation as a captive, which not only saved his life, but also allowed him to coordinate the resistance from inside the prison, in his position as senior ranking officer.

Most of us don’t find ourselves in Stockdale’s predicament, but once you begin paying attention, the dichotomy of control has countless applications to everyday life, and all of them have to do with one crucial move: shifting your goals from external outcomes to internal achievements.

For example, let’s say that you are preparing your résumé for a possible job promotion. If your goal is to get the promotion, you are setting yourself up for a possible disappointment. There is no guarantee that you will get it, because the outcome is not (entirely) under your control. Sure, you can influence it, but it also depends on a number of variables that are independent of your efforts, including possible competition from other employees, or perhaps the fact that your boss, for whatever unfathomable reason, really doesn’t like you.

That’s why your goal should be internal: if you adopt the Stoic way, you would conscientiously put together the best résumé that you can, and then mentally prepare to accept whatever outcome with equanimity, knowing that sometimes the universe will favour you, and other times it will not. What do you gain by being anxious over something you don’t control? Or angry at a result that was not your doing? You are simply adding a self-inflicted injury to the situation, compromising your happiness and serenity.

This is no counsel for passive acceptance of whatever happens. After all, I just said that your goal should be to put together the best résumé possible! But it is the mark of a wise person to realise that things don’t always go the way we wish. If they don’t, the best counsel is to pick up the pieces, and move on.

Do you want to win that tennis match? It is outside of your control. But to play the best game you can is under your control. Do you want your partner to love you? It is outside of your control. But there are plenty of ways you can choose to show your love to your partner – and that is under your control. Do you want a particular political party to win the election? It is outside of your control (unless you’re Vladimir Putin!) But you can choose to engage in political activism, and you can vote. These aspects of your life are under your control. If you succeed in shifting your goals internally, you will never blame or criticise anyone, and you won’t have a single rival, because what other people do is largely beyond your control and therefore not something to get worked up about. The result will be an attitude of equanimity toward life’s ups and downs, leading to a more serene life.Aeon counter – do not remove

Massimo Pigliucci  is professor of philosophy at City College and at the Graduate Center of the City University of New York. His latest book is How to Be a Stoic: Ancient Wisdom for Modern Living (May, 2017). He lives in New York.

This article was originally published at Aeon and has been republished under Creative Commons.

#

What know-it-alls don’t know, or the illusion of competence

by Kate Fehlhaber (This article was originally published at Aeon and has been republished under Creative Commons).

 

One day in 1995, a large, heavy middle-aged man robbed two Pittsburgh banks in broad daylight. He didn’t wear a mask or any sort of disguise. And he smiled at surveillance cameras before walking out of each bank. Later that night, police arrested a surprised McArthur Wheeler. When they showed him the surveillance tapes, Wheeler stared in disbelief. ‘But I wore the juice,’ he mumbled. Apparently, Wheeler thought that rubbing lemon juice on his skin would render him invisible to videotape cameras. After all, lemon juice is used as invisible ink so, as long as he didn’t come near a heat source, he should have been completely invisible.

Police concluded that Wheeler was not crazy or on drugs – just incredibly mistaken.

The saga caught the eye of the psychologist David Dunning at Cornell University, who enlisted his graduate student, Justin Kruger, to see what was going on. They reasoned that, while almost everyone holds favourable views of their abilities in various social and intellectual domains, some people mistakenly assess their abilities as being much higher than they actually are. This ‘illusion of confidence’ is now called the ‘Dunning-Kruger effect’, and describes the cognitive bias to inflate self-assessment.

To investigate this phenomenon in the lab, Dunning and Kruger designed some clever experiments. In one study, they asked undergraduate students a series of questions about grammar, logic and jokes, and then asked each student to estimate his or her score overall, as well as their relative rank compared to the other students. Interestingly, students who scored the lowest in these cognitive tasks always overestimated how well they did – by a lot. Students who scored in the bottom quartile estimated that they had performed better than two-thirds of the other students!

This ‘illusion of confidence’ extends beyond the classroom and permeates everyday life. In a follow-up study, Dunning and Kruger left the lab and went to a gun range, where they quizzed gun hobbyists about gun safety. Similar to their previous findings, those who answered the fewest questions correctly wildly overestimated their knowledge about firearms. Outside of factual knowledge, though, the Dunning-Kruger effect can also be observed in people’s self-assessment of a myriad of other personal abilities. If you watch any talent show on television today, you will see the shock on the faces of contestants who don’t make it past auditions and are rejected by the judges. While it is almost comical to us, these people are genuinely unaware of how much they have been misled by their illusory superiority.

Sure, it’s typical for people to overestimate their abilities. One study found that 80 per cent of drivers rate themselves as above average – a statistical impossibility. And similar trends have been found when people rate their relative popularity and cognitive abilities. The problem is that when people are incompetent, not only do they reach wrong conclusions and make unfortunate choices but, also, they are robbed of the ability to realise their mistakes. In a semester-long study of college students, good students could better predict their performance on future exams given feedback about their scores and relative percentile. However, the poorest performers showed no recognition, despite clear and repeated feedback that they were doing badly. Instead of being confused, perplexed or thoughtful about their erroneous ways, incompetent people insist that their ways are correct. As Charles Darwin wrote in The Descent of Man (1871): ‘Ignorance more frequently begets confidence than does knowledge.’

Interestingly, really smart people also fail to accurately self-assess their abilities. As much as D- and F-grade students overestimate their abilities, A-grade students underestimate theirs. In their classic study, Dunning and Kruger found that high-performing students, whose cognitive scores were in the top quartile, underestimated their relative competence. These students presumed that if these cognitive tasks were easy for them, then they must be just as easy or even easier for everyone else. This so-called ‘imposter syndrome’ can be likened to the inverse of the Dunning-Kruger effect, whereby high achievers fail to recognise their talents and think that others are equally competent. The difference is that competent people can and do adjust their self-assessment given appropriate feedback, while incompetent individuals cannot.

And therein lies the key to not ending up like the witless bank robber. Sometimes we try things that lead to favourable outcomes, but other times – like the lemon juice idea – our approaches are imperfect, irrational, inept or just plain stupid. The trick is to not be fooled by illusions of superiority and to learn to accurately reevaluate our competence. After all, as Confucius reportedly said, real knowledge is knowing the extent of one’s ignorance.Aeon counter – do not remove

Kate Fehlhaber is the editor in chief of Knowing Neurons and a PhD candidate in neuroscience at the University of California, Los Angeles. She lives in Los Angeles.

This article was originally published at Aeon and has been republished under Creative Commons.

#

Wars are not won by military genius or decisive battles

by Cathal J Nolan
(This article was originally published at Aeon and has been republished under Creative Commons).

War is the most complex, physically and morally demanding enterprise we undertake. No great art or music, no cathedral or temple or mosque, no intercontinental transport net or particle collider or space programme, no research for a cure for a mass-killing disease receives a fraction of the resources and effort we devote to making war. Or to recovery from war and preparations for future wars invested over years, even decades, of tentative peace. War is thus far more than a strung-together tale of key battles. Yet, traditional military history presented battles as fulcrum moments where empires rose or fell in a day, and most people still think that wars are won that way, in an hour or an afternoon of blood and bone. Or perhaps two or three. We must understand the deeper game, not look only to the scoring. That is hard to do because battles are so seductive.

War evokes our fascination with spectacle, and there is no greater stage or more dramatic players than on a battlefield. We are drawn to battles by a lust of the eye, thrilled by a blast from a brass horn as Roman legionaries advance in glinting armour or when a king’s wave releases mounted knights in a heavy cavalry charge. Grand battles are open theatre with a cast of many tens of thousands: samurai under signal kites, mahouts mounted on elephants, a Zulu impi rushing over lush grass toward a redcoat firing line. Battles open with armies dressed in red, blue or white, flags fluttering, fife and drums beating the advance. Or with the billowing canvas of a line of fighting sail, white pufferies erupting in broadside volleys. Or a wedge of tanks hard-charging over the Russian steppe. What comes next is harder to comprehend.

Regimental Combat Team 6, 1st Battalion, 6th Marine Regiment DOD Photo Taken By Cpl James Clark | 01.20.2012 (Diplopundit selected this photo for the article)

The idea of the ‘decisive battle’ as the hinge of war, and wars as the gates of history, speaks to our naive desire to view modern war in heroic terms. Popular histories are written still in a drums-and-trumpets style, with vivid depictions of combat divorced from harder logistics, daily suffering, and a critical look at the societies and cultures that produced mass armies and sent them off to fight in faraway fields for causes about which the average soldier knew nothing.

Visual media especially play on what the public wants to see: raw courage and red days, the thrill of vicarious violence and spectacle. This is the world of war as callow entertainment, of Quentin Tarantino’s Inglourious Basterds (2009) or Brad Pitt in Fury (2014). It’s not the world of real Nazis or real war.

Battles also entice generals and statesmen with the idea that a hard red day can be decisive, and allow us to avoid attrition, which we all despise as morally vulgar and without redemptive heroism. We fear to find only indecision and tragedy without uplift or morality in trench mud, or roll calls of dead accumulating over years of effort and endurance. Instead, we raise battles to summits of heroism and generals to levels of genius that history cannot support. Though some historians might try, celebrating even failed campaigns as glorious. Prussia is wrecked, yet Frederick is the greatest of Germans. France is beaten and an age is named for Louis XIV, another for Napoleon. Europe lies in ruin, but German generals displayed genius with Panzers.

Whether or not we agree that some wars were necessary and just, we should look straight at the grim reality that victory was most often achieved in the biggest and most important wars by attrition and mass slaughter – not by soldierly heroics or the genius of command. Winning at war is harder than that. Cannae, Tours, Leuthen, Austerlitz, Tannenberg, Kharkov – all recall sharp images in a word. Yet winning such lopsided battles did not ensure victory in war. Hannibal won at Cannae, Napoleon at Austerlitz, Hitler at Sedan and Kiev. All lost in the end, catastrophically.

There is heroism in battle but there are no geniuses in war. War is too complex for genius to control. To say otherwise is no more than armchair idolatry, divorced from real explanation of victory and defeat, both of which come from long-term preparation for war and waging war with deep national resources, bureaucracy and endurance. Only then can courage and sound generalship meet with chance in battle and prevail, joining weight of materiel to strength of will to endure terrible losses yet win long wars. Claims to genius distance our understanding from war’s immense complexity and contingency, which are its greater truths.

Modern wars are won by grinding, not by genius. Strategic depth and resolve is always more important than any commander. We saw such depth and resilience in Tsarist Russia in 1812, in France and Britain in the First World War, in the Soviet Union and the United States during the Second World War, but not in Carthage or overstretched Nazi Germany or overreaching Imperial Japan. The ability to absorb initial defeats and fight on surpassed any decision made or battle fought by Hannibal or Scipio, Lee or Grant, Manstein or Montgomery. Yes, even Napoleon was elevated as the model of battle genius by Clausewitz and in military theory ever since, despite his losing by attrition in Spain, and in the calamity of the Grand Armée’s 1812 campaign in Russia. Waterloo was not the moment of his decisive defeat, which came a year earlier. It was his anticlimax.

Losers of most major wars in modern history lost because they overestimated operational dexterity and failed to overcome the enemy’s strategic depth and capacity for endurance. Winners absorbed defeat after defeat yet kept fighting, overcoming initial surprise, terrible setbacks and the dash and daring of command ‘genius’. Celebration of genius generals encourages the delusion that modern wars will be short and won quickly, when they are most often long wars of attrition. Most people believe attrition is immoral. Yet it’s how most major wars are won, aggressors defeated, the world remade time and again. We might better accept attrition at the start, explain that to those we send to fight, and only choose to fight the wars worth that awful price. Instead, we grow restless with attrition and complain that it’s tragic and wasteful, even though it was how the Union Army defeated slavery in America, and Allied and Soviet armies defeated Nazism.

With humility and full moral awareness of its terrible costs, if we decide that a war is worth fighting, we should praise attrition more and battle less. There is as much room for courage and character in a war of attrition as in a battle. There was character aplenty and courage on all sides at Verdun and Iwo Jima, in the Hürtgen Forest, in Korea. Character counts in combat. Sacrifice by soldiers at Shiloh or the Marne or Kharkov or Juno Beach or the Ia Drang or Korengal Valley were not mean, small or morally useless acts. Victory or defeat by attrition, by high explosive and machine gun over time, does not annihilate all moral and human meaning.

The Allure of Battle: A History of How Wars Have Been Won and Lost by Cathal Nolan is out now through Oxford University Press.Aeon counter – do not remove

Cathal J Nolan  teaches military history at Boston University. He is the author of The Allure of Battle: A History of How Wars Have Been Won and Lost(2017).

This article was originally published at Aeon and has been republished under Creative Commons.

 

#

What every dictator knows: young men are natural fanatics

 

by Joe Herbert, emeritus professor of neuroscience at the Cambridge Centre for Brain Repair at the University of Cambridge. His latest book is Testosterone: Sex, Power, and the Will to Win (2015). This article was originally published at Aeon and has been republished under Creative Commons.

Young men are particularly liable to become fanatics. Every dictator, every guru, every religious leader, knows this. Fanatics have an overwhelming sense of identity based on a cause (a religion) or a community (gang, team), and a tight and exclusive bond with other members of that group. They will risk injury, loss or even death for the sake of their group. They regard everyone else as outsiders, or even enemies. But why are so many of them young males?

In a world of nation-states, young men fought the wars that formed most countries. The same goes for tribes, villages and factions. Young males have qualities that specialize them for this essential function. They readily identify with their group. They form close bonds with its other members. They are prone to follow a strong leader. This is why young males are so vulnerable to environmental influences, such as the prevailing culture in which they happen to live, and why they are so easily attracted by charismatic leaders or lifestyles that promise membership of restricted groups with sharply defined objectives and values. They like taking risks on behalf of their group – and they usually underestimate the danger that such risks represent. If they didn’t have these properties, they would be less willing to go to war, and therefore less able to fulfil one of their essential sociobiological roles.

Why are young men like this? Part of it seems to depend on testosterone, acting on their brain during early foetal life. Exposure in the womb ‘masculinises’ the brain – giving it certain properties, including sexual identity as a male, as well as a preference for play patterns that involve physical contact and even play fights. We know this because girls exposed to abnormal levels of testosterone during this time show similar behaviour, but much less otherwise. At puberty, there is another surge of testosterone acting on this already-prepared brain: this not only awakens sexuality, but encourages various strategies for competing for a mate – including the use of aggression and risk-taking behaviour. But testosterone is far from the only factor in making a fanatic.

Testosterone acts on an ancient part of the brain, the limbic system. The human limbic system looks very like that in other primates, such as chimpanzees, and is even easily recognisable in rats. But this part of the human brain is regulated by a more recent addition: the frontal lobes, which lie behind your forehead. Folk usage recognises their importance: in a hangover from the age of physiognomy, we call bright people ‘highbrow’, reflecting their tall foreheads (and thus their assumed larger frontal lobes). Among their other functions, the frontal lobes are important for personality, social interactions ­– and restraint. Damage to them results in impaired and inappropriate social behaviour, as well as lack of judgment.

Crucially, males’ frontal lobes don’t fully mature until their late 20s, whereas those of women mature earlier. This part of the brain is highly reactive to social cues and the behaviour of other people. The stereotyped young man – loud, risky, unreasonable, aggressive (but also non-conformist and thus innovative) – might be one result. So while it’s an evolutionary advantage to the group as a whole, a combination of rampant testosterone and an immature frontal lobe also explains why young men like taking risks and why they are liable to fanaticism.

Of course, not all young men, even the fanatics, become terrorists. Young men are not all the same. Different outcomes might be due to different social factors. Many terrorists come from criminal or deprived backgrounds. We know that a neglected or abusive childhood can result in antisocial or deviant behaviour later in life. An individual’s social environment, particularly early in life, can have long-lasting behavioural implications. We are beginning to learn something about how these conditions can result in persistent or even permanent changes to the brain, but so far we cannot do much about undoing them. We call people who have disregard for normal human relationships ‘psychopaths’, implying that they have abnormal (pathological) events in their ‘psyche’ (mind). We also know that there are people who develop genetically abnormal social traits (autism is one example) irrespective of upbringing. We do not know the precise defects in the brain that are responsible. Nevertheless, their nature – abnormal social behaviour and inter-personal relationships – points towards the frontal lobes, though other areas of the brain can also be involved.

Social status is prized by the males of many animal species, including humans. Several non-human primates maintain clear-cut dominance rankings. Higher status gives increased access to food, shelter and mates. It’s mostly based on physical prowess, and males fight or threaten each other to determine their relative position.

This also occurs in humans, of course. And yet the human brain has developed other ranking systems, including those based on money, birth or technical ability. The development of projectile weapons has reduced our dependence on muscular strength, but emphasised other traits, such as ruthlessness, bravery and leadership. Within fanatical groups, there is much competition to show qualities that increase a member’s standing with others in the group. This might be particularly attractive to those who, in the rest of life, have little cause to think they rank highly.

Terrorist or aggressive acts, therefore, can be carried out to prove a member’s worth, and attract the kind of attention that seems otherwise unattainable. It’s a modern way to satisfy an ancient biological need, for the respect that individual males crave. In summary, the propensity of the masculine brain is to form bonds with other males (eg street gangs), to recognise and identify with groups, to defend those groups against others, and compete with them for assets. A young male’s hormonal constitution and the way his brain matures together increase his susceptibility to fanaticism, an extreme instance of bonding, and make him prone to taking risk-laden actions on behalf of his group.

The human brain has invented additional categories of identity seemingly unknown in other species, including those based on common beliefs or ethical points of view. Today, identity is increasingly based on beliefs. The huge human brain has enabled the invention of weapons; these have given fanatics increasingly effective means of achieving the primitive aim of dominance by terrorising others. The path to fanaticism will be influenced by a male’s genes, his early experiences, his hormones, the maturity or otherwise of his brain, and the social context in which he finds himself. All these can result in a brain state we label fanaticism, a dangerous mutation of a role that is biologically essential for young men. Our task is to recognise what that brain state might be, how it arises and, if possible, to counter it.Aeon counter – do not remove

Joe Herbert

This article was originally published at Aeon and has been republished under Creative Commons.

 

#

Stupefied: How the best and the brightest learn to switch off their brains at the office door

Posted: 11:57 am ET
[twitter-follow screen_name=’Diplopundit’ ]

 

André Spicer is professor of organisational behaviour at the Cass Business School at City, University of London, where he specialises in political dynamics, organisational culture and employee identity. His latest book, together with Mats Alvesson, is The Stupidity Paradox: The Power and Pitfalls of Functional Stupidity at Work (2016). The following is an excerpt from his piece Stupefied on how organisations enshrine collective stupidity and how employees are rewarded for checking their brains at the office door.  The article was originally published in Aeon [http://aeon.co].

Organisations hire smart people, but then positively encourage them not to use their intelligence. Asking difficult questions or thinking in greater depth is seen as a dangerous waste. Talented employees quickly learn to use their significant intellectual gifts only in the most narrow and myopic ways.

Those who learn how to switch off their brains are rewarded. By avoiding thinking too much, they are able to focus on getting things done. Escaping the kind of uncomfortable questions that thinking brings to light also allows employees to side-step conflict with co-workers. By toeing the corporate line, thoughtless employees get seen as ‘leadership material’ and promoted. Smart people quickly learn that getting ahead means switching off their brains as soon as they step into the office.

Sounds familiar?  For those interested in further reading, the author co-published a study on a A Stupidity-Based Theory of Organizations with Mats Alvesson in the Journal of Management Studies in 2012.  The abstract is here; the full article is available for a fee here.

 #

America’s Declaration of Independence was pro-immigrant

Posted: 12:22 am EST

America’s Declaration of Independence was pro-immigrant
by Steven Pincus

Steven Pincus is professor of history at Yale University. His latest book is 1688: The First Modern Revolution (2011). He lives in New Haven, Connecticut. This article was originally published at Aeon and has been republished under Creative Commons.

The Declaration of Independence by John Trumbull, 1819. Courtesy Wikimedia

The Declaration of Independence by John Trumbull, 1819. Courtesy Wikimedia

In 1776, American Patriots faced problems of crushing sovereign debt, vituperative debates about immigration, and questions about the role of foreign trade. They responded by founding a government committed to open borders and free trade. The Declaration of Independence, the country’s charter document, outlined the new republic’s fundamental economic principles, ones that Americans would be wise to remember, because they are now under threat.

Americans have long held their country’s founding document sacred. John Quincy Adams, America’s sixth president, asserted on 4 July 1821 that ‘never, never for a moment have the great principles, consecrated by the Declaration of this day, been renounced or abandoned’. In 1861, Abraham Lincoln announced that: ‘I have never had a feeling politically that did not spring from the sentiments embodied in the Declaration of Independence.’ Even this year’s Republican Platform committee agrees that the Declaration ‘sets forth the fundamental precepts of American Government’. The Declaration committed that government to reversing the oppressive policies advanced by the British monarch George III and his government. In particular, they called for the free movement of peoples and goods.

In Britain, the ministers who came to power in the 1760s and ’70s overwhelmingly believed, as do many European and North American politicians, that the only option in the face of sovereign debt is to pursue austerity measures. Like many politicians today, they were also happy to shift the tax burden onto those who had the least political capacity to object. In the 18th century, this meant taxing the under-represented manufacturing districts of England and, above all, taxing the unrepresented North Americans. Today, this often means regressive taxation: taking less from those with more.

Patriots on both sides of the Atlantic who opposed the British governments of the 1760s and ’70s did not deny that heavy national debts could be oppressive, but they insisted that the dynamic interplay of producers and consumers was the key to generating economic growth. Unlike their ministerial opponents, they believed that the best way to pay down that debt was for the government to stimulate the economy. They pointed out that the colonies represented the most dynamic sector of Britain’s imperial economy. The more the colonies grew in population and wealth, the more British manufactured goods they would consume. Since these goods were indirectly taxed, the more the Americans bought, the more they helped to lower the government’s debt. Consumption in the colonies was thus ‘the source of immense revenues to the parent state’, as the founding father Alexander Hamilton put it in 1774.

When Americans declared independence in 1776, they set forth to pursue new, independent economic policies of free trade and free immigration. The Committee of Five, including John Adams, Thomas Jefferson and Benjamin Franklin, who drew up the Declaration of Independence, condemned George III for ‘cutting off our Trade with all parts of the world’. The British government had long erected tariff and non-tariff barriers to American trade with the French and Spanish colonies in the Caribbean and South America. By doing so, they deprived Americans both of a vital outlet for their products and of access to hard currency. This was why Franklin had, in 1775, called for Britain to ‘give us the same Privileges of Trade as Scotland received at the Union [of 1707], and allow us a free Commerce with all the rest of the World’. This was why Jefferson called on the British imperial government not ‘to exclude us from going to other markets’. Freedom of commerce, admittedly one that was accompanied by state support for the development of new industries, is foundational to the United States.

The founders’ commitment to free trade stands in stark contrast with Donald Trump’s recent declaration for American ‘economic independence’. Trump insists that his economic programme echoes the wishes of the founding fathers, who ‘understood trade’. In fact, Trump’s economic principles are the reverse of those advocated by the authors of the Declaration. Like the British government of the 1760s, against which the Patriots defined themselves, Trump focuses narrowly on America’s role as a ‘dominant producer’. He is right to say that the founders encouraged manufacturing. But they did so by simultaneously supporting government subsidies for new American manufactures and advocating free trade agreements, such as the Model Treaty adopted by Congress in 1776 that sought to establish bilateral free trade. This was a far cry from Trump’s call for new ‘tariffs’.

The Declaration also condemned George III for his restrictions on immigration. Well-designed states, patriots believed, should promote immigration. This was why they denounced George III for endeavouring to ‘prevent the population of these states’. George III, the American Patriots pointed out, had reversed generations of imperial policy by ‘refusing to pass’ laws ‘to encourage … migrations hither’. Patriots, by contrast, welcomed new immigrants. They knew that British support for the immigration of Germans, Italians, Scottish Highlanders, Jews and the Irish had done a great deal to stimulate the development of British America in the 18th century. State-subsidised immigrants populated the new colony of Georgia in the 1730s. Immigrants brought with them new skills to enhance production, and they immediately proved to be good consumers. ‘The new settlers to America,’ Franklin maintained, created ‘a growing demand for our merchandise, to the greater employment of our manufacturers’.

Nothing could be further from the animating spirit of America’s charter document than closing the country’s borders. Restrictions on immigration more closely resemble British imperial policies that spurred American revolt and independence.

The Declaration of Independence was much more than a proclamation of separation from the Mother Country. It provided the blueprint, the ‘fundamental precepts’, for a new government. Americans broke away from the British Empire in the 1770s, in part, because they rejected restrictions on trade and immigration.Aeon counter – do not remove

Steven Pincus

This article was originally published at Aeon and has been republished under Creative Commons.

#