Diplomacy: A Rusting Tool of American Statecraft

by Ambassador Chas W. Freeman, Jr. (USFS, Ret.)
Senior Fellow, the Watson Institute for International and Public Affairs, Brown University, Washington, DC and Cambridge, Massachusetts, February, 2018 

Diplomacy: A Rusting Tool of American Statecraft
A Lecture to programs on Statecraft at American University, Harvard, and MIT [Republished with permission. The original text is available here]

I am here to talk about diplomacy.  This may seem an odd moment to broach the subject.  Our president has told us that it doesn’t matter that his administration is not staffed to do it, because “I’m the only one who matters.”  In other words, “l’état c’est moi.”

Now that it’s got that straight, the United States Department of State has set about dismantling itself.  Meanwhile, the Foreign Service of the United States is dejectedly withering away.  Our ever-flatulent media seem unconvinced that Americans will miss either institution.

I suspect they’re wrong about that.  Diplomacy is an instrument of statecraft that Americans have not been educated to understand and whose history they do not know.  It is not about “making nice.”  Nor is it just a delaying tactic before we send in the Marines.

Diplomacy is a political performing art that informs and determines the decisions of other states and peoples.  It shapes their perceptions and calculations so that they do what we want them to do because they come to see doing so as in their own best interest.  Diplomacy influences the policies and behavior of states and peoples through measures short of war, though it does not shrink from war as a diversion or last resort.  It is normally but not always overtly non-coercive.  It succeeds best when it embraces humility and respects and preserves the dignity of those to whom it is applied.  As the Chinese philosopher, Laozi put it:  “A leader is best when people barely know he exists.  When his work is done, his aim fulfilled, they will say, we did it ourselves.”

Napoleon called diplomacy, “the police in grand costume” but it is usually not much to look at.  It seldom involves blowing things up, most of its action is unseen, and it is relatively inexpensive.  Diplomacy’s greatest triumphs tend to be preventing things from happening.  But it’s hard to prove they wouldn’t have occurred, absent diplomacy.  So diplomats are more often blamed for what did happen than credited for what didn’t.  Diplomats are even worse than sailors at marching.  Diplomacy stages no parades in which ambassadors and their political masters can strut among baton-twirling majorettes or wave to adoring crowds.   Nor, for the most part, does it justify expensive programs that generate the pork and patronage that nourish politics

All this makes diplomacy both obscure and of little or no direct interest to the central institutions in contemporary Washington’s foreign policy.  As any foreign embassy will tell you, the U.S. Department of Defense and other elements of the military-industrial-congressional complex now dominate the policy process.  Both are heavily invested in theories of coercive interaction between states.  Both favor strategic and tactical doctrines that justify expensive weapons systems and well-paid people to use them.  Activities that cost little and lack drama do not intrigue them.  They see diplomats as the clean-up squad to be deployed after they have demolished other societies, not as peers who can help impose our will without fighting.

U.S. foreign policy is heavily militarized in theory, practice, and staffing.  No one has bankrolled the development of professional diplomatic doctrine, meaning a body of interrelated operational concepts describing how to influence the behavior of other states and people by mostly non-violent means.  So there is no diplomatic equivalent of military doctrine, the pretensions of some scholars of international relations (IR) theory notwithstanding.  This is a very big gap in American statecraft that the growing literature on conflict management has yet to fill.  The absence of diplomatic doctrine to complement military science eliminates most options short of the raw pressure of sanctions or the use of force.  It thereby increases the probability of armed conflict, with all its unpredictable human and financial consequences.

Working out a diplomatic doctrine with which to train professional diplomats could have major advantages.  Diplomatic performance might then continually improve, as military performance does, as experience emended doctrine.   But developing diplomatic doctrine would require acceptance that our country has a need for someone other than dilettantes and amateurs to conduct its foreign relations.  Our politicians, who love the spoils system, seem firmly convinced that, between them, wealthy donors and campaign gerbils can meet most of our needs in foreign affairs, with the military meeting the rest.  The Department of State, which would be the logical government agency to fund an effort at the development of tradecraft and doctrine, is usually led by diplomatic novices.  It is also the perennial runt at the federal budgetary teat.

Leadership of foreign policy by untrained neophytes was to a great extent  the American norm even during the Cold War, when the United States led the world outside the Soviet camp and  deployed unmatched political attractiveness and economic clout.  Now retired and active duty military officers have been added to the diplomatic management mix.  They are experts in the application of violence, not peaceable statecraft, to foreign societies.  How is this likely to work out in the new world disorder?  As the late Deng Xiaoping said, “practice is the sole criterion of truth.”  So we’ll see.  But while we wait for the outcome, there is still time to consider the potential of diplomacy as an instrument of statecraft.

The basis of diplomacy is empathy for the views of others.  It is most effective when grounded in a sophisticated understanding of another’s language, culture, feelings, and intellectual habits. Empathy inhibits killing.  It is not a character trait we expect or desire our soldiers, sailors, airmen, and marines to have.

Language and area training plus practical experience are what enable diplomats to imagine the viewpoint of foreign leaders, to see the world as they do, to analyze trends and events as they would, and to evaluate the pros and cons of actions as they might.  A competent diplomat can use such insights to make arguments that foreign leaders find persuasive.  A diplomat schooled in strategy can determine what circumstances are required to persuade foreign leaders that doing what the diplomat wants them to do is not yielding to superior power but deciding on their own to do what is in their nation’s best interest.

Empathy does not, of course, imply alignment or agreement with the viewpoints of others, just understanding of them.  It is not the same as sympathy, which identifies with others’ perspectives.  Sometimes the aim of diplomacy is to persuade a foreign country to continue to adhere to established policies, because they are beneficial.  But more commonly, it is to change the policies, behavior, and practices of other countries or individuals, not to affirm or endorse them.  To succeed, diplomats must cleave to their own side’s interests, convictions, and policy positions even as they grasp the motivations and reasoning processes of those whose positions they seek to change.  But they must also be able to see their country and its actions as others see them and accept these views as an operational reality to be acknowledged and dealt with rather than denounced as irrational or duplicitous.

To help policy-makers formulate policies and actions that have a real chance of influencing a particular foreign country’s decisions, diplomats habitually find themselves called upon to explain how and why that country’s history and circumstances make it see things and act the way it does.  In the United States, most men and women in senior foreign policy positions did not work their way up the ranks.  They are much more familiar with domestic interest groups and their views than with foreign societies and how they work.  Explanation of foreign positions is easily mistaken for advocacy of them, especially by people inclined to dismiss outlandish views that contradict their prejudices as inherently irrational or malicious.

It’s good domestic politics to pound the policy table in support of popular narratives and nationalist postures and to reject foreign positions on issues as irrational, disingenuous, or malevolent.  But diplomats can’t do that if they are to remain true to their calling.  In a policy process driven more by how things will look to potential domestic critics than by a determination actually to change the behavior of foreigners, diplomats are easily marginalized.  But when they are backed by strong-minded leaders who want results abroad, they can accomplish a great deal that military intervention cannot.

Let me give a couple of examples of how U.S. diplomacy has rearranged other states’ and people’s appraisals of their strategic circumstances and caused them to decide to adopt courses of action favored by the United States.  These examples show both the complexities with which diplomacy must deal and its limitations in terms of its ability to secure assured outcomes.

 

Continue reading

Advertisements

Ambassador Anthony Quainton: “there are more and more hammers in the policy toolbox…”

Posted: 4:04 am ET

 

From Militarization and Marginalization of American Diplomacy and Foreign Policy via American Diplomacy
Ambassador Anthony C. E. Quainton 
Former U.S. Ambassador to CAR, Nicaragua, Kuwait, Peru
Former Assistant Secretary of State for Diplomatic Security, DGHR, and CT Coordinator

“[W]e are not facing a militarization of American foreign policy but the marginalization of diplomacy as the effective alternative to military force. The denigration and dismissal of soft power, even when it is renamed smart power, has led to a perception of diplomatic weakness and the concomitant rise of military influence on the policy process. It is a sad reality that there are more and more hammers in the policy toolbox and fewer alternative weapons. The result may be that a president anxious to make America great again and to demonstrate the effectiveness of American leadership and power may look for a place of his choosing to demonstrate American power. President Trump does not seem temperamentally interested in the prolonged and protracted process of diplomacy. His recent tweet questioning the utility of Secretary Tillerson’s efforts to engage the North Koreans in dialogue is an example of this skepticism. In these circumstances we should not be surprised if the United States were to decides to choose a target of opportunity in Iran or North Korea or Syria to show off its military might. This will not reflect the institutional militarization of American foreign policy but rather the emotional need of many Americans, frustrated by our loss of global standing to demonstrate that America can indeed be great again. Neither a resourced military nor an marginalized diplomacy should want that to happen.”

#

 

More than just sanctuary, migrants need social citizenship #seventhperson

By Nancy Berlinger:  a research scholar at The Hastings Center in New York. Her most recent book is Are Workarounds Ethical? Managing Moral Problems in Health Care Systems (2016). She co-directs the Undocumented Patients project. | Via Creative Commons Attribution-No Derivatives

 

In 1975, the English author John Berger wrote about the political implications of immigration, at a time when one in seven workers in the factories of Germany and Britain was a male migrant – what Berger called the ‘seventh man’. Today, every seventh person in the world is a migrant.

Migrants are likely to settle in cities. In the United States, 20 cities (accounting for 36 per cent of the total US population in 2014) were home to 65 per cent of the nation’s authorised immigrants and 61 per cent of unauthorised immigrants. In Singapore, migrant workers account for 20 per cent of the city-state’s population. (Migrants continue to be a significant rural population. In the US, three-quarters of farm workers are foreign-born.)

Scholarship on migration tends to focus normative arguments on the national level, where policy concerning borders and immigration is made. Some prominent political philosophers – including David Miller at Nuffield College, Oxford, and Joseph Carens at the University of Toronto – also outline an account of ‘social membership’ in receiving societies. This process unfolds over five to 10 years of work, everyday life and the development of attachments. As Carens writes in ‘Who Should Get In?’ (2003), after a period of years, any migrant crosses a ‘threshold’ and is no longer a stranger. This human experience of socialisation holds true for low-wage and unauthorised migrants, so a receiving society should acknowledge that migrants themselves, not only their economic contributions, are part of that society.

Carens and Miller apply this argument to the moral claims of settled migrants at risk of deportation because they are unauthorised or because the terms of their presence are tightly limited by work contracts. In the US, for example, most of the estimated 11.3 million people who crossed a border without authorisation or are living outside the terms of their original visas have constituted a settled population for the past decade, with families that include an estimated 4 million children who are US citizens by birthright. In The Ethics of Immigration (2013), Carens writes that the prospect of deporting young immigrants from the place where they had lived most of their lives was especially troubling: it is ‘morally wrong to force someone to leave the place where she was raised, where she received her social formation, and where she has her most important human connections’. Miller and Carens concur with the Princeton political theorist Michael Walzer’s view of open-ended guest-worker programmes as ethically problematic. The fiction that such work is temporary and such workers remain foreign obscures the reality that these migrants are also part of the societies in which they live and work, often for many years, and where they deserve protection and opportunities for advancement.

Not all migrants will have access to a process leading to national citizenship or permanent legal residence status, whether this is because they are unauthorised, or their immigration status is unclear, or they are living in a nation that limits or discourages immigration while allowing foreign workers on renewable work permits. If we agree that migration is part of the identity of a society in which low-wage migrants live and work, whether or not this is acknowledged by non-migrants or by higher-status migrants, what would it mean to build on the idea of social membership and consider migrants as social citizens of the place in which they have settled? And what realistic work can the idea of social citizenship do in terms of improving conditions for migrants and supporting policy development?

Social citizenship is both a feeling of belonging and a definable set of commitments and obligations associated with living in a place; it is not second-class national citizenship. The place where one’s life is lived might have been chosen in a way that the nation of one’s birth was not; for a Londoner or a New Yorker, local citizenship can be a stronger identity than national citizenship. Migrants live in cities with a history of welcoming immigrants, in cities that lack this history, and also in cities where national policy discourages immigration. Considering how to ensure that social citizenship extends to migrants so that they get to belong, to contribute, and to be protected is a way to frame ethical and practical questions facing urban policymakers.

Considering migrants as social citizens of the cities in which they settle is related to but not the same as the idea of the city as a ‘sanctuary’ for migrants. Throughout the US, local officials have designated ‘sanctuary cities’ for undocumented immigrants subject to deportation under policies announced by the federal government in February 2017. This contemporary interpretation of an ancient concept refers to a policy of limited local cooperation with federal immigration officials, often associated with other policies supporting a city’s migrant population. Canadian officials use the term ‘sanctuary city’ similarly, to refer to local protections and potentially also to limited cooperation with border-control authorities. In Europe, the term ‘city of sanctuary’ tends to refer to efforts supporting local refugees and coordinated advocacy for refugee admission and rights. These local actions protecting migrants are consistent with a practical concept of social citizenship in which civic history and values, and interests such as being a welcoming, diverse or growing city, correspond to the interests of migrants. However, the idea of ‘sanctuary’ suggests crisis: an urgent need for a safe place to hide. To become social citizens, migrants need more from cities than sanctuary.

Local policies that frame social citizenship in terms that apply to settled migrants should go beyond affirming migrants’ legal rights and helping them to use these rights, although this is certainly part of a practical framework. Social citizenship, as a concept that should apply to migrants and non-migrants alike, on the basis of being settled into a society, can build on international human rights law, but can be useful in jurisdictions where human rights is not the usual reference point for considering how migrants belong to, contribute to, and are protected by a society.

What can a city expect or demand of migrants as social citizens? Mindful that the process of social integration usually takes more than one generation, it would not be fair to expect or demand that migrants integrate into a new society on an unrealistic timetable. Most migrants are adults, and opportunities to belong, to contribute, and to be protected should be available to them, as well as to the next generation. Migrants cannot be expected to take actions that could imperil them or their families. For example, while constitutionally protected civil rights in the US extend to undocumented immigrants, using these rights (by identifying themselves publicly, for example) can bring immigrants to the attention of federal authorities, a reality or fear that might constrain their ability to participate in civic life.

In his novel Exit West (2017), Mohsin Hamid offers a near-future fictional version of a political philosopher’s ‘earned amnesty’ proposal. Under the ‘time tax’, newer migrants to London pay a decreasing ‘portion of income and toil’ toward social welfare programmes for longstanding residents, and have sweat-equity opportunities to achieve home ownership by working on infrastructure construction projects (the ‘London Halo’). Today, the nonfictional citizens of Berlin are debating how to curb escalating rents so that the city remains open to lower-wage residents, including internal and transnational migrants. A robust concept of social citizenship that includes migrants who have begun the process of belonging to a city, and those who should be acknowledged as already belonging, will provide a necessary framework for understanding contemporary urban life in destination cities.Aeon counter – do not remove

Nancy Berlinger

This article was originally published at Aeon and has been republished under Creative Commons.

#


American exceptionalism, from Stalin with love

By Ian Tyrrell | He is emeritus professor of history at the University of New South Wales in Sydney, Australia. His latest book is Crisis of the Wasteful Nation: Empire and Conservation in Theodore Roosevelt’s America (2015). Creative Commons Attribution-No Derivatives. Via Aeon

 

Every time a public figure uses the term ‘American exceptionalism’, ordinary Americans turn to my website. It’s number one for a quick answer to the question: ‘What is American exceptionalism?’ My latest benefactor was Hillary Clinton, who used the term in a speech on 31 August. My website hits spiked. Until about 2010, few Americans had heard the term. Since then, its use has expanded exponentially. It is strange that such an inelegant term should be adopted by two major political parties when so many people had not a clue what it meant. Of course, one doesn’t have to use the term to believe in the underlying concept. But the phrase has a history that helps us to understand the current hyperbolic use.

American exceptionalism is not the same as saying the United States is ‘different’ from other countries. It doesn’t just mean that the US is ‘unique’. Countries, like people, are all different and unique, even if many share some underlying characteristics. Exceptionalism requires something far more: a belief that the US follows a path of history different from the laws or norms that govern other countries. That’s the essence of American exceptionalism: the US is not just a bigger and more powerful country – but an exception. It is the bearer of freedom and liberty, and morally superior to something called ‘Europe’. Never mind the differences within Europe, or the fact that ‘the world’ is bigger than the US and Europe. The ‘Europe’ versus ‘America’ dichotomy is the crucible in which American exceptionalist thinking formed.

Some presume that the Frenchman Alexis de Tocqueville invented the term in the 1830s, but only once did de Tocqueville actually call American society ‘exceptional’. He argued that Americans lacked culture and science, but could rely on the Anglo-Saxons in Britain to supply the higher forms of civilisation. This is not what Americans mean by ‘exceptionalism’ today.

American exceptionalism is an ideology. The ‘ism’ is the giveaway. De Tocqueville examined US institutions and moral behaviours as structural tendencies of democratic societies. He did not see US democracy as an ideology. To him, the US was the harbinger of a future that involved the possible democratisation of Europe, not an unrepeatable outlier of civilisation. He studied the US as a model of democratic society, whose workings needed to be understood, because the idea was spreading.

Some think that Werner Sombart, the German socialist of the early 1900s, invented the term, but he did not. Sombart claimed only that US capitalism, and its abundance, made the country temporarily unfavourable terrain for the development of socialism. It was actually Joseph Stalin, or his minions, who, in 1929, gave the idea its name. It is surely one of the ironies of modern history that both major US political parties now compete to endorse a Stalinist term.

Orthodox communists used the term to condemn the heretical views of the American communist Jay Lovestone. In the late 1920s, Lovestone argued that the capitalist economy of the US did not promote the revolutionary moment for which all communists waited. The Communist Party expelled Lovestone, but his followers and ex-Trotskyites in the US embraced the exceptionalist epithet and, eventually, the idea that the US would permanently avoid the socialist stage of development.

After the Nazi-Soviet Pact of 1939, as well as later during the Cold War, many of these US Marxists jettisoned their old political allegiances but retained the mindset that the economic success of the US buried class struggle in their nation – permanently. As the leader of the free world, the chief victor in the Second World War over ‘totalitarian’ Germany, and by far the world’s most prosperous economy, the US seemed in all these ways an exceptional nation. Seymour Martin Lipset, the eminent Stanford political sociologist, made a career investigating the many factors that led to this American exceptionalism. Until his death in 2006, Lipset continued to hold that the US was not subject to the historical norms of all other nations.

No one did more than Ronald Reagan to amplify and popularise the US as exceptional. Refusing to accept the doldrums of the Jimmy Carter presidency or the transgressions of Richard Nixon as the best that Americans could do, Reagan promoted the image of the US as a shining ‘city upon a hill’. This reference is to a 1630 sermon by John Winthrop, the Governor of Massachusetts Bay Colony. Winthrop was calling on the new Pilgrim settlers heading for Massachusetts to stick to the narrow path of Puritanism.

Reagan and his followers wrongly attributed American exceptionalism to this Puritan injunction, and added ‘shining’ to the original, which gave the phrase a distinctly different connotation. Nor was Winthrop referring to any nation, but rather a discrete community of English Protestant believers. Notably, Winthrop’s sermon had been neglected for centuries. It was resurrected only in the 1940s by a few Harvard academics who were engaged in an intellectual rehabilitation of Puritan thought. In a 1961 speech, John F Kennedy, who had been a Harvard student and was influenced by that university’s Americanists, used the ‘city upon a hill’ phrase. The idea of the US as a ‘city upon a hill’, however, really gained purchase in political rhetoric in the 1970s and ’80s, as Reagan sought to reinvent the country.

Without question, Reagan saw the US as an exceptional nation. The language of exceptionalism, however, derived from Marxism, not God. The idea of a morally superior and unique civilisation destined to guide the world did not come under the banner of an orthodox ‘ism’ until very recently, until the 21st century. In the wake of 9/11, the speeches of George W Bush and his supporters asserted the radical distinctiveness of the US with a new belligerence. We have all heard it: it is ‘our freedoms’ that Islamic terrorists hated; they wished to kill Americans because they envied this exceptional inheritance.

The global financial crisis of 2007-10 added to the geopolitical turmoil that followed 9/11. Though the US economy expanded in the 1990s and early 2000s, economic inequality that began to grow in the Reagan era also became worse. In the post-1945 age, when academics first posed American exceptionalism as a coherent doctrine, the idea also became linked to global US military and political hegemony. In the past two generations, since the Reagan era, Americans have not prospered to the same extent, and American exceptionalism has been increasingly linked only to military hegemony.

Decline is, in fact, the midwife to the ideology of American exceptionalism. The less exceptional that circumstances in the US appear, the louder defenders of exceptionalism insist on orthodoxy. When the nation was indisputably powerful and its people prosperous, Americans did not collectively require an ‘ism’ to serve as a guiding light. In these more polarised times, when the fates of Americans become based more on their class and less on their shared nationality, the ideological orthodoxy of American exceptionalism has emerged on a political level. A previously obscure academic term became a rallying cry for a political agenda.

When Hillary Clinton joins the exceptionalist bandwagon, it reflects a political consensus that Donald Trump denies. In wanting to make America great again, Trump implicitly accepts that it is not currently ‘great’, and never was exceptional. No longer is the Republican Party the chief cheerleader of American exceptionalism. But the Democrats have picked up the mantle, and the language of exceptionalism continues to rally a party and a country.Aeon counter – do not remove

Ian Tyrrell

This article was originally published at Aeon and has been republished under Creative Commons.

#


The Middle East in the New World Disorder

By Ambassador Chas W. Freeman, Jr. (USFS, Ret.)

Not so long ago, Americans thought we understood the Middle East, that region where the African, Asian, and European worlds collide.  When the Ottoman Empire disintegrated in World War I, the area  became a European sphere of influence with imperial British, French, and Italian subdivisions.  The Cold War split it into American and Soviet client states.  Americans categorized countries as with us or against us, democratic or authoritarian, and endowed with oil and gas or not.  We acted accordingly.

In 1991, the Soviet Union defaulted on the Cold War and left the United States the only superpower still standing.  With the disappearance of Soviet power, the Middle East became an exclusively American sphere of influence.  But a series of U.S. policy blunders and regional reactions to them have since helped thrust the region into chaos, while progressively erasing American dominance

In the new world disorder, there are many regional sub-orders.  The Middle East is one of them.  It is entering the final stages of a process of post-imperial, national self-determination that began with Kemal Atatürk’s formation of modern Turkey from the rubble of the Ottoman Empire in 1923.  This process is entrenching the originally Western concept of the nation state in the region.  It led to Gamal Abdel Nasser’s repudiation of British overlordship and overthrow of the monarchy in Egypt in 1952, Ayatollah Khomeini’s rejection of American tutelage and replacement of the Shah with the Islamic Republic of Iran in 1979, and the misnamed “Arab Spring” in 2011.  Its latest iteration is unfolding in Saudi Arabia.

In the Middle East, as elsewhere, regional rather than global politics now drives events.  The world is reentering a diplomatic environment that would have been familiar to Henry John Temple, 3rd Viscount Palmerston, who served nineteenth century Britain as secretary of war, foreign affairs, and prime minister.  In his time, the core skill of statecraft was manipulation of regional balances of power to protect national interests and exercise influence through measures short of war.

Palmerston famously observed that in international relations, there are no permanent friends or permanent enemies, only permanent interests.  In the new world disorder, with its narcissistic nationalism, shifting alignments, and wobbling partnerships, this sounds right, even if national interests are also visibly evolving to reflect fundamental shifts in their international context.  Palmerston’s aphorism is a reminder that the flexibility and agility implicit in the hedged obligations of entente – limited commitments for limited contingencies – impart advantages that the inertia of alliance – broad obligations of mutual aid – does not.  One way or another, it is in our interest to aggregate the power of others to our own while minimizing the risks to us of doing so.

To cope with the world after the Pax Americana and to put “America first,” we Americans are going to have relearn the classic vocabulary of diplomacy or some new, equally reality-based version of it.  If we do, we will discover that, in the classic sense of the word, we now have no “allies” in the Middle East.  The only country with which we had a de jure alliance based on mutual obligations, Turkey, has de facto departed it.

Today, Ankara and Washington are seriously estranged.  Turkey is no longer aligned with the United States on any of our major diplomatic objectives in the region, which have been: securing Israel, excluding Russian influence; opposing Iran; and sustaining strategic partnerships with Saudi Arabia and the U.A.E.  Americans can no longer count on Turkey to support or acquiesce in our policies toward the Israel-Palestine issue; Syria; Iraq; Iran; Russia; the Caucasus; the Balkans; Greece; Cyprus; Egypt, the Gulf Cooperation Council countries; the members of the Organization of Islamic Cooperation; NATO; or the EU.

Having been rebuffed by Europe, Turkey has abandoned its two-century-long drive to redefine its identify as European.  It is pursuing an independent, if erratic, course in the former Ottoman space and with Russia and China.  The deterioration in EU and US-Turkish relations represents a very significant weakening of Western influence in the Middle East and adjacent regions.  As the list of countries Turkey affects suggests, this has potentially far-reaching consequences.

Meanwhile, U.S. relations with Iran remain antagonistic.  American policy blunders like the destabilization of Iraq and Syria have facilitated Iran’s establishment of a sphere of influence in the Fertile Crescent.  Our lack of a working relationship with Tehran leaves the United States unable to bring our influence to bear in the region by measures short of war.  U.S. policy is thus all military, all the time.  The White House echoes decisions made in Jerusalem, Riyadh, and Abu Dhabi.  It no longer sets its own objectives and marshals others behind them.

For our own reasons, which differ from country to country, Americans have unilaterally taken under our wing a variety of client states, some of which are each other’s historic antagonists.  Our commitments have not changed despite the fact that the regional context of our relationships with our client states and their orientations and activities are all in rapid evolution.  Other than Turkey, the United States has never had a Middle Eastern partner that has seen itself as obliged to come to our aid or, indeed, to do anything at all for us except what might serve  its own immediate, selfish interests.  The obligations all run the other way – from us to them.

Continue reading

Bragging rights: when beating your own drum helps (or hurts)

By Patrick Heck | He is a PhD candidate in social psychology at Brown University in Rhode Island, where he studies the Self, social judgment and decision making, and prosocial behavior. Via Creative Commons Attribution-No Derivatives

Social observers are particularly attuned to braggadocio. What do you think of a person who claims to be a better driver, performer or lover than average? Is this person better described as confident or cocky; self-important or honest? Would you put your health or safety in their hands? And what about the opposite type of person, who claims to be worse than others? Would you hire this person for a job? In what field?

Social scientists have been asking for decades whether boastful, self-aggrandising beliefs and behaviours are beneficial to those who make such claims. According to one school of thought, claiming to be better than others feels good, and when we feel good, we are happier and better adjusted. This argument suggests that bragging to others can satisfy the motive to craft and maintain a positive self-image. According to another line of research, however, consistently viewing oneself as superior entails a distortion of reality. Inaccurate individuals with low self-knowledge have weaker relationships and a tendency to make riskier decisions than their accurate, self-aware counterparts.

Together with Joachim Krueger at Brown University in Rhode Island, I recently proposed a middle ground: braggadocio could be a double-edged sword. In our paper in the journal Social Psychology, we argue that thinking you are better than average (and bragging to others about it) can damage some aspects of your reputation but boost others. Bragging can help or harm depending upon your goals – so you’d do well to know what you want to accomplish before tooting your own horn.

To test how observers respond to braggadocio and humility, we recruited nearly 400 volunteers and asked them to rate a series of target individuals along the two major dimensions of social perception: competence, including rationality, intelligence and naiveté, and morality, including ethics, trustworthiness and selfishness. Some of the targets were defined as performing better or worse than average without making claims Some claimed to be better or worse than average without any evidence. Others both made a claim about themselves (‘I did better/worse than average’) while researchers revealed their scores.

The results demonstrated several detrimental effects of boasting, although we observed some surprising benefits too. Perhaps the most interesting finding was what we call the ‘humility paradox’. In the absence of evidence (ie, a test score), bragging to be better than average boosted a target’s reputation as competent, but diminished their reputation as moral. Conversely, those who remained humble by claiming to be worse than average were rated as more moral and less competent than the braggarts. The paradox suggests that when deciding whether or not to boast about your performance, keen decision-makers might first stop to consider which aspect of reputation they are most interested in emphasising or protecting.

The results were especially nuanced when test subjects rated targets whose claims were either validated or violated by objective evidence (their actual test performance). For moral reputations, humility remained a beneficial strategy even when a target performed well. Across the board, participants rated targets who claimed to be worse than average as more moral than targets who claimed to be better than average, regardless of their actual performance. In the domain of morality, humility pays.

For perceived competence, evidence mattered. The absolute worst thing a target could do was to claim superiority (‘I am better than average’) when the evidence proved him wrong (‘Harry actually scored below average on the test’).

There was, to be sure, some strategic benefit to making a boastful claim: targets who claimed to be better than average were seen as quite competent either when:

(a) evidence supported this claim; or

(b) no evidence was available.

In other words, boasting appeared to benefit a target’s reputation as competent, so long as contradictory evidence was never revealed.

As is the case with most experiments in social psychology, these studies were conducted in a contrived laboratory setting, and carry several limitations. All our participants lived in the United States, although we know that cultural background can encourage or discourage boasting. Similarly, all the targets that our participants rated had male names in order to rule out any confounding effects of gender, even though we know that the gender of observers and targets plays an important role in social perception. Culture and gender are two variables we would like to incorporate in future studies on the nature and perception of bragging.

Despite these limitations, the results of our studies suggest a few strategies for daily life: in situations where your competence is of critical interest (such as a job interview or debate), claiming to be better than the other candidates could be beneficial, so long as contradictory evidence will never come to light. But in situations where your reputation as a warm or moral person is put to the test (say, while networking or on a date), it appears that humility is the best strategy, even if you truly have something to brag about. Aeon counter – do not remove

Patrick Heck

This article was originally published at Aeon and has been republished under Creative Commons.

#


@StateDept Spox Talks “No Double Standard Policy” and 7 FAM 052 Loudly Weeps

Posted: 2:58 am ET

 

So we asked about the State Department’s “no double stand policy” on December 5 after media reports say that classified cables went out  in the past 2 weeks warning US embassies worldwide to heighten security ahead of a possible @POTUS announcement recognizing Jerusalem as the capital of Israel.

On December 7, the State Department press corps pressed the official spokesperson about a cable that reportedly asked agency officials to defer all nonessential travel to Israel, the West Bank, and Jerusalem. Note that the security messages issued by multiple posts on December 5 and 6 with few exceptions were personal security reminders, and warnings of potential protests.  The Worldwide Caution issued on December 6 is an update “with information on the continuing threat of terrorist actions, political violence, and criminal activity against U.S. citizens and interests abroad.

None of the messages released include information that USG officials were warned to defer non-essential travel to the immediate affected areas. When pressed about this apparent double standard, the official spox insisted that “unfortunately, just as State Department policy, we don’t comment on official – whether or not there was an official communication regarding — regarding this.”

Noooooooooooooooooo!

The spox then explained  what the “no double standard” policy means while refusing to comment on official communication that potentially violates such policy. And if all else fails, try “hard to imagine that our lawyers have not gone through things.”  

Holy moly guacamole, read this: 7 FAM 052  NO DOUBLE STANDARD POLICY

In administering the Consular Information Program, the Department of State applies a “no double standard” policy to important security threat information, including criminal information.

Generally, if the Department shares information with the official U.S. community, it should also make the same or similar information available to the non-official U.S. community if the underlying threat applies to both official and non-official U.S. citizens/nationals.

If a post issues information to its employees about potentially dangerous situations, it should evaluate whether the potential danger could also affect private U.S. citizens/nationals living in or traveling through the affected area.

The Department’s “No Double Standard” policy, provided in 7 FAM 052, is an integral part of CA/OCS’s approach to determine whether to send a Message.  The double standard we guard against is in sharing threat-related information with the official U.S. community — beyond those whose job involves investigating and evaluating threats — but not disseminating it to the U.S. citizen general public when that information does or could apply to them as well.

Also this via 7 FAM 051.2(b) Authorities (also see also 22 CFR 71.1, 22 U.S.C. 2671 (b)(2)(A), 22 U.S.C. 4802, and 22 U.S.C. 211a):

…The decision to issue a Travel Alert, Travel Warning, or a Security or Emergency Message for U.S. Citizens for an individual country is based on the overall assessment of the safety/security situation there.  By necessity, this analysis must be undertaken without regard to bilateral political or economic considerations.  Accordingly, posts must not allow extraneous concerns to color the decision of whether to issue information regarding safety or security conditions in a country, or how that information is to be presented.

As to the origin of this policy, we would need to revisit the Lockerbie Bombing and Its Aftermath (this one via ADST’s Oral History).

The State Department’s official spokesperson via the Daily Press Briefing, December 7, 2017:

QUESTION: So a cable went out to all U.S. diplomatic and consular missions yesterday that asked State Department officials to defer all nonessential travel to the entirety of Israel, the West Bank, and Jerusalem. Normally when you are discouraging American officials from going to a particular area, under the no double standard rule, you make that public to all U.S. citizens so that they have the same information. I read through the Travel Warnings on Israel, the West Bank, and Gaza yesterday, both in the middle of the day and then at the end of the day after the worldwide caution, and I saw no similar warning to U.S. citizens or advice to U.S. citizens to defer nonessential travel to those areas. Why did you say one thing in private to U.S. officials and another thing – and not say the same thing in public to U.S. citizens?

MS NAUERT: Let me state the kinds of communication that we have put out to American citizens and also to U.S. Government officials. And one of the things we often say here is that the safety and security of Americans is our top priority. There are top policy priorities, but that is our overarching, most important thing, the safety and security of Americans.

We put out a security message to U.S. citizens on the 5th of December – on Monday, I believe it was. We put out a security message to our U.S. citizens that day – that was Tuesday? Okay, thank you – on the 5th of December. We put out another one on the 6th of December as well, expressing our concerns. We want to alert people to any possible security situations out of an abundance of caution. That information was put, as I understand it, on the State Department website, but it was also issued by many of our posts overseas in areas where we thought there could be something that could come up.

In addition to that, there is a Travel Warning that goes out regarding this region. That is something that is updated every six months, I believe it is. This Travel Warning for the region has been in effect for several, several years, so that is nothing new. In addition to that, we put out a worldwide caution. That is updated every six months. We had a worldwide caution in place for several years, but yesterday, out of an abundance of caution, we updated it. As far as I’m aware of, and I won’t comment on any of our internal communications to say whether or not there were any of these internal communications because we just don’t do that on any matter, but I think that we’ve been very clear with Americans, whether they work for – work for the U.S. Government or whether they’re citizens traveling somewhere, about their safety and security. This is also a great reminder for any Americans traveling anywhere around the world to sign up for the State Department’s STEP program, which enables us to contact American citizens wherever they are traveling in the case of an emergency if we need to communicate with them.

QUESTION: But why did you tell your officials not to travel to those areas between December 4th and December 20th, and not tell American citizens the same things? Because you didn’t tell that to American citizens in all of the messages that you put up on the embassy website, on the consulate website, nor did you tell American citizens that in a Worldwide Caution, nor did you tell them that in the link to Israel, the West Bank, and Gaza that was put out by the State Department in the Worldwide Caution yesterday. You’re telling your people inside one thing, and you’re telling American citizens a different thing, and under your own rules, you are – there is supposed to be no double standard. Why didn’t you tell U.S. citizens the same thing you told the U.S. officials?

MS NAUERT: Again, unfortunately, just as State Department policy, we don’t comment on official – whether or not there was an official communication regarding —

Image via Wikimedia Commons by Saibo

QUESTION: (Off-mike.)

MS NAUERT: – regarding this. But I can tell you as a general matter, I think we have been very clear about the security concerns regarding Americans. We have put out those three various subjects or types of communications to American citizens who are traveling in areas that could be affected.

QUESTION: I’m going to ask you –

MS NAUERT: In terms of the U.S. Government, when we talk about the U.S. Government deferring non-essential travel, I would hope that people would not travel for non-essential reasons just as a general matter anyway.

QUESTION: But why – I’m going to ask you a hypothetical, which I would ask you to entertain, if you’ll listen to it.

MS NAUERT: I’ll listen to it. I’d be happy to listen to it.

QUESTION: If there were such communication, and you know and every U.S. diplomat who gets an ALDAC, which means every other person who works at the State Department knows that this communication went out – so if there were such communication, why would you say one thing to your own officials and a different thing to American citizens —

MS NAUERT: As our —

QUESTION: – which is what the law and your own rules require?

MS NAUERT: As you well know, we have a no “double standard.” And for folks who aren’t familiar with what that means, it’s when we tell our staff something about a particular area or a security threat, we also share that same information with the American public. I would find it hard to imagine that our lawyers have not gone through things to try to make sure that we are all on the same page with the information that we provide to U.S. Government officials as well as American citizens. And that’s all I have for you on that. Okay? Let’s move on to something else.

#

Why rudeness at work is contagious and difficult to stop

By Trevor Foulk | He is a PhD candidate in business administration at the University of Florida. He is interested in negative work behaviours, team dynamics, decision-making, and depletion/recovery. Creative Commons Attribution-No Derivatives

 

Most people can relate to the experience of having a colleague inexplicably treat them rudely at work. You’re not invited to attend a meeting. A co-worker gets coffee – for everyone but you. Your input is laughed at or ignored. You wonder: where did this come from? Did I do something? Why would he treat me that way? It can be very distressing because it comes out of nowhere and often we just don’t understand why it happened.

A large and growing body of research suggests that such incidents, termed workplace incivility or workplace rudeness, are not only very common, but also very harmful. Workplace rudeness is not limited to one industry, but has been observed in a wide variety of settings in a variety of countries with different cultures. Defined as low-intensity deviant behaviour with ambiguous intent to harm, these behaviours – small insults, ignoring someone, taking credit for someone’s work, or excluding someone from office camaraderie – seem to be everywhere in the workplace. The problem is that, despite their ‘low-intensity’ nature, the negative outcomes associated with workplace rudeness are anything but small or trivial.

It would be easy to believe that rudeness is ‘no big deal’ and that people must just ‘get over it’, but more and more researchers are finding that this is simply not true. Experiencing rudeness at work has been associated with decreased performance, decreased creativity, and increased turnover intentions, to name just a few of the many negative outcomes of these behaviours. In certain settings, these negative outcomes can be catastrophic – for example, a recent article showed that when medical teams experienced even minor insults before performing a procedure on a baby, the rudeness decimated their performance and led to mortality (in a simulation). Knowing how harmful these behaviours can be, the question becomes: where do they come from, and why do people do them?

While there are likely many reasons people behave rudely, at least one explanation that my colleagues and I have recently explored is that rudeness seems to be ‘contagious’. That is, experiencing rudeness actually causes people to behave more rudely themselves. Lots of things can be contagious – from the common cold, to smiling, yawning and other simple motor actions, to emotions (being around a happy person typically makes you feel happy). And as it turns out, being around a rude person can actually make you rude. But how?

There are two ways in which behaviours and emotions can be contagious. One is through a conscious process of social learning. For example, if you’ve recently taken a job at a new office and you notice that everybody carries a water bottle around, it likely won’t be long until you find yourself carrying one, too. This type of contagion is typically conscious. If somebody said: ‘Why are you carrying that water bottle around?’, you would say: ‘Because I saw everybody else doing it and it seemed like a good idea.’

Another pathway to contagion is unconscious: research shows that when you see another person smiling, or tapping a pencil, for example, most people will mimic those simple motor behaviours and smile or tap a pencil themselves. If someone were to ask why you’re smiling or tapping your pencil, you’d likely answer: ‘I have no idea.’

In a series of studies, my colleagues and I found evidence that rudeness can become contagious through a non-conscious, automatic pathway. When you experience rudeness, the part of your brain responsible for processing rudeness ‘wakes up’ a little bit, and you become a little more sensitive to rudeness. This means that you’re likely to notice more rude cues in your environment, and also to interpret ambiguous interactions as rude. For example, if someone said: ‘Hey, nice shoes!’ you might normally interpret that as a compliment. If you’ve recently experienced rudeness, you’re more likely to think that person is insulting you. That is, you ‘see’ more rudeness around you, or at least you think you do. And because you think others are being rude, you become more likely to behave rudely yourself.

You might be wondering, how long does this last? Without more research it’s impossible to say for sure, but in one of our studies we saw that experiencing rudeness caused rude behaviour up to seven days later. In this study, which took place in a negotiations course at a university, participants engaged in negotiations with different partners. We found that when participants negotiated with a rude partner, in their next negotiation their partner thought they behaved rudely. In this study, some of the negotiations took place with no time lag, sometimes there was a three-day time lag, and sometimes there was a seven-day time lag. To our surprise, we found that the time lag seemed to be unimportant, and at least within a seven-day window the effect did not appear to be wearing off.

Unfortunately, because the rudeness is contagious and unconscious, it’s hard to stop. So what can be done? Our work points to a need to re-examine the types of behaviours that are tolerated at work. More severe deviant behaviours, such as abuse, aggression and violence, are not tolerated because their consequences are blatant. While rudeness of a more minor nature makes its consequences a little harder to observe, it is no less real and no less harmful, and thus it might be time to question whether we should tolerate these behaviours at work.

You might be thinking that it will be impossible to end workplace rudeness. But work cultures can change. Workers once used to smoke at their desks, and those same workers would have said it was a natural part of office life that couldn’t be removed. Yet workplace smoking is verboten everywhere now. We’ve drawn the line at smoking and discrimination – and rudeness should be the next to go.Aeon counter – do not remove

Trevor Foulk

This article was originally published at Aeon and has been republished under Creative Commons.

#


Moderation may be the most challenging and rewarding virtue

By Aurelian Craiutu

He is a professor of political science and adjunct professor of American studies at Indiana University, Bloomington. His most recent book is Faces of Moderation: The Art of Balance in an Age of Extremes (2016). He lives in Bloomington.

Three centuries ago, the French political philosopher Montesquieu claimed that human beings accommodate themselves better to the middle than to the extremes. Only a few decades later, George Washington begged to differ. In his Farewell Address (1796), the first president of the United States sounded a warning signal against the pernicious effects of the spirit of party and faction. The latter, he argued, has its roots in the strongest passions of the human mind and can be seen in ‘its greatest rankness’ in popular government where the competition and rivalry between factions are ‘sharpened by the spirit of revenge’ and immoderation.

If one looks at our world today, we might be tempted to side with Washington over Montesquieu. Our political scene offers a clear sign of the little faith we seem to have in this virtue without which, as John Adams memorably put it in 1776, ‘every man in power becomes a ravenous beast of prey’. Although our democratic institutions depend on political actors exercising common sense, self-restraint and moderation, we live in a world dominated by hyperbole and ideological intransigence in which moderates have become a sort of endangered species in dire need of protection. Can we do something about that to save them from extinction? To answer this question, we should take a new look at moderation, which Edmund Burke regarded as a difficult virtue, proper only to noble and courageous minds. What does it mean to be a moderate voice in political and public life? What are the principles underlying moderation? What do moderates seek to achieve in society, and how do they differ from more radical or extremist minds?

Continue reading

Massimo Pigliucci: To be happier, focus on what’s within your control

by Massimo Pigliucci
(This article was originally published at Aeon and has been republished under Creative Commons)

God, grant me the serenity to accept the things I cannot change,
Courage to change the things I can,
And wisdom to know the difference.

This is the Serenity Prayer, originally written by the American theologian Reinhold Niebuhr around 1934, and commonly used by Alcoholics Anonymous and similar organisations. It is not just a key step toward recovery from addiction, it is a recipe for a happy life, meaning a life of serenity arrived at by consciously taking what life throws at us with equanimity.

The sentiment behind the prayer is very old, found in 8th-century Buddhist manuscripts, as well as in 11th-century Jewish philosophy. The oldest version I can think of, however, goes back to the Stoic philosopher Epictetus. Active in the 2nd century in Rome and then Nicopolis, in western Greece, Epictetus argued that:

We are responsible for some things, while there are others for which we cannot be held responsible. The former include our judgment, our impulse, our desire, aversion and our mental faculties in general; the latter include the body, material possessions, our reputation, status – in a word, anything not in our power to control. … [I]f you have the right idea about what really belongs to you and what does not, you will never be subject to force or hindrance, you will never blame or criticise anyone, and everything you do will be done willingly. You won’t have a single rival, no one to hurt you, because you will be proof against harm of any kind.

I call this Epictetus’ promise: if you truly understand the difference between what is and what is not under your control, and act accordingly, you will become psychologically invincible, impervious to the ups and downs of fortune.

Of course, this is far easier said than done. It requires a lot of mindful practice. But I can assure you from personal experience that it works. For instance, last year I was in Rome, working, as it happened, on a book on Stoicism. One late afternoon I headed to the subway stop near the Colosseum. As soon as I entered the crowded subway car, I felt an unusually strong resistance to moving forward. A young fellow right in front of me was blocking my way, and I couldn’t understand why. Then the realisation hit, a second too late. While my attention was focused on him, his confederate had slipped his hand in my left front pocket, seized my wallet, and was now stepping outside of the car, immediately followed by his accomplice. The doors closed, the train moved on, and I found myself with no cash, no driver’s licence, and a couple of credit cards to cancel and replace.

Before I started practising Stoicism, this would have been a pretty bad experience, and I would not have reacted well. I would have been upset, irritated and angry. This foul mood would have spilled over the rest of the evening. Moreover, the shock of the episode, as relatively mild as the attack had been, would have probably lasted for days, with a destructive alternation of anger and regret.

But I had been practicing Stoicism for a couple of years. So my first thought was of Epictetus’ promise. I couldn’t control the thieves in Rome, and I couldn’t go back and change what had happened. I could, however, accept what had happened and file it away for future reference, focusing instead on having a nice time during the rest of my stay. After all, nothing tragic had happened. I thought about this. And it worked. I joined my evening company, related what happened, and proceeded to enjoy the movie, the dinner, and the conversation. My brother was amazed that I took things with such equanimity and that I was so calm about it. But that’s precisely the power of internalising the Stoic dichotomy of control.

And its efficacy is not limited to minor life inconveniences, as in the episode just described. James Stockdale, a fighter-jet pilot during the Vietnam War, was shot down and spent seven and a half years in Hoa Lo prison, where he was tortured and often put in isolation. He credits Epictetus for surviving the ordeal by immediately applying the dichotomy of control to his extreme situation as a captive, which not only saved his life, but also allowed him to coordinate the resistance from inside the prison, in his position as senior ranking officer.

Most of us don’t find ourselves in Stockdale’s predicament, but once you begin paying attention, the dichotomy of control has countless applications to everyday life, and all of them have to do with one crucial move: shifting your goals from external outcomes to internal achievements.

For example, let’s say that you are preparing your résumé for a possible job promotion. If your goal is to get the promotion, you are setting yourself up for a possible disappointment. There is no guarantee that you will get it, because the outcome is not (entirely) under your control. Sure, you can influence it, but it also depends on a number of variables that are independent of your efforts, including possible competition from other employees, or perhaps the fact that your boss, for whatever unfathomable reason, really doesn’t like you.

That’s why your goal should be internal: if you adopt the Stoic way, you would conscientiously put together the best résumé that you can, and then mentally prepare to accept whatever outcome with equanimity, knowing that sometimes the universe will favour you, and other times it will not. What do you gain by being anxious over something you don’t control? Or angry at a result that was not your doing? You are simply adding a self-inflicted injury to the situation, compromising your happiness and serenity.

This is no counsel for passive acceptance of whatever happens. After all, I just said that your goal should be to put together the best résumé possible! But it is the mark of a wise person to realise that things don’t always go the way we wish. If they don’t, the best counsel is to pick up the pieces, and move on.

Do you want to win that tennis match? It is outside of your control. But to play the best game you can is under your control. Do you want your partner to love you? It is outside of your control. But there are plenty of ways you can choose to show your love to your partner – and that is under your control. Do you want a particular political party to win the election? It is outside of your control (unless you’re Vladimir Putin!) But you can choose to engage in political activism, and you can vote. These aspects of your life are under your control. If you succeed in shifting your goals internally, you will never blame or criticise anyone, and you won’t have a single rival, because what other people do is largely beyond your control and therefore not something to get worked up about. The result will be an attitude of equanimity toward life’s ups and downs, leading to a more serene life.Aeon counter – do not remove

Massimo Pigliucci  is professor of philosophy at City College and at the Graduate Center of the City University of New York. His latest book is How to Be a Stoic: Ancient Wisdom for Modern Living (May, 2017). He lives in New York.

This article was originally published at Aeon and has been republished under Creative Commons.

#