Conventional anti-corruption tools often fail to address root causes – but loss of US leadership could still spell trouble for efforts abroad

Source: The Conversation – USA (3) – By Diana Chigas, Professor of the Practice in International Negotiation and Conflict Resolution, Tufts University

President Donald Trump signs a series of executive orders on Feb. 10, 2025, including an order relating to the Foreign Corrupt Practices Act. Andrew Harnik/Getty Images

For nearly half a century, the Foreign Corrupt Practices Act has made it illegal for U.S. citizens and companies to bribe foreign officials. Since 1998, that has been the case for foreign companies listed on U.S. stock exchanges or acting in the U.S., too.

Under the Trump administration, however, expectations are changing. In February 2025, an executive order froze new investigations for 180 days, arguing that the act has been “stretched beyond proper bounds” and “harms American economic competitiveness.” The president ordered a review of enforcement guidelines to ensure they advance U.S. interests and competitiveness.

The Department of Justice’s revised guidelines, issued in June 2025, prioritize cases that are tied to cartels and other transnational criminal organizations, harm U.S. companies or their “fair access to compete,” or involve “infrastructure or assets” important for national security.

Whatever impact the new guidelines will have on anti-corruption prosecutions globally, which is still unclear, the impact on the actual level of corruption will likely be small. Legal rules and sanctions designed to deter, find and punish “bad apples” have had limited success in many parts of the world. Yet the United States’ retreat from leadership could set back momentum for addressing the root causes of corruption.

New anti-corruption norm, but limited change

In 1977, when the Foreign Corrupt Practices Act was signed into law, the U.S. was alone in criminalizing bribery of foreign officials. Since then, and especially since the end of the Cold War, there’s been a paradigm shift.

Today, a global infrastructure of treaties and institutions obligates countries to criminalize corruption, adopt measures to prevent it and cooperate to recover stolen assets. All but a few members of the United Nations have adopted the U.N. Convention Against Corruption. Substantial amounts of international aid have also been allocated to strengthen anti-corruption efforts. In 2021 alone, the Development Assistance Committee of the Organisation for Economic Co-operation and Development invested over US$7.5 billion for reforms related to fighting corruption, from anti-corruption courts to public financial management.

Yet global trends in corruption, widely defined as the “abuse of entrusted power for personal gain,” are not improving. On the 2024 Corruption Perceptions Index, the most widely used global ranking of public sector corruption, two-thirds of countries scored below 50 on a scale where 0 is “very corrupt” and 100 is “very clean.” And while 32 countries had reduced corruption since 2012, 148 had either stayed the same or gotten worse.

Corruption, it turns out, can be stubbornly resistant to “best practices.”

A beige sign with illustrations of one hand holding money reaching out toward another hand signaling 'stop.'
A sign at a Congolese hospital reminds patients that payments directly to staff are not allowed.
BSIP/Universal Images Group Via Getty Images

A few examples illustrate this “whack-a-mole” dynamic. Medical personnel in Ugandan hospitals began to solicit “gifts” and “appreciation” after the government imposed greater oversight and penalties for bribery. A study of World Bank efforts in over 100 developing countries to clean up procurement corruption found that gains in one area were canceled out when government buyers started to use procedures not subject to the new rules. In my own research, my co-authors and I found that civil servants developed innovative ways to avoid enforcing a law requiring public employees convicted of corruption to be fired.

More than ‘bad apples’

I have spent the past 10 years trying to understand this paradox. One key factor we (and many others) found is that most conventional anti-corruption tools are addressing the wrong problem.

The Foreign Corrupt Practices Act and similar measures focus on preventing, detecting and punishing individual acts of corruption. Rules requiring reporting and asset declarations, monitoring and oversight, and criminal penalties for corruption belong to this category. These tools try to limit the power people have over decisions and resources and increase accountability and transparency.

This approach works where corrupt acts are sporadic, opportunistic deviations from the norm by “bad apples” acting to enrich themselves. It also assumes that rule of law and robust institutions exist.

Rows of people sit on the ground and along a cement wall, all facing one way, under a blue sky.
Filipinos protest on Sept. 21, 2025, in Manila after corruption was uncovered in flood control projects that have embroiled officials, engineers, contractors and politicians.
Ezra Acayan/Getty Images

This is not the case in much of the world – especially in fragile and conflict-affected states where corruption is endemic. By “endemic,” I mean not just that corruption is widespread, but that it is embedded in politics and the economy – a “team effort” within broad networks, with informal rules of the game. As an Afghan official reportedly told U.S. Embassy officials in 2010, endemic corruption “is not just a problem for the system of governance … it is the system of governance.”

What makes the conventional anti-corruption tool kit so ineffective in contexts of endemic corruption?

#1. It does not pay to follow the rules. Without trusted leaders and institutions to implement the law, it is difficult for people to behave honestly, as they don’t trust that others will do the same. Corruption, in this sense, is a “collective action” problem. If corruption is the norm, not the exception, the short-term costs of sticking to the rules are too high.

#2. Corruption serves a useful function – even when it undermines the public good. Even when people believe it’s wrong, corruption can solve problems that seem unsolvable in their current system. For example, health workers in Nigeria often ask for bribes because their salaries are low and clinics lack needed supplies. The money helps them fulfill family obligations and make clinics work. Similarly, politicians often practice patronage because it helps them redistribute wealth to retain supporters and stabilize conflict. Unless dysfunction is addressed, incentives to bypass the rules remain.

#3. Informal institutions prevail over formal rules. When a government cannot be relied upon to provide security, services or livelihoods, people rely on their personal networks to survive. As a judge in the Central African Republic told our research team, “If someone [within your social network] asks for a service, you are required to do it, even if it goes against your own ethics. To refuse is to put oneself in opposition [to one’s clan] and this can be dangerous.”

Loss of leadership

This does not mean that conventional anti-corruption approaches are completely ineffective or irrelevant.

But they aren’t enough on their own. They work best hand in hand with interventions that address motivating factors – from low pay to a lack of livelihoods not dependent on corruption, to social norms that motivate people to seek bribes or make them hesitate to enforce the rules.

Over the past few years, momentum has built to develop these new approaches – though it is still early to assess their effectiveness. Some focus on fixing government dysfunction. Others help unite people and groups trying to resist corruption. Some projects support “horizontal” monitoring by peer firms or communities, instead of government regulation, or try to “nudge” behaviors or change social norms.

The limitations of existing anti-corruption approaches suggest that more limited enforcement of the Foreign Corrupt Practices Act is not likely, by itself, to worsen global corruption. But the loss of U.S. leadership may.

The U.S. role in anti-corruption progress cannot be understated – as a leader in “policing” foreign corruption, a model for other countries’ laws and institutions, and a leading donor. It is still unclear whether others – such as the U.K., the most likely and dedicated candidate – can fill the gap.

Equally concerning, in my view, is the danger that the U.S. turn to a more self-serving view of anti-corruption efforts may encourage a corrupt use of anti-corruption enforcement. Many authoritarian governments have weaponized anti-corruption laws to target political opponents through selective prosecutions.

If the Foreign Corrupt Practices Act is used this way, this could not only undermine the legitimacy of global anti-corruption norms but exacerbate conflict and fuel democratic backsliding at home and abroad.

The Conversation

Diana Chigas receives funding for her research from The MacArthur Foundation, Transparency International Canada and the Wellspring Philanthropic Fund through Besa Global, Inc., a social enterprise in Canada dedicated to improving anti-corruption effectiveness in fragile and conflict-affected contexts.

ref. Conventional anti-corruption tools often fail to address root causes – but loss of US leadership could still spell trouble for efforts abroad – https://theconversation.com/conventional-anti-corruption-tools-often-fail-to-address-root-causes-but-loss-of-us-leadership-could-still-spell-trouble-for-efforts-abroad-263894

Violent acts in houses of worship are rare but deadly – here’s what the data shows

Source: The Conversation – USA (3) – By James Densley, Professor of Criminal Justice, Metropolitan State University

A church program lies on the ground near the family reunification area after the shooting in Grand Blanc, Mich., on Sept. 28, 2025. Jeff Kowalsky/AFP via Getty Images

On Sept. 28, 2025, at least four people were killed and eight others injured during a Sunday service at a Church of Jesus Christ of Latter-day Saints chapel in Grand Blanc, Michigan. Just a month earlier, two people died and 21 were injured during a Mass for students at the Catholic Church of the Annunciation in Minneapolis.

These tragedies may feel sudden and senseless, but they are part of a longer pattern that we have been tracking.

We are criminologists who have studied violence for decades. In 2023, we created a public database of homicides that occur in houses of worship across the United States. It now spans nearly 25 years of incidents, documenting how often these attacks happen, who perpetrates them, what weapons are used, when and where they occur, and how deadly they are.

What the numbers show

From 2000 to 2024, the dataset records 379 incidents and 487 deaths at religious congregations and religious community centers. Most involved a single victim, but some – like the recent shootings in Michigan and Minnesota – killed or injured many people.

About 7 in 10 incidents involved firearms, accounting for three-quarters of the deaths. Firearm cases averaged about 1.4 deaths each, compared with 1.1 for nonfirearm cases.

Handguns were the most common weapon, linked to more than 100 incidents and 147 deaths. But semiautomatic rifles, though used in only seven cases, killed 46 people — more than six per attack, on average.

The deadliest year was 2017, when 47 people were killed at places of worship, 42 of them with firearms. Twenty-six of those people were killed in a single catastrophic shooting at First Baptist Church of Sutherland Springs, Texas.

‘Mass shootings’

Mass shootings are often defined as attacks that kill four or more people. Using that threshold, the data shows 10 incidents since 2000 at houses of worship. Lower the bar to three killed, and there are 14; at two killed, 40.

Definitions shape perception. Most people associate mass shootings with high-profile tragedies like the massacres at Charleston’s Emanuel African Methodist Episcopal Church in 2015 or Pittsburgh’s Tree of Life Synagogue in 2018. But many other attacks, like the tragedy at Annunciation in Minneapolis, involve two or three deaths. Each represents a profound loss for a community.

A man with a white beard, wearing a black sports cap and a cross necklace, somberly holds up a lit candle amid a crowd outside.
Attendees attend a vigil at Holy Redeemer Church in Burton, Mich. on Sept. 28, 2025, following a shooting at a nearby chapel of The Church of Jesus Christ of Latter-day Saints.
AP Photo/Jose Juarez

In the cases where four or more people were killed, every perpetrator was a man in his 20s to 40s, with an average age of 32. Compared with other homicides at worship sites, these shooters were far more likely to have a history of mental health problems: 60% vs. 18%. They were also far more likely to have been thinking about or planning suicide – 70% vs. 17% – and to die by suicide during or after the attack: 60% vs. 10%.

There were other similarities, too. Among attackers who killed four or more people, 20% had served in the military, and 60% had a criminal background. Among attackers who killed fewer people, those numbers were 4% and 43%. Deadlier shooters more often leaked their plans or showed signs of being in crisis beforehand.

When and where

Violence is most likely to strike on Sundays – a quarter of all cases – followed by Saturdays. That reflects worship patterns: Sundays are the busiest day for most Christian denominations, while Saturdays are common for Jewish services.

Incidents cluster around mornings and nights, with mornings most common — the prime window for weekly services. And despite headlines about shootings inside sanctuaries, 71% of homicides occurred outside – in parking lots, courtyards or on steps – when people were gathering or leaving.

In two-thirds of cases, it was unclear whether the perpetrator had a connection with the congregation. Most of the other cases, though, involved attackers with clear ties, including members, relatives, pastors and employees. In dozens of cases, domestic disputes spilled into worship settings. Because services are routine, predictable gatherings, they can become flash points for private conflicts that turn deadly.

Attacks happened across the nation, but were concentrated in the South. The region tends to have more frequent attendance at religious services and looser firearm laws – a combination that helps explain the South’s overrepresentation, though no region is untouched.

Which faiths are affected

Ninety-seven percent of deadly incidents occurred at Christian churches, reflecting how many there are in the United States.

But, adjusting for the number of congregations, the data underscores other faiths’ vulnerability to targeted violence. Jewish and Muslim houses of worship, community centers and cemeteries, for example, contend with frequent threats and vandalism.

Only one incident at a gurdwara – a Sikh temple – appears in the dataset. Because there are so few in the U.S., though, that single case translates into the highest rate for any faith tradition, once the total number of congregations are taken into account. Stabbings or shootings also occurred at six Jewish synagogues and community centers, further suggesting disproportionate risk.

Two incidents involved mosques. Yet that contrasts with data showing high levels of Islamophobia in the U.S., suggesting that most violence against Muslims may occur in other settings.

A woman's face is illuminated by the candle she holds, and she and a man beside her stand in a crowd at night.
People attend a vigil on Aug. 5, 2013, to mark the one-year anniversary of a shooting at a Sikh temple in Oak Creek, Wis.
Scott Olson/Getty Images

Why this research matters

Homicides in houses of worship remain rare, but when they occur, firearms make them deadlier. Victims have included pastors, rabbis, imams, monks, congregants, staff and children.

Numbers cannot capture the grief of families in Grand Blanc or Minneapolis, or the trauma that survivors carry. But they can reveal patterns that ground conversations about safety and prevention.

Houses of worship are meant to be open spaces of peace and refuge. The challenge is balancing this higher purpose with practical security. By studying these past tragedies, Americans may better prepare for the future – and prevent more families from enduring the heartbreak of recent weeks.

The Conversation

James Densley has received funding from the National Institute of Justice, Joyce Foundation, and Sandy Hook Promise Foundation.

Jillian Peterson has received funding from the National Institute of Justice, Joyce Foundation, and Sandy Hook Promise Foundation.

ref. Violent acts in houses of worship are rare but deadly – here’s what the data shows – https://theconversation.com/violent-acts-in-houses-of-worship-are-rare-but-deadly-heres-what-the-data-shows-266328

Even a government shutdown that ends quickly would hamper morale, raise costs and reduce long-term efficiency in the federal workforce

Source: The Conversation – USA (2) – By Gonzalo Maturana, Associate Professor of Finance, Emory University

Congress failed to reach a deal in time, leaving the federal government shut down. AP Photo/Mariam Zuhaib

The U.S. government shutdown couldn’t come at a worse time for federal workers.

With a government shutdown, hundreds of thousands of federal employees would be furloughed – sent home without pay until funding resumes. And ahead of the shutdown, President Donald Trump suggested that a prolonged lapse in funding could open the door to “irreversible” changes, such as reducing parts of the federal workforce.

The shutdown marks another difficult moment this year for a federal workforce that has so far shed more than 300,000 jobs. This is largely due to ongoing Trump administration efforts to downsize parts of the federal government and restructure or largely eliminate certain government agencies with the stated aim of increasing efficiency.

As a team of financial economists who study labor markets and public sector employment and have examined millions of federal personnel records spanning such government shutdowns in the past, we have found that the consequences reach far beyond the now-familiar images of closed national parks and stalled federal services. Indeed, based on our study of an October 2013 shutdown during which about 800,000 federal employees were furloughed for 16 days, shutdowns leave an enduring negative effect on the federal workforce, reshaping its composition and weakening its performance for years to come.

What happens to workers

Millions of Americans interact with the federal government every day in ways both big and small. More than one-third of U.S. national spending is routed through government programs, including Medicare and Social Security. Federal workers manage national parks, draft environmental regulations and help keep air travel safe.

Whatever one’s political leanings, if the goal is a government that handles these responsibilities effectively, then attracting and retaining a talented workforce is essential.

Yet the ability of the federal government to do so may be increasingly difficult, in part because prolonged shutdowns can have hidden effects.

When Congress fails to pass appropriations, federal agencies must furlough employees whose jobs are not deemed “excepted” – sometimes commonly referred to as essential. Those excepted employees keep working, while others are barred from working or even volunteering until funding resumes. Furlough status reflects funding sources and mission categories, not an individual’s performance, so it confers no signal about an employee’s future prospects and primarily acts as a shock to morale.

Importantly, furloughs do not create long-term wealth losses; back pay has always been granted and, since 2019, is legally guaranteed. Employees therefore recover their pay even though they may face real financial strain in the short run.

A cynical observer might call furloughs a paid vacation, yet the data tells a different story.

A sign in front of a national memorial.
National Parks are among the federal services that typically close during a shutdown, as happened in 2013.
AP Photo/Susan Walsh

Immediate consequences, longer-term effects

Using extensive administrative records on federal civilian workers from the October 2013 shutdown, we tracked how this shock to morale rippled through government operations. Employees exposed to furloughs were 31% more likely to leave their jobs within one year.

These departures were not quickly replaced, forcing agencies to rely on costly temporary workers and leading to measurable declines in core functions such as payment accuracy, legal enforcement and patenting activity.

Further, we found that this exodus builds over the first two years after the shutdown and then settles into a permanently lower headcount, implying a durable loss of human capital. The shock to morale is more pronounced among young, female and highly educated professionals with plenty of outside options. Indeed, our analysis of survey data from a later 2018-2019 shutdown confirms that morale, not income loss, drives the exits.

Employees who felt most affected reported a sharp drop in agency, control and recognition, and they were far more likely to plan a departure.

The effect of the motivation loss is striking. Using a simple economic model where workers can be expected to value both cash and purpose, we estimate that the drop in intrinsic motivation after a shutdown would require a roughly 10% wage raise to offset.

Policy implications

Some people have argued that this outflow of employees amounts to a necessary trimming, a way to shrink government by a so-called starving of the beast.

But the evidence paints a different picture. Agencies hit hardest by furloughs turned to temporary staffing firms to fill the gaps. Over the two years after the shutdown we analyzed, these agencies spent about US$1 billion more on contractors than they saved in payroll.

The costs go beyond replacement spending, as government performance also suffers. Agencies that were more affected by the shutdown recorded higher rates of inaccurate federal payments for several years. Even after partial recovery, losses amounted to hundreds of millions of dollars that taxpayers never recouped.

Other skill-intensive functions declined as well. Legal enforcement fell in agencies that became short of experienced attorneys, and patenting activity dropped in science and engineering agencies after key inventors left.

Official estimates of shutdown costs typically focus on near-term GDP effects and back pay. But our findings show that an even bigger bill comes later in the form of higher employee turnover, higher labor costs to fill gaps, and measurable losses in productivity.

Shutdowns are blunt, recurring shocks that demoralize the public workforce and erode performance. These costs spill over to everyone who relies on government services. If the public wants efficient, accountable public institutions, then we should all care about avoiding shutdowns.

After an already turbulent year, it is unclear whether a shutdown would significantly add to the strain on federal employees or have a more limited effect, since many who were considering leaving have already left through buyouts or forced terminations this year. What is clear is that hundreds of thousands of federal employees will experience another period of uncertainty.

This story was updated on Oct. 1, 2025, to include details of the shutdown.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Even a government shutdown that ends quickly would hamper morale, raise costs and reduce long-term efficiency in the federal workforce – https://theconversation.com/even-a-government-shutdown-that-ends-quickly-would-hamper-morale-raise-costs-and-reduce-long-term-efficiency-in-the-federal-workforce-265723

‘Whisper networks’ don’t work as well online as off − here’s why women are better able to look out for each other in person

Source: The Conversation – USA – By Carrie Ann Johnson, Assistant Teaching Professor of Women’s and Gender Studies, Iowa State University

Would you trust sensitive information from someone you know more than from an anonymous online poster? kali9/E+ via Getty Images

Whisper networks are informal channels that women use to warn each other about sexual harassment, abuse or assault. The reason they work isn’t because they are secret – they work because they are contextual.

The informal protective communication shared in schools, churches, workplaces and other organizations can be communicated and trusted based on the common language and cultural understanding of those giving and receiving the information.

Since 2017 when the MeToo hashtag went viral, people have been trying to create online whisper networks. Projects like the Shitty Media Men list and the Facebook group Are We Dating the Same Guy are both examples of trying to build a larger warning system. The Tea app sells itself as a digital platform that gives women tools to protect themselves and others when dating men.

However, important components of whisper networks get lost when they are moved to anonymous nationwide warning systems.

I’m an organizational communications and gender scholar. My research focuses on whisper networks and how people use them to keep safe in organizations. When this idea moves to a digital platform, the information may still be useful, but it is harder for participants to gauge how reliable it is.

What makes whisper networks work

Whisper networks form when there is an environment of shared risk. Their purpose is protective. In other words, the people in whisper networks do not share information for punitive reasons, but to protect each other and to make sense of their experiences. There is an unspoken expectation that the receiver of information will also share only with people they trust.

In my study of whisper networks, participants talked about how they could assess the information they receive and give through personal interactions in their offices, congregations, schools and other organizations. They felt like they could measure the trustworthiness of the person sharing information and the trustworthiness of the people with whom they share information.

They also talked about cues, including noticing how the person sharing information treats other people in the office and how they talk about other people when they aren’t around. This important component of whisper networks is difficult to translate to an app, even when the app claims to verify people as members.

Women use coded messages and actions in whisper networks to figure out who is safe in any given room, who is in need of whisper network information, and when they decide that whisper network information is worth listening to.

These protective messages tend to be shared either one on one or in small groups. Women know the information is reliable because of how it is shared and who shares it. For example, someone might say, “He is a little creepy toward the undergraduate women.” The person receiving the message uses the surrounding context codes to understand the seriousness of the situation.

Another person might say, “He makes people feel special and then uses information to be unprofessional.” Instead of a warning about actions, it is a warning about the process the harasser uses and what to watch out for. The person sharing the information wants the person they are protecting to understand the ways this perpetrator tends to move in relation to the people they work with.

None of the language is specific, and it is largely coded so that the listener understands, and the person sharing doesn’t need to worry about the repercussions of sharing it.

Interview participants told me that they usually share information in quiet conversations where they already trust the person they are talking to.

When I asked participants about how they knew they were receiving whisper network information, they talked about how the person would lean in, drop to a different tone or volume, or how the vibe would change. It’s difficult to get any of those clues through an online platform.

The risk of faulty information and misinterpretation goes up when information is shared on anonymous platforms or shared widely. When more specific stories are shared, it’s almost always in a trusted, private setting, not shared widely on an anonymous forum. When protective communication is broad, the network loses the very qualities that made it feel safe in the first place. Few of the nonverbal and social reputation signals exist on an app, and that makes the communication feel less trustworthy.

Anonymous platforms can also create potentially volatile situations. Because people can post anonymously, there is more room for sloppiness, exaggeration and even defamation. These apps build on the myth that there is an individual solution and quick fix for sexual harassment and assault instead of acknowledging the underlying structural and cultural issues.

In addition to being less effective than offline networks, whisper network apps and websites are vulnerable to hackers.

A different safety concern

Online platforms offer users a limited understanding of how their data is used and stored, so the user’s safety takes second place to the platform owners’ and investors’ financial incentives. These apps have largely been created by people who carry less risk and who are concerned with monetization, even if they also care about safety. The risks disproportionately affect those whose safety is already at risk.

In addition to the issues of effectiveness and trust is the question of safety. The Tea app has been in the news because of two separate data breaches, including over 70,00 images that were leaked to online message boards. Data included government-issued IDs, personal information and private messages. A separate breach exposed direct messages on the app.

So while it’s conceivable that some online lists could be created for specific communities that share a common culture and language, no matter how good the intent is, it is unlikely that the creators of apps and websites are at the same risk of exposure as the people who use them. In addition, apps built for specific communities or communication styles would probably be significantly less profitable than those that are promoted nationally or worldwide and so are less likely to be built or sustained.

All of this isn’t to say that apps aren’t useful and necessary. But based on my research, I don’t believe they provide the same safety and protection as in-person, organizational whisper networks.

The Conversation

Carrie Ann Johnson does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. ‘Whisper networks’ don’t work as well online as off − here’s why women are better able to look out for each other in person – https://theconversation.com/whisper-networks-dont-work-as-well-online-as-off-heres-why-women-are-better-able-to-look-out-for-each-other-in-person-265182

‘Warrior ethos’ mistakes military might for true security – and ignores the wisdom of Eisenhower

Source: The Conversation – USA – By Monica Duffy Toft, Professor of International Politics and Director of the Center for Strategic Studies, The Fletcher School, Tufts University

Hundreds of generals and admirals will converge on Quantico, Virginia, on Sept. 30, 2025, after being summoned from across the globe by their boss, Pentagon chief Pete Hegseth. While Hegseth has not formally announced the purpose of the meeting, The New York Times reports that it will cover “aspects of what he calls a shift toward a ‘warrior ethos’ at the Pentagon.”

The meeting comes soon after President Donald Trump’s Sept. 5 executive order renaming the Department of Defense the “Department of War.” With that change, Trump reverted the department to a name not used since the 1940s.

The change represents far more than rebranding – it signals an escalation in the administration’s embrace of a militaristic mindset that, as long ago as 1961, President Dwight D. Eisenhower warned against in his farewell address, and that the nation’s founders deliberately aimed to constrain.

The timing of this name change feels particularly notable when considered alongside recent reporting revealing secret U.S. military operations. In 2019, a detachment of U.S. Navy SEALs crept ashore in North Korea on a mission to plant a listening device during high-stakes nuclear talks. The risks were enormous: Discovery could have sparked a hostage crisis or even war with a nuclear-armed foe.

That such an operation was approved by Trump in his first term at all exemplifies an increasingly reckless militarism that has defined American foreign policy for decades. That militarism is the very subject of my book, “Dying by the Sword.”

Further, the name change was announced just days after Trump authorized a U.S. military strike on a Venezuelan boat that the administration claimed was carrying drug-laden cargo and linked to the Tren de Aragua cartel. The strike killed 11 people. The administration justified the killings by labeling them “narcoterrorists.”

A large cargo plane being unloaded by soldiers.
The U.S. has beefed up military exercises in Puerto Rico during a campaign in the Southern Caribbean against boats suspected of transporting illegal drugs.
Miguel J. Rodríguez Carrillo/Getty Images

Abandoning restraint – deliberately

The Department of War existed from 1789 until 1947, when Congress passed the National Security Act reorganizing the armed services into the National Military Establishment. Just two years later, lawmakers amended the act, renaming the institution the Department of Defense.

Officials disliked the “NME” acronym – which sounded uncomfortably like “enemy” – but the change was not only about appearances.

In the aftermath of World War II, U.S. leaders wanted to emphasize a defensive rather than aggressive military posture as they entered the Cold War, a decades-long standoff between the United States and Soviet Union defined by a nuclear arms race, ideological rivalry and proxy wars short of direct great-power conflict.

The new emphasis also dovetailed with the new U.S. grand strategy in foreign affairs – diplomat George F. Kennan’s containment strategy, which aimed to prevent the expansion of Soviet power and communist ideology around the world.

Kennan’s approach narrowly survived a push to a more aggressive “rollback” strategy of the Soviet Union from its occupation and oppression of central and eastern Europe. It evolved instead into a long game: a team effort to keep the adversary from expanding to enslave other peoples, leading to the adversary’s collapse and disintegration without risking World War III.

On the ground, this meant fewer preparations for war and more emphasis on allies and intelligence, and foreign aid and trade, along with the projection of defensive strength. The hope was that shaping the environment rather than launching attacks would cause Moscow’s influence to wither. To make this strategy viable, the U.S. military itself had to be reorganized.

In a 1949 address before Congress, President Harry S. Truman described the reorganization sparked by the 1947 legislation as a “unification” of the armed forces that would bring efficiency and coordination.

But a deeper purpose was philosophical: to project America’s military power as defensive and protective, and for Truman, strengthening civilian oversight.

The wisdom of this restraint is clearest in Eisenhower’s farewell address of January 1961.

In less than 10 minutes, the former five-star general who had commanded Allied forces to victory in World War II cautioned Americans against the rise of a “military-industrial complex.” He acknowledged that the nation’s “arms must be mighty, ready for instant action,” but warned that “the potential for the disastrous rise of misplaced power exists and will persist.”

Creating new enemies, destabilizing regions

The risky North Korean team mission by the Navy SEALs illustrates how America’s militaristic approach often produces the very dangers it aspires to deter.

Rather than enhancing diplomacy, the operation risked derailing talks and escalating conflict. This is the central argument of my book: America’s now-reflexive reliance on armed force doesn’t make America great again or more secure. It makes the country less secure, by creating new enemies, destabilizing regions and diverting resources from the true foundations of security.

It also makes the U.S less admired and respected. The State Department budget continues to be dwarfed by the Department of War’s budget, with the former never reaching more than 5.5% of the latter. And the U.S. Agency for International Development, or USAID, once the leading arm of U.S. soft power as quiet purveyor development aid around the world, is now shuttered.

Today’s Pentagon budget exceeds anything Eisenhower could have imagined.

President Dwight D. Eisenhower’s farewell speech, delivered on Jan. 17, 1961, in which he warned against the establishment of a “military-industrial complex.”

Trump’s rebranding of the Department of Defense into the Department of War signals a shift toward framing U.S. power primarily in terms of military force. Such a framing emphasizes the use of violence as the principal means of solving problems and equates hostility and aggression with leadership.

Yet historical experience shows that military dominance alone has not translated into strategic success. That’s the mindset that lost the U.S. endless wars in Afghanistan and Iraq, and failed in interventions in Libya and Syria – conflicts that altogether cost trillions of dollars and hundreds of thousands of lives while leaving the country less secure and eroding its international legitimacy.

Only an alert and knowledgeable citizenry,” Eisenhower said, can compel the proper balance between military power and peaceful goals.

The very title of my and my co-author’s book comes from the Gospel of Matthew – Chapter 26, verse 52 – that “to live by the sword is to die by the sword.” Throughout modern history, true security has come from diplomacy, international law, economic development and investments in health care and education. Not from an imaginary “warrior ethos.”

America, I would argue, doesn’t need a Department of War. It needs leaders who understand, as Eisenhower did, that living by the sword will doom us all in the end. Real security comes from the quiet power that builds legitimacy and lasting peace. The U.S. can choose again to embody those strengths, to lead not by fear but by example.

The Conversation

Monica Duffy Toft does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. ‘Warrior ethos’ mistakes military might for true security – and ignores the wisdom of Eisenhower – https://theconversation.com/warrior-ethos-mistakes-military-might-for-true-security-and-ignores-the-wisdom-of-eisenhower-266213

Censorship campaigns can have a way of backfiring – look no further than the fate of America’s most prolific censor

Source: The Conversation – USA (2) – By Amy Werbel, Professor of the History of Art, Fashion Institute of Technology (FIT)

The vast majority of Americans support the right to free speech. Jacek Boczarski/Anadolu via Getty Images

In the first year of President Donald Trump’s second term in office, his administration has made many attempts to suppress speech it disfavorsat universities, on the airwaves, in public school classrooms, in museums, at protests and even in lawyer’s offices.

If past is prologue, these efforts may backfire.

In 2018, I published my book “Lust on Trial: Censorship and the Rise of American Obscenity in the Age of Anthony Comstock.”

A devout evangelical Christian, Comstock hoped to use the powers of the government to impose moral standards on American expression in the late-19th and early-20th centuries. To that end, he and like-minded donors established the New York Society for the Suppression of Vice, which successfully lobbied for the creation of the first federal anti-obscenity laws with enforcement provisions.

Later appointed inspector for the Post Office Department, Comstock fought to abolish whatever he deemed blasphemous and sinful: birth control, abortion aids and information about sexual health, along with certain art, books and newspapers. Federal and state laws gave him the power to order law enforcement to seize these materials and have prosecutors bring criminal indictments.

I analyzed thousands of these censorship cases to assess their legal and cultural outcomes.

I found that, over time, Comstock’s censorship regime did lead to a rise in self-censorship, confiscations and prosecutions. However, it also inspired greater support for free speech and due process.

More popular – and more profitable

One effect of Comstock’s censorship campaigns: The materials and speech he disfavored often made headlines, putting them on the public’s radar as a kind of “forbidden fruit.”

For example, prosecutions targeting artwork featuring nude subjects led to both sensational media coverage and a boom in the popularity of nudes on everything from soap advertisements and cigar boxes to photographs and sculptures.

Black-and-white portrait of bald man with mutton chops.
Anthony Comstock.
Bettmann/Getty Images

Meanwhile, entrepreneurs of racy forms of entertainment – promoters of belly dancing, publishers of erotic postcards and producers of “living pictures,” which were exhibitions of seminude actors posing as classical statuary – all benefited from Comstock’s complaints. If Comstock wanted it shut down, the public often assumed that it was fun and trendy.

In 1891, Comstock became irate when a young female author proposed paying him to attack her book and “seize a few copies” to “get the newspapers to notice it.” And in October 1906, Comstock threatened to shut down an exhibition of models performing athletic exercises wearing form-fitting union suits. Twenty thousand people showed up to Madison Square Garden for the exhibition – far more than the venue could hold at the time.

The Trump administration’s recent efforts to get comedian Jimmy Kimmel off the air have similarly backfired.

Kimmel had generated controversy for comments he made on his late-night talk show in the wake of conservative activist Charlie Kirk’s assassination. ABC, which is owned by The Walt Disney Co., initially acquiesced to pressure from Federal Communications Commission Chairman Brendan Carr and announced the show’s “indefinite” suspension. But many viewers, angered over the company’s capitulation, canceled their subscriptions of Disney streaming services. This led to a 3.3% drop in Disney’s share price, which spurred legal actions by shareholders of the publicly traded company.

ABC soon lifted the suspension. Kimmel returned, drawing 6.26 million live viewers – more than four times his normal audience – while over 26 million viewers watched Kimmel’s return monologue on social media. Since then, all network affiliates have resumed airing “Jimmy Kimmel Live!”

‘Comstockery’ and hypocrisy

In the U.S., disfavored political speech and obscenity are different in important ways. The Supreme Court has held that the First Amendment provides broad protections for political expression, whereas speech deemed to be obscene is illegal.

Despite this fundamental difference, social and cultural forces can make it difficult to clearly discern protected and unprotected speech.

In Comstock’s case, the public was happy to see truly explicit pornography removed from circulation. But their own definition of what was “obscene” – and, therefore, criminally liable – was much narrower.

In 1905, Comstock attempted to shut down a theatrical performance of George Bernard Shaw’s “Mrs. Warren’s Profession” because the plot included prostitution. The aging censor was widely ridiculed and became a “laughing stock,” according to The New York Times. Shaw went on to coin the term “Comstockery,” which caught on as a shorthand for overreaching censoriousness.

Cartoon of young women booting an older man down some steps.
Cartoonists at the turn of the 20th century had a field day with Anthony Comstock’s overreaches.
Amazon

In a similar manner, when Attorney General Pam Bondi recently threatened Americans that the Department of Justice “will absolutely … go after you, if you are targeting anyone with hate speech,” swift backlash ensued.

Numerous Supreme Court rulings have held that hate speech is constitutionally protected. However, those in power can threaten opponents with punishment even when their speech clearly does not fall within one of the rare exceptions to the First Amendment protection for political speech.

Doing so carries risks.

The old saying “people in glass houses shouldn’t throw stones” also applies to censors: The public holds them to higher standards, lest they be exposed as hypocrites.

For critics of the Trump administration, it was jarring to see officials outraged about “hate speech,” only to hear the president announce, at Charlie Kirk’s memorial, “I hate my opponent, and I don’t want the best for them.”

In Comstock’s case, defendants and their attorneys routinely noted that Comstock had seen more illicit materials than any man in the U.S. Criticizing Comstock in 1882, Unitarian minister Octavius Brooks Frothingham quoted Shakespeare: “Who is so virtuous as to be allowed to forbid the distribution of cakes and ale?”

In other words, if you’re going to try to enforce moral standards, you better make sure you’re beyond reproach.

Free speech makes for strange bedfellows

Comstock’s censorship campaign, though self-defeating in the long run, nonetheless caused enormous suffering, just as many people today are suffering from calls to fire and harass those whose viewpoints are legal, but disliked by the Trump administration.

Comstock prosecuted women’s rights advocate Ida Craddock for circulating literature that advocated for female sexual pleasure. After Craddock was convicted in 1902, she died by suicide. She left behind a “letter to the public,” in which she accused Comstock of violating her rights to freedom of religion and speech.

Portly man wearing a hat and a full body suit gingerly stepping into a bathtub.
A 1906 cartoon in the satirical periodical Puck mocks Anthony Comstock as a prude.
PhotoQuest/Getty Images

During Craddock’s trial, the jury hadn’t been permitted to see her writings; they were deemed “too harmful.” Incensed by these violations of the First and Fourth amendments, defense attorneys rallied together and were joined by a new coalition in the support of Americans’ constitutional rights. Lincoln Steffens of the nascent Free Speech League wrote, in response to Craddock’s suicide, that “those who believe in the general principle of free speech must make their point by supporting it for some extreme cause. Advocating free speech only for a popular or uncontroversial position would not convey the breadth of the principle.”

Then, as now, the cause of free expression can bring together disparate political factions.

In the wake of the Kimmel saga, many conservative Republicans came out to support the same civil liberties also advocated by liberal Hollywood actors. Two-thirds of Americans in a September 2025 YouGov poll said that it was “unacceptable for government to pressure broadcasters to remove shows it disagrees with.”

My conclusion from studying the 43-year career of America’s most prolific censor?

Government officials may think a campaign of suppression and fear will silence their opponents, but these threats could end up being the biggest impediment to their effort to remake American culture.

The Conversation

Amy Werbel receives funding from the State University of New York.

ref. Censorship campaigns can have a way of backfiring – look no further than the fate of America’s most prolific censor – https://theconversation.com/censorship-campaigns-can-have-a-way-of-backfiring-look-no-further-than-the-fate-of-americas-most-prolific-censor-266117

McCarthyism’s shadow looms over controversial firing of Texas professor who taught about gender identity

Source: The Conversation – USA (2) – By Laura Gail Miller, Ed.D. Candidate in Educational Organizational Learning and Leadership, Seattle University

A Texas A&M free speech case raises questions about academic freedom that have featured before in American society and courts, including during the 1950s. Westend61

Texas A&M University announced the resignation of its president, Mark A. Welsh III, on Sept. 18, 2025, following a controversial decision earlier in the month to fire a professor over a classroom exchange with a student about gender identity.

The university – a public school in College Station, Texas – fired Melissa McCoul, a children’s literature professor, on Sept. 9. McCoul’s dismissal happened after a student secretly filmed video as the professor taught a class and discussed a children’s book that includes the image of a purple “gender unicorn,” a cartoon image that is sometimes used to teach about gender identity.

The student questioned whether it was “legal” to be teaching about gender identity, given President Donald Trump’s January 2025 executive order – which is not legally binding – that said there are only two genders, male and female.

The video went viral, triggering backlash from Republican lawmakers who called for McCoul to be fired and praised the fact that the school also demoted the College of Arts and Science’s dean and revoked administrative duties from a department head.

Texas A&M officials have said that McCoul was fired because her course content was not consistent with the published course description. McCoul is appealing her firing and is considering legal action against the school.

Academic freedom advocates have condemned McCoul’s firing and say it raises questions about whether professors should be fired for addressing politically charged topics.

As a history educator researching curriculum design, civics education and generational dynamics, I study how classroom discussions often mirror larger cultural and political conflicts.

The Texas A&M case is far from unprecedented. The Cold War offers an example of another politically contentious time in American history when people questioned if and how politics should influence what gets taught in the classroom – and tried to restrict what teachers say.

A large grassy and concrete space is seen with a water tower behind it and a person riding their bike, while another one walks.
The public university Texas A&M, seen here in August 2023, is the site of a controversial freedom of speech and academic repression case.
iStock/Getty Images Plus

Educators under suspicion in the McCarthy era

During the Cold War – a period of geopolitical tension between the U.S. and the Soviet Union that came after World War II and lasted until 1991 – fears of communist infiltration spread widely across American society, including the country’s schools.

One particularly contentious period was in the late 1940s and 1950s, during what is often referred to as the McCarthy era. The era is named after Wisconsin Sen. Joseph McCarthy, a Republican who led the charge on accusing government employees and others – often without evidence – of being communists.

Beginning in the late 1940s, local school boards, state legislatures and Congress launched investigations into teachers and professors across the country accused of harboring communist sympathies. This often led to the teachers being blacklisted and fired.

More than 20 states passed loyalty oath laws requiring public employees, including educators, to swear that they were not members of the Communist Party or affiliated groups.

In California, for example, the 1950 Levering Act mandated a loyalty oath for all state employees, including professors at public universities. Some employees refused to sign the oath, and 31 University of California professors were fired.

And in New York, the Feinberg Law, approved in 1949, authorized school districts to fire teachers who were members of “subversive organizations.” More than 250 educators were fired or forced to resign under the Feinberg Law and related anti-subversion policies between 1948 and 1953.

These laws had a chilling impact on academic life and learning.

Faculty, including those who were not under investigation, and students alike avoided discussing controversial topics, such as labor organizing and civil rights, in the classroom.

This pervasive climate of censorship also made it challenging for educators to fully engage students in critical, meaningful learning.

The Supreme Court steps in

By the mid-1950s, questions about the constitutionality of these laws – and the extent of professors’ academic freedom and First Amendment right to freedom of speech – reached the Supreme Court.

In one such case, 1957’s Sweezy v. New Hampshire, Louis C. Weyman, the New Hampshire attorney general, questioned Paul Sweezy, a Marxist economist, about the content of a university lecture he delivered at the University of New Hampshire.

Weyman wanted to determine whether Sweezy had advocated for Marxism or said that socialism was inevitable in the country. Sweezy refused to answer Weyman’s questions, citing his constitutional rights. The Supreme Court ruled in Sweezy’s favor, emphasizing the importance of academic freedom and the constitutional limits on state interference in university teaching.

The Supreme Court also considered another case, Keyishian v. Board of Regents, in 1967. With the Cold War still ongoing, this case challenged New York’s Feinberg Law, which required educators to disavow membership in communist organizations.

In striking down the law, the court declared that academic freedom is “a special concern of the First Amendment.” The ruling emphasized that vague or broad restrictions on what teachers can say or believe create an unconstitutional, “chilling effect” on the classroom.

While these cases did not remove all political pressures on what teachers could discuss in class, they set significant constitutional limits on state efforts to regulate classroom speech, particularly at public institutions.

A man in a black-and-white photo wears glasses and holds up papers toward a microphone. He sits next to another man.
Sen. Joseph R. McCarthy, right, speaks during the McCarthy investigations in November 1954, trying to show communist subversion in high government circles.
Bettmann/Contributor

Recurring tensions from now and then

There are several important differences between the McCarthy era and current times.

For starters, conservative concern centered primarily on the spread of communism during the McCarthy era. Today, debates often involve conservative critiques of how topics such as gender identity, race and other cultural issues — sometimes grouped under the term “woke” — are addressed in schools and society.

Second, in the 1950s and ‘60s, external pressures on academic freedom often came in the form of legal mandates.

Today, the political landscape in academia is more complex and fast-paced, with pressures emanating from both the public and federal government.

Viral outrage, administrative investigations and threats to cut state or federal funding to schools can all contribute to an intensifying climate of fear of retribution that constrains educators’ ability to teach freely.

Despite these differences, the underlying dynamic between the two time periods is similar – in both cases, political polarization intensifies public scrutiny of educators.

Like loyalty oaths in the 1950s, today’s political controversies create a climate in which many teachers feel pressure to avoid contentious topics altogether. Even when no laws are passed, the possibility of complaints, investigations or firings can shape classroom choices.

Just as Sweezy and Keyishian defined the boundaries of state power in the 1950s and ‘60s, potential legal challenges like the appeal from the fired Texas A&M professor may eventually lead to court rulings that clarify how people’s First Amendment protections apply in today’s disputes over curriculum and teaching.

Whether these foundational protections will endure under the Supreme Court’s current and future makeup remains an open question.

The Conversation

Laura Gail Miller does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. McCarthyism’s shadow looms over controversial firing of Texas professor who taught about gender identity – https://theconversation.com/mccarthyisms-shadow-looms-over-controversial-firing-of-texas-professor-who-taught-about-gender-identity-265554

Why chromium is considered an essential nutrient, despite having no proven health benefits

Source: The Conversation – USA (3) – By Neil Marsh, Professor of Chemistry and Biological Chemistry, University of Michigan

You’re more likely to get chromium from your cookware than from your food. Fausto Favetta Photoghrapher/Moment via Getty Images

You might best know chromium as a bright, shiny metal used in bathroom and kitchen fittings. But is it also essential for your health?

In a form known as trivalent chromium, this metal is included in multivitamin pills and sold as a dietary supplement that companies claim can improve athletic performance and help regulate blood sugar.

I’m a biochemistry professor with a long-standing interest in how metals function in biology. Although health agencies in the United States and other countries recommend chromium as a dietary requirement, eight decades of research have resulted in slim evidence that people derive any significant health benefits from this mineral.

Why, then, did chromium come to be considered essential for human health?

What is an essential trace element?

To stay healthy, people need what are called essential trace elements in their diet. These include metals such as iron, zinc, manganese, cobalt and copper. As the word “trace” implies, you need only tiny amounts of these metals for optimal function.

Diagram of select columns and rows of the periodic table
A range of metals are considered essential (green) to humans. Others are important for only some forms of life (pink).
Jomova et al. 2022, CC BY-SA

For most of these trace elements, decades of research have shown they are genuinely essential for health. Iron, for example, is essential for carrying oxygen in your blood, and many proteins – complex molecules that carry out all of the functions necessary for life – require iron to function properly. A deficiency of iron leads to anemia, a condition that results in fatigue, weakness, headaches and brittle nails, among other symptoms. Iron supplements can help reverse these symptoms.

Importantly, biochemists have pinpointed exactly how iron helps proteins perform essential chemical reactions, not just for humans but all living organisms. Researchers know not only that iron is essential but also why it is essential.

Little evidence for chromium’s benefits

However, the same cannot be said for chromium.

Chromium deficiency – having little to no chromium in your body – is extremely rare, and researchers have not identified any clearly defined disease caused by low chromium levels.

Like all food, essential metals must be absorbed by your digestive system. However, the gut absorbs only about 1% of ingested chromium. Other essential metals are absorbed more efficiently – for example, the average person absorbs around 25% of certain forms of ingested iron.

Importantly, despite many studies, scientists have yet to find any protein that requires chromium to carry out its biological function. In fact, only one protein is known to bind chromium, and this protein most likely helps your kidneys remove chromium from your blood. While some studies in people suggest chromium might be involved to some degree in regulating blood glucose levels, research on whether adding extra chromium to your body through supplements can substantially improve your body’s ability to break down and use sugar has been inconclusive.

Thus, based on biochemistry, there is currently no evidence that humans, or other animals, actually require chromium for any particular function.

Flawed research in rats

So how did chromium come to be considered an essential trace metal?

The idea that chromium might be essential for health stems from studies in the 1950s, a time when nutritionists knew very little about what trace metals are required to maintain good health.

One influential study involved feeding lab rats a diet that induced symptoms of Type 2 diabetes. Supplementing their diet with chromium seemed to cure the rats of Type 2 diabetes, and medical researchers were enticed by the suggestion that chromium might provide a treatment for this disease. Today’s widespread claims that chromium is important for regulating blood sugar can be traced to these experiments.

Chromium crystals in the shape of jagged chunks and as a cube
Chromium has many uses as a metal alloy, but not so much as a nutrient.
Alchemist-hp/Wikimedia Commons, CC BY-NC-ND

Unfortunately, these early experiments were very flawed by today’s standards. They lacked the statistical analyses needed to show that their results were not due to random chance. Furthermore, they lacked important controls, including measuring how much chromium was in the rats’ diet to start with.

Later studies that were more rigorously designed provided ambiguous results. While some found that rats fed chromium supplements controlled their blood sugar slightly better than rats raised on a chromium-free diet, others found no significant differences. But what was clear was that rats raised on diets that excluded chromium were perfectly healthy.

Experiments on people are much harder to control for than experiments on rats, and there are few well-designed clinical trials investigating the effects of chromium on patients with diabetes. Just as with the rat studies, the results are ambiguous. If there is an effect, it is very small.

Recommendations based on averages

Why, then, is there a recommended dietary intake for chromium despite its lack of documented health benefits?

The idea that chromium is needed for health persists due in large part to a 2001 report from the National Institute of Medicine’s Panel on Micronutrients. This panel of distinguished nutritional researchers and clinicians was formed to evaluate available research on human nutrition and set “adequate intake” levels of vitamins and minerals. Their recommendations form the basis of the recommended daily intake labels found on food and vitamin packaging and the National Institutes of Health guidelines for clinicians.

Despite acknowledging the lack of research demonstrating clear-cut health benefits for chromium, the panel still recommended adults get about 30 micrograms per day of chromium in their diet. This recommendation was not based on science but rather on previous estimates of how much chromium adult Americans already ingest each day. Notably, much of this chromium is leached from stainless steel cookware and food processing equipment, rather than coming from our food.

So, while there may not be confirmed health risks from taking chromium supplements, there’s probably no benefit either.

The Conversation

Neil Marsh receives funding from the NIH and NSF.

ref. Why chromium is considered an essential nutrient, despite having no proven health benefits – https://theconversation.com/why-chromium-is-considered-an-essential-nutrient-despite-having-no-proven-health-benefits-252867

Russell M. Nelson, president of The Church of Jesus Christ of Latter-day Saints, pushed it away from ‘Mormon’ – a word that has courted controversy for 200 years

Source: The Conversation – USA (3) – By Konden Smith Hansen, Senior Lecturer of Religious Studies, University of Arizona

Russell Nelson, center, sits during the Church of Jesus Christ of Latter-day Saints’ biannual General Conference in Salt Lake City in 2019. George Frey/Getty Images

Russell M. Nelson, a former heart surgeon and longtime church leader, was 93 years old when he became president of The Church of Jesus Christ of Latter-day Saints in 2018. But anyone who assumed that his tenure would be uneventful, due to his advanced years, was quickly proved wrong. Visiting South America that year, he told members to buckle up: “Eat your vitamin pills. Get your rest. It’s going to be exciting.”

Nelson, who died on Sept. 27, 2025, at age 101, proved a consequential reformer: an energetic leader who streamlined bureaucracy, took steps toward gender equity and ended the church’s century-long relationship with the Boy Scouts, while reaffirming its opposition to LGBTQ+ relationships and identities.

He steered the church unapologetically through storms of public scrutiny, including accusations that the church had concealed the value of its investments. For the faithful, Nelson represented God’s mouthpiece on Earth. The church considers each president to be a “prophet, seer, and revelator.”

Yet one of his initiatives made an impact that rippled far beyond the church. In 2018, he surprised observers by declaring the use of the word “Mormon” a “major victory for Satan,” insisting on the use of the church’s full name. Individuals were to be recognized by their institutional affiliation, as “members of The Church of Jesus Christ of Latter-day Saints.”

The name of the church was given by God, and shortening it erases “all that Jesus Christ did for us,” Nelson argued. Yet adherents have long self-identified as Mormons, so the rebrand felt like a novelty to some members.

As a university lecturer teaching courses on American religion and Mormonism, I was one of many who wrestled with this change in terminology – and saw the challenges it created for my students and colleagues. For almost two centuries, the word “Mormon” has framed how Americans think about and discuss this faith.

Birth of a church

The name Mormon comes from the title of the Book of Mormon, a religious text unique to the faith. Founder Joseph Smith, who organized the church in 1830, believed he had been instructed by God to restore Jesus’ true church. He claimed that an angel had led him to uncover and translate ancient gold plates that detailed the religious history of an ancient civilization in the Americas, founded by Israelites who fled Jerusalem.

An old book open to the first page.
An 1841 edition of the Book of Mormon, on display in the museum at the Springs Preserve in Las Vegas.
Prosfilaes/Wikimedia Commons, CC BY

Early critics mockingly attached the word Mormon to the movement, but Smith insisted that in the book’s original language, the word meant “literally, ‘more good.’” By the time Smith was killed by a mob in Illinois in 1844, his followers had embraced the word.

After Smith’s death, Mormons split into different factions, with the largest group traveling by foot and wagon to the far American West. Yet, the group’s evolving practices continued to spark controversy. Polygamy and the church’s political and economic influence contributed to decades of animosity between Mormons and the rest of the nation.

The United States began seizing church property and imprisoning polygamist leaders, coercing church president Wilford Woodruff to end official support for polygamy in 1890.

A new debut

Three years later, at the Chicago World’s Fair, the church rebranded Mormonism, presenting Mormon pioneers as an embodiment of the values of the American frontier.

Woodruff, then 86 years old, spoke of himself as Utah’s oldest living pioneer and of Mormons as a people who built the American West. The Mormon Tabernacle Choir performed at the fair, reintroducing Mormons to the wider public as a sophisticated and artistic people. The crowd shouted, “Three cheers for the Mormons!” The Chicago Herald wrote, “Mormons and gentiles came together as friends.”

A black and white photo of ornate buildings around a waterway, with fountains in a plaza.
The Great Basin at the Chicago World’s Fair in 1893.
Chicago History Museum/Getty Images

Despite this, many Americans still distrusted Mormons. In 1903, high-ranking church official Reed Smoot was elected to the U.S. Senate, which provoked national outcry and led to Senate hearings that lasted until 1907. The hearings substantiated charges that the practice of polygamy persisted but exonerated Smoot as an individual. As Smoot argued, Mormons were independent of the institutional church and thus trustworthy Americans. He convinced his fellow senators that if the church’s teachings came into conflict with his conscience or oath of office, then, as a Mormon, he would uphold the latter.

Following Smoot’s lead, the church embraced the trappings of American patriotism and doubled down against plural marriage. These moves won the Latter-day Saints powerful political allies, including Theodore Roosevelt, who disliked the institutional church but viewed Mormons themselves as intensely moral and patriotic.

‘Meet the Mormons’

Ab Jenkins, a race car driver whose records made him an international celebrity in the 1930s, capitalized on this new image of Mormon individuality and wholesomeness. The “Mormon Boy” credited his clean, church-approved lifestyle for his success. On his car, the Mormon Meteor, Jenkins rejected alcohol and cigarette endorsements and instead brandished a sign that read, “Yes, I’m a Mormon.”

A black and white image of an old-fashioned car against a white, flat landscape.
Ab Jenkins starts a 1939 test run in his race car, the Mormon Meteor III, on Utah’s Bonneville Salt Flats.
Underwood Archives/Getty Images

For several decades, other Mormon celebrities like family band The Osmonds and golfer Johnny Miller continued to shape positive public views of Mormons – hitting a high-water mark in 1977, when Gallup found that only 18% of Americans held unfavorable views.

Church efforts to influence social issues, however, such as its decades-long opposition to the Equal Rights Amendment, eventually took a toll. By 1991, public opinion of Mormons had fallen dramatically, with 37% of Americans viewing them unfavorably – and leaders decided that another rebrand was in order.

The previous year, senior leader Gordon B. Hinckley had admonished members to make the word Mormon “shine with added luster.” When he became president in 1995, Hinckley worked to reframe how the public saw Mormons, arguing on the “60 Minutes” TV show that Mormons were “not a weird people.”

The Salt Lake City Olympics in 2002 pushed Mormonism into the national spotlight, and that same year, the church launched its major website, Mormon.org, with stories and headlines liberally using the term “Mormon.” A media campaign followed a decade later, featuring prominent members declaring, “I’m a Mormon.” Ordinary members were then encouraged to upload their own “I’m a Mormon” profiles to this website and share them on their own social media accounts.

A tall building with an image of a woman figure skating projected on it, next to church steeples and a snow-covered mountain.
The Latter-day Saints temple in downtown Salt Lake City, center, as an Olympic banner drapes the church office building next door during the 2002 Games.
George Frey/AFP via Getty Images

Mitt Romney’s Republican nomination for the 2012 presidential election and the popularity of the satirical “Book of Mormon” musical pushed Mormons again into the national spotlight. In 2014, the church produced a documentary titled “Meet the Mormons,” shown in theaters across the U.S., which apostle Jefferey R. Holland explained was to “show people what we’re really like.”

In 2017, a Pew Research Center survey’s “feeling thermometer” found public opinion of Mormons to have risen to the “somewhat warmer” rating of 54, a 6-point increase from 2014.

‘More good’?

That said, the church’s relationship to the word Mormon has always been complex. As far back as 1990, Nelson was already warning fellow Latter-day Saints that Mormon was “not an appropriate alternative” for the church’s full name. During the 2002 Olympics, the church advised media that the nickname was acceptable for individuals but not to refer to the institution itself.

Overall, I would argue the church has used the word Mormon to improve public opinion for more than a century. Part of this branding downplayed popular fears about the church and its influence – allowing outsiders to develop favorable views toward Mormons, even if they disliked the institution itself.

In March 2023, a Pew Research poll reported a low point in public opinion of Mormons, falling for the first time below every other measured group. A quarter of Americans held “unfavorable views of Mormons,” while only 15% held “favorable” ones.

A month later, Nelson pleaded with members to be peacemakers, lamenting that “many people seem to believe that it is completely acceptable to condemn, malign and vilify anyone who does not agree with them.” Nationally, intense polarization and violence have continued since then – including a horrific attack in Michigan on a Latter-day Saints church building on Sept. 28, 2025, just one day after Nelson’s death.

“Mormon” has been an important term in engaging those outside the faith, particularly in countering negative perceptions. Whether or not the word disappears, what may matter more for Nelson’s legacy is whether people outside the church associate it with “more good” – both institutionally and individually.

This is an updated version of an article originally published on Sept. 5, 2024.

The Conversation

Konden Smith Hansen does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Russell M. Nelson, president of The Church of Jesus Christ of Latter-day Saints, pushed it away from ‘Mormon’ – a word that has courted controversy for 200 years – https://theconversation.com/russell-m-nelson-president-of-the-church-of-jesus-christ-of-latter-day-saints-pushed-it-away-from-mormon-a-word-that-has-courted-controversy-for-200-years-266229

Trump administration is on track to cut 1 in 3 EPA staffers by the end of 2025, slashing agency’s ability to keep pollution out of air and water

Source: The Conversation – USA (2) – By Elizabeth Blum, Professor of Environmental History, Troy University

Environmental Protection Agency staff and contractors are often involved in large cleanups of toxic waste, such as after the Los Angeles fires of early 2025. Mario Tama/Getty Images

As Congress faces a Sept. 30, 2025, deadline to fund the federal government, Environmental Protection Agency Administrator Lee Zeldin has put the EPA on the chopping block. But even before Congress decides about the administration’s recommendations to slash its staff, the EPA’s political leaders have made even more significant cuts to the agency’s workforce.

And a look at past efforts to cut EPA staff shows how rapidly those changes can affect Americans’ health and the environment.

Using publicly available government databases and a collection of in-depth interviews with current and former EPA employees, the Environmental Data and Governance Initiative, a group of volunteer academics that we are a part of, has begun to put some numbers behind what many have suspected. Zeldin’s cuts have diminished the EPA’s staffing levels, even before Congress has had a chance to weigh in, affecting the environment, public health and government transparency.

People hold signs saying 'There's no Planet B,' 'Save the U.S. EPA' and other messages.
EPA employees protest cuts to the agency.
Brett Phelps/The Boston Globe via Getty Images

How many people are being let go?

Precise numbers of staffing cuts are hard to pin down, but their historic scale in the first eight months of this administration is unmistakable. Released in May, Zeldin’s budget proposal for the fiscal year starting October 2025 proposed to cut 1,274 full-time-equivalent employee positions from a total of 14,130 in the year ending Sept. 30, 2025 – a 9% drop.

A July 18, 2025, press release from the EPA said the agency had already cut 23% of its personnel, terminating the employment of 3,707 of 16,155 employees. Using employees – the number of people – rather than full-time equivalents makes these numbers difficult to compare directly with EPA’s budget proposals.

Combining EPA data on staffing changes with conservative estimates of the pending cuts, the initiative has calculated that 25% of EPA staff are already out of the agency.

That calculation does not include other announced cuts, including a third round of deferred resignations taking effect at the end of September 2025 and December 2025. Those cuts may see the departure of similar numbers of full-time equivalents as in the past two rounds – approximately 500 and 1,500.

The agency has also reportedly planned to be cutting as much as two-thirds of research staff.

With those departure figures included, the initiative estimates that approximately 33% of staffers at the agency when Trump took office will be gone by the end of 2025. That would leave, at the start of 2026, an EPA staff numbering approximately 9,700 people, a level not seen since the last years of the Nixon and Ford administrations.

These cuts are deeper than past efforts to shrink the size of the agency. In his first term, Trump proposed eliminating 21.4% of staff at the EPA, though Congress made no significant changes to the agency’s staffing. The largest actual cut to EPA staffing was under President Ronald Reagan in the early 1980s: He advocated for a 17.3% drop in staffing, although Congress held the cuts to 10%.

Effects of past cuts

In the past, cuts to the EPA caused problems and were reversed – but it took years.

The staffing and budget cuts that came during the first two years of the Reagan administration generated problems with meeting the agency’s responsibilities.

For instance, rather than prosecute industry for polluting, Reagan’s EPA Administrator Anne Gorsuch told business leaders she would ignore their violations of environmental laws. Remaining staff were convinced that working on enforcement cases would be a “black mark” on their records.

Another top political appointee at Reagan’s EPA, Rita Lavelle, who headed the Superfund effort to clean up toxic sites, faced prison time for her official acts. She was convicted of perjury and obstructing a congressional investigation because she lied about her ties to a former employer who had polluted the Stringfellow Acid Pits, a Superfund site near Riverside, California.

A person holds a clear jar of liquid while sitting on the ground in an area covered by rocks and dirt.
A man holds a jar of contaminated water from the stream flowing out of the Stringfellow Acid Pits in California in February 1983.
Bill Nation/Sygma via Getty Images

In the wake of the scandal, Lavelle was fired and Gorsuch and more than a dozen other political appointees resigned.

In a later report on the issue, Congress accused Gorsuch, Lavelle and others of poor job performance, noting that after four years of Superfund work, “only six of the 546 … of the most hazardous sites in the Nation have been cleaned up.” The Stringfellow site, a focus of the investigation, was “threatening the health and safety of 500,000 people,” the report noted.

With anger over the scandals from both Americans and Congress, Reagan reversed course and spent the remaining six years of his presidency building the EPA back up in both staffing and budget. Staffing, for example, increased from a low of 10,481 full-time-equivalent employees in 1982 to 15,130 in 1989. Reagan’s EPA budget, which had fallen to US$4.1 billion in 1984, increased to $4.9 billion in 1989.

The existing Trump cuts, and those proposed – if enacted by Congress – would be deeper than Reagan’s, reducing the number of people doing important research on environmental harms and the health effects of dangerous chemicals; suing companies who pollute the environment; and overseeing the cleanup of toxic sites.

The Conversation

Elizabeth (Scout) Blum is affiliated with the Environmental Data and Governance Initiative. She has received funding from EDGI.

Chris Sellers previously received funding from the National Science Foundation on a project that partly involved research into the EPA’s history.

ref. Trump administration is on track to cut 1 in 3 EPA staffers by the end of 2025, slashing agency’s ability to keep pollution out of air and water – https://theconversation.com/trump-administration-is-on-track-to-cut-1-in-3-epa-staffers-by-the-end-of-2025-slashing-agencys-ability-to-keep-pollution-out-of-air-and-water-265249