Censorship campaigns can have a way of backfiring – look no further than the fate of America’s most prolific censor

Source: The Conversation – USA (2) – By Amy Werbel, Professor of the History of Art, Fashion Institute of Technology (FIT)

The vast majority of Americans support the right to free speech. Jacek Boczarski/Anadolu via Getty Images

In the first year of President Donald Trump’s second term in office, his administration has made many attempts to suppress speech it disfavorsat universities, on the airwaves, in public school classrooms, in museums, at protests and even in lawyer’s offices.

If past is prologue, these efforts may backfire.

In 2018, I published my book “Lust on Trial: Censorship and the Rise of American Obscenity in the Age of Anthony Comstock.”

A devout evangelical Christian, Comstock hoped to use the powers of the government to impose moral standards on American expression in the late-19th and early-20th centuries. To that end, he and like-minded donors established the New York Society for the Suppression of Vice, which successfully lobbied for the creation of the first federal anti-obscenity laws with enforcement provisions.

Later appointed inspector for the Post Office Department, Comstock fought to abolish whatever he deemed blasphemous and sinful: birth control, abortion aids and information about sexual health, along with certain art, books and newspapers. Federal and state laws gave him the power to order law enforcement to seize these materials and have prosecutors bring criminal indictments.

I analyzed thousands of these censorship cases to assess their legal and cultural outcomes.

I found that, over time, Comstock’s censorship regime did lead to a rise in self-censorship, confiscations and prosecutions. However, it also inspired greater support for free speech and due process.

More popular – and more profitable

One effect of Comstock’s censorship campaigns: The materials and speech he disfavored often made headlines, putting them on the public’s radar as a kind of “forbidden fruit.”

For example, prosecutions targeting artwork featuring nude subjects led to both sensational media coverage and a boom in the popularity of nudes on everything from soap advertisements and cigar boxes to photographs and sculptures.

Black-and-white portrait of bald man with mutton chops.
Anthony Comstock.
Bettmann/Getty Images

Meanwhile, entrepreneurs of racy forms of entertainment – promoters of belly dancing, publishers of erotic postcards and producers of “living pictures,” which were exhibitions of seminude actors posing as classical statuary – all benefited from Comstock’s complaints. If Comstock wanted it shut down, the public often assumed that it was fun and trendy.

In 1891, Comstock became irate when a young female author proposed paying him to attack her book and “seize a few copies” to “get the newspapers to notice it.” And in October 1906, Comstock threatened to shut down an exhibition of models performing athletic exercises wearing form-fitting union suits. Twenty thousand people showed up to Madison Square Garden for the exhibition – far more than the venue could hold at the time.

The Trump administration’s recent efforts to get comedian Jimmy Kimmel off the air have similarly backfired.

Kimmel had generated controversy for comments he made on his late-night talk show in the wake of conservative activist Charlie Kirk’s assassination. ABC, which is owned by The Walt Disney Co., initially acquiesced to pressure from Federal Communications Commission Chairman Brendan Carr and announced the show’s “indefinite” suspension. But many viewers, angered over the company’s capitulation, canceled their subscriptions of Disney streaming services. This led to a 3.3% drop in Disney’s share price, which spurred legal actions by shareholders of the publicly traded company.

ABC soon lifted the suspension. Kimmel returned, drawing 6.26 million live viewers – more than four times his normal audience – while over 26 million viewers watched Kimmel’s return monologue on social media. Since then, all network affiliates have resumed airing “Jimmy Kimmel Live!”

‘Comstockery’ and hypocrisy

In the U.S., disfavored political speech and obscenity are different in important ways. The Supreme Court has held that the First Amendment provides broad protections for political expression, whereas speech deemed to be obscene is illegal.

Despite this fundamental difference, social and cultural forces can make it difficult to clearly discern protected and unprotected speech.

In Comstock’s case, the public was happy to see truly explicit pornography removed from circulation. But their own definition of what was “obscene” – and, therefore, criminally liable – was much narrower.

In 1905, Comstock attempted to shut down a theatrical performance of George Bernard Shaw’s “Mrs. Warren’s Profession” because the plot included prostitution. The aging censor was widely ridiculed and became a “laughing stock,” according to The New York Times. Shaw went on to coin the term “Comstockery,” which caught on as a shorthand for overreaching censoriousness.

Cartoon of young women booting an older man down some steps.
Cartoonists at the turn of the 20th century had a field day with Anthony Comstock’s overreaches.
Amazon

In a similar manner, when Attorney General Pam Bondi recently threatened Americans that the Department of Justice “will absolutely … go after you, if you are targeting anyone with hate speech,” swift backlash ensued.

Numerous Supreme Court rulings have held that hate speech is constitutionally protected. However, those in power can threaten opponents with punishment even when their speech clearly does not fall within one of the rare exceptions to the First Amendment protection for political speech.

Doing so carries risks.

The old saying “people in glass houses shouldn’t throw stones” also applies to censors: The public holds them to higher standards, lest they be exposed as hypocrites.

For critics of the Trump administration, it was jarring to see officials outraged about “hate speech,” only to hear the president announce, at Charlie Kirk’s memorial, “I hate my opponent, and I don’t want the best for them.”

In Comstock’s case, defendants and their attorneys routinely noted that Comstock had seen more illicit materials than any man in the U.S. Criticizing Comstock in 1882, Unitarian minister Octavius Brooks Frothingham quoted Shakespeare: “Who is so virtuous as to be allowed to forbid the distribution of cakes and ale?”

In other words, if you’re going to try to enforce moral standards, you better make sure you’re beyond reproach.

Free speech makes for strange bedfellows

Comstock’s censorship campaign, though self-defeating in the long run, nonetheless caused enormous suffering, just as many people today are suffering from calls to fire and harass those whose viewpoints are legal, but disliked by the Trump administration.

Comstock prosecuted women’s rights advocate Ida Craddock for circulating literature that advocated for female sexual pleasure. After Craddock was convicted in 1902, she died by suicide. She left behind a “letter to the public,” in which she accused Comstock of violating her rights to freedom of religion and speech.

Portly man wearing a hat and a full body suit gingerly stepping into a bathtub.
A 1906 cartoon in the satirical periodical Puck mocks Anthony Comstock as a prude.
PhotoQuest/Getty Images

During Craddock’s trial, the jury hadn’t been permitted to see her writings; they were deemed “too harmful.” Incensed by these violations of the First and Fourth amendments, defense attorneys rallied together and were joined by a new coalition in the support of Americans’ constitutional rights. Lincoln Steffens of the nascent Free Speech League wrote, in response to Craddock’s suicide, that “those who believe in the general principle of free speech must make their point by supporting it for some extreme cause. Advocating free speech only for a popular or uncontroversial position would not convey the breadth of the principle.”

Then, as now, the cause of free expression can bring together disparate political factions.

In the wake of the Kimmel saga, many conservative Republicans came out to support the same civil liberties also advocated by liberal Hollywood actors. Two-thirds of Americans in a September 2025 YouGov poll said that it was “unacceptable for government to pressure broadcasters to remove shows it disagrees with.”

My conclusion from studying the 43-year career of America’s most prolific censor?

Government officials may think a campaign of suppression and fear will silence their opponents, but these threats could end up being the biggest impediment to their effort to remake American culture.

The Conversation

Amy Werbel receives funding from the State University of New York.

ref. Censorship campaigns can have a way of backfiring – look no further than the fate of America’s most prolific censor – https://theconversation.com/censorship-campaigns-can-have-a-way-of-backfiring-look-no-further-than-the-fate-of-americas-most-prolific-censor-266117

McCarthyism’s shadow looms over controversial firing of Texas professor who taught about gender identity

Source: The Conversation – USA (2) – By Laura Gail Miller, Ed.D. Candidate in Educational Organizational Learning and Leadership, Seattle University

A Texas A&M free speech case raises questions about academic freedom that have featured before in American society and courts, including during the 1950s. Westend61

Texas A&M University announced the resignation of its president, Mark A. Welsh III, on Sept. 18, 2025, following a controversial decision earlier in the month to fire a professor over a classroom exchange with a student about gender identity.

The university – a public school in College Station, Texas – fired Melissa McCoul, a children’s literature professor, on Sept. 9. McCoul’s dismissal happened after a student secretly filmed video as the professor taught a class and discussed a children’s book that includes the image of a purple “gender unicorn,” a cartoon image that is sometimes used to teach about gender identity.

The student questioned whether it was “legal” to be teaching about gender identity, given President Donald Trump’s January 2025 executive order – which is not legally binding – that said there are only two genders, male and female.

The video went viral, triggering backlash from Republican lawmakers who called for McCoul to be fired and praised the fact that the school also demoted the College of Arts and Science’s dean and revoked administrative duties from a department head.

Texas A&M officials have said that McCoul was fired because her course content was not consistent with the published course description. McCoul is appealing her firing and is considering legal action against the school.

Academic freedom advocates have condemned McCoul’s firing and say it raises questions about whether professors should be fired for addressing politically charged topics.

As a history educator researching curriculum design, civics education and generational dynamics, I study how classroom discussions often mirror larger cultural and political conflicts.

The Texas A&M case is far from unprecedented. The Cold War offers an example of another politically contentious time in American history when people questioned if and how politics should influence what gets taught in the classroom – and tried to restrict what teachers say.

A large grassy and concrete space is seen with a water tower behind it and a person riding their bike, while another one walks.
The public university Texas A&M, seen here in August 2023, is the site of a controversial freedom of speech and academic repression case.
iStock/Getty Images Plus

Educators under suspicion in the McCarthy era

During the Cold War – a period of geopolitical tension between the U.S. and the Soviet Union that came after World War II and lasted until 1991 – fears of communist infiltration spread widely across American society, including the country’s schools.

One particularly contentious period was in the late 1940s and 1950s, during what is often referred to as the McCarthy era. The era is named after Wisconsin Sen. Joseph McCarthy, a Republican who led the charge on accusing government employees and others – often without evidence – of being communists.

Beginning in the late 1940s, local school boards, state legislatures and Congress launched investigations into teachers and professors across the country accused of harboring communist sympathies. This often led to the teachers being blacklisted and fired.

More than 20 states passed loyalty oath laws requiring public employees, including educators, to swear that they were not members of the Communist Party or affiliated groups.

In California, for example, the 1950 Levering Act mandated a loyalty oath for all state employees, including professors at public universities. Some employees refused to sign the oath, and 31 University of California professors were fired.

And in New York, the Feinberg Law, approved in 1949, authorized school districts to fire teachers who were members of “subversive organizations.” More than 250 educators were fired or forced to resign under the Feinberg Law and related anti-subversion policies between 1948 and 1953.

These laws had a chilling impact on academic life and learning.

Faculty, including those who were not under investigation, and students alike avoided discussing controversial topics, such as labor organizing and civil rights, in the classroom.

This pervasive climate of censorship also made it challenging for educators to fully engage students in critical, meaningful learning.

The Supreme Court steps in

By the mid-1950s, questions about the constitutionality of these laws – and the extent of professors’ academic freedom and First Amendment right to freedom of speech – reached the Supreme Court.

In one such case, 1957’s Sweezy v. New Hampshire, Louis C. Weyman, the New Hampshire attorney general, questioned Paul Sweezy, a Marxist economist, about the content of a university lecture he delivered at the University of New Hampshire.

Weyman wanted to determine whether Sweezy had advocated for Marxism or said that socialism was inevitable in the country. Sweezy refused to answer Weyman’s questions, citing his constitutional rights. The Supreme Court ruled in Sweezy’s favor, emphasizing the importance of academic freedom and the constitutional limits on state interference in university teaching.

The Supreme Court also considered another case, Keyishian v. Board of Regents, in 1967. With the Cold War still ongoing, this case challenged New York’s Feinberg Law, which required educators to disavow membership in communist organizations.

In striking down the law, the court declared that academic freedom is “a special concern of the First Amendment.” The ruling emphasized that vague or broad restrictions on what teachers can say or believe create an unconstitutional, “chilling effect” on the classroom.

While these cases did not remove all political pressures on what teachers could discuss in class, they set significant constitutional limits on state efforts to regulate classroom speech, particularly at public institutions.

A man in a black-and-white photo wears glasses and holds up papers toward a microphone. He sits next to another man.
Sen. Joseph R. McCarthy, right, speaks during the McCarthy investigations in November 1954, trying to show communist subversion in high government circles.
Bettmann/Contributor

Recurring tensions from now and then

There are several important differences between the McCarthy era and current times.

For starters, conservative concern centered primarily on the spread of communism during the McCarthy era. Today, debates often involve conservative critiques of how topics such as gender identity, race and other cultural issues — sometimes grouped under the term “woke” — are addressed in schools and society.

Second, in the 1950s and ‘60s, external pressures on academic freedom often came in the form of legal mandates.

Today, the political landscape in academia is more complex and fast-paced, with pressures emanating from both the public and federal government.

Viral outrage, administrative investigations and threats to cut state or federal funding to schools can all contribute to an intensifying climate of fear of retribution that constrains educators’ ability to teach freely.

Despite these differences, the underlying dynamic between the two time periods is similar – in both cases, political polarization intensifies public scrutiny of educators.

Like loyalty oaths in the 1950s, today’s political controversies create a climate in which many teachers feel pressure to avoid contentious topics altogether. Even when no laws are passed, the possibility of complaints, investigations or firings can shape classroom choices.

Just as Sweezy and Keyishian defined the boundaries of state power in the 1950s and ‘60s, potential legal challenges like the appeal from the fired Texas A&M professor may eventually lead to court rulings that clarify how people’s First Amendment protections apply in today’s disputes over curriculum and teaching.

Whether these foundational protections will endure under the Supreme Court’s current and future makeup remains an open question.

The Conversation

Laura Gail Miller does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. McCarthyism’s shadow looms over controversial firing of Texas professor who taught about gender identity – https://theconversation.com/mccarthyisms-shadow-looms-over-controversial-firing-of-texas-professor-who-taught-about-gender-identity-265554

Why chromium is considered an essential nutrient, despite having no proven health benefits

Source: The Conversation – USA (3) – By Neil Marsh, Professor of Chemistry and Biological Chemistry, University of Michigan

You’re more likely to get chromium from your cookware than from your food. Fausto Favetta Photoghrapher/Moment via Getty Images

You might best know chromium as a bright, shiny metal used in bathroom and kitchen fittings. But is it also essential for your health?

In a form known as trivalent chromium, this metal is included in multivitamin pills and sold as a dietary supplement that companies claim can improve athletic performance and help regulate blood sugar.

I’m a biochemistry professor with a long-standing interest in how metals function in biology. Although health agencies in the United States and other countries recommend chromium as a dietary requirement, eight decades of research have resulted in slim evidence that people derive any significant health benefits from this mineral.

Why, then, did chromium come to be considered essential for human health?

What is an essential trace element?

To stay healthy, people need what are called essential trace elements in their diet. These include metals such as iron, zinc, manganese, cobalt and copper. As the word “trace” implies, you need only tiny amounts of these metals for optimal function.

Diagram of select columns and rows of the periodic table
A range of metals are considered essential (green) to humans. Others are important for only some forms of life (pink).
Jomova et al. 2022, CC BY-SA

For most of these trace elements, decades of research have shown they are genuinely essential for health. Iron, for example, is essential for carrying oxygen in your blood, and many proteins – complex molecules that carry out all of the functions necessary for life – require iron to function properly. A deficiency of iron leads to anemia, a condition that results in fatigue, weakness, headaches and brittle nails, among other symptoms. Iron supplements can help reverse these symptoms.

Importantly, biochemists have pinpointed exactly how iron helps proteins perform essential chemical reactions, not just for humans but all living organisms. Researchers know not only that iron is essential but also why it is essential.

Little evidence for chromium’s benefits

However, the same cannot be said for chromium.

Chromium deficiency – having little to no chromium in your body – is extremely rare, and researchers have not identified any clearly defined disease caused by low chromium levels.

Like all food, essential metals must be absorbed by your digestive system. However, the gut absorbs only about 1% of ingested chromium. Other essential metals are absorbed more efficiently – for example, the average person absorbs around 25% of certain forms of ingested iron.

Importantly, despite many studies, scientists have yet to find any protein that requires chromium to carry out its biological function. In fact, only one protein is known to bind chromium, and this protein most likely helps your kidneys remove chromium from your blood. While some studies in people suggest chromium might be involved to some degree in regulating blood glucose levels, research on whether adding extra chromium to your body through supplements can substantially improve your body’s ability to break down and use sugar has been inconclusive.

Thus, based on biochemistry, there is currently no evidence that humans, or other animals, actually require chromium for any particular function.

Flawed research in rats

So how did chromium come to be considered an essential trace metal?

The idea that chromium might be essential for health stems from studies in the 1950s, a time when nutritionists knew very little about what trace metals are required to maintain good health.

One influential study involved feeding lab rats a diet that induced symptoms of Type 2 diabetes. Supplementing their diet with chromium seemed to cure the rats of Type 2 diabetes, and medical researchers were enticed by the suggestion that chromium might provide a treatment for this disease. Today’s widespread claims that chromium is important for regulating blood sugar can be traced to these experiments.

Chromium crystals in the shape of jagged chunks and as a cube
Chromium has many uses as a metal alloy, but not so much as a nutrient.
Alchemist-hp/Wikimedia Commons, CC BY-NC-ND

Unfortunately, these early experiments were very flawed by today’s standards. They lacked the statistical analyses needed to show that their results were not due to random chance. Furthermore, they lacked important controls, including measuring how much chromium was in the rats’ diet to start with.

Later studies that were more rigorously designed provided ambiguous results. While some found that rats fed chromium supplements controlled their blood sugar slightly better than rats raised on a chromium-free diet, others found no significant differences. But what was clear was that rats raised on diets that excluded chromium were perfectly healthy.

Experiments on people are much harder to control for than experiments on rats, and there are few well-designed clinical trials investigating the effects of chromium on patients with diabetes. Just as with the rat studies, the results are ambiguous. If there is an effect, it is very small.

Recommendations based on averages

Why, then, is there a recommended dietary intake for chromium despite its lack of documented health benefits?

The idea that chromium is needed for health persists due in large part to a 2001 report from the National Institute of Medicine’s Panel on Micronutrients. This panel of distinguished nutritional researchers and clinicians was formed to evaluate available research on human nutrition and set “adequate intake” levels of vitamins and minerals. Their recommendations form the basis of the recommended daily intake labels found on food and vitamin packaging and the National Institutes of Health guidelines for clinicians.

Despite acknowledging the lack of research demonstrating clear-cut health benefits for chromium, the panel still recommended adults get about 30 micrograms per day of chromium in their diet. This recommendation was not based on science but rather on previous estimates of how much chromium adult Americans already ingest each day. Notably, much of this chromium is leached from stainless steel cookware and food processing equipment, rather than coming from our food.

So, while there may not be confirmed health risks from taking chromium supplements, there’s probably no benefit either.

The Conversation

Neil Marsh receives funding from the NIH and NSF.

ref. Why chromium is considered an essential nutrient, despite having no proven health benefits – https://theconversation.com/why-chromium-is-considered-an-essential-nutrient-despite-having-no-proven-health-benefits-252867

Russell M. Nelson, president of The Church of Jesus Christ of Latter-day Saints, pushed it away from ‘Mormon’ – a word that has courted controversy for 200 years

Source: The Conversation – USA (3) – By Konden Smith Hansen, Senior Lecturer of Religious Studies, University of Arizona

Russell Nelson, center, sits during the Church of Jesus Christ of Latter-day Saints’ biannual General Conference in Salt Lake City in 2019. George Frey/Getty Images

Russell M. Nelson, a former heart surgeon and longtime church leader, was 93 years old when he became president of The Church of Jesus Christ of Latter-day Saints in 2018. But anyone who assumed that his tenure would be uneventful, due to his advanced years, was quickly proved wrong. Visiting South America that year, he told members to buckle up: “Eat your vitamin pills. Get your rest. It’s going to be exciting.”

Nelson, who died on Sept. 27, 2025, at age 101, proved a consequential reformer: an energetic leader who streamlined bureaucracy, took steps toward gender equity and ended the church’s century-long relationship with the Boy Scouts, while reaffirming its opposition to LGBTQ+ relationships and identities.

He steered the church unapologetically through storms of public scrutiny, including accusations that the church had concealed the value of its investments. For the faithful, Nelson represented God’s mouthpiece on Earth. The church considers each president to be a “prophet, seer, and revelator.”

Yet one of his initiatives made an impact that rippled far beyond the church. In 2018, he surprised observers by declaring the use of the word “Mormon” a “major victory for Satan,” insisting on the use of the church’s full name. Individuals were to be recognized by their institutional affiliation, as “members of The Church of Jesus Christ of Latter-day Saints.”

The name of the church was given by God, and shortening it erases “all that Jesus Christ did for us,” Nelson argued. Yet adherents have long self-identified as Mormons, so the rebrand felt like a novelty to some members.

As a university lecturer teaching courses on American religion and Mormonism, I was one of many who wrestled with this change in terminology – and saw the challenges it created for my students and colleagues. For almost two centuries, the word “Mormon” has framed how Americans think about and discuss this faith.

Birth of a church

The name Mormon comes from the title of the Book of Mormon, a religious text unique to the faith. Founder Joseph Smith, who organized the church in 1830, believed he had been instructed by God to restore Jesus’ true church. He claimed that an angel had led him to uncover and translate ancient gold plates that detailed the religious history of an ancient civilization in the Americas, founded by Israelites who fled Jerusalem.

An old book open to the first page.
An 1841 edition of the Book of Mormon, on display in the museum at the Springs Preserve in Las Vegas.
Prosfilaes/Wikimedia Commons, CC BY

Early critics mockingly attached the word Mormon to the movement, but Smith insisted that in the book’s original language, the word meant “literally, ‘more good.’” By the time Smith was killed by a mob in Illinois in 1844, his followers had embraced the word.

After Smith’s death, Mormons split into different factions, with the largest group traveling by foot and wagon to the far American West. Yet, the group’s evolving practices continued to spark controversy. Polygamy and the church’s political and economic influence contributed to decades of animosity between Mormons and the rest of the nation.

The United States began seizing church property and imprisoning polygamist leaders, coercing church president Wilford Woodruff to end official support for polygamy in 1890.

A new debut

Three years later, at the Chicago World’s Fair, the church rebranded Mormonism, presenting Mormon pioneers as an embodiment of the values of the American frontier.

Woodruff, then 86 years old, spoke of himself as Utah’s oldest living pioneer and of Mormons as a people who built the American West. The Mormon Tabernacle Choir performed at the fair, reintroducing Mormons to the wider public as a sophisticated and artistic people. The crowd shouted, “Three cheers for the Mormons!” The Chicago Herald wrote, “Mormons and gentiles came together as friends.”

A black and white photo of ornate buildings around a waterway, with fountains in a plaza.
The Great Basin at the Chicago World’s Fair in 1893.
Chicago History Museum/Getty Images

Despite this, many Americans still distrusted Mormons. In 1903, high-ranking church official Reed Smoot was elected to the U.S. Senate, which provoked national outcry and led to Senate hearings that lasted until 1907. The hearings substantiated charges that the practice of polygamy persisted but exonerated Smoot as an individual. As Smoot argued, Mormons were independent of the institutional church and thus trustworthy Americans. He convinced his fellow senators that if the church’s teachings came into conflict with his conscience or oath of office, then, as a Mormon, he would uphold the latter.

Following Smoot’s lead, the church embraced the trappings of American patriotism and doubled down against plural marriage. These moves won the Latter-day Saints powerful political allies, including Theodore Roosevelt, who disliked the institutional church but viewed Mormons themselves as intensely moral and patriotic.

‘Meet the Mormons’

Ab Jenkins, a race car driver whose records made him an international celebrity in the 1930s, capitalized on this new image of Mormon individuality and wholesomeness. The “Mormon Boy” credited his clean, church-approved lifestyle for his success. On his car, the Mormon Meteor, Jenkins rejected alcohol and cigarette endorsements and instead brandished a sign that read, “Yes, I’m a Mormon.”

A black and white image of an old-fashioned car against a white, flat landscape.
Ab Jenkins starts a 1939 test run in his race car, the Mormon Meteor III, on Utah’s Bonneville Salt Flats.
Underwood Archives/Getty Images

For several decades, other Mormon celebrities like family band The Osmonds and golfer Johnny Miller continued to shape positive public views of Mormons – hitting a high-water mark in 1977, when Gallup found that only 18% of Americans held unfavorable views.

Church efforts to influence social issues, however, such as its decades-long opposition to the Equal Rights Amendment, eventually took a toll. By 1991, public opinion of Mormons had fallen dramatically, with 37% of Americans viewing them unfavorably – and leaders decided that another rebrand was in order.

The previous year, senior leader Gordon B. Hinckley had admonished members to make the word Mormon “shine with added luster.” When he became president in 1995, Hinckley worked to reframe how the public saw Mormons, arguing on the “60 Minutes” TV show that Mormons were “not a weird people.”

The Salt Lake City Olympics in 2002 pushed Mormonism into the national spotlight, and that same year, the church launched its major website, Mormon.org, with stories and headlines liberally using the term “Mormon.” A media campaign followed a decade later, featuring prominent members declaring, “I’m a Mormon.” Ordinary members were then encouraged to upload their own “I’m a Mormon” profiles to this website and share them on their own social media accounts.

A tall building with an image of a woman figure skating projected on it, next to church steeples and a snow-covered mountain.
The Latter-day Saints temple in downtown Salt Lake City, center, as an Olympic banner drapes the church office building next door during the 2002 Games.
George Frey/AFP via Getty Images

Mitt Romney’s Republican nomination for the 2012 presidential election and the popularity of the satirical “Book of Mormon” musical pushed Mormons again into the national spotlight. In 2014, the church produced a documentary titled “Meet the Mormons,” shown in theaters across the U.S., which apostle Jefferey R. Holland explained was to “show people what we’re really like.”

In 2017, a Pew Research Center survey’s “feeling thermometer” found public opinion of Mormons to have risen to the “somewhat warmer” rating of 54, a 6-point increase from 2014.

‘More good’?

That said, the church’s relationship to the word Mormon has always been complex. As far back as 1990, Nelson was already warning fellow Latter-day Saints that Mormon was “not an appropriate alternative” for the church’s full name. During the 2002 Olympics, the church advised media that the nickname was acceptable for individuals but not to refer to the institution itself.

Overall, I would argue the church has used the word Mormon to improve public opinion for more than a century. Part of this branding downplayed popular fears about the church and its influence – allowing outsiders to develop favorable views toward Mormons, even if they disliked the institution itself.

In March 2023, a Pew Research poll reported a low point in public opinion of Mormons, falling for the first time below every other measured group. A quarter of Americans held “unfavorable views of Mormons,” while only 15% held “favorable” ones.

A month later, Nelson pleaded with members to be peacemakers, lamenting that “many people seem to believe that it is completely acceptable to condemn, malign and vilify anyone who does not agree with them.” Nationally, intense polarization and violence have continued since then – including a horrific attack in Michigan on a Latter-day Saints church building on Sept. 28, 2025, just one day after Nelson’s death.

“Mormon” has been an important term in engaging those outside the faith, particularly in countering negative perceptions. Whether or not the word disappears, what may matter more for Nelson’s legacy is whether people outside the church associate it with “more good” – both institutionally and individually.

This is an updated version of an article originally published on Sept. 5, 2024.

The Conversation

Konden Smith Hansen does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Russell M. Nelson, president of The Church of Jesus Christ of Latter-day Saints, pushed it away from ‘Mormon’ – a word that has courted controversy for 200 years – https://theconversation.com/russell-m-nelson-president-of-the-church-of-jesus-christ-of-latter-day-saints-pushed-it-away-from-mormon-a-word-that-has-courted-controversy-for-200-years-266229

Late-night TV in the US has a storied history of political commentary and presidential engagement

Source: The Conversation – UK – By Faye Davies, Senior Lecturer in Media and Cultural Theory, Birmingham City University

Earlier this month, it looked as if late-night talk show host Jimmy Kimmel had lost his job after his network, ABC, pulled his show over controversial comments he made about the death of Charlie Kirk. But within a week he was back, and his show Jimmy Kimmel Live! gained its highest ratings in more than a decade.

It was Kimmel’s first show back on the air after ABC lifted his suspension as a result of public pressure. Kimmel had prompted outrage including from the US president, Donald Trump, and his Maga supporters after he accused what he called “the Maga gang” of attempting to capitalise on Kirk’s murder.

ABC’s decision to pull Kimmel off the air gained global attention. Trump celebrated on his TruthSocial platform, citing what he said were Kimmel’s poor ratings and lack of talent. But Kimmel’s fans – and supporters of free speech in the US and beyond – cancelled their subscriptions to Disney, ABC’s owner.

Disney relented and ABC reinstated Kimmel. But the episode – as well as comments from Trump that networks whose shows were opposed to him should “maybe” have their licences “taken away” – has raised fears and prompted questions about free speech, state intervention and censorship in the US.

Late-night shows have been a cornerstone of the American media landscape since the late 1940s. They typically air after the evening news and their hosts, usually comedians, tend to open with a monologue which takes in and provides a humorous commentary on the news.




Read more:
Jimmy Kimmel is back, but how much longer will late-night comedy last?


The Tonight Show’s host Johnny Carson introduced the witty introductory speech in the 1960s. Late-night political satire in the US has tended to focus on scandalous and controversial decisions, with a distinct focus on the personalities and actions of prominent public figures. Many previous presidents have been targeted but they haven’t shied away from engaging with the format. Both Richard Nixon and John Kennedy appeared on The Tonight Show in the 1960s, as did Ronald Reagan in the mid 1970s. Bill Clinton appeared on The Arsenio Hall Show as a saxophone-playing presidential hopeful in 1992.

David Letterman hosted George W. Bush in 2000. Barack Obama appeared on Saturday Night Live as a candidate before he became the first sitting president to join late-night host Jay Leno in 2009. Surprisingly, even Donald Trump hosted the satirical sketch show Saturday Night Live in 2004 and then again as candidate in 2015.

It’s a powerful medium that reaches diverse audiences, and in some instances can sway opinion. Research has found that Carson’s coverage had an impact on public opinion around the Watergate scandal against then sitting president Nixon.

Political satire tends to be focused on comic metaphors and embellishment – and so not all presidents make for good jokes. For instance, Obama didn’t provide enough scandal for content.

But the twice-impeached Trump has offered endless fodder for late-night political satire. Hosts jumped on his suggestion that injecting disinfectant might be able to treat COVID-19. They found much to prod at through the Stormy Daniels scandal.

That was during his last presidency, however. This time round he seems less open to the jokes.




Read more:
New York Times v Sullivan: the 60-year old Supreme Court judgment that press freedom depends on in Trump era


Feeling the heat

Speaking soon after Kimmel made his comments, the government official responsible for licensing ABC’s local stations publicly pressured the company to punish Kimmel. Speaking on right-wing podcaster Benny Johnson’s show, the chair of the Federal Communications Commission (FCC), Brendan Carr said: “These companies can find ways to change conduct and take actions on Kimmel, or there’s going to be additional work for the FCC ahead.” It was a clear warning that action restricting content appeared to be looming.

Disney and ABC were clearly panicked and Kimmel was pulled off air.

After Kimmel’s suspension, the world of late night rallied around him. Meyers said on his show, Late Night with Seth Meyers that the situation, “has experts worried that we are rapidly devolving into repressive autocracy in the style of Russia or Hungary”. Stephen Colbert, host of The Late Show – which will be discontinued in 2026 – maintained he stood with Kimmel warning that “with an autocrat, you cannot give an inch”. He called ABC “naive” for pulling Kimmel off the air.

Even former US president Barack Obama spoke up, claiming that muzzling reporters and commentators was dangerous government coercion.

As the clampdown on late-night shows develops, Kimmel and Colbert’s situations raise significant questions about free speech and the scope of political satire in “the land of the free”.




Read more:
The First Amendment: what it really means for free speech and why Donald Trump is trampling on it


Kimmel: contrite yet defiant

After his cancellation was reversed, Kimmel returned with an emotional and defiant 28-minute monologue. He appeared visibly moved when making it clear that: “It was never my intention to make light of the murder of a young man. I don’t think there’s anything funny about it.” Kimmel emphasised that he wasn’t laying the blame for Kirk’s death on any political side – and had been trying the achieve precisely the opposite.

Central to Kimmel’s return was his strong reaffirmation of satire’s role in American political discourse with a nod to all sides of the political spectrum: “I want to thank the people who don’t support my show and what I believe, but support my right to share those beliefs anyway.”

And, while it appears Trump is doubling down on his threats, so far the backlash and resulting debate over free speech, cancel culture, and social media will keep the late-night genre part of US primetime for now.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

Faye Davies does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Late-night TV in the US has a storied history of political commentary and presidential engagement – https://theconversation.com/late-night-tv-in-the-us-has-a-storied-history-of-political-commentary-and-presidential-engagement-266087

The UK must invest in medicines – but not at any price

Source: The Conversation – UK – By Catia Nicodemo, Professor of Health Economics, Brunel University of London

Cryptographer/Shutterstock

The UK’s science minister, Sir Patrick Vallance, has sounded the alarm over the country’s declining investment in medicines. He warned that the NHS risks losing out on important treatments and the country could lose its place at the cutting edge of medical research if spending does not recover. It comes at a sensitive time – this year drugmakers including Merck and AstraZeneca have backtracked on plans to invest in the UK.

Vallance is correct that there is a need to encourage pharmaceutical firms to keep investing and launching new medicines in the UK. On the other side, there is a need to protect public funds from being wasted on treatments that do not offer enough benefit for their cost.

At the moment, just 9% of NHS healthcare spending goes on medicines. This is less than Spain (18%), Germany (17%) and France (15%). At a time when some experts believe the UK is getting sicker, this might come as a surprise.

But the UK is unusual among major health systems in how carefully it regulates drug spending. The National Institute for Health and Care Excellence (Nice) has, since its creation, judged new treatments not only on clinical evidence but on cost-effectiveness.

That means asking whether a drug’s health benefits – measured in quality-adjusted life years (QALYs) – justify its price compared with existing care. For most treatments the threshold is about £20,000 to £30,000 per QALY. This is not a perfect measure, but it gives the NHS a consistent way of deciding whether the health gained is worth the money spent.

The value of this approach is clear. Nice’s record shows that medicines that pass its tests have added millions of QALYs to patients in England, while also preventing waste on drugs that bring only marginal improvements at high cost.

A study published earlier this year in medical journal The Lancet found that many of the new medicines recommended by Nice between 2000-2020 brought substantial benefit to patients. But it also noted that some high-cost drugs deliver much less health gain than investments in prevention or early diagnosis could.

The study emphasises that maintaining rigorous thresholds around cost-effectiveness ensures that public funds go to treatments that really improve lives. In other words, the discipline of cost-effectiveness has protected the public purse while ensuring access to genuine innovations.

This regulatory strength is reinforced by national pricing schemes for branded medicines. These cap overall growth in the NHS drugs bill and require companies to pay rebates if spending rises too fast. In practice, this means that if total spending on branded medicines exceeds an agreed annual limit, pharmaceutical companies must pay back a percentage of their sales revenue to the Department of Health.

In recent years that rebate rate has been as high as 20–26% of sales, effectively lowering the price the NHS pays. This is made possible by the buying power of the health service.

Together with Nice’s appraisals, these measures have helped the NHS maintain relatively low medicines spending compared with many countries. At the same time, it still secures access to major advances in cancer therapy, immunology and rare disease treatment.

For a publicly funded service under constant financial strain, these protections are vital. Despite the pressure on its budget, the NHS has secured meaningful access to new therapies. For example, by March 2024, nearly 100,000 patients in England – many of whom would otherwise face long delays or rejection – had benefited from early access via the Cancer Drugs Fund to more than 100 drugs across 250 conditions.

The balance with Big Pharma

However, strict controls on price and access can have unintended consequences. If companies see the UK as a low-return market, they may choose to launch new drugs elsewhere first, or to limit investment in research and early trials here.

There is a danger that patients could face delays in receiving new treatments. Or the scientific ecosystem, which relies on steady collaboration with industry, could weaken.

Still, the answer is not to abandon cost-effectiveness. Without it, the NHS would risk paying high prices for small gains. This would divert money from staff, diagnostics or prevention – areas that often bring more health benefit per pound spent.

an nhs mobile screening hut.
Cost-effective spending on medicines can leave more money available for preventative and screening measures.
Marmalade Photos/Shutterstock

In such cases, raising thresholds or relaxing scrutiny would do more harm than good. Cost-effectiveness is not just about saving money. It is about fairness, ensuring that treatments funded genuinely improve lives relative to their cost.

The challenge, then, is balance. The UK should continue to hold firm on value for money, while finding ways to encourage investment. That might mean improving the speed and clarity of Nice processes, so that companies know where they stand earlier and patients can access good drugs more quickly.

It could involve reviewing thresholds periodically to account for inflation and medical progress, without undermining the principle that treatments must show sufficient benefit. And it certainly means supporting research and development through stable partnerships with universities, tax incentives and grants.

What should not be underestimated is the UK’s scientific strength. The country remains home to world-class universities, skilled researchers and an innovative biotech sector. The rapid development of the Oxford–AstraZeneca COVID vaccine showed what UK science can deliver at scale and speed.




Read more:
The UK’s speedy COVID-19 vaccine rollout: surprise success or planned perfection?


Pharmaceutical companies know this, and many – including AstraZeneca, GSK, Novo Nordisk, Pfizer, Johnson & Johnson and most recently Moderna – continue to invest in British labs and trials because of the talent and infrastructure. Danish firm Novo Nordisk has strengthened its ties with the University of Oxford, committing £18.5 million to fund 20 postdoctoral fellowships as part of its flagship research partnership.

The UK’s approach to assessing value has won respect internationally. That discipline must be preserved. Reversing the decline in investment means creating a predictable, transparent environment for industry while maintaining the protections that safeguard patients and taxpayers alike. If done well, the UK can continue to be both a responsible buyer of medicines and a world leader in science.

The Conversation

Catia Nicodemo is affiliated with university of Oxford

ref. The UK must invest in medicines – but not at any price – https://theconversation.com/the-uk-must-invest-in-medicines-but-not-at-any-price-266016

Four ways virtual reality can help communities heal after disasters

Source: The Conversation – UK – By Paola Di Giuseppantonio Di Franco, Associate Professor School of Philosophy and Art History, University of Essex

When natural disasters strike, they shatter lives, disrupt routines and loosen the emotional ties people have with the places they call home. For the Italian towns of Amatrice and Accumoli, devastated by a 6.2 earthquake in 2016, the damage extended far beyond bricks and mortar. Streets vanished. Landmarks were reduced to rubble. The past seemed to disappear while the future became very uncertain.

But what if technology could offer a way to reconnect with what was lost and reflect on the future of the place?

Recent research my colleagues and I conducted explores how virtual reality (VR) can help communities recover emotionally, socially and culturally after a disaster.

Working with members of the communities affected, we created immersive digital environments of their towns as they existed before the earthquake. The results revealed how VR can support healing in ways no blueprint or rebuild ever could.

Here are four ways virtual reality might help communities heal after catastrophic events strike.

1. It offers a space to grieve and remember

For many participants, the VR reconstructions were emotionally powerful experiences – one even described them as “cemeteries of place”. Stepping into a virtual version of their hometown allowed them to reconnect with deeply personal memories: the sound of a church bell, the feel of sitting on a bench while having a gelato, the view from a childhood window.

Grief in post-disaster settings isn’t just about lost lives – it’s also about the erasure of everyday spaces where people worked, gathered, played, laughed and simply lived. One resident of Amatrice told us she didn’t have the courage to drive through the town any more because the destruction was too painful to witness.

In VR, however, she was able to revisit the square where she used to sit with her family and eat ice cream. For some, this triggered sadness, but also joy, a sense of lightness – and a sense of reconnection.

2. It helps people reclaim a lost ‘sense of place’

Disasters often leave communities displaced, physically and emotionally. Familiar surroundings become unrecognisable. For residents of Amatrice and Accumoli, whose historic centres are still inaccessible or remain destroyed after nine years, daily routines and social interactions have been disrupted and must be reconstituted.

By recreating these spaces in VR, we saw how people could begin to reclaim their sense of place. The reconstructions included not just major landmarks, but also small, meaningful details, such as plastic chairs outside cafes, flowerpots on balconies, even the chatter of people in a square on a summer evening. These touches matter. They help make the virtual towns feel alive, bringing back the heritage of the everyday of these communities.

One participant said that being in the VR environment felt like “going to the living room” again, a phrase some locals once used for their evening strolls in the town square.

3. It supports intergenerational memory-sharing

Many of the younger participants in our project were children, or not yet born, when the earthquake struck. Their memories of the towns are fragmented or absent. VR gave them a way to see and understand what their parents and grandparents remember, through their eyes, to ask questions, point to places, and listen to stories.

In practice, the experience became a shared one. While one person wore the VR headset, others gathered around a laptop to observe, comment and remember. One teenager asked her mother to help find the window of her old bedroom. Another participant’s son, born two years after the earthquake, “saw” pre-quake Amatrice for the first time through VR and through his father’s narration.

These moments turned the technology into a tool for storytelling, for keeping cultural memory alive between generations.

4. It creates inclusive, community-led recovery tools

Much disaster recovery is led by top-down planning (meaning, engineers, architects and bureaucrats making decisions about what to rebuild and how). But VR offers an opportunity to include community voices from the start.

Our project used a “techno-ethnographic” approach, where residents didn’t just observe but shaped the reconstructions. We asked: what should we include? What matters to you? They pointed out favourite cafes, benches, trees and missing features. They even debated how many clocks were on the civic tower, as they could not remember.

This collaborative process gave residents a sense of agency over how their towns and their memories were represented. It also reminded us that authenticity isn’t about perfect realism. It’s about emotional truth: the way a place feels, not just how it looks.

Technology and emotional healing

Virtual reality can’t replace what’s been lost. It can’t rebuild trust, revive livelihoods or resolve trauma. But our research shows it can offer emotional healing: a space where people can mourn, reflect, reconnect and share.

It also shows that technology must be handled with care. In early versions of our VR environments, we found that some participants became distressed or disoriented, especially when scenes depicted post-earthquake ruins of the town in nighttime settings. This taught us the importance of trauma-sensitive design: allowing users to adjust lighting, control their experience, or even just step away when needed.

Ultimately, VR is not a fix but it can be a powerful complement to the long, human work of rebuilding after disaster. When designed with communities, for communities, it can help restore more than heritage. It can help restore belonging.

The Conversation

Paola Di Giuseppantonio Di Franco receives funding from UKRI through a Future Leaders Fellowship, Round 6

ref. Four ways virtual reality can help communities heal after disasters – https://theconversation.com/four-ways-virtual-reality-can-help-communities-heal-after-disasters-263479

Waiting isn’t a bad thing — it can actually boost your wellbeing

Source: The Conversation – UK – By Ayse Burcin Baskurt, Senior Lecturer, Applied Positive Psychology, University of East London

Don’t dread those moments where you have to wait – see them instead as an opportunity. Maria Markevich/ Shutterstock

Waiting can be boring, which is why we typically do anything we can to avoid it. We fill moments where we have to wait with something to keep our minds busy – such as scrolling on social media, reading the news or listening to a podcast.

But waiting isn’t always bad. Research shows that it can be beneficial as it improves self-control – an ability important for many social, cognitive and mental health outcomes.

Self-control refers to a person’s ability to regulate their thoughts, emotions and behaviour when long-term goals conflict with short-term temptations.

Self-control has broad importance – whether that’s in school or the workplace – because of its implications for learning, decision-making, performance, social relationships and wellbeing. The skill is key in resisting temptation in these settings.

Our ability to wait is a key way self-control is put to the test.

A frustrated man stares at his laptop. He cradles his head in his fists.
Don’t act on impulse – waiting can help us put space between our whims and exert self-control.
Olena Yakobchuk/ Shutterstock

This might include pausing for a moment before writing a response to an email that has annoyed us. Or maybe it’s resisting the temptation of an unhealthy food when you’re trying to eat healthier. Both of these are examples of exerting self-control and creating space between impulse and action.

Research shows that even short delays or pauses – such as ordering food ahead of time or waiting before making a purchase – can cool-off impulses and help us prioritise long-term goals.

Despite the attention given to self-control in different fields of psychological research, waiting as a standalone construct has not received as much attention. Still, what research there is on the topic shows us that waiting can have similar benefits.

For instance, research has looked at what effect silence has in coaching conversations – with silence acting as a form of waiting. When the person who has been asked a question pauses before answering, it gives them the space to process their thoughts. This can help them better understand how they’re feeling, uncover memories or even shed a light on things that are confusing them. In this way, silence serves a distinct purpose in communication – be it a pause for better listening, a defence or a chance for reflection.

Moments of waiting can create space for reflection. Having the opportunity to reflect on our actions, emotions and experiences can spark ideas, deeper focus and creativity.

There are many personal and cultural differences in terms of how we perceive time in waiting. Waiting can also be uncomfortable or frustrating for those brains that crave stimulation. And, in some cultures, it can be framed as passive or inefficient – while in others, waiting is deemed powerful and transformative.

These differences mean that waiting can be perceived and practised differently – and so benefits will appear in different forms.

The value of waiting

To reap the benefits that can come from learning self-control, resisting urges and appreciating the moments when we’re waiting, we need to recognise the value of waiting.

Here are some evidence-based tips from positive psychology for practising it more intentionally for our own wellbeing:

1. Savouring

Have you ever bought a ticket for an event and ended up enjoying the anticipation more than the event itself? Or felt the excitement of counting down to a summer holiday with friends?

When we anticipate something exciting, part of the joy lies in the wait itself. Research shows that savouring what we look forward to helps us prolong pleasure.

Every time we think about it, we get small bursts of joy. Visualising the concert, the trip or any event that you long for makes waiting less of an obstacle and more of an extension of the experience.

2. Gratitude

There are many moments in life where we have no option but to wait – for instance, while waiting to hear from your doctor about test results. But these moments can also give us an opportunity to feel gratitude.

Pausing to reflect on what you’re grateful for can make waiting less about the frustration or worry you’re feeling and more about appreciation.

3. Meaning making

Instead of seeing waiting as an inconvenience, try re-framing the way you think about it.

The next time you’re stuck in traffic or standing in a long line, instead of seeing it as an inconvenience, re-frame it and see the moment as a chance to rest, pause or reflect. Re-framing how you think about the situation can change the experience.

When we connect waiting to a sense of purpose, waiting gains direction and meaning.

4. Mindfulness

Irritable waiting moments can be cues to practise mindfulness. Mindfulness involves paying full attention to the present moment, and looking at it with curiosity and acceptance.

Intentionally noticing what’s going on in you and around you can turn an annoying circumstance into a mini check-in and chance to re-charge. This small practice may even help to improve your wellbeing by helping you to relax and regulate emotions.

This all isn’t to suggest you should find more opportunities to sit around and wait. Rather, it’s about seeing value in the moments where we do have to wait – and about intentionally making these moments more manageable and fun.

The Conversation

Ayse Burcin Baskurt does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Waiting isn’t a bad thing — it can actually boost your wellbeing – https://theconversation.com/waiting-isnt-a-bad-thing-it-can-actually-boost-your-wellbeing-265122

Trump’s Gaza peace plan: A bit of the old, a bit of the new – and the same stumbling blocks

Source: The Conversation – Global Perspectives – By Asher Kaufman, Professor of History and Peace Studies, University of Notre Dame

U.S. President Donald Trump and Israeli Prime Minister Benjamin Netanyahu arrive for a joint news conference at the White House on Sept. 29, 2025. Alex Wong/Getty Images

The latest U.S.-sponsored peace plan for the Middle East was unveiled at the White House on Sept. 29, 2025, and immediately accepted by Israeli Prime Minister Benjamin Netanyahu.

The proposal, which U.S. President Donald Trump said marked a “historic” moment that was “very close” to ending the two-year-old war in Gaza, will now go to Hamas. The Palestinian group said it was reviewing the document, having had it delivered by Egyptian and Qatari mediators.

Should it be accepted, hostilities would end “immediately,” according to the plan. But given that all previous U.S.-backed attempts have to date failed, there is reason for skepticism. The Conversation turned to Asher Kaufman, an expert on the modern Middle East and professor of peace studies at the University of Notre Dame, to explain what is different about this plan – and how it might fare.

What are the main points of the new plan?

The plan outlined by Trump in the presence of Netanyahu consists of 20 points.

If accepted by Israel and Hamas, it would see the full withdrawal of Israel Defense Forces from the Gaza Strip in three stages.

The first stage would be dependent on the release of the remaining 48 hostages taken during the Oct. 7, 2023, attack in Israel by Hamas and Palestinian Islamic Jihad, 20 of whom are believed to be alive. At the same time, Israel would release 250 Palestinians serving life in prison, as well as 1,700 Gazans arrested after Oct. 7.

This stage would also see humanitarian aid flow immediately to the desperate population in Gaza.

Stage two would see Gaza governed by a temporary transitional body consisting of a technocratic, apolitical committee composed of Palestinians and international members.

The committee would be overseen by a “board of peace” headed by Trump and other heads of state, including former U.K Prime Minister Tony Blair. This board would also oversee the reconstruction of the Gaza Strip and its economic development.

Hamas’ members would be given amnesty if they laid down their arms, but would also have to agree – along with other members of militant Palestinian factions – to not have any role in the governance of Gaza.

A new military body to be called the International Stabilization Force would be established and deployed in the Gaza Strip. The plan calls for it to be composed of Arab and international partners.

Only then would the Israeli military withdraw completely from Gaza, at which point the post-war Gaza plan would turn to economic redevelopment.

How does this differ from past US-backed plans?

The portions of the plan that include Israeli withdrawal, the release of hostages in exchange for Palestinian prisoners, and the provision of mass humanitarian aid to Gaza are similar to past agreements, including the last one that collapsed after Israel violated its terms in March 2025.

But there are new parts. These include the creation of the board of peace and the International Stabilization Force.

The former gives concrete structure to Trump’s older ideas to develop the Gaza Strip as a real estate venture; the latter provides a framework for an international military force that would police the strip for the foreseeable future.

The plan also mentions a long-term horizon for self-determination and the establishment of a Palestinian state – a point not raised in previous proposals, which mainly focused on ending the war in Gaza but neglected to include a longer-term pathway to statehood.

What would post-Gaza look like under this plan?

Trump sees the Gaza Strip as a real estate development opportunity – he has said as much in the past, and again talked on Sept. 29 of the opportunities of the coastline of Gaza.

Two people look through a gap in the wall as a plume of smoke rises in the background.
Smoke rises from the area targeted by Israeli forces in Gaza City, Gaza on Sept. 27, 2025.
Khames Alrefi/Anadolu via Getty Images

As such, his “vision of peace” is designed mainly through an economic-development lens.

The plan envisions a reconstructed strip supported principally by regional players that could stabilize the region and provide in the short term humanitarian relief and in the long term economic opportunities to Gazans.

The Trump administration and Israel hope to have not only a Hamas-free Gaza but a depoliticized Gazan population in its entirety.

With no role for Hamas, who will represent Palestinians in Gaza?

It is not clear from the plan who will represent Palestinians. But reading between the lines, one can see the possibility of a revamped version of the Palestinian Authority, the body that nominally governs parts of the West Bank, that could take the role of “Palestinian technocrats.” Point nine of the plan suggests that the Palestinian Authority could have a role in the future of Gaza after the Palestinian Authority “has completed its reform program,” but it does not say what this reform program entails.

The plan also suggests that Palestinian police forces would be trained and supervised by the International Stabilization Force and stationed in the Gaza Strip. That also hints to the possibility that the Palestinian Authority’s police – which has long been accused by Palestinians of working in concert with the Israelis to provide security in the West Bank – could take up this role.

Netanyahu has long resisted considering the Palestinian Authority as a viable body to govern Gaza in the “day after” the war.

So if this plan goes into effect, the question of who makes up the Palestinian technocratic administration might certainly be one of the main stumbling blocks.

What are the chances of the plan being accepted?

There are two main barriers.

In Israel, Netanyahu will need to get the approval of the far-right members of his government, who in the past have resisted anything short of a continuation of the war and the final takeover of the Gaza Strip by Israel. Netanyahu knows that his political future is dependent on keeping far-right members of his coalition on board – and that dynamic has undone past pushes for the end of the war.

For Hamas, if this agreement is realized, it would mean the end of its military and political presence in the Gaza Strip.

As such, the political and militant body – which has governed the territory since June 2007 – will need to be in a desperate situation to accept the terms. Or perhaps, Hamas may finally be attuned to the desperate plight of Gazans and respond to it.

The plan, as worded, gives them little to hold on to as an achievement after Hamas sparked two years of war on Oct. 7, 2023, with unbearable sacrifices for Palestinians.

It is not far-fetched to think that Netanyahu is supporting Trump’s plan knowing that chances of its realization are very slim. In the last two years, Netanyahu has demonstrated that he is mainly motivated by his own political survival and he will not take any step that would jeopardize it.

By accepting the plan, he demonstrates his alliance with the American president. It could also win Netanyahu valuable political capital in Israel: allowing him to present himself as willing to end the war, but safe in the knowledge that it will likely be rejected by Hamas.

Given the fact that the plan has no concrete timeline, particularly in relation to Israel’s staged withdrawal, it also buys him valuable political time. It could allow Netanyahu to place himself in a better position domestically, with national elections scheduled for October 2026. If Netanyahu sees that public opinion shifts in his favor he could even move forward with early elections, as he often did in the past, to capitalize on the moment.

The Conversation

Asher Kaufman does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Trump’s Gaza peace plan: A bit of the old, a bit of the new – and the same stumbling blocks – https://theconversation.com/trumps-gaza-peace-plan-a-bit-of-the-old-a-bit-of-the-new-and-the-same-stumbling-blocks-266341

Air temperatures over Antarctica have soared 35ºC above average. What does this unusual event mean for Australia?

Source: The Conversation – Global Perspectives – By Martin Jucker, Senior Lecturer in Atmospheric Science, Climate Change Research Centre, UNSW Sydney

Jeremy Stewardson/Getty

Right now, cold air high above Antarctica is up to 35ºC warmer than normal. Normally, strong winds and the lack of sun would keep the temperature at around –55°C. But it’s risen sharply to around –20°C.

The sudden heating began in early September and is still taking place. Three separate pulses of heat have each pushed temperatures up by 25ºC or more. Temperatures spiked and fell back and spiked again.

It looks as if an unusual event known as sudden stratospheric warming is taking place – the unexpected warming of the stratosphere, 12 to 40 kilometres above ground.

In the middle of an Antarctic winter, this atmospheric layer is normally exceptionally cold, averaging around –80°C. By the end of September it would be roughly –50ºC. This month, atmospheric waves carrying heat from the surface have pushed up into this layer.

In the Northern Hemisphere, these events are very common, occurring once every two years. But in the south, sudden large-scale warming was long thought to be extremely rare. My research has shown they are more common than expected, if we group the very strong 2002 event with slightly weaker events such as in 2019 and 2024.

Sudden warming may sound ominous. But weather is messy. Many factors play into what happens down where we live.

A drier, warmer spring and summer for southeastern Australia usually follow these warming events. But at present, forecasters are predicting warmer than usual temperatures across Australia alongside a wetter spring in the east.

A plot of stratospheric temperatures above the South Pole.
This graph shows the air temperature 30km above the South Pole. The normal seasonal cycle of temperature is in light gray, while the black line shows actual temperatures this year. Stratospheric warming first occurred on 5 September, followed by a second pulse around 14 September and the strongest warming so far peaking on 27 September.
Martin Jucker/Japan Meteorological Agency

What’s happening in the skies over Antarctica?

High above both the Arctic and Antarctic is a large area of rotating winds called the stratospheric polar vortex. By definition, sudden stratospheric warming events affect these two systems.

Over Antarctica, these events are usually detected about 30 kilometres above the Southern Ocean, just to the north of Antarctica’s coastline.

The Antarctic winter runs from March to October. During this period, the continent and the atmosphere above it are dark and very cold, as the sun doesn’t rise until September.

The polar vortex traps intensely cold air and keeps it isolated from the warmer air at lower latitudes. But every now and then, this can change.

Just like the ocean, the atmosphere has waves. What’s happening at present is that large-scale atmospheric waves have spread from the surface up into the stratosphere above Antarctica, bringing heat energy with them. As these waves interact with the strong winds of the vortex, they transfer this heat.

This is only possible during the Antarctic winter, as the polar winds are only strong during these months.

While these events are called “sudden”, it’s not sudden in the sense we would commonly use. The warming takes place over days or weeks. But they are sudden in the sense they’re often unexpected, as they are difficult to predict.

figure of antarctica showing rapid stratospheric warming.
Temperatures have spiked in the stratosphere over Antarctica this month. This figure shows the temperature anomaly from September 12 to 21st.
NOAA, CC BY-NC-ND

What does this mean for us?

What happens in Antarctica doesn’t stay in Antarctica. When a sudden warming event arrives, it can have flow-on effects for the weather.

We would usually expect southeastern Australia to be drier and warmer after sudden stratospheric warming above Antarctica.

In 2019, sudden warming over Antarctica led to drier conditions in Australia. Research has shown this influenced the megafires over the Black Summer of 2019–2020. These events can create prime conditions for bushfires.

The opposite is also true: If the polar stratosphere is even colder than usual, we expect wetter and cooler conditions over southeastern Australia.

For instance, over the 2023-24 spring and summer, forecasters predicted a dry spell driven by an El Niño event in the Pacific. But this didn’t happen. Instead, the very cold polar stratosphere produced a rather cool and wet summer.

There’s another effect, too. When the stratosphere is warmer, less ozone is destroyed in the ozone layer and more ozone is carried from the equator towards the poles.

That’s good for humans, as it means more dangerous ultraviolet rays are blocked from reaching the ground. But changing ozone levels can also contribute to the arrival of unexpected weather systems caused by a warmer stratosphere.

silhouette of firefighter spraying water on large bushfire.
Sudden stratospheric warming in 2019 influenced Australia’s Black Summer megafires. Pictured: a firefighter fighting a blaze near Nowra in New South Wales.
Saeed Khan/Getty

How often does this happen?

Media coverage has suggested these events are rare. But that isn’t entirely correct.

These events were first discovered in the Northern Hemisphere, where they happen roughly every second year.

But the northern polar stratosphere is warmer and has weaker winds. This means it’s easier for atmospheric waves to disturb the vortex. In the Northern Hemisphere, sudden stratospheric warming is defined as a complete disappearance of the polar vortex.

When the same definition is used for the Southern Hemisphere, only the 2002 event would meet the criteria in our entire observational record. That’s because the intense stratospheric winds of up to 300kmh over Antarctica are extremely difficult for atmospheric waves to penetrate.

Using this narrow definition, these events in Antarctica are estimated to happen about once every 60 years – and are expected to become even rarer.

But if we define these southern events more broadly as a weakening of the polar vortex producing sudden warming, the frequency is more common. Using this definition, we estimated the frequency of events like the 2019 event to be once every 22 years.

At present, I am leading international work to find better ways of detecting these events in the Southern Hemisphere.

What will this event lead to?

Forecasting chaotic systems such as the weather is a hard job. The sudden warming of the stratosphere over Antarctica will have some influence over spring and summer weather in Australia and New Zealand. But the stratosphere is just one factor among many in shaping the weather as we experience it.

At present, the Australian Bureau of Meteorology is forecasting a warmer spring, and wetter in the southeast. This is because the sudden warming event is happening at the same time as ocean temperatures remain very warm, and hotter oceans lead to more evaporation and thus more rain.

But this could still change. Not all sudden stratospheric warming events end up influencing the weather near the surface. It’s worth keeping an eye on the seasonal forecasts this summer.

The Conversation

Martin Jucker receives funding from the NSW Bushfire and Natural Hazards Research Centre.

ref. Air temperatures over Antarctica have soared 35ºC above average. What does this unusual event mean for Australia? – https://theconversation.com/air-temperatures-over-antarctica-have-soared-35-c-above-average-what-does-this-unusual-event-mean-for-australia-265079