Decades of hostility between Iran and the US were preceded by a little-remembered century-long friendship

Source: The Conversation – USA – By Daniel Thomas Potts, Professor of Ancient Near Eastern Archaeology and History, New York University

The ouster of Prime Minister Mohammad Mosaddegh marked a turning point in U.S.-Iran relations. AP Photo

The British- and American-backed plot to overthrow Iran’s prime minister in 1953 laid the groundwork for the 1979 Iran hostage crisis and decades of hostility with the U.S. that have now culminated in a war launched on Iran by the U.S. and Israel.

Many Americans only know the anger and tension with Iran that has grown from those roots set down during the middle of the last century. But as an archaeologist who has spent over 50 years specializing in Iran, and from my research on Iranian history in the context of changes undergone by Iran’s nomadic population through time, I believe it is worth recalling the time when the two countries had a distinctly different relationship.

In the 1800s, American missionaries journeyed to what was then called Persia.

The missionaries helped build important institutions – schools, colleges, hospitals and medical schools – in Persia, many of which still exist.

Dr. Joseph Plumb Cochran, an American physician fluent in Persian, Turkish, Kurdish and Assyrian, founded a hospital in Urmia in 1879, as well as Iran’s first medical school. When Cochran died at Urmia in northwestern Iran in 1905, over 10,000 people attended his funeral.

This image clashes with most American stereotypes of Iran and its people, and is at odds with decades of anti-Iranian sentiment emanating from Washington.

Iran and the United States, in fact, have a deep history of mutual respect and friendship.

From 1834, when the first Protestant American mission was established in Urmia, until 1953, when the CIA’s involvement in Iran’s internal affairs set the United States on the road to conflict with Tehran, Americans were the good guys.

Joseph Plumb Cochran in his medical college at Urmia.
Wikipedia

Imperial bad guys

For years, Americans have seen images of Iranians shouting “Death to America.” President Donald Trump returned the sentiment during his first term, vowing to bring Iran death and destruction. And on Feb. 28, 2026, after weeks of threats and military preparation, the U.S. and Israel attacked Iran, killing Supreme Leader Ali Khamenei; that war continues to this day.

But before all that happened, when Americans were the good guys, there were other countries who were instead manipulators and who exerted undue influence over Iran.

The bad guys, at whose hands Iran suffered most, were Russia and Great Britain. Those two nations – often at the invitation of Iran’s leaders – economically exploited Persia to further their own imperial ambitions, using sustained diplomatic, military and economic pressure.

After two ill-judged wars fought against Russia – the First (1804-1813) and Second Russo-Persian Wars (1826-1828) – Persia (the name Iran was officially adopted in 1935) lost large amounts of territory to the czar.

Much later, Russia found another means of exerting control over the Persian crown, loaning millions of rubles to its rulers, like Mozaffar ed-Din Shah, who reigned from 1896-1902 and needed capital to fund his lavish lifestyle.

With the exception of the Anglo-Persian War (1856-1857), Persian relations with Great Britain were less openly hostile. But what they lacked in martial vigor was more than compensated for by economic exploitation.

Toward the end of the 19th century, the shah granted exclusive concessions to the British for everything from telegraph lines to tobacco. Rights to Iran’s oil were given to the Anglo-Persian (later Anglo-Iranian) Oil Company.

So assured were Britain and Russia in their control of Persia that, in 1907, they signed the infamous Anglo-Russian Convention. That agreement divided the country – unbeknownst to its Parliament, let alone its inhabitants – into Russian, British and “neutral” spheres of influence. After it became public it provoked the outrage of ordinary Persians and the international community at large.

Cartoon from 1907 satirizing Russia and England dividing up Persia.
Punch/Pushkin House

America the good

Iran’s relations with the United States were completely different.

The 19th- and early 20th-century history of British and Russian imperial ambitions and involvement in Iran put Iran in a dependent, exploited position at the hands of the governments of these two countries.

But the presence in Iran of American missionaries and, later, invited government technocrats, was of an entirely different quality. These were Americans offering aid, with no expectation of advantage to be gained officially for the United States government.

American Presbyterian missionary efforts in Iran began in 1834 and focused on education, with 117 schools established around Urmia by 1895. Efforts were also directed at medical and social welfare. These were nongovernmental missions. The U.S. government was conspicuous by its absence in Iran and Iranian affairs.

By the late 19th century, the Presbyterian Board of Foreign Missions had opened new stations in cities across northern Iran, from Tehran to Mashhad. American diplomatic relations with Persia were established in 1883. A decade later the American Presbyterian Hospital was founded in Tehran by John G. Wishard.

After the First World War, Presbyterian schools for both boys and girls proliferated, the most famous of which were the American College of Tehran for boys, established in 1925, and Iran Bethel School for girls.

In 1910, the Persian Parliament, aware that the country’s finances were in disarray, invited the U.S. to identify a “disinterested American expert as treasurer-general to reorganize and conduct collection and disbursement of revenue.”

Despite Russian attempts to block the initiative, W. Morgan Shuster, a distinguished career civil servant, was appointed by Persia in February 1911. He arrived in Tehran in May, bringing with him four other Americans.

The mission was a failure, lasting only eight months, and, unsurprisingly, was adroitly sabotaged by the combined efforts of British and Russian diplomats in Tehran.

American William Morgan Shuster, treasurer-general of Persia.
Wikipedia

The country’s financial situation after the First World War was still precarious. With none of the colonialist baggage associated with the two European superpowers, America was turned to, almost as a last resort, to fix what ailed Iran. Riza Shah, father of the last shah, appointed an American, Arthur C. Millspaugh, as the administrator-general of the finances of Persia.

When Millspaugh arrived in Tehran in 1922, a newspaper editorial addressed him with these words: “You are the last doctor called to the death-bed of a sick person. If you fail, the patient will die. If you succeed, the patient will live.”

Despite his often testy relations with foreigners, Riza Shah acknowledged Millspaugh’s American Financial Mission was “the last hope of Persia.” The fact that the mission was far from an unqualified success does not detract from its importance. Nor did it diminish America’s image as an honest broker in Iranian eyes, in contrast to that of Russia and Great Britain.

Of course, not every Iranian-American interaction during this period was positive. Robert Imbrie, the American consul in Tehran, was brutally murdered in 1924, allegedly because a fanatical religious leader accused him of being a Baha’i and poisoning a well.

Riza Shah used the episode to crack down on dissidents and impose strict controls on public gatherings.

Students at the American Memorial School, Tabriz, 1923.
shahrefarang.com

America the bad

America’s benign image in Iran was forever shattered in 1953 when the CIA, working with Great Britain, engineered a coup against Mohammad Mossadegh, the democratically elected prime minister who had nationalized the Anglo-Iranian Oil Company.

Even though the overthrow of Mossadegh damaged Iranian trust in America, the years just prior to Iranian revolution in 1979 saw the number of Iranian students in the United States steadily rise.

Over one-third of the approximately 100,000 Iranian students pursuing university degrees abroad in 1977 were in the U.S. By the time of the Islamic revolution two years later, that number had climbed to 51,310, making Iran by far the biggest single source of foreign students in America, with 17% of the total foreign student population. The next-largest contributor of foreign students, Nigeria, accounted for only 6%.

“Iranian students have been here for nearly a century … there are deep and abiding connections that reveal themselves when you look at the historical record,” researcher Steven Ditto, who wrote a report on Iranian students in the U.S., told The Washington Post in 2017.

The legacy of American goodwill, personal friendship and doing the right thing by Iran has not been completely lost, although the war now underway may make it seem as though America’s good relationship with Iran has been lost irretrievably.

Deep friendships dating back well over a century can withstand a great deal. A reservoir of goodwill and affection may lie dormant while political storms rage. Iran and America were good friends in the past, and for good reason. I believe that Americans would do well to remember that.

This is an updated version of an article originally published on Aug. 19, 2020.

The Conversation

Daniel Thomas Potts does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Decades of hostility between Iran and the US were preceded by a little-remembered century-long friendship – https://theconversation.com/decades-of-hostility-between-iran-and-the-us-were-preceded-by-a-little-remembered-century-long-friendship-279636

Birutė Galdikas: The last of ‘Leakey’s Angels’ in primatology’s most extraordinary chapter

Source: The Conversation – USA (2) – By Mireya Mayor, Director of Exploration and Science Communication, Florida International University

Birute Galdikas carries an orangutan named Isabel in Borneo, Indonesia. The 2011 film ‘Born To Be Wild 3D’ followed her work. AP Photo/Irwin Fedriansyah

Primatologist Birutė Galdikas died on March 24, 2026, and an era of science that began in the forests of Tanzania, Rwanda and Borneo studying humanity’s closest living relatives more than half a century ago is coming quietly to a close. Her passing marks more than the loss of a scientist – it’s the end of one of the most extraordinary chapters in modern science.

For more than half a century, primatology had three central figures: Jane Goodall, Dian Fossey and Galdikas — often called Leakey’s Angels, after their mentor — who transformed how we understand primates and, in many ways, how we understand ourselves.

A young woman sits with orangutans playing around her in the jungle.
Birutė Galdikas, shown in 1965.
Universal Archive/Universal Images Group via Getty Images

They were sent into the field by paleoanthropologist Louis Leakey, who believed that if we understood other primates, we might better understand human evolution and human nature. It was a radical idea at the time, not only scientifically but culturally. Leakey did not send large research teams or established professors. Instead, three young women went into forests, often alone, for years at a time.

What they discovered changed science and the public imagination.

Seeing chimpanzees and apes as individuals

Before the scientists’ work, primates were often described as creatures of instinct, their behavior explained largely through simple drives for food and reproduction. After their work, people began to talk about individuals with personalities, alliances, rivalries, friendships and grief.

Goodall, Fossey and Galdikas showed that chimpanzees make tools and wage political struggles, that gorillas live in complex family groups, and that orangutans raise their young with a patience and investment that rivals that of humans. The line between humans and other primates did not disappear, but it became harder to draw cleanly.

They also changed who could be a scientist.

Three women living for years in remote forests in the 1960s and ‘70s was not normal. By succeeding, they quietly expanded the boundaries of who could lead expeditions, run field sites, publish major research and become the public face of science. Many primatologists of my generation entered a field that these women forced open.

Birutė Galdikas talks about her career.

Each of these extraordinary women shaped my life in different ways. I never met Fossey, who died in Rwanda in 1985. But watching “Gorillas in the Mist,” a movie about her work, changed the course of my life and sent me toward primatology instead of law school. Years later, as a young primatologist studying lemurs, I met Goodall at a conference; she later wrote the foreword to my book and became a mentor and friend as I navigated my own path in conservation science. I met Galdikas, a scientist at Canada’s Simon Fraser University, professionally and immediately recognized a kindred spirit – another woman who had devoted her life to the study and protection of humans’ closest animal relatives.

With their deaths – Goodall died in 2025 – it falls to those of us who were inspired by them to continue and evolve their work at a time when it has never been more difficult or more important.

But the field today’s primatologists inherited is not the same one they began.

The next generation and primates’ struggle for survival

The first generation of field primatologists went into forests full of animals to discover how primates lived. They were explorers as much as scientists, and their work had the feel of discovery in the classic sense – new behaviors, new social structures, new understandings of intelligence and culture in animals.

Their research helped reshape anthropology, psychology and evolutionary biology. They helped answer one of the oldest questions humans ask about themselves: What makes us different from other species?

Birutė Galdikas talks about the documentary ‘Born to be Wild 3D’ and her work rescuing and returning orangutans to the wild.

By the time my generation began working in the field, many of those questions had already been answered. We knew primates used tools, formed political alliances, reconciled after fights and mourned their dead. We knew they had personalities and social strategies.

The question was no longer whether primates were like us, but whether they would survive us.

This is the quiet shift that defines modern primatology. My generation now goes into forests that are smaller, more fragmented and quieter, and the work is increasingly focused on making sure those animals are still there at all.

I have spent much of my career studying lemurs in Madagascar, where this shift is impossible to ignore. Lemurs are among the most endangered group of mammals on Earth, with more than 90% of species threatened with extinction. In many parts of Madagascar, forests now exist only as isolated fragments surrounded by agriculture and human settlement. Some lemur populations survive in forest patches so small that a single fire or logging operation could eliminate them entirely.

Conservation begins with caring

These primates that captured the world’s attention are also the species most like us. They have long childhoods, complex societies, intelligence, and emotional lives that feel familiar to us. Their similarity is what made people care. And that caring, in many cases, is what has kept them from disappearing entirely.

The great achievement of Leakey’s Angels was not only what they discovered, but that they made the world care about primates.

Before the three scientists’ work, chimpanzees, gorillas and orangutans were largely abstract animals to most people – zoo exhibits, textbook illustrations, evolutionary symbols. After their work, these creatures became individuals with names, families, histories and personalities. Each of the women’s work was celebrated in films and books, including the Morgan Freeman-narrated documentary “Born to Be Wild 3D” that followed Galdikas’ orangutan rescues.

Conservation begins with caring, and caring begins with stories. They gave the world those stories.

But caring is no longer enough. We are now in an era where the most important breakthroughs in primatology may not be new discoveries about behavior, but new ways to protect habitats, connect fragmented forests, preserve genetic diversity and help humans and primates survive on the same increasingly crowded landscapes.

The work has shifted from observation to intervention, from discovery to responsibility.

Every generation of scientists inherits a different world. The generation of Jane Goodall, Dian Fossey and Birutė Galdikas inherited a world full of primates we did not yet understand. My generation has inherited a world where we understand primates very well, but are in danger of losing them anyway.

The forests are quieter now than when these three young women went into them more than half a century ago. The responsibility, however, has only grown louder.

The central question of primatology is no longer what makes us human. It is whether a species intelligent enough to understand extinction will choose to prevent it in our closest living relatives.

The Conversation

Mireya Mayor does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Birutė Galdikas: The last of ‘Leakey’s Angels’ in primatology’s most extraordinary chapter – https://theconversation.com/birute-galdikas-the-last-of-leakeys-angels-in-primatologys-most-extraordinary-chapter-279398

Panicking scientists, canceled experiments – federal funding cuts turned my work as a research dean into crisis management

Source: The Conversation – USA – By Nara Parameswaran, Senior Associate Dean for Research, College of Human Medicine, Michigan State University

Cuts to federally funded research slow the progress of scientific innovations and new treatments. Michigan State University College of Human Medicine

Fielding frantic faculty emails and panicked texts was not how I had hoped my 2025 would begin. Little did I imagine that my role as a research dean at a medical school would be taken over by navigating chaotic grant terminations and delays of federal research funding, all justified in the name of scientific progress.

Under normal conditions, a major part of my job is reducing barriers for faculty, staff and students engaged in innovative research. For example, I make sure my faculty have enough human help to complete necessary administrative tasks so they can focus on their science while writing their grants. My overall goal is to remove roadblocks and foster an environment in which new discoveries are made that can improve people’s lives.

But none of us in research leadership positions around the country had ever faced anything like the Trump administration’s attacks on universities and science.

One of my first clues that we were no longer operating in business-as-usual mode was when the White House terminated U.S. Agency for International Development grants. Michigan State University was one of the institutions affected by this major blow to agricultural, food and other global research, but the medical school where I work wasn’t directly hit. Our turn came when DOGE – the Department of Government Efficiency, a Trump administration effort at eliminating bureaucratic wasteturned its attention to the U.S. National Institutes of Health.

As the White House took aim at higher education and the scientific research enterprise with its budgetary scalpels, my world was thrown into chaos.

man standing at a podium, speaking
Nara Parameswaran’s job as a medical school research dean transformed into dealing with much more chaos and uncertainty.
Michigan State University College of Human Medicine

The human costs of grant uncertainty

While interruptions to grant funding slow scientific progress, there is an immediate real-world human cost to the upheaval.

Consider the case of one of my junior faculty members. 2025 was a critical year for them: If they didn’t receive funding, they would lose their employment – it’s common in academia for scientists to need to raise money to support their own research and part of their salary. Their NIH program officer – the person who recommends whether a grant would be funded – had previously told them their proposal would likely be successful. But by February 2025, that NIH officer was DOGE’d – that is, fired – and so the fate of the grant remained in limbo.

The review of a second grant proposal that this MSU researcher had submitted to NIH was delayed by several months after NIH suspended the panels that assess the scientific merit of grant submissions.

By the time the faculty member received initial feedback on that grant and resubmitted it for reevaluation, the government had shut down and delayed the review again. By this point, nearly a year had passed and no grant had been awarded – or rejected, for that matter.

Without funding, this faculty member cannot conduct experiments, pay or train students and other lab members, or purchase essential supplies to do experiments. As a result, both scientific progress and their career advancement remain in jeopardy; they hang on by a thread while waiting for yet another grant to come through.

Sadly, this was not an unusual case. A number of faculty – at my school and across the country – had received funding, only to have the government cancel their grants mid-project for unknown or unclear reasons. Haphazard grant terminations or prolonged uncertainty create chaos not only for faculty, but also for students, research staff and all the families who depend on these positions for income.

Identifying new resources to help faculty continue their work became one of my top priorities. But finding spare money is not a trivial task. It meant working with the college and university leaders to identify resources, prioritizing some spending while holding back on other budget items, as well as raising money from the community for what we called “research rescue.”

A small group of medical school research deans from various universities started meeting on Sundays – the only day that worked for everyone. We share information, provide advice and support and try to think strategically about how to help faculty. We talk about any successes and commiserate about the depth of the chaos at our institutions. None of us were trained to deal with this kind of situation, and the support of this group has been critical for me personally.

In addition, the research deans from various colleges here at MSU discuss these issues regularly with each other and other university officials to strategize how to navigate these difficult times, sharing information among people with different roles.

young man in white lab coat works with materials under a lab hood
Early-career researchers are among those hardest hit by uncertainty and chaos – with some choosing to leave science altogether.
Michigan State University College of Human Medicine

A generation of scientists at risk

One of the most profound consequences of all this instability has been its impact on the next generation of scientists, especially Ph.D. students and postdoctoral scholars. Not only are these early-career researchers training to be the scientists of the future, but they are also essential contributors to performing grant-funded experiments, publishing in scientific journals and ensuring research program continuity.

By June 2025, several dozen Ph.D. students across my campus were affected by grant terminations or delays. With more than 100 faculty from my college concerned about funding, many postdoctoral scholars working under them faced uncertain futures. This wasn’t unique to our college or university – this was, and is, a national problem.

Anticipating deep cuts to funding for student stipends and training, institutions were forced to reduce or even cancel graduate student admissions for the year.

Additional widespread disruption stemmed from revoking F-1 visas, which had allowed international students to study in the U.S., and terminating related recordkeeping, placing international students in academic and personal limbo.

Recruitment and training were further complicated by a September 2025 proclamation calling for restrictions on H-1B visas. This further constrained universities’ ability to recruit postdoctoral scholars and faculty who aren’t American citizens.

Unsurprisingly, these conditions have profoundly shaped how students and trainees assess their future careers. In a 2025 survey of 824 trainees, 77% said that recent executive orders or federal policy changes influenced their career plans significantly or somewhat. The long-term implications of these barriers could be grim, especially because more than half of the country’s postdoc scholars in science, technology, engineering and math, and around a third of the country’s graduate students, are visa holders.

Recent data on who received grant funding revealed troubling trends, particularly for early-career investigators. As a research dean, my major worry is about the livelihoods of these scientists, especially because most of them have young families to provide for.

woman in white lab coat has a pipette in one hand and holds up a test tube
Every dollar that NIH spent on research in 2024 generated more than twice as much value to the economy by creating jobs, supporting small businesses and developing new technologies.
Michigan State University College of Human Medicine

A new reality for scientists

Countless breakthroughs that have altered the course of human and animal health have been made possible by sustained federal research investment through agencies such as NIH. These discoveries are made by real people working in research labs or in the communities we serve, and this work requires real money.

Declines in support for these researchers, coupled with reduced graduate enrollment and ongoing visa challenges, risk erasing an entire generation of scientists, with consequences that will reverberate for many years.

All these unresolved challenges – grant terminations, potential reductions in funding for research infrastructure, federal workforce cuts, visa instability, lawsuits and threats to the next generation of the scientific workforce – have converged into a single reality: Uncertainty has become the norm. Each day brings new questions about who will be affected and how to respond in ways that protect faculty, staff and students so they can continue their important work.

The Conversation

Nara Parameswaran has received funding from National Institutes of Health, American Heart Association, Korean Ginseng Society and the California Prune industry.

ref. Panicking scientists, canceled experiments – federal funding cuts turned my work as a research dean into crisis management – https://theconversation.com/panicking-scientists-canceled-experiments-federal-funding-cuts-turned-my-work-as-a-research-dean-into-crisis-management-277162

Sex test used in IOC’s new transgender ban more likely to exclude from Olympics intersex women who were assigned female at birth

Source: The Conversation – USA – By Ari Berkowitz, Presidential Professor and Graduate Liaison for biology programs; Director, Cellular & Behavioral Neurobiology Graduate Program, University of Oklahoma

Sex testing in elite sports has had a long, inconsistent history. anton5146/iStock via Getty Images Plus

The International Olympic Committee announced a new policy on March 26, 2026, for women’s competitions: Every athlete must be tested for a gene called SRY, usually found on the Y chromosome. Males typically have a Y chromosome and females typically don’t, so the IOC says this requirement will exclude “biological males.” This announcement comes as planning for the 2028 Summer Olympics, hosted in Los Angeles, is underway.

But the IOC statement hides the complexity of biological sex and continues the organization’s century of what the record shows is inconsistent and biologically unsound sports policies.

I’m a biology professor and author of an upcoming book, “The Binary Delusion: How Biology Defies the Myth of Two Sexes.” While the impetus for the new policy seems to be exclusion of transgender women from women’s athletics, it will more likely exclude and draw unwelcome publicity to many more women who are not transgender.

Few elite athletes are transgender

Transgender people have faced mounting legal and political attacks in recent years.

President Donald Trump issued an executive order on Jan. 20, 2025, asserting that biological sex is simple and binary – that everyone is unambiguously female or male – and another executive order precluding “males” from women’s competitions.

At least 29 U.S. states have excluded transgender girls and women from girl’s and women’s athletic competitions. These laws are built on the idea that men on average are superior to women in many sports, so women need to be protected from unfair competition.

Person holding a sign reading 'SPORTS FOR ALL' in front of U.S. Supreme Court; other people bearing trans flags
Numerous state bills have aimed to ban transgender athletes from participating in sports.
Oliver Contreras/AFP via Getty Images

But elite transgender athletes are rare. In a 2024 hearing before the U.S. Senate, the president of the NCAA testified that of the 510,000 athletes in U.S. colleges at that time, he was aware of fewer than 10 transgender athletes – less than 0.002%.

Only one known transgender woman has ever participated in an Olympic women’s competition since the committee allowed women to compete in the Games beginning in 1900: Laurel Hubbard, a weightlifter who competed for New Zealand in 2021 but did not medal.

The rarity of transgender athletes in elite competition suggests their exclusion is a solution in search of a problem.

Biological sex is complicated

The genetic test the IOC is requiring is more likely to identify intersex women.

Intersex people have a combination of typically female and typically male biological sex traits. These include sex chromosomes, internal and external reproductive anatomy, sex hormones and hormone receptors.

There are many variations of intersex traits, but three may be the most relevant for women’s athletic competitions: androgen insensitivity, 5-alpha-reductase deficiency and genetic mosaicism.

People with androgen insensitivity don’t respond as much to androgens like testosterone. Some believe that having high levels of testosterone can give athletes a competitive advantage. Athletes with this condition gain little or no advantage like muscle growth from androgens. This also means their visible sex characteristics, including their genitals, appear mostly or entirely female. The new IOC policy has an exception for “complete” androgen insensitivity but doesn’t say how athletes would demonstrate this.

The policy also doesn’t mention partial androgen insensitivity, where androgen receptors respond to testosterone but probably not enough to gain a significant advantage in performance. Nevertheless, athletes with partial androgen insensitivity will presumably fail the test and be excluded from participating under the new policy.

People with 5-alpha-reductase deficiency make and respond to testosterone but make little or none of a more potent androgen called dihydrotestosterone, or DHT. If they have no DHT, their genitals appear more female and they gain less athletic advantage from androgens. People with this condition who have a Y chromosome will fail the new sex test and be excluded.

People with mosaicism are born with some cells that have a Y chromosome and some that do not. Women can develop mosaicism during pregnancy, when fetal cells with Y chromosomes cross the placenta into her body. The woman will have some cells with a Y chromosome, perhaps for the rest of her life. Such cells could cause a previously pregnant athlete to fail the new test.

History of sex testing in women’s athletics

The IOC and associated sports federations have a long history of sex testing, especially for track events. Sex testing has switched from genitals to genes to testosterone levels, and now back to genes. While the stated goal for these policies was to uncover males pretending to be female, they have never found any. Instead, they identified and excluded intersex women.

For much of the 20th century, sports administrators examined genitals if a competitor was suspected of being male. In the mid-1960s, they began examining the genitals of all women participating in International Amateur Athletics Federation competitions in what were called “nude parades.”

The nude parades embarrassed athletes and sports federations and were replaced by newly available chromosome or gene tests in the late 1960s. These tests were often done without informed consent – athletes were instead told they were being tested for performance-enhancing drugs. The test results were then often revealed publicly without the athlete’s consent.

For example, in 1967, the Polish sprinter Ewa Klobukowska, who had won three gold medals and set three world records, was designated “male” by a chromosome test, though she had typically female genitals. She was excluded from competition and forced to return her medals. But the following year, she gave birth – she apparently had genetic mosaicism.

Black-and-white photo of athletes running on a track
Ewa Klobukowska (No. 3 bib) had her medals stripped after being incorrectly designated ‘male’ through genetic testing.
S&G/PA Images via Getty Images

In 1985, a Spanish hurdler, Maria José Martínez-Patiño, found out through a public announcement that she was designated “male” by a genetic test and excluded from competition. “I felt ashamed and embarrassed,” she has said in a personal account. “I lost friends, my fiancé, hope and energy. But I knew that I was a woman and that my genetic difference gave me no unfair physical advantage. I could hardly pretend to be a man; I have breasts and a vagina.” She has complete androgen insensitivity.

Genetic testing largely gave way to testing testosterone levels in recent decades, which also excluded many intersex athletes. The 2026 IOC announcement states that there is no overlap in the testosterone levels of female and male elite athletes, but published research examining hundreds of elite athletes contradicts this statement.

In 2021, the IOC announced a new policy stating that “Every person has the right to practise sport without discrimination and in a way that respects their health, safety and dignity.” But the committee left it to each sport’s federation to regulate their own competitions, leading to a confusing mix of criteria that may have paved the way for the organization’s simplified 2026 policy.

SRY gene test is misleading

The IOC’s 2026 policy hints at the complexity of biological sex, stating that sex includes “sex chromosomes, gonads and hormones.” But it’s odd that genitals didn’t make their list, considering that genitals – external sex organs like the vagina and penis – are how most ordinary people define female and male, how physicians assign sex at birth, and how the IOC itself defined sex for decades.

Sex testing through genital inspection, though embarrassing and traumatizing for many athletes, may have been a better indicator of athletic advantage from androgens like testosterone than the SRY gene. During prenatal development, androgens cause initially undefined body structures to become a penis and scrotum; in their absence, or with androgen insensitivity, these structures become a clitoris and labia instead. Thus, typically female genitals indicate low androgen levels or low sensitivity to androgens, which suggests an athlete’s physical performance was not enhanced by these hormones.

Unless the IOC takes scrupulous care to screen for these exceptions, its new genetic test will likely exclude athletes who have not gained an advantage from androgens. Their new policy, however, states that “the need for consistency and fairness across sports” will not allow for “case-by-case consideration.”

As a result, it’s likely that another generation of intersex women will be excluded from the Olympics.

The Conversation

Ari Berkowitz receives funding from the National Science Foundation.

ref. Sex test used in IOC’s new transgender ban more likely to exclude from Olympics intersex women who were assigned female at birth – https://theconversation.com/sex-test-used-in-iocs-new-transgender-ban-more-likely-to-exclude-from-olympics-intersex-women-who-were-assigned-female-at-birth-279489

NASA wants to build a base on the Moon by the 2030s – how and why it plans to build up to a long-term lunar presence

Source: The Conversation – USA – By Michelle L.D. Hanlon, Professor of Air and Space Law, University of Mississippi

NASA’s Space Launch System rocket that will take an astronaut crew around the Moon rolls out to the launchpad. Joel Kowsky/NASA via Getty Images

The next U.S. trip to the Moon isn’t about planting a flag. It’s about learning how to live and work there.

NASA has just reset its Artemis program, marking a clear strategic shift: Space exploration is moving away from a race to achieve milestones and toward a system built on repeated operations, a sustained presence and lunar infrastructure that could become part of the technology networks we rely on here on Earth.

That shift is reflected in newly announced plans to invest billions of dollars in building a long-term lunar base, with habitats, power systems and surface infrastructure designed to support ongoing human activity. The message? Humans have already normalized travel to space. The next step is normalizing living beyond Earth.

Artemis is NASA’s plan to return people to the Moon with the goal of staying. Unlike the short Apollo missions of the 1960s and 1970s, it consists of increasingly complex missions: flying around the Moon, landing on its surface and eventually establishing a base near the lunar south pole. The program aims to create a reliable way for humans to live and work there, develop technologies useful on Earth and prepare for the journey to Mars.

Rather than moving straight from the upcoming Artemis II crewed lunar flyby to a surface landing, the new road map adds an intermediate mission in 2027. Astronauts will test docking, life-support systems and communications with commercial lunar landers from SpaceX and Blue Origin, but in low Earth orbit, the region roughly 100 to 1,200 miles (160 to 2,000 kilometers above Earth’s surface, where rescue remains possible.

NASA head Jared Isaacman discussed changes to the Artemis program on Feb. 27, 2026.

The first landing near the lunar south pole is now targeted for 2028. This timeline may sound delayed, but in reality, it has been deliberately reset to prioritize building reliable systems that can operate long into the future over speed.

As a professor of air and space law, I’ve been watching these developments closely. The United States is still in a race – particularly with China – but it is choosing to compete on its own terms. Rather than chasing the fastest possible landing, NASA is focused on building a system that can support repeated missions and a lasting human presence.

From sprint to system

The original Artemis plan aimed to leap quickly from test flights to a crewed landing while simultaneously developing new rockets, spacecraft and landing systems. That approach carried risk. Artemis I, an uncrewed mission, flew successfully in 2022. After a few delays, Artemis II is now nearing launch, with windows planned for early April 2026. But the further jump to a safe and reliable landing remains significant.

NASA’s new road map slows the transition deliberately. Instead of stand-alone milestones, NASA is now building a sequence of repeatable steps to gain hands-on experience.

This change includes a substantial new investment, with a multiphase plan for a lunar base with habitats, power systems and the surface infrastructure needed for a long-term human presence on the Moon. Consistent launch cadence and repeatable operations are how teams develop the expertise needed for safe, reliable spaceflight and eventually for traveling to Mars.

A rocket on a launchpad overlooking water.
The Artemis II Space Launch System rocket is poised to launch a crew of four to space.
NASA/John Kraus

This shift is reflected in the decision to pause the planned lunar Gateway station, a small space station intended to orbit the Moon, and prioritize infrastructure on the lunar surface itself, where astronauts will live, work and build over time.

The new changes also emphasize a shifting role for commercial companies.
SpaceX’s and Blue Origin’s lunar landers are integrated into the mission architecture.

The 2027 test mission, for example, will practice docking between crewed spacecraft and new commercial lunar landers in low Earth orbit. NASA is coordinating a network of public and private partners rather than running a single government-run Apollo-like program.

This method spreads risk across partners, lowers costs and speeds development, though success now depends on multiple players working reliably together.

Law follows activity

NASA’s road map is not just about lowering technical risk. It is also about shaping the future environment of lunar activity.

International space law, including the 1967 Outer Space Treaty, sets out broad principles to guide space activities, like avoiding harmful interference with others’ activities. But those rules only gain real meaning through repeated, coordinated activity, especially on the lunar surface, where desirable landing sites are limited.

Countries and companies that maintain a sustained presence on the Moon will shape the practical expectations everyone will share while living and working on the Moon. One-off demonstrations, like lunar landings, don’t shape lunar activity like continued operations would.

A diagram showing the three phases on NASA's lunar base plan, with phase 1 securing access, phase 2 establishing a base and phase 3 a semipermanent crew presence
NASA’s Artemis program seeks to establish a long-term human presence on the lunar surface.
NASA TV

Why this matters – even if you never go to space

It would be easy to see these changes as purely technical, but they are not. The structure of a space program shapes what technologies are developed, how industries grow and which countries influence how space is used. Technologies developed for sustained lunar activity, including life-support systems, energy storage and advanced communications, have found applications on Earth, from medicine to disaster response.

There are economic effects as well. The Artemis program supports jobs across the United States and among its international partners. It helps build industries that extend far beyond NASA itself.

And there is a strategic dimension. As more countries and companies operate in space, the question is no longer just who arrives first, but who helps define how activity is carried out. Over time, that presence will likely become part of the infrastructure that supports daily life on Earth.

Communications, navigation, supply chains and scientific data already depend on space-based systems. As activity expands to the Moon, facilities there, from energy systems to communications relay systems that transmit data and signals back to Earth, will become integrated into those networks. What is built on the Moon will not sit apart from life on Earth, but increasingly function as an extension of it.

The Moon is becoming a place where infrastructure, industry and rules and expectations for how humans operate there are already beginning to take shape. NASA’s updated plan signals that the United States intends to be present there consistently.

The updates to the Artemis program are a statement about how the United States intends to engage in the next phase of space exploration. Rather than pursuing a single dramatic landing, the U.S. is committing to the steady, repeatable work of building a lasting foothold on the Moon, and redefining humanity’s relationship with space itself.

The Conversation

Michelle L.D. Hanlon does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. NASA wants to build a base on the Moon by the 2030s – how and why it plans to build up to a long-term lunar presence – https://theconversation.com/nasa-wants-to-build-a-base-on-the-moon-by-the-2030s-how-and-why-it-plans-to-build-up-to-a-long-term-lunar-presence-279166

Are multiverses real? An astrophysicist explains why it depends on how you define ‘real’

Source: The Conversation – USA – By Zachary Slepian, Associate Professor of Astronomy, University of Florida

Physics has multiple theories and interpretations of the existence of a multiverse. Yana Iskayeva/Moment via Getty Images

Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to CuriousKidsUS@theconversation.com.


Are multiverses real? If so, what do they look like? How do you get there without disturbing time? – Emily, age 9, Pune, Maharashtra, India


The idea of a multiverse – a hypothetical collection of all possible universes – is one that
science fiction fans love to explore. But does the multiverse actually exist?

To answer the question of whether the multiverse is real, we first need to agree on what it means for something to be real. As an astrophysicist who studies cosmology – the large-scale history and structure of the universe – and the philosophy of physics, I’ve thought about this question more than a few times over my career.

The most immediate definition of “real” might be that you can see and touch it. My lunch is real in this sense, because I can taste it and you can hear me chewing it (hopefully not too loudly). So “real” might be defined as something you can perceive with at least one of your five senses.

But that would leave out a lot of things that are also real. The microwaves that heat up your food are real, but you can’t directly perceive them – only their effect, heated food. So some real things you can “see” only indirectly by the evidence they leave behind. The existence of dinosaurs is another example – you can see only their fossils.

So, you can ask two versions of the question of whether the multiverse is real. One: Can you see, hear, touch, smell or taste it? Two: Even if you can’t, is there any evidence of its effects?

Quantum mechanics of the multiverse

The answer most researchers would offer to whether you can perceive the multiverse with your five senses is probably not. But there are lots of real things that aren’t real in this sense, such as microwaves. So can we see any indirect evidence of the multiverse, such as the effects it might have on the observable world?

The short answer is yes, sort of.

The multiverse is one way to understand the behavior of very, very small things, such as atoms and subatomic particles. Scientists call the rules governing how these very small objects behave quantum mechanics. In quantum mechanics, it’s never certain what the outcome of an experiment will be. You can only write down the chance – that is, the probability – of something happening.

Schrödinger’s cat illustrates how multiple possibilities can exist at the same time.

It’s like rolling dice: You can’t be sure what number you’ll get, but you can say you have an equal chance of getting one, two, three, four, five or six on top of the dice. However, if you knew enough information about the dice – such as its exact shape and mass, the air patterns around it and the exact way you threw it – you could predict exactly what side it would land on. It might take a big computer simulation to crunch the numbers, but it’s possible.

Now imagine really, really, really small dice. Even if you had a very powerful computer, you wouldn’t be able to predict which side this super small dice would fall on. That’s because it’s governed by quantum mechanics, where you can’t predict outcomes with complete certainty. You can predict only probability.

Many worlds and the multiverse

Quantum mechanics is only somewhat random – not everything has an equal chance of happening. We can predict the chance of each scenario happening, but not the actual outcome. In the case of quantum dice, all we could know about it is that there’s a 1 in 6 chance of it landing on any face.

One way scientists have interpreted this strange property of quantum mechanics is that each possible scenario actually does happen. But when it does, it creates another universe. This is called the many-worlds view of quantum mechanics.

In the case of our quantum dice, the many-worlds view would say there’s a 1 in 6 chance of rolling each number because six universes are created every time we roll the dice. Although we stay in one of them – say, the world where the dice comes up three – five other universes are also created where the dice comes up as one of the other numbers.

In this picture of quantum mechanics, universes branch off with every scenario. Of course, we cannot really make a quantum mechanical dice and roll it – just interacting with the dice would destroy its quantum nature.

Does this mean quantum mechanics is evidence that the multiverse is real? I would say no. While it’s a fascinating way to imagine quantum mechanics, it’s just one interpretation, not undeniable evidence of the multiverse.

Illustration of sparkly blue spheres against a black background
If multiple universes possibly exist but you aren’t able to perceive any of them, do they actually exist?
Victor de Schwanberg/Science Photo Library via Getty Images

The multiverse and string theory

Another relevant aspect of the multiverse is its role in string theory. String theory argues that the fundamental particles that make up matter are themselves made of vibrating strings of energy. Think of an elastic band vibrating inside each particle.

String theory also argues that the universe has more than three dimensions. Different string theories predict different numbers of extra dimensions. This means physical constants such as the speed of light and the charge of electrons could have different values. So could the amount of stuff in the universe, such as matter. That suggests a landscape of possible different universes, each with different conditions – a multiverse.

So far, there isn’t definite evidence of a multiverse based on string theory. These universes probably wouldn’t connect to each other, otherwise they wouldn’t count as separate universes – just part of our own. So even if they do exist, we may never get direct evidence for their existence.

However, there could be indirect evidence of the existence of multiple universes. For instance, string theory can help scientists predict the results of very high-energy experiments in our own universe. It can also make predictions for how matter behaves on very, very small scales. If these predictions turn out to be true, that could be evidence for string theory. And if string theory is possibly real in our universe, this indirectly means the multiverse may also be real.

While there hasn’t been any definitive evidence in our own universe for string theory, who knows what the future may hold.


Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.

The Conversation

Zachary Slepian does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Are multiverses real? An astrophysicist explains why it depends on how you define ‘real’ – https://theconversation.com/are-multiverses-real-an-astrophysicist-explains-why-it-depends-on-how-you-define-real-268357

We analyzed Philly street scenes and identified signs of gentrification using machine learning trained on longtime residents’ observations

Source: The Conversation – USA – By Maya Mueller, Ph.D. Candidate in Architectural Engineering, Drexel University

Researchers used Google Street View to pull images of gentrifying neighborhoods. @2021 Google Street View, CC BY-NC

What does gentrification in Philadelphia look like?

“High-rise, modern apartment buildings.”

“(A) modern look that’s so out of place with our traditional row homes that have been here for a hundred years.”

“Six- to seven-floor high-rises with garages in the basement. They charge an extra $200 to park.”

“Gray, industrial looking.”

“The houses are ugly as heck. No architectural style. They’re probably two-bedroom, some probably one. And they usually put a deck up. It’s not geared for kids or families. A lot of steps.”

These are some of the descriptions that longtime residents of gentrifying neighborhoods in Philly used to describe the new construction popping up around them.

We are Ph.D. candidates in architectural engineering and geography, environment and urban studies at Drexel and Temple universities in Philadelphia. Working with a multidisciplinary team of professors and students, we recently developed a new way to map gentrification in Philly neighborhoods using a combination of accounts from longtime residents, Google Street View images and machine learning.

View of grey, boxy new construction building next to two older, more traditional houses
Signs of gentrification in Philly include new buildings that don’t fit the surrounding architecture.
Jeff Fusco/The Conversation U.S., CC BY-SA

Using AI to spot gentrification

Our team posited that the best source for knowing what gentrification looks like comes from the perceptions of longtime residents in gentrifying neighborhoods.

So we held focus groups in three rapidly gentrifying neighborhoods – one in Northeast Philadelphia and two in the River Wards section north of Center City and along the Delaware River.

We asked residents to identify the visual cues of building designs, materials, colors and landscaping choices that they associate with gentrification.

Many of these residents could recount, in great detail, the exact street intersections where they saw gentrification-related development occur over the decades.

We corroborated each location they identified through historical Google Street View imagery. By examining the exteriors of these buildings, we could expand upon the more generalized language used in the discussions, such as “modern” or “boxy,” to , such as “presence of bump-out windows” and “increased floor area ratio,” which is a measure of how much of the surface area of a land parcel a building takes up.

When pulling panoramas of residential building exteriors from Google Street View, we looked at two distinct time periods: 2009-13 and 2017-21.

AI is getting better at spotting the visual signs of gentrification. Researchers refer to AI systems that categorize scenery according to certain characteristics, like seeming “gentrified” or “not-gentrified,” as “deep mapping” models.

Deep mapping models use neural network algorithms, which can pick up on patterns in big datasets. The particular model we used is able to pick up subtle, pixel‑level differences between two images.

The model learned to approximate how residents distinguish gentrified scenes from unchanged ones. When we tested the model’s output, we found that it was able to separate “gentrified” from “not‑gentrified” images with an accuracy of about 84%. This showed us that visual cues guided by residents’ observations can be translated into a reliable machine learning signal.

Gentrification doesn’t always look the same

As a neighborhood becomes gentrified, wealthier people move in and long-standing residents can be displaced through rent hikes or the loss of housing. Gentrification can also lead to the disappearance of a “sense of place” – characteristics that make a neighborhood feel familiar and like home.

Grid of before and after images of urban locations
Examples of images in the researchers’ gentrification audit.
Author provided, CC BY-SA

With deep mapping models, researchers and neighborhood stakeholders can pull their own data on landscape changes related to gentrification and better understand how gentrification changes physical environments. With better data, they can map hot spots of new development and use machine learning models that predict future trajectories of change.

For example, several of our focus group participants in one neighborhood noted that gentrification was connected to the demolition of old buildings that likely contained hazardous substances, such as asbestos and lead. They wondered about the potential for air pollution. With accurate data on where development is occurring, researchers can model relationships between new construction and environmental conditions such as air quality.

Moreover, this process can also give legitimacy to neighborhood groups that may see changes occurring around them but lack the quantitative data to legitimatize their concerns to the media and to city government.

By being more explicit about how gentrification is defined when we categorize images and train our machine learning model, researchers can be more transparent about how image data is prepared and prevent personal biases from guiding the model and the patterns it learns.

For example, certain research finds that gentrification leads to increased greenery. However, some participants in our focus groups reported that gentrification resulted in the loss of community gardens and greenery. This experience runs contrary to common assumptions in gentrification research.

Transparency in training models builds trust

By defining how gentrification is perceived by residents, researchers like us can add clarity to how we prepare the model data. Even with more clarity, however, these AI systems are still “black box” in nature. A black box model means that the connection between inputs and outputs is unclear to the model user.

One way to make the model more transparent is by applying an additional model called XAI, or explainable artificial intelligence. Through XAI, there is potential to better understand which characteristics in an image are more important to the model prediction. For example, does the model focus on the windows of a building or the relative height of buildings?

Answering these questions will help researchers and stakeholders trust model predictions.

At the same time, one of us is leading a complementary line of research focused on explaining the reasoning behind the machine learning model decisions. In Philadelphia and many other U.S. cities, street scenes can have a dense mix of cars, vegetation and architectural styles that can confuse the model. There is a lot of complicated visual information to parse through, a lot of variety. Understanding the model’s internal logic helps ensure that its predictions reflect real neighborhood dynamics rather than irrelevant details in the imagery.

Together, these research directions aim to deepen our understanding of how gentrification unfolds on the ground and how AI can help illuminate patterns that might otherwise go unnoticed.

Read more of our stories about Philadelphia and Pennsylvania, or sign up for our Philadelphia newsletter on Substack.

The Conversation

Maya Mueller receives funding from the National Science Foundation.

Isaac Quaye received funding from the National Science foundation.

ref. We analyzed Philly street scenes and identified signs of gentrification using machine learning trained on longtime residents’ observations – https://theconversation.com/we-analyzed-philly-street-scenes-and-identified-signs-of-gentrification-using-machine-learning-trained-on-longtime-residents-observations-277704

Basic income’s appeal today is similar to its roots in 18th-century England – it’s a way to compensate people for a common good taken for private gain

Source: The Conversation – USA (2) – By Will Glovinsky, Research Assistant Professor of Humanities, Binghamton University, State University of New York

The first basic income proposals were a reaction to the seizure of common fields by English landlords. George Stubbs/The Yorck Project, CC BY

A story has been going around about artificial intelligence for the past decade: At some point, AI advances, robots and self-driving cars will throw countless people out of work.

The rich folks who control AI companies will get richer. Most other people’s fortunes will decline as their skills lose value and they fail to get new jobs. To prevent the U.S. from suffering mass hunger and political chaos, the story goes, it will need a new system: The government will provide many people, or maybe everyone, with no-strings-attached cash payments.

There are many names for this kind of policy, including “basic income.”

Backing from a diverse group

This is essentially the story told by 2020 presidential candidate Andrew Yang and by the labor leader Andy Stern. You may also hear it from an array of tech billionaires, including OpenAI CEO Sam Altman and Tesla and SpaceX CEO Elon Musk. In telling it, those moguls also get to hype their companies’ AI models.

Local governments from Stockton, California, to Atlanta are testing basic income programs by giving low-income residents cash. Across the Atlantic, British Investment Minister Jason Stockwood has said he and other leaders are “definitely talking” about the idea.

Meanwhile, social scientists are also interested. They point to basic income experiments that have found more tangible benefits, such as fewer hospitalizations and improved parenting practices.

A man looks off to the left with a quotation in the background.
When Michael Tubbs was the mayor of Stockton, Calif., the city temporarily ran a basic income program that gave 125 people money with no strings attached.
Nick Otto/AFP via Getty Images

Telling the ‘basic income’ origin story

I think this talk about basic income – as a solution to automation-driven job losses, or simply as a way to help people – misses something important.

As a scholar of British culture, literature and politics, I study the English thinkers and activists who first called for a form of basic income at the turn of the 19th century – an era of political turmoil, technological change and the global exchange of ideas. I believe that understanding the origins of basic income policies can help clarify what’s behind the current surge of interest in the idea.

This history suggests that basic income is not just about finding a solution to automation or efficiently reducing poverty, though it might do those things.

More fundamentally, calls for basic income respond to the sense that something has been unfairly taken from ordinary people.

AI’s ‘expertise theft’

The feared mass layoffs from AI have not yet materialized, though cracks may be forming in the job market for entry-level workers.

But technology is developing rapidly, making it hard to predict the future.

Meanwhile, another side of AI’s impact on workers is coming into focus.

Three MIT economists, including two Nobel laureates, published a paper in February 2026 in which they bluntly warned that current AI models are engaged in what they called “expertise theft.”

“AI systems,” they wrote, “freely scrape content from websites, social media, YouTube, newspapers, Wikipedia, and blogs, then statistically recombine this material and sell access to the results.”

The concern is that companies will sell all of us – or our former bosses – AI-mediated access to the very ideas, artwork and knowledge contributed by generations of skilled humans.

This large-scale appropriation of the resources that knowledge workers use to make a living – skills, styles, theorems, jokes, recipes – has a historical parallel. As the Oxford economist and machine learning expert Maximilian Kasy argues, AI companies’ wholesale data theft echoes the enclosure of common lands in England in the lead-up to the Industrial Revolution.

The loss of the commons

From 1604 to 1914, English landowners leveraged their control over Parliament to seize 6.8 million acres (275,186 square kilometers) of land once shared by commoners. In the mid-18th century, the process began to accelerate.

Previously, common people had shared the right to plow open fields, gather firewood, graze animals and cut peat from nearby bogs. Rules and fines had discouraged overuse.

Now, with these resources fenced off, commoners had to till someone else’s land for a wage. A communal inheritance was literally hedged in.

As with AI companies’ expertise theft today, the enclosure of the commons was defended by large landowners as a modernizing step. Experts debate the issue, but the economists Leander Heldring, James A. Robinson and Sebastian Vollmer found that English enclosures contributed to a 45% increase in farm yields.

But the enclosures of lands that previously belonged to all also reduced the economic independence of ordinary people. One observer summed up the feelings of commoners this way: “All I know is, I had a cow, and an act of Parliament has taken it from me.”

Amid this widespread dispossession, the first basic income proposals arose.

A response to losses due to enclosures

In the early 1770s, the magistrates of Newcastle attempted to enclose the town’s common land and keep its rental income for themselves.

The local townspeople successfully resisted. If they gave up their rights to use the land, they would divide its rent equally.

The struggle inspired a young Newcastle schoolmaster named Thomas Spence to develop the world’s first basic income proposal.

The son of an impoverished netmaker, Spence never left England. But he was intrigued by reports of Indigenous American systems of egalitarian land use.

His reading persuaded him that the English enclosures were designed to fence most people out from the very resources they needed to survive, rendering them dependent. “If Grass or Nettles they could eat,” he joked, landowners would fence them off, too.

Thomas Spence still has fans today.

Spence therefore called for the real estate of each parish, the ancient administrative unit in England, to be collectively owned by its residents. Farms would be leased out to the highest bidder, preserving competition.

But rather than accrue to landlords, the rents would fund parish-run schools, hospitals, courts and roads. The remainder would be distributed equally every three months to all residents of the parish, regardless of their age, occupation or gender. In one version of the plan, the local women would run the parish.

In 1798, Spence estimated the dividend at almost 10 pounds annually. In 1816, his followers proposed a version that would compensate former landholders but still yield a payout of 4 pounds.

Those 4 pounds in 1816 would be worth about 342 pounds, or US$456, as of February 2026. And 10 pounds in 1798 would equal 1,126 pounds, or $1,496.

Both were huge sums at a time when male farm laborers might make about 28 pounds annually if employed year-round. The economist Thomas Malthus, Spence’s contemporary, doubted the dividend would be so high.

Whatever the payment’s value, Spence argued that this money was owed to the people. If enclosing land they previously could farm forced commoners to work for landlords or move to northern factory towns, the payments would compensate them for the loss of their “natural rights” to the earth.

The first basic income movement

By the 1790s, Spence had landed in London.

There, hawking radical pamphlets and a sassafras-flavored beverage called saloop out of stalls and storefronts, he spread the gospel of basic income as the French Revolution raged.

A tireless propagandist, he published dialogues, handbills, ballads, anthologies and – when Spence was inevitably arrested – his own trial proceedings. He was imprisoned several times between 1792 and 1802, usually without any trial at all.

When he died in 1814, he had a loyal following of Spenceans, who chalked slogans on walls and sang ballads in the London taverns promoting his plan for unconditional cash dividends.

The doctrine of universal payments was considered so dangerous that the Spenceans were outlawed in 1817.

A woman stands in a room with a lot of boxes.
Nicole Huguenin runs Maui Rapid Response, a nonprofit supporting Maui fire survivors with cash assistance. She’s shown here organizing canned food at the organization’s warehouse in Kahului, Hawaii, in March 2026.
AP Photo/Mengshin Lin

Basic income aims to address dispossession

Up until recently, Spence’s ideas had found their closest analog in Alaska, which since 1982 has paid several thousand dollars yearly to every resident out of the revenue generated from the drilling of oil on state-owned lands.

In my view, Spence’s writings are evidence that the concept of basic income is a response to pervasive dispossession. Two centuries ago, Spence and his followers fought for universal cash payments because enclosure had made ordinary people too dependent on landowners for their livelihoods.

They did not emphasize that money would be good for people, as proponents do now. They argued that money was owed to people.

Today, concerns about AI-driven automation are driving the discussion about basic income. But automation may also be how the 21st-century form of semilegal theft becomes visible. Mounting calls for an “AI dividend,” provided on a regular basis, or “universal basic capital,” received as a lump sum, or even public ownership of AI may all reflect a dawning awareness of a new wave of dispossession.

This time, it’s fueled by the appropriation of humanity’s next common resource: our knowledge and skills.

The Conversation

Will Glovinsky does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Basic income’s appeal today is similar to its roots in 18th-century England – it’s a way to compensate people for a common good taken for private gain – https://theconversation.com/basic-incomes-appeal-today-is-similar-to-its-roots-in-18th-century-england-its-a-way-to-compensate-people-for-a-common-good-taken-for-private-gain-276950

Shiite grief over attacks on Iran’s sacred cities has deep historical roots

Source: The Conversation – USA (3) – By Mary Thurlkill, Professor of Religion, University of Mississippi

Several Shiite communities in South Asia recently refrained from celebrating Eid as they mourned the death of Iran’s supreme leader, Ayatollah Ali Khamenei. From Nigeria to Kashmir – well beyond the Gulf region – the assassination has stirred deep concerns among Shiite Muslims.

Shiite Islam is the official, and majority, religion of Iran. Shiite minorities in other countries tend to view Iranian leaders as protectors and have sometimes risked personal safety to protest the war.

As the violence expands around Tehran, Shiites are not only grieving the death of their leaders but also fear the loss of holy cities and shrines that anchor their collective memory.

Many of the cities targeted in the war today are home to these types of shrines, including Qom, Isfahan and Mashhad. In Isfahan, the 17th-century Jame Abbasi Mosque, also known as Shah Mosque, sustained damage during one of the airstrikes. After the ayatollah’s death, Shiites gathered at Imam Reza Shrine in Mashhad to mourn his loss. In a signal recognized by all Shiites, Iran raised the black flag at the shrine’s dome to mark the community’s shared grief.

Qom, located about 80 miles south of Tehran, has attracted much media attention because of the large-scale military attacks against it. Various social media platforms are showing destroyed buildings and plumes of smoke filling the skyline.

After Khamenei’s death, the city was targeted because the Assembly of Experts gathered there to elect his successor. Israel attacked Qom’s Shokouhiyeh Industrial Zone, known for its drone production companies.

With the news blackout in Iran and Gulf states, it’s impossible to know the impact of military operations on holy sites like Qom. Regardless of the level of material damage, Shiites are deploring the physical and spiritual assaults against their sacred landscape.

That’s because in Shiite Islam, grief is not only personal but collective. As a scholar of medieval Islam and Shiite piety, I have seen how this grief is expressed through rituals, pilgrimage and devotion to saints.

Redemptive suffering

Shared sorrow is a key part of Iran’s Twelver Shiite identity, which venerates the Prophet Muhammad’s family through daughter Fatima and cousin and son-in-law, Ali.

Fatima and Ali’s lineage is called the Imamate, with each individual imam recognized as a sinless spiritual leader. Each imam is responsible for providing guidance as the “proof of God” on earth.

Ali’s leadership and the imams’ leadership wasn’t recognized by all Muslims, however. Some of Muhammad’s companions and early leaders of the Umayyad dynasty, which ruled from 661-750 C.E., rejected their authority and punished their followers.

According to Shiite tradition, in 680 C.E. supporters of Ali’s family living in Kufa – in modern-day Iraq – appealed to Husayn, the prophet’s grandson, for assistance. They had refused to pledge their allegiance to the Umayyad Caliph Yazid because they viewed him as illegitimate and oppressive.

Husayn gathered a small group of friends and family, including wives, children and siblings, and headed to Kufa. Their party was intercepted outside the city, on the plains of Karbala, by Yazid’s forces led by Umar ibn Sa’d.

Cut off from water and vastly outnumbered, Husayn’s camp suffered for 10 days in the desert, and the Kufans never rallied to their defense. Desperate from thirst, Husayn rode out of the camp with his infant son to appeal for water, but an enemy archer shot an arrow through the child’s neck.

Tradition says that on the 10th of Muharram, Husayn and his companions met Yazid’s military on the battlefield and were massacred. Many of the men were beheaded and women captured; Umar ibn Sa’d marched the spiked heads and shackled women through various towns on the way back to Caliph Yazid in Damascus to deter further protest.

Husayn’s Kufan supporters acknowledged their failure to aid the imam and pledged to publicly atone. In 685 C.E. about 4,000 penitents revolted against Umayyads in Syria; the majority died.

Shiites worldwide still commemorate Husayn’s death at Karbala as a sacrifice for the community’s collective redemption.

Shiites frame their own suffering – from facing injustice to martyrdom – as symbolically participating in Husayn’s sacrifice. Public ceremonies include “taziyeh” plays performed during Muharram that recreate Husayn’s martyrdom and the public recitation of poetry dedicated to his family.

A sacred landscape

America and Israel associate holy sites such as Qom with underground bunkers, uranium plants and military headquarters. But for Shiites they are centers of pilgrimage, where the faithful seek connection with God, the imams and their sacred history.

Qom has universities and stunning sacred architecture that date back to the Safavids, a dynasty that ruled Iran from 1501 through 1736. Its seminary is the foremost clerical institution in the world, training students from Lebanon, Iraq and Afghanistan in a wide range of topics, including Shiite jurisprudence, Quranic interpretation and Arabic literature. Women also attend the seminary but with segregated classrooms and some course restrictions.

Several women in burqas stand before a shrine with tall minarets, holding a large photo of Ayatollah Khamenei.
Iranians mourn the death of Khamenei in a U.S. attack during a demonstration at the Hazrat Masumeh shrine in the city of Qom, Iran, on March 1, 2026.
Stringer/Anadolu via Getty Images

Qom’s primary sacred site, the shrine of Fatima bint Musa, who died in 816 C.E., is one of the most important sacred sites for Shii Muslims worldwide and attracts millions of pilgrims each year.

Popularly known as Fatima Masuma, she is the daughter of seventh imam Musa al-Kazim and sister to the eighth imam, Ali al-Rida.

Iranian Shiites – known as Twelvers – believe there are 12 imams in the Prophet Muhammad’s family lineage with exalted spiritual status, and that the 12th imam never died but went into “hiding.” Shiites know the 12th imam as al-Mahdi, or the messiah: they believe he will return at the world’s end times to restore God’s justice and peace.

According to Fatima’s hagiographies, or popular sacred stories, she remained unmarried and devoted herself to scholarship. She’s known as a trustworthy transmitter of hadith – sayings from the prophet and his family – and she studied the Quran and jurisprudence. She’s especially revered in Shiite Islam because of her kinship with the imams.

A mosque with a shining dome and tall minarets is seen through an archway with colorful intricate patterns.
The shrine of Fatima Masumeh in Qom, Iran.
Mansoreh Motamedi/Moment via Getty images

Tradition notes that when Fatima’s father, Musa al-Kazim, was unable to meet visitors with spiritual questions, he directed them to consult his daughter.

During her lifetime, the Abbasid dynasty rose to power in Baghdad and quickly sought to curtail the imamate’s popularity because many Muslims viewed Ali’s family as the only legitimate rulers.

Saintly intercession

Just on the outskirts of Qom is a village called Jamkaran, home to another important pilgrimage site. According to tradition, the 12th imam, or Imam al-Mahdi, appeared to a devotee in the 10th century and requested a shrine be constructed.

From 1995 to 2005 the Iranian government greatly expanded the mosque complex and city infrastructure to support the millions of pilgrims who visit annually.

Shiites believe al-Mahdi is mysteriously present at the site and listens to their concerns. In a popular ritual of prayer and piety, visitors write personal requests on bits of paper and drop them down the “Well of Requests.”

Shiites share political pain and injustice not only with each other but also with the imams, bound in collective grief and prayers for redemption. These traditions help explain the powerful reactions seen across Shiite communities following attacks on sacred sites and the killing of Grand Ayatollah Khamenei.

The Conversation

Mary Thurlkill does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Shiite grief over attacks on Iran’s sacred cities has deep historical roots – https://theconversation.com/shiite-grief-over-attacks-on-irans-sacred-cities-has-deep-historical-roots-278799

Trump’s ‘God Squad’ pits energy vs. endangered species, but it’s a false choice – protecting wildlife can be good for business

Source: The Conversation – USA (2) – By Dan Salas, Director of the Sustainable Landscapes Program, University of Illinois Chicago

Boat strikes can harm or kill whales and are one concern about the oil industry’s environmental impact. Robyn Beck/AFP via Getty Images

There’s a well-worn debate in U.S. politics that goes something like this: Would you rather have abundant and affordable energy or a clean, healthy planet where wildlife can flourish?

It sounds like an either/or choice, but it doesn’t have to be.

Many corporate leaders, including those I’ve worked with, know that wildlife conservation can also be good business.

That’s worth remembering as the Trump administration prepares to convene, for the first time in over 30 years, a special committee known as the “God Squad” that has the power to override one of the nation’s most important environmental protection laws: the Endangered Species Act of 1973.

What is the God Squad?

The Endangered Species Act requires that federal agencies avoid any action that is likely to jeopardize the continued existence of any species listed under the act. That includes federal permits for development, mining, drilling or logging.

To comply with the law, companies can be required to take actions to avoid harming protected species. Those steps can be frustrating when they add delays and costs to already costly development projects.

Early in the law’s history, Congress amended it to include an exemption. It authorized the creation of the Endangered Species Act Committee, made up of federal agency leaders, which could grant exemptions to this prohibition on federal actions considered likely to risk extinction of a listed species.

An owl flies toward the camera, it's wings outspread.
In one of only three meetings over 50 years, the God Squad in the 1990s considered a request to exempt the northern spotted owl in parts of Oregon targeted for logging. The request was eventually withdrawn.
Polinova via Wikimedia Commons, CC BY

That committee – the God Squad – includes seven members, and a vote may succeed only if five or more committee members agree. The six permanent members are the secretaries of the interior, agriculture and Army; the chairman of the Council of Economic Advisors, and the administrators of the Environmental Protection Agency and National Oceanic and Atmospheric Administration. The seventh member of the committee is a designated individual from the affected state or states.

The committee’s rare actions in the past

Meetings of the God Squad are so rare that the committee has gathered only three times in its existence.

The committee’s authority is limited to very uncommon circumstances in which there are no “reasonable and prudent alternatives” that would avoid jeopardizing a listed species or impair a species’ critical habitat.

The committee’s first and most notable case was in 1979. It involved the snail darter, a tiny, then-endangered fish whose habitat would have been harmed by the proposed Tellico Dam in Tennessee. Around the same time, the committee also met to review an exemption application related to water management at the Grayrocks Dam in Wyoming and its effects on endangered whooping cranes downstream in Nebraska.

A dam with a spillway, quiet water below, and trees on one side.
The Tennessee Valley Authority’s Tellico Dam, where the God Squad rejected a request for an exemption to the Endangered Species Act in 1979, was eventually completed after authorization from Congress.
U.S. Fish and Wildlife Service/Flickr

The third meeting of the committee was in the 1990s, when it considered exempting from the Endangered Species Act multiple timber sales in Oregon and Washington that would likely jeopardize the northern spotted owl.

For Tellico Dam, the committee denied the exception, but Congress later cleared the way for the dam to be completed. For Grayrocks Dam, the committee granted an exemption but required the Missouri Basin Power Project to preserve habitat and manage water to reduce harm to the cranes.

In the case of the northern spotted owl, exemptions were initially granted for timber sales in Oregon but later withdrawn due to legal challenges and procedural violations. No such exemptions were authorized in Washington state.

Why is it convening now?

The official notice says the meeting is “regarding an exemption under the Endangered Species Act” with respect to oil and gas activities.

In a court document responding to a lawsuit filed over the meeting by the environmental group Center for Biological Diversity, the government wrote that the March 31 meeting was called because the “Secretary of War found it necessary for reasons of national security to exempt from the ESA’s requirements all Gulf of America oil and gas exploration and development activities” associated with the Outer Continental Shelf Oil and Gas Program.

That likely refers to a May 2025 biological opinion by NOAA Fisheries. That opinion found that oil industry operations, particularly vessels striking wildlife, could jeopardize the Rice’s whale and other rare species.

The committee could be considering exemptions to the requirements of that biological opinion, which is being challenged by both conservation groups that want more protections and by industries that consider it too restrictive.

Convening the committee also follows the mandate of President Donald Trump’s January 2025 executive order declaring a “national energy emergency.” That executive order directs the Endangered Species Act Committee to “identify obstacles to domestic energy infrastructure specifically deriving from implementation of the ESA.”

Changing the paradigm

While the common rhetoric often makes it seem like energy and environmental goals are at odds, examples among leading energy and transportation companies have shown otherwise.

At the University of Illinois Chicago’s Energy Resources Center, my colleagues and I find ways conservationists and energy companies can work together, such as through networks like the Rights-of-Way as Habitat Working Group, which focuses on habitat conservation in working landscapes.

Balancing ecological and economic interests is not just a “nice idea” – it’s shown to be good business.

Planning new projects in ways that avoid harm to wildlife and include proactive conservation plans can avoid lawsuits, permit delays, reputational risks and increased costs.

Companies we work with in the energy and infrastructure sectors are finding that integrating ecological principles into projects and conservation practices into operations avoids other business interruptions as well.

For example, maintaining green spaces as wildlife habitat can buffer infrastructure from severe weather, erosion or flooding. Restoring or improving habitats can also reduce legal risks from environmental damage.

Two people take notes in a field of wildflowers.
Maintaining natural areas on corporate lands can boost species considered at risk, like the monarch butterfly. This is land maintained near a military base.
U.S. Space Force photo by Master Sgt. Carlin Leslie

Programs like the University of Illinois Chicago’s nationwide agreements for monarch butterflies and bumblebees help companies reduce regulatory delays and help conserve endangered and declining species at the same time.

For businesses, this can create positive connections with their employees and the communities where they operate. This, in turn, improves their reputations, which can help reduce project delays and encourage investment.

What happens when the Endangered Species Act Committee convenes may influence more than the future of a few species. It could affect broader actions concerning environmental stewardship, corporate responsibility and federal oversight.

If the committee focuses solely on removing protections for wildlife, it risks eroding public trust and could hinder efforts to foster conservation in the energy industry. If instead the committee considers how to increase cooperation among industry, conservation groups and federal agencies, it could have a lasting positive outcome.

The Conversation

Dan Salas does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Trump’s ‘God Squad’ pits energy vs. endangered species, but it’s a false choice – protecting wildlife can be good for business – https://theconversation.com/trumps-god-squad-pits-energy-vs-endangered-species-but-its-a-false-choice-protecting-wildlife-can-be-good-for-business-279433