Gene Hackman had a will, but the public may never find out who inherits his $80M fortune

Source: The Conversation – USA (2) – By Naomi Cahn, Professor of Law, University of Virginia

Gene Hackman and his wife, Betsy Arakawa, pose for a photo in 1986 in Los Angeles. Donaldson Collection/Michael Ochs Archives via Getty Images

Gene Hackman was found dead inside his New Mexico home on Feb. 26, 2025, at the age of 95. The acclaimed actor’s wife, Betsy Arakawa, had also died of a rare virus – a week before his death from natural causes.

Details about the couple’s plans for Hackman’s reportedly US$80 million fortune are only starting to emerge, months after the discovery of their tragic demise. While their wills have not yet been made public, we have seen them through a reputable source.

Both documents are short and sought to give the bulk of their assets to Hackman’s trust – a legal arrangement that allows someone to state their wishes for how their assets should be managed and distributed. Wills and trusts are similar in that both can be used to distribute someone’s property. They differ in that a trust can take effect during someone’s lifetime and continue long after their death. Wills take effect only upon someone’s death, for the purpose of distributing assets that person had owned.

Both trusts and wills can be administered by someone who does not personally benefit from the property.

Hackman, widely revered for his memorable roles in movies such as “The French Connection,” “Bonnie and Clyde” and “The Birdcage,” made it clear in his will that he wanted the trust to manage his assets, and he apparently named Arakawa as a third-party trustee. But that plan was dashed by Arakawa’s sudden death.

The person managing Hackman’s estate asked the court to appoint a new trustee, a request that the court approved, according to public records. But the court order is not public, and the trust itself remains private, so the public doesn’t yet know who will manage his estate or inherit his fortune. U.S. courts vary in how much access they provide to case records.

As law professors who specialize in trusts and estates, we teach courses about the transfer of property during life and at death. We believe that the drama playing out over Hackman’s assets offers valuable lessons for anyone leaving an estate, large or small, for their loved ones to inherit. It also is a cautionary tale for the tens of millions of Americans in stepfamilies.

‘Pour-over’ wills are a popular technique

The couple signed the wills in 2005, more than a decade before Hackman was diagnosed with dementia. There’s no reason to doubt whether Hackman was of sound mind at that time. Although he had retired from acting and led a very private life for a public figure, after the last film he starred in, “Welcome to Mooseport,” was released in 2004, Hackman continued to write books and narrate documentaries for several more years.

Based on the wills that we have been able to review, Hackman and Arakawa used a popular estate planning technique that combined two documents: a lifetime trust and a will.

The first document, sometimes called a “living trust,” usually contains the most important details about who ultimately inherits a person’s property once they die. All other estate planning documents, including wills, all financial and brokerage accounts, and life insurance policies can pour assets into the trust at death by naming the trustee as the death beneficiary.

The trust is the only document that needs to be updated when life circumstances change, such as divorce, the death of a spouse, or the birth of a child. All of the other planning documents can be left alone because they already name the trustee of the trust as the property recipient.

Hackman also signed a second document, known as a “pour-over” will. A pour-over will is a catchall measure to ensure that anything owned at death ends up in the trust if it wasn’t transferred during life. Hackman’s pour-over will gave his estate at death to Arakawa as the designated trustee of the trust he had created.

The combination of a trust coupled with a pour-over will – a technique that Michael Jackson also used – offers many advantages.

One is that, if the trust is created during life, it can be administered privately at death without the cost, publicity and delay of probate – the court-supervised process for estate administration. That is why, while Hackman’s personal representative filed his will in probate court to administer any remaining property owned at death, the trust created during Hackman’s life can manage assets without court supervision.

An older man and an older woman look puzzled while reading a document.
It’s important to carefully consider what should happen if you both die around the same time.
Inside Creative House/iStock via Getty Images Plus

Who might get what

The trust document has not been made public, but Hackman’s personal representative stated that the trust “contains mainly out-of-state beneficiaries” who will inherit his assets.

Hackman’s beneficiaries are unlikely to be publicly identified because they appear in the trust rather than the pour-over will. His will does not leave anything directly to any relatives. Even Arawaka was not slated to receive anything herself, only as trustee, but the will does mention his children in a paragraph describing his family.

Hackman had three children, all born during his first marriage, to Faye Maltese: Christopher, Elizabeth and Leslie. Hackman had acknowledged that it was hard for them to grow up with an often-absent celebrity father, but his daughters and one granddaughter released a statement after he died about missing their “Dad and Grandpa.” It is possible that Hackman’s children, as well as Arakawa, are named as beneficiaries of the trust.

Arakawa had no children of her own. Little is known about her family, except that her mother, now 91, is still alive. Arakawa’s will gave the bulk of her estate to Hackman as trustee of his trust, but only if he survived her by 90 days. If he failed to survive by 90 days, then she instructed her personal representative to establish a charitable trust “to achieve purposes beneficial to the community” consistent with the couple’s charitable preferences.

Her will refers to charitable “interests expressed … by my spouse and me during our lifetimes.” But it offers no specific guidance on which charities should benefit. Because Hackman did not survive Arakawa by 90 days, no part of her estate will pass to Hackman’s trust or his children.

Christopher Hackman has reportedly hired a lawyer, leading to speculation that he might contest some aspect of his father’s or stepmother’s estates.

Research shows that the average case length of a probate estate is 532 days, but individual cases can vary greatly in length and complexity. It is possible that the public may never learn what happens to the trust if the parties reach a settlement without litigation in court.

Man in tuxedo and a large bowtie stands next to two teenagers who are looking away.
Gene Hackman and his daughters, Elizabeth Hackman and Leslie Hackman, attend the screening of ‘Superman’ in 1978 at the Kennedy Center in Washington, D.C.
Ron Galella Collection via Getty Images

Takeaways for the rest of us

We believe that anyone thinking about who will inherit their property after they die can learn three important lessons from the fate of Hackman’s estate.

First, a living trust can provide more privacy than a will by avoiding the publicity of a court-supervised probate administration. It can also simplify the process for updating the estate plan by avoiding the need to amend multiple documents every time life circumstances change, such as the birth of a child or end of a marriage. Because all estate planning documents pour into the trust, the trust is the only document that requires any updating.

You don’t need a multimillion-dollar estate to justify the cost of creating a living trust. Some online platforms charge less than $400 for help creating one.

Second, remember that even when your closest loved ones are much younger than you are, it’s impossible to predict who will die first. If you do create a living trust, it should include a backup plan in case someone named in it dies before you. You can choose a “contingent beneficiary” – someone who will take the property if the primary beneficiary dies first. You can also choose a successor trustee who will manage the trust if the primary trustee dies first or declines to serve.

Finally, it’s important to carefully consider how best to divide the estate.

Hackman’s children and some of his other relatives may ultimately receive millions through his trust. But parents in stepfamilies must often make difficult decisions about how to divide their estate between a surviving spouse and any children they had with other partners.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Gene Hackman had a will, but the public may never find out who inherits his $80M fortune – https://theconversation.com/gene-hackman-had-a-will-but-the-public-may-never-find-out-who-inherits-his-80m-fortune-259650

Too many em dashes? Weird words like ‘delves’? Spotting text written by ChatGPT is still more art than science

Source: The Conversation – USA (2) – By Roger J. Kreuz, Associate Dean and Professor of Psychology, University of Memphis

Language experts fare no better than everyday people. Aitor Diago/Moment via Getty Images

People are now routinely using chatbots to write computer code, summarize articles and books, or solicit advice. But these chatbots are also employed to quickly generate text from scratch, with some users passing off the words as their own.

This has, not surprisingly, created headaches for teachers tasked with evaluating their students’ written work. It’s also created issues for people seeking advice on forums like Reddit, or consulting product reviews before making a purchase.

Over the past few years, researchers have been exploring whether it’s even possible to distinguish human writing from artificial intelligence-generated text. But the best strategies to distinguish between the two may come from the chatbots themselves.

Too good to be human?

Several recent studies have highlighted just how difficult it is to determine whether text was generated by a human or a chatbot.

Research participants recruited for a 2021 online study, for example, were unable to distinguish between human- and ChatGPT-generated stories, news articles and recipes.

Language experts fare no better. In a 2023 study, editorial board members for top linguistics journals were unable to determine which article abstracts had been written by humans and which were generated by ChatGPT. And a 2024 study found that 94% of undergraduate exams written by ChatGPT went undetected by graders at a British university.

Clearly, humans aren’t very good at this.

A commonly held belief is that rare or unusual words can serve as “tells” regarding authorship, just as a poker player might somehow give away that they hold a winning hand.

Researchers have, in fact, documented a dramatic increase in relatively uncommon words, such as “delves” or “crucial,” in articles published in scientific journals over the past couple of years. This suggests that unusual terms could serve as tells that generative AI has been used. It also implies that some researchers are actively using bots to write or edit parts of their submissions to academic journals. Whether this practice reflects wrongdoing is up for debate.

In another study, researchers asked people about characteristics they associate with chatbot-generated text. Many participants pointed to the excessive use of em dashes – an elongated dash used to set off text or serve as a break in thought – as one marker of computer-generated output. But even in this study, the participants’ rate of AI detection was only marginally better than chance.

Given such poor performance, why do so many people believe that em dashes are a clear tell for chatbots? Perhaps it’s because this form of punctuation is primarily employed by experienced writers. In other words, people may believe that writing that is “too good” must be artificially generated.

But if people can’t intuitively tell the difference, perhaps there are other methods for determining human versus artificial authorship.

Stylometry to the rescue?

Some answers may be found in the field of stylometry, in which researchers employ statistical methods to detect variations in the writing styles of authors.

I’m a cognitive scientist who authored a book on the history of stylometric techniques. In it, I document how researchers developed methods to establish authorship in contested cases, or to determine who may have written anonymous texts.

One tool for determining authorship was proposed by the Australian scholar John Burrows. He developed Burrows’ Delta, a computerized technique that examines the relative frequency of common words, as opposed to rare ones, that appear in different texts.

It may seem counterintuitive to think that someone’s use of words like “the,” “and” or “to” can determine authorship, but the technique has been impressively effective.

Black-and-white photographic portrait of young woman with short hair seated and posing for the camera.
A stylometric technique called Burrow’s Delta was used to identify LaSalle Corbell Pickett as the author of love letters attributed to her deceased husband, Confederate Gen. George Pickett.
Encyclopedia Virginia

Burrows’ Delta, for example, was used to establish that Ruth Plumly Thompson, L. Frank Baum’s successor, was the author of a disputed book in the “Wizard of Oz” series. It was also used to determine that love letters attributed to Confederate Gen. George Pickett were actually the inventions of his widow, LaSalle Corbell Pickett.

A major drawback of Burrows’ Delta and similar techniques is that they require a fairly large amount of text to reliably distinguish between authors. A 2016 study found that at least 1,000 words from each author may be required. A relatively short student essay, therefore, wouldn’t provide enough input for a statistical technique to work its attribution magic.

More recent work has made use of what are known as BERT language models, which are trained on large amounts of human- and chatbot-generated text. The models learn the patterns that are common in each type of writing, and they can be much more discriminating than people: The best ones are between 80% and 98% accurate.

However, these machine-learning models are “black boxes” – that is, we don’t really know which features of texts are responsible for their impressive abilities. Researchers are actively trying to find ways to make sense of them, but for now, it isn’t clear whether the models are detecting specific, reliable signals that humans can look for on their own.

A moving target

Another challenge for identifying bot-generated text is that the models themselves are constantly changing – sometimes in major ways.

Early in 2025, for example, users began to express concerns that ChatGPT had become overly obsequious, with mundane queries deemed “amazing” or “fantastic.” OpenAI addressed the issue by rolling back some changes it had made.

Of course, the writing style of a human author may change over time as well, but it typically does so more gradually.

At some point, I wondered what the bots had to say for themselves. I asked ChatGPT-4o: “How can I tell if some prose was generated by ChatGPT? Does it have any ‘tells,’ such as characteristic word choice or punctuation?”

The bot admitted that distinguishing human from nonhuman prose “can be tricky.” Nevertheless, it did provide me with a 10-item list, replete with examples.

These included the use of hedges – words like “often” and “generally” – as well as redundancy, an overreliance on lists and a “polished, neutral tone.” It did mention “predictable vocabulary,” which included certain adjectives such as “significant” and “notable,” along with academic terms like “implication” and “complexity.” However, though it noted that these features of chatbot-generated text are common, it concluded that “none are definitive on their own.”

Chatbots are known to hallucinate, or make factual errors.

But when it comes to talking about themselves, they appear to be surprisingly perceptive.

The Conversation

Roger J. Kreuz does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Too many em dashes? Weird words like ‘delves’? Spotting text written by ChatGPT is still more art than science – https://theconversation.com/too-many-em-dashes-weird-words-like-delves-spotting-text-written-by-chatgpt-is-still-more-art-than-science-259629

Water recycling is paramount for space stations and long-duration missions − an environmental engineer explains how the ISS does it

Source: The Conversation – USA – By Berrin Tansel, Professor of Civil and Environmental Engineering, Florida International University

The water recovery system on the ISS is state of the art. Roscosmos State Space Corporation via AP, File

When you’re on a camping trip, you might have to pack your own food and maybe something to filter or treat water that you find. But imagine your campsite is in space, where there’s no water, and packing jugs of water would take up room when every inch of cargo space counts. That’s a key challenge engineers faced when designing the International Space Station.

Before NASA developed an advanced water recycling system, water made up nearly half the payload of shuttles traveling to the ISS. I am an environmental engineer and have conducted research at Kennedy Space Center’s Space Life Sciences Laboratory. As part of this work, I helped to develop a closed-loop water recovery system.

Today, NASA recovers over 90% of the water used in space. Clean water keeps an astronaut crew hydrated, hygienic and fed, as it can use it to rehydrate food. Recovering used water is a cornerstone of closed-loop life support, which is essential for future lunar bases, Mars missions and even potential space settlements.

A rack of machinery.
A close-up view of the water recovery system’s racks – these contain the hardware that provides a constant supply of clean water for four to six crew members aboard the ISS.
NASA

NASA’s environmental control and life support system is a set of equipment and processes that perform several functions to manage air and water quality, waste, atmospheric pressure and emergency response systems such as fire detection and suppression. The water recovery system − one component of environmental control and life support − supports the astronauts aboard the ISS and plays a central role in water recycling.

Water systems built for microgravity

In microgravity environments like the ISS, every form of water available is valuable. The water recovery systems on the ISS collect water from several sources, including urine, moisture in cabin air, and hygiene – meaning from activities such as brushing teeth.

On Earth, wastewater includes various types of water: residential wastewater from sinks, showers and toilets; industrial wastewater from factories and manufacturing processes; and agricultural runoff, which contains fertilizers and pesticides.

In space, astronaut wastewater is much more concentrated than Earth-based wastewater. It contains significantly higher levels of urea – a compound from urine – salts, and surfactants from soaps and materials used for hygiene. To make the water safe to drink, the system needs to remove all of these quickly and effectively.

The water recovery systems used in space employ some of the same principles as Earth-based water treatment. However, they are specifically engineered to function in microgravity with minimal maintenance. These systems also must operate for months or even years without the need for replacement parts or hands-on intervention.

NASA’s water recovery system captures and recycles nearly all forms of water used or generated aboard the space station. It routes the collected wastewater to a system called the water processor assembly, where it is purified into safe, potable water that exceeds many Earth-based drinking water standards.

The water recovery and treatment system on the ISS consists of several subsystems.

Recovering water from urine and sweat

The urine processor assembly recovers about 75% of the water from urine by heating and vacuum compression. The recovered water is sent to the water processor assembly for further treatment. The remaining liquid, called brine, still contains a significant amount of water. So, NASA developed a brine processor assembly system to extract the final fraction of water from this urine brine.

In the brine processor assembly, warm, dry air evaporates water from the leftover brine. A filter separates the contaminants from the water vapor, and the water vapor is collected to become drinking water. This innovation pushed the water recovery system’s overall water recovery rate to an impressive 98%. The remaining 2% is combined with the other waste generated.

An astronaut in a red shirt holds a small metal cylinder.
The filter used in brine processing has helped achieve 98% recovery.
NASA

The air revitalization system condenses moisture from the cabin air – primarily water vapor from sweat and exhalation – into liquid water. It directs the recovered water to the water processor assembly, which treats all the collected water.

Treating recovered water

The water processor assembly’s treatment process includes several steps.

First, all the recovered water goes through filters to remove suspended particles such as dust. Then, a series of filters removes salts and some of the organic contaminants, followed by a chemical process called catalytic oxidation that uses heat and oxygen to break down the remaining organic compounds. The final step is adding iodine to the water to prevent microbial growth while it is stored.

Japan Aerospace Exploration Agency astronaut Koichi Wakata next to the International Space Station’s water recovery system, which recycles urine and wastewater into drinking water. As Wakata humorously puts it, ‘Here on board the ISS, we turn yesterday’s coffee into tomorrow’s coffee.’

The output is potable water — often cleaner than municipal tap water on Earth.

Getting to Mars and beyond

To make human missions to Mars possible, NASA has estimated that spacecraft must reclaim at least 98% of the water used on board. While self-sustaining travel to Mars is still a few years away, the new brine processor on the ISS has increased the water recovery rate enough that this 98% goal is now in reach. However, more work is needed to develop a compact system that can be used in a space ship.

The journey to Mars is complex, not just because of the distance involved, but because Mars and Earth are constantly moving in their respective orbits around the Sun.

The distance between the two planets varies depending on their positions. On average, they’re about 140 million miles (225 million km) apart, with the shortest theoretical approach, when the two planets’ orbits bring them close together, taking 33.9 million miles (54.6 million km).

A typical crewed mission is expected to take about nine months one way. A round-trip mission to Mars, including surface operations and return trajectory planning, could take around three years. In addition, launch windows occur only every 26 months, when Earth and Mars align favorably.

As NASA prepares to send humans on multiyear expeditions to the red planet, space agencies around the world continue to focus on improving propulsion and perfecting life support systems. Advances in closed-loop systems, robotic support and autonomous operations are all inching the dream of putting humans on Mars closer to reality.

The Conversation

Berrin Tansel does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Water recycling is paramount for space stations and long-duration missions − an environmental engineer explains how the ISS does it – https://theconversation.com/water-recycling-is-paramount-for-space-stations-and-long-duration-missions-an-environmental-engineer-explains-how-the-iss-does-it-260171

To better detect chemical weapons, materials scientists are exploring new technologies

Source: The Conversation – USA – By Olamilekan Joseph Ibukun, Postdoctoral Research Associate in Chemistry, Washington University in St. Louis

German troops make their way through a cloud of smoke or gas during a gas training drill, circa 1916. Henry Guttmann/Hulton Archive via Getty Images

Chemical warfare is one of the most devastating forms of conflict. It leverages toxic chemicals to disable, harm or kill without any physical confrontation. Across various conflicts, it has caused tens of thousands of deaths and affected over a million people through injury and long-term health consequences.

Mustard gas is a class of chemical that isn’t a gas at room temperature – it’s a yellow-brown oily liquid that can vaporize into a toxic mist. Viktor Meyer refined the synthesis of mustard into a more stable form. Mustard gas gained international notoriety during World War I and has been used as a weapon many times.

A vintage photograph of a soldier poking a cylinder, which releases a cloud of smoke.
German soldiers release poison gas from cylinders during World War I.
Henry Guttmann Collection/Hulton Archive via Getty Images

It is nearly impossible to guarantee that mustard gas will never be used in the future, so the best way to prepare for the possibility is to develop a very easy way to detect it in the field.

My colleagues and I, who are chemists and materials science researchers, are keen on developing a rapid, easy and reliable way to detect toxic chemicals in the environment. But doing so will require overcoming several technological challenges.

Effects on human health and communities

Mustard gas damages the body at the cellular level. When it comes into contact with the skin or eyes or is inhaled, it dissolves easily in fats and tissues and quickly penetrates the body. Once inside the body, it changes into a highly reactive form that attaches to and damages DNA, proteins and other essential parts of cells. Once it reacts with DNA, the damage can’t be undone – it may stop cells from functioning properly and kill them.

Mustard gas exposure can trigger large, fluid-filled blisters on the skin. It can also severely irritate the eyes, leading to redness, swelling and even permanent blindness. When inhaled, it burns the lining of the airways, leading to coughing, difficulty breathing and long-term lung damage. Symptoms often don’t appear for several hours, which delays treatment.

Four photos of people holding out their forearms, which have large blisters.
The forearms of test subjects exposed to nitrogen mustard and lewisite, chemicals that cause large, fluid-filled blisters on the skin.
Naval Research Laboratory

Even small exposures can cause serious health problems. Over time, it can weaken the immune system and has been linked to an increased risk of cancers due to its effects on DNA.

The effect of just one-time exposure carries down to the next generation. For example, studies have reported physical abnormalities and disorders in the children of men who were exposed to mustard gas, while some of the men became infertile.

The best way to prevent serious health problems is to detect mustard gas early and keep people away from it.

Detecting mustard gas early

The current methods to detect mustard gas rely on sophisticated chemistry techniques. These require expensive, delicate instruments that are difficult to carry to the war front and are too fragile to be kept in the field as a tool for detecting toxic chemicals. These instruments are conventionally designed for the laboratory, where they stay in one location and are handled carefully.

Many researchers have attempted to improve detection techniques. While each offers a glimpse of hope, they also come with setbacks.

Some scientists have been working on a wearable electrochemical biosensor that could detect mustard gas in both liquid and vapor form. They succeeded in developing tiny devices that provide real-time alerts. But here, stability became a problem. The enzymes degrade, and environmental noise can cloud the signal. Because of this issue, these strips haven’t been used successfully in the field.

To simplify detection, others developed molecularly imprinted polymer test strips targeting thiodiglycol, a mustard gas breakdown product. These strips change color when they come into contact with the gas, and they’re cheap, portable and easy to use in the field. The main concern is that they detect a chemical present in the aftermath of mustard gas use, not the agent itself, which isn’t quite as effective.

One of the most promising breakthroughs came in 2023 in the form of fluorescent probes, which change color when they sense the chemical. This probe is a tiny detective tool that detects or measures the target chemical and generates a signal. But these probes remain vulnerable to environmental interference such as humidity and temperature, meaning they’re less reliable in rugged field conditions.

Some other examples under development include a chemical sensor device that families could have at home, or even a wearable device.

Wearable devices are tricky, however, since they need to be small. Researchers have been trying to integrate tiny nanomaterials into sensors. Other teams are looking at how to incorporate artificial intelligence. Artificial intelligence could help a device interpret data faster and respond more quickly.

Researchers bridging the gap

Now at Washington University in St Louis, Makenzie Walk and I are part of the team of researchers working on detecting these chemicals, led by Jennifer Heemstra and M.G. Finn. Another member is Seth Taylor, a postdoctoral researcher at Georgia Tech.

Our team of researchers hopes to use the lessons learned from prior sensors to develop an easy and reliable way to rapidly detect these chemicals in the field. Our approach will involve testing different molecular sensor designs on compounds modeled after specific chemical weapons. The sensors would initiate a cascade of reactions that generate a bright, colorful fluorescent signal in the laboratory.

We are figuring out to which compounds these chemicals react best, and which might make a good candidate for use in a detector. These tests allow us to determine how much of the chemical will need to be in the air to trigger a reaction that we can detect, as well as how long it will need to be in the air before we can detect it.

Additionally, we are investigating how the structure of the chemicals we work with influences how they react. Some react more quickly than others, and understanding their behavior will help us pick the right compounds for our detector. We want them to be sensitive enough to detect even small amounts of mustard gas quickly, but not so sensitive that they frequently give falsely positive results.

Eliminating the use of these chemicals would be the best approach to avoid future recurrence. The 1997 Chemical Weapons Convention bans the production, use and accumulation of chemical weapons. But countries such as Egypt, North Korea and South Sudan have not signed or officially adopted the international arms control treaty.

To discourage countries that don’t sign the treaty from using these weapons, other countries can use sanctions. For example, the U.S. learned that Sudan used chemical weapons in 2024 during a conflict, and in response it placed sanctions on the government.

Even without continued use of these chemical weapons, some traces of the chemical may still linger in the environment. Technology that can quickly identify the chemical threat in the environment could prevent more disasters from occurring.

As scientists and global leaders collectively strive for a safer world, the ability to detect when a dangerous chemical is released or is present in real time will improve a community’s preparedness, protection and peace of mind.

The Conversation

Mekenzie Walk and Jen Heemstra contributed to this article.

Heemstra lab receives funding from the Defense Threat Reduction Agency (DTRA).

ref. To better detect chemical weapons, materials scientists are exploring new technologies – https://theconversation.com/to-better-detect-chemical-weapons-materials-scientists-are-exploring-new-technologies-257296

More than 50% of Detroit students regularly miss class – and schools alone can’t solve the problem

Source: The Conversation – USA – By Jeremy Singer, Assistant Professor of Education, Wayne State University

Nobody learns in an empty classroom. Jeffrey Basinger/Newsday RM via Getty Images

Thousands of K-12 students in Detroit consistently miss days of school.

Chronic absenteeism is defined as missing at least 10% of school days – or 18 in a 180-day academic year. In Detroit, chronic absenteeism rose during the COVID-19 pandemic and remains a persistent challenge.

To encourage attendance, the Detroit Public Schools Community District is getting creative. This past year, Michigan’s largest school district awarded US$200 gift cards to nearly 5,000 high schoolers for attending all their classes during a two-week period, and Superintendent Nikolai Vitti also floated the idea of providing bikes to help students get to class. Some district students lack access to reliable transportation.

To understand the consequences of kids regularly missing school, The Conversation U.S. spoke with Sarah Lenhoff, associate professor of education at Wayne State University and director of the Detroit Partnership for Education Equity & Research, an education-focused research collaborative, and Jeremy Singer, an assistant professor of education at Wayne State University. Lenhoff and Singer wrote a book published in March about the socioeconomic drivers of chronic absenteeism in K-12 schools and how policymakers and communities, not just educators, can help.

Is chronic absenteeism the same as truancy?

No. Truancy is how schools have thought about and dealt with student attendance problems since the early days of public education in the United States in the 19th century and is still defined in state law across the country. It focuses on “unexcused” absences and compliance with mandatory school attendance laws. By contrast, chronic absenteeism includes any absence – whether “excused” or “unexcused” – because each absence can be consequential for student learning and development.

Chronic absenteeism is usually defined as missing 10% or more school days. The 10% threshold is somewhat arbitrary, since researchers know that the consequences of missing school accumulate with each day missed. But the specific definition of chronic absenteeism has been solidified in research and by policymakers. Most states now include a measure of chronic absenteeism in their education accountability systems.

How big of a problem is chronic absenteeism in Detroit’s K-12 public schools?

Detroit has among the highest chronic absenteeism rates in the country: more than 50% in recent school years. Prior to the pandemic, the average rate of chronic absenteeism nationwide was about 15%, and it was around 24% in 2024.

In one of our prior studies, we found Detroit’s chronic absenteeism rate was much higher than other major cities – even others with high absenteeism rates such as Milwaukee or Philadelphia.

This is related to the depth of social and economic inequalities that Detroit families face. Compared to other major cities, Detroit has higher rates of poverty, unemployment and crime. It has worse public health conditions. And even its winters are some of the coldest of major U.S. cities. All of these factors make it harder for kids to attend school.

Rates of chronic absenteeism spiked in Detroit during the COVID-19 pandemic, as they did statewide. The Detroit Public Schools Community District has come close to returning to its pre-pandemic levels of absenteeism. The rates were 66% in the 2023-24 school year compared to 62% in the school year right before the pandemic began, 2018-19.

Detroit’s charter schools have struggled more to bring down their chronic absenteeism rates post-pandemic, but the numbers are lower overall – 54% in the 2023-24 school year compared to 36% in 2018-19.

A Black woman wearing a red T-shirt and sunglasses holds up a sign reading 'OUR FIGHT FOR DETROIT KIDS'
A school social worker from Noble Elementary-Middle School protests outside Detroit Public Schools headquarters.
Bill Pugliano/Getty Images

How does missing school affect students?

The connection between attendance and achievement is clear: Students who miss more school on average score worse on reading and math tests. As early as pre-K, being chronically absent is linked to lower levels of school readiness, both academically and behaviorally. By high school, students who miss more school tend to earn lower grades and GPAs and are less likely to graduate.

And it’s not just the absent students who are affected. When more kids in a class miss school regularly, that is associated with lower overall test scores and worse measures of skills such as executive functioning for other students in that class.

Does chronic absenteeism vary by family income or other factors?

Rates of chronic absenteeism are much higher among students from low-income families. In these cases, absenteeism is often driven by factors outside a student’s control such as unstable housing, unreliable transportation, health issues, lack of access to child care, or parents who work nontraditional hours. These challenges make it harder for students to get to school consistently, even when families are deeply committed to education.

School-based factors also influence attendance. Students are more likely to be chronically absent in schools with weaker relationships with families or a less positive school culture. However, even schools with strong practices may struggle if they serve communities facing deep socioeconomic hardship.

Ultimately, we don’t view chronic absenteeism as an issue of student motivation or family values. Rather, we see it as an issue related to the unequal conditions that shape students’ lives.

Does punishing absent kids or their parents work?

Many schools have suspended students for absences, or threatened their parents with fines or jail time. In some cases, families have lost social services due to their children’s chronic absenteeism.

Research shows these strategies are not only ineffective, they can make the problem worse.

For example, we found that when schools respond with punishment instead of support, they often alienate the very students and families who are already struggling to stay connected. Harsh responses can deepen mistrust between families and schools. When absences are treated as a personal failing caused by a lack of motivation or irresponsibility rather than symptoms of deeper challenges, students and parents may disengage further.

Instead, educators might ask: What’s getting in the way of consistent attendance, and how can we help? That shift from blame to understanding can help improve attendance.

What can policymakers, school districts and community organizations do to reduce chronic absenteeism?

Chronic absenteeism is a societal issue, not just a school problem. In other words, we need to recognize that chronic absenteeism is not a problem that schools can solve alone. While educators work to improve conditions within schools, policymakers and community leaders can take responsibility for the broader factors that influence attendance.

This could look like investing more resources and fostering collaboration across sectors such as health care, housing, transportation and social services to better support students and their families. Community organizations can play a role too, offering wraparound services such as mental health care, access to transportation, and after-school programming, all of which can support families. In the meantime, educators can focus on what they can control: strengthening communication with families, building supportive relationships and helping families connect with existing services that can remove attendance barriers.

The Conversation

Sarah Lenhoff receives funding from the Skillman Foundation, the Joyce Foundation, the Kresge Foundation, the William T. Grant Foundation, the American Institutes for Research, and the Urban Institute.

Jeremy Singer does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. More than 50% of Detroit students regularly miss class – and schools alone can’t solve the problem – https://theconversation.com/more-than-50-of-detroit-students-regularly-miss-class-and-schools-alone-cant-solve-the-problem-260773

Emil Bove confirmed – his appeals court nomination echoed earlier controversies, but with a key difference

Source: The Conversation – USA – By Paul M. Collins Jr., Professor of Legal Studies and Political Science, UMass Amherst

Emil Bove, Donald Trump’s nominee to serve as a federal appeals judge for the 3rd Circuit, is sworn in during a confirmation hearing in Washington, D.C., on June 25, 2025. Bill Clark/CQ-Roll Call, Inc, via Getty Images

President Donald Trump’s nomination of his former criminal defense attorney, Emil Bove, to be a judge on the United States Court of Appeals for the 3rd Circuit, was mired in controversy.

On June 24, 2025, Erez Reuveni, a former Department of Justice attorney who worked with Bove, released an extensive, 27-page whistleblower report. Reuveni claimed that Bove, as the Trump administration’s acting deputy attorney general, said “that it might become necessary to tell a court ‘fuck you’” and ignore court orders related to the administration’s immigration policies. Bove’s acting role ended on March 6 when he resumed his current position of principal associate deputy attorney general.

When asked about this statement at his June 25 Senate confirmation hearing, Bove said, “I don’t recall.”

And on July 15, 80 former federal and state judges signed a letter opposing Bove’s nomination. The letter argued that “Mr. Bove’s egregious record of mistreating law enforcement officers, abusing power, and disregarding the law itself disqualifies him for this position.”

A day later, more than 900 former Department of Justice attorneys submitted their own letter opposing Bove’s confirmation. The attorneys argued that “Few actions could undermine the rule of law more than a senior executive branch official flouting another branch’s authority. But that is exactly what Mr. Bove allegedly did through his involvement in DOJ’s defiance of court orders.”

On July 17, Democrats walked out of the Senate Judiciary Committee vote, in protest of the refusal by Chairman Chuck Grassley, a Republican from Iowa, to allow further investigation and debate on the nomination. Republicans on the committee then unanimously voted to move the nomination forward for a full Senate vote.

Late in the evening of July 29, and after two more whistleblower complaints about Bove’s conduct had emerged, the U.S. Senate confirmed Bove’s nomination in a 50-49 vote.

As a scholar of the courts, I know that most federal court appointments are not as controversial as Bove’s nomination. But highly contentious nominations do arise from time to time.

Here’s how three controversial nominations turned out – and how Bove’s nomination was different in a crucial way.

A man smiles and looks toward a microphone with people sitting behind him. All of them are dressed formally.
Robert Bork testifies before the Senate Judiciary Committee for his confirmation as associate justice of the Supreme Court in September 1987.
Mark Reinstein/Corbis via Getty Images

Robert Bork

Bork is the only federal court nominee whose name became a verb.

“Borking” is “to attack or defeat (a nominee or candidate for public office) unfairly through an organized campaign of harsh public criticism or vilification,” according to Merriam-Webster.

This refers to Republican President Ronald Reagan’s 1987 appointment of Bork to the Supreme Court.

Reagan called Bork “one of the finest judges in America’s history.” Democrats viewed Bork, a federal appeals court judge, as an ideologically extreme conservative, with their opposition based largely on his extensive scholarly work and opinions on the U.S. Court of Appeals for the District of Columbia Circuit.

In opposing the Bork nomination, Sen. Ted Kennedy of Massachusetts took the Senate floor and gave a fiery speech: “Robert Bork’s America is a land in which women would be forced into back-alley abortions, blacks would sit at segregated lunch counters, rogue police could break down citizens’ doors in midnight raids, schoolchildren could not be taught about evolution, writers and artists could be censored at the whim of government, and the doors of the federal courts would be shut on the fingers of millions of citizens for whom the judiciary is often the only protector of the individual rights that are the heart of our democracy.”

Ultimately, Bork’s nomination failed by a 58-42 vote in the Senate, with 52 Democrats and six Republicans rejecting the nomination.

Ronnie White

In 1997, Democratic President Bill Clinton nominated White to the United States District Court for the Eastern District of Missouri. White was the first Black judge on the Missouri Supreme Court.

Republican Sen. John Ashcroft, from White’s home state of Missouri, led the fight against the nomination. Ashcroft alleged that White’s confirmation would “push the law in a pro-criminal direction.” Ashcroft based this claim on White’s comparatively liberal record in death penalty cases as a judge on the Missouri Supreme Court.

However, there was limited evidence to support this assertion. This led some to believe that Ashcroft’s attack on the nomination was motivated by stereotypes that African Americans, like White, are soft on crime.

Even Clinton implied that race may be a factor in the attacks on White: “By voting down the first African-American judge to serve on the Missouri Supreme Court, the Republicans have deprived both the judiciary and the people of Missouri of an excellent, fair, and impartial Federal judge.”

White’s nomination was defeated in the Senate by a 54-45 party-line vote. In 2014, White was renominated to the same judgeship by President Barack Obama and confirmed by largely party-line 53-44 vote, garnering the support of a single Republican, Susan Collins of Maine.

A man with brown skin and a black suit places a hand on a leather chair and stands alongside people dressed formally.
Ronnie White, a former justice for the Missouri Supreme Court, testifies during an attorney general confirmation hearing in Washington in January 2001.
Alex Wong/Newsmakers

Miguel Estrada

Republican President George W. Bush nominated Estrada to the Court of Appeals for the District of Columbia Circuit in 2001.

Estrada, who had earned a unanimous “well-qualified” rating from the American Bar Association, faced deep opposition from Senate Democrats, who believed he was a conservative ideologue. They also worried that, if confirmed, he would later be appointed to the Supreme Court.

A dark-haired man in a suit, standing while swearing an oath.
Miguel Estrada, President George Bush’s nominee to the U.S. Court of Appeals for the District of Columbia, is sworn in during his hearing before Senate Judiciary on Sept. 26, 2002.
Scott J. Ferrell/Congressional Quarterly/Getty Images

However, unlike Bork – who had an extensive paper trail as an academic and judge – Estrada’s written record was very thin.

Democrats sought to use his confirmation hearing to probe his beliefs. But they didn’t get very far, as Estrada dodged many of the senators’ questions, including ones about Supreme Court cases he disagreed with and judges he admired.

Democrats were particularly troubled by allegations that Estrada, when he was screening candidates for Justice Anthony Kennedy, disqualified applicants for Supreme Court clerkships based on their ideology.

According to one attorney: “Miguel told me his job was to prevent liberal clerks from being hired. He told me he was screening out liberals because a liberal clerk had influenced Justice Kennedy to side with the majority and write a pro-gay-rights decision in a case known as Romer v. Evans, which struck down a Colorado statute that discriminated against gays and lesbians.”

When asked about this at his confirmation hearing, Estrada initially denied it but later backpedaled. Estrada said, “There is a set of circumstances in which I would consider ideology if I think that the person has some extreme view that he would not be willing to set aside in service to Justice Kennedy.”

Unlike the Bork nomination, Democrats didn’t have the numbers to vote Estrada’s nomination down. Instead, they successfully filibustered the nomination, knowing that Republicans couldn’t muster the required 60 votes to end the filibuster. This marked the first time in Senate history that a court of appeals nomination was filibustered. Estrada would never serve as a judge.

Bove stands out

As the examples of Bork, Estrada and White make clear, contentious nominations to the federal courts often involve ideological concerns.

This is also true for Bove, who was opposed in part because of the perception that he is a conservative ideologue.

But the main concerns about Bove were related to a belief that he is a Trump loyalist who shows little respect for the rule of law or the judicial branch.

This makes Bove stand out among contentious federal court nominations.

This story, originally published on July 21, 2025, has been updated to reflect the Senate’s confirmation of Bove.

The Conversation

Paul M. Collins Jr. does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Emil Bove confirmed – his appeals court nomination echoed earlier controversies, but with a key difference – https://theconversation.com/emil-bove-confirmed-his-appeals-court-nomination-echoed-earlier-controversies-but-with-a-key-difference-261347

Light pollution is encroaching on observatories around the globe – making it harder for astronomers to study the cosmos

Source: The Conversation – USA – By Richard Green, Astronomer Emeritus, Steward Observatory, University of Arizona

Light pollution from human activity can threaten radio astronomy – and people’s view of the night sky. Estellez/iStock via Getty Images

Outdoor lighting for buildings, roads and advertising can help people see in the dark of night, but many astronomers are growing increasingly concerned that these lights could be blinding us to the rest of the universe.

An estimate from 2023 showed that the rate of human-produced light is increasing in the night sky by as much as 10% per year.

I’m an astronomer who has chaired a standing commission on astronomical site protection for the International Astronomical Union-sponsored working groups studying ground-based light pollution.

My work with these groups has centered around the idea that lights from human activities are now affecting astronomical observatories on what used to be distant mountaintops.

A map of North America showing light pollution, with almost all the eastern part of the U.S. covered from Maine to North Dakota, and hot spots on the West Coast.
Map of North America’s artificial sky brightness, as a ratio to the natural sky brightness.
Falchi et al., Science Advances (2016), CC BY-NC

Hot science in the cold, dark night

While orbiting telescopes like the Hubble Space Telescope or the James Webb Space Telescope give researchers a unique view of the cosmos – particularly because they can see light blocked by the Earth’s atmosphere – ground-based telescopes also continue to drive cutting-edge discovery.

Telescopes on the ground capture light with gigantic and precise focusing mirrors that can be 20 to 35 feet (6 to 10 meters) wide. Moving all astronomical observations to space to escape light pollution would not be possible, because space missions have a much greater cost and so many large ground-based telescopes are already in operation or under construction.

Around the world, there are 17 ground-based telescopes with primary mirrors as big or bigger than Webb’s 20-foot (6-meter) mirror, and three more under construction with mirrors planned to span 80 to 130 feet (24 to 40 meters).

The newest telescope starting its scientific mission right now, the Vera Rubin Observatory in Chile, has a mirror with a 28-foot diameter and a 3-gigapixel camera. One of its missions is to map the distribution of dark matter in the universe.

To do that, it will collect a sample of 2.6 billion galaxies. The typical galaxy in that sample is 100 times fainter than the natural glow in the nighttime air in the Earth’s atmosphere, so this Rubin Observatory program depends on near-total natural darkness.

Two pictures of the constellation Orion, with one showing many times more stars.
The more light pollution there is, the fewer stars a person can see when looking at the same part of the night sky. The image on the left depicts the constellation Orion in a dark sky, while the image on the right is taken near the city of Orem, Utah, a city of about 100,000 people.
jpstanley/Flickr, CC BY

Any light scattered at night – road lighting, building illumination, billboards – would add glare and noise to the scene, greatly reducing the number of galaxies Rubin can reliably measure in the same time, or greatly increasing the total exposure time required to get the same result.

The LED revolution

Astronomers care specifically about artificial light in the blue-green range of the electromagnetic spectrum, as that used to be the darkest part of the night sky. A decade ago, the most common outdoor lighting was from sodium vapor discharge lamps. They produced an orange-pink glow, which meant that they put out very little blue and green light.

Even observatories relatively close to growing urban areas had skies that were naturally dark in the blue and green part of the spectrum, enabling all kinds of new observations.

Then came the solid-state LED lighting revolution. Those lights put out a broad rainbow of color with very high efficiency – meaning they produce lots of light per watt of electricity. The earliest versions of LEDs put out a large fraction of their energy in the blue and green, but advancing technology now gets the same efficiency with “warmer” lights that have much less blue and green.

Nevertheless, the formerly pristine darkness of the night sky now has much more light, particularly in the blue and green, from LEDs in cities and towns, lighting roads, public spaces and advertising.

The broad output of color from LEDs affects the whole spectrum, from ultraviolet through deep red.

The U.S. Department of Energy commissioned a study in 2019 which predicted that the higher energy efficiency of LEDs would mean that the amount of power used for lights at night would go down, with the amount of light emitted staying roughly the same.

But satellites looking down at the Earth reveal that just isn’t the case. The amount of light is going steadily up, meaning that cities and businesses were willing to keep their electricity bills about the same as energy efficiency improved, and just get more light.

Natural darkness in retreat

As human activity spreads out over time, many of the remote areas that host observatories are becoming less remote. Light domes from large urban areas slightly brighten the dark sky at mountaintop observatories up to 200 miles (320 kilometers) away. When these urban areas are adjacent to an observatory, the addition to the skyglow is much stronger, making detection of the faintest galaxies and stars that much harder.

A white-domed building on a hilltop among trees.
The Mt. Wilson Observatory in the Angeles National Forest may look remote, but urban sprawl from Los Angeles means that it is much closer to dense human activity today than it was when it was established in 1904.
USDA/USFS, CC BY

When the Mt. Wilson Observatory was constructed in the Angeles National Forest near Pasadena, California, in the early 1900s, it was a very dark site, considerably far from the 500,000 people living in Greater Los Angeles. Today, 18.6 million people live in the LA area, and urban sprawl has brought civilization much closer to Mt. Wilson.

When Kitt Peak National Observatory was first under construction in the late 1950s, it was far from metro Tucson, Arizona, with its population of 230,000. Today, that area houses 1 million people, and Kitt Peak faces much more light pollution.

Even telescopes in darker, more secluded regions – like northern Chile or western Texas – experience light pollution from industrial activities like open-pit mining or oil and gas facilities.

A set of buildings atop a mountain in the desert.
European Southern Observatory’s Very Large Telescope at the Paranal site in the sparsely populated Atacama Desert in northern Chile.
J.L. Dauvergne & G. Hüdepohl/ESO, CC BY-ND

The case of the European Southern Observatory

An interesting modern challenge is facing the European Southern Observatory, which operates four of the world’s largest optical telescopes. Their site in northern Chile is very remote, and it is nominally covered by strict national regulations protecting the dark sky.

AES Chile, an energy provider with strong U.S. investor backing, announced a plan in December 2024 for the development of a large industrial plant and transport hub close to the observatory. The plant would produce liquid hydrogen and ammonia for green energy.

Even though formally compliant with the national lighting norm, the fully built operation could scatter enough artificial light into the night sky to turn the current observatory’s pristine darkness into a state similar to some of the legacy observatories now near large urban areas.

A map showing two industrial sites, one large, marked on a map of Chile. Just a few miles to the north are three telescope sites.
The location of AES Chile’s planned project in relation to the European Southern Observatory’s telescope sites.
European Southern Observatory, CC BY-ND

This light pollution could mean the facility won’t have the same ability to detect and measure the faintest galaxies and stars.

Light pollution doesn’t only affect observatories. Today, around 80% of the world’s population cannot see the Milky Way at night. Some Asian cities are so bright that the eyes of people walking outdoors cannot become visually dark-adapted.

In 2009, the International Astronomical Union declared that there is a universal right to starlight. The dark night sky belongs to all people – its awe-inspiring beauty is something that you don’t have to be an astronomer to appreciate.

The Conversation

Richard Green is affiliated with the International Astronomical Union and the American Astronomical Society, as well as DarkSky International.

ref. Light pollution is encroaching on observatories around the globe – making it harder for astronomers to study the cosmos – https://theconversation.com/light-pollution-is-encroaching-on-observatories-around-the-globe-making-it-harder-for-astronomers-to-study-the-cosmos-260387

‘AI veganism’: Some people’s issues with AI parallel vegans’ concerns about diet

Source: The Conversation – USA – By David Joyner, Associate Dean and Senior Research Associate, College of Computing, Georgia Institute of Technology

Ethical concerns – like the mistreatment of content creators decried by this protester – drive both veganism and resistance to using AI. Mario Tama/Getty Images

New technologies usually follow the technology adoption life cycle. Innovators and early adopters rush to embrace new technologies, while laggards and skeptics jump in much later.

At first glance, it looks like artificial intelligence is following the same pattern, but a new crop of studies suggests that AI might follow a different course – one with significant implications for business, education and society.

This general phenomenon has often been described as “AI hesitancy” or “AI reluctance.” The typical adoption curve assumes a person who is hesitant or reluctant to embrace a technology will eventually do so anyway. This pattern has repeated over and over – why would AI be any different?

Emerging research on the reasons behind AI hesitancy, however, suggests there are different dynamics at play that might alter the traditional adoption cycle. For example, a recent study found that while some causes of this hesitation closely mirror those regarding previous technologies, others are unique to AI.

In many ways, as someone who closely watches the spread of AI, there may be a better analogy: veganism.

AI veganism

The idea of an AI vegan is someone who abstains from using AI, the same way a vegan is someone who abstains from eating products derived from animals. Generally, the reasons people choose veganism do not fade automatically over time. They might be reasons that can be addressed, but they’re not just about getting more comfortable eating animals and animal products. That’s why the analogy in the case of AI is appealing.

Unlike many other technologies, it’s important not to assume that skeptics and laggards will eventually become adopters. Many of those refusing to embrace AI actually fit the traditional archetype of an early adopter. The study on AI hesitation focused on college students who are often among the first demographics to adopt new technologies.

There is some historical precedent for this analogy. Under the hood, AI is just a set of algorithms. Algorithmic aversion is a well-known phenomenon where humans are biased against algorithmic decision-making – even if it is shown to be more effective. For example, people prefer dating advice from humans over advice from algorithms, even when the algorithms perform better.

But the analogy to veganism applies in other ways, providing insights into what to expect in the future. In fact, studies show that three of the main reasons people choose veganism each have a parallel in AI avoidance.

Ethical concerns

One motivation for veganism is concern over the ethical sourcing of animal by-products. Similarly, studies have found that when users are aware that many content creators did not knowingly opt into letting their work be used to train AI, they are more likely to avoid using AI.

a woman in a crowd holds a sign over her head
Many vegans have ethical concerns about the treatment of animals. Some people who avoid using AI have ethical concerns about the treatment of content creators.
Vuk Valcic/SOPA Images/LightRocket via Getty Images

These concerns were at the center of the Writers Guild of America and Screen Actors Guild-American Federation of Television and Radio Artists strikes in 2023, where the two unions argued for legal protections against companies using creatives’ works to train AI without consent or compensation. While some creators may be protected by such trade agreements, lots of models are instead trained on the work of amateur, independent or freelance creators without these systematic protections.

Environmental concerns

A second motivation for veganism is concern over the environmental impacts of intensive animal agriculture, from deforestation to methane production. Research has shown that the computing resources needed to support AI are growing exponentially, dramatically increasing demand for electricity and water, and that efficiency improvements are unlikely to lower the overall power usage due to a rebound effect, which is when efficiency gains spur new technologies that consume more energy.

One preliminary study found that increasing users’ awareness of the power demands of AI can affect how they use these systems. Another survey found that concern about water usage to cool AI systems was a factor in students’ refusal to use the technology at Cambridge University.

a woman in a crowd holds a hand-painted sign
Both AI and meat production spark concerns about environmental impact.
Kichul Shin/NurPhoto via Getty Images

Personal wellness

A third motivation for veganism is concern for possible negative health effects of eating animals and animal products. A potential parallel concern could be at work in AI veganism.

A Microsoft Research study found that people who were more confident in using generative AI showed diminished critical thinking. The 2025 Cambridge University survey found some students avoiding AI out of concern that using it could make them lazy.

It is not hard to imagine that the possible negative mental health effects of using AI could drive some AI abstinence in the same way the possible negative physical health effects of an omnivorous diet may drive some to veganism.

How society reacts

Veganism has led to a dedicated industry catering to that diet. Some restaurants feature vegan entrees. Some manufacturers specialize in vegan foods. Could it be the case that some companies will try to use the absence of AI as a selling point for their products and services?

If so, it would be similar to how companies such as DuckDuckGo and the Mozilla Foundation provide alternative search engines and web browsers with enhanced privacy as their main feature.

There are few vegans compared to nonvegans in the U.S. Estimates range as high as 4% of the population. But the persistence of veganism has enabled a niche market to serve them. Time will tell if AI veganism takes hold.

The Conversation

David Joyner does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. ‘AI veganism’: Some people’s issues with AI parallel vegans’ concerns about diet – https://theconversation.com/ai-veganism-some-peoples-issues-with-ai-parallel-vegans-concerns-about-diet-260277

When socialists win Democratic primaries: Will Zohran Mamdani be haunted by the Upton Sinclair effect?

Source: The Conversation – USA – By James N. Gregory, Professor of History, University of Washington

Democratic mayoral candidate Zohran Mamdani, right, and Attorney General of New York Letitia James walk in the NYC Pride March on June 29, 2025, in New York. AP Photo/Olga Fedorova

It has happened before: an upset victory by a Democratic Socialist in an important primary election after an extraordinary grassroots campaign.

In the summer of 1934, Upton Sinclair earned the kind of headlines that greeted Zohran Mamdani’s primary victory on June 24, 2025, in the New York City mayoral election.

Mamdani’s win surprised nearly everyone. Not just because he beat the heavily favored former governor Andrew Cuomo, but because he did so by a large margin. Because he did so with a unique coalition, and because his Muslim identity and membership in the Democratic Socialists of America should have, in conventional political thinking, made victory impossible.

This sounds familiar, at least to historians like me. Upton Sinclair, the famous author and a socialist for most of his life, ran for governor in California in 1934 and won the Democratic primary election with a radical plan that he called End Poverty in California, or EPIC.

The news traveled the globe and set off intense speculation about the future of California, where Sinclair was then expected to win the general election. His primary victory also generated theories about the future of the Democratic Party, where this turn toward radicalism might complicate the policies of the Democratic administration of Franklin D. Roosevelt.

What happened next may concern Mamdani supporters. Business and media elites mounted a campaign of fear that put Sinclair on the defensive. Meanwhile, conservative Democrats defected, and a third candidate split progressive votes.

In the November election, Sinclair lost decisively to incumbent Gov. Frank Merriam, who would have stood less chance against a conventional Democrat.

As a historian of American radicalism, I have written extensively about Sinclair’s EPIC movement, and I direct an online project that includes detailed accounts of the campaign and copies of campaign materials.

Upton’s 1934 campaign initiated the on-again, off-again influence of radicals in the Democratic Party and illustrates some of the potential dynamics of that relationship, which, almost 100 years later, may be relevant to Mamdani in the coming months.

A man waves through the window of a black car.
Upton Sinclair is seen in September 1934 in Poughkeepsie, N.Y., following a conference with President Franklin D. Roosevelt.
Bettmann/Contributor/Getty Images

California, 1934

Sinclair launched his gubernatorial campaign in late 1933, hoping to make a difference but not expecting to win. California remained mired in the Great Depression. The unemployment rate had been estimated at 29% when Roosevelt took office in March and had improved only slightly since then.

Sinclair’s Socialist Party had failed badly in the 1932 presidential election as Democrat Roosevelt swept to victory. Those poor results included California, where the Democratic Party had been an afterthought for more than three decades.

Sinclair decided that it was time to see what could be accomplished by radicals working within that party.

Reregistering as a Democrat, he dashed off a 64-page pamphlet with the futuristic title I, Governor of California and How I Ended Poverty. It detailed his plan to solve California’s massive unemployment crisis by having the state take over idle farms and factories and turn them into cooperatives dedicated to “production for use” instead of “production for profit.”

A black and white photo shows a man on a stage, the American flag behind him, speaking to a crowd.
Sinclair speaks to a group in his campaign headquarters in Los Angeles, Calif., in September 1934.
Bettmann/ Contributor/Getty Images

Sinclair soon found himself presiding over an explosively popular campaign, as thousands of volunteers across the state set up EPIC clubs – numbering more than 800 by election time – and sold the weekly EPIC News to raise campaign funds.

Mainstream Democrats waited too long to worry about Sinclair and then failed to unite behind an alternative candidate. But it would not have mattered. Sinclair celebrated a massive primary victory, gaining more votes than all of his opponents combined.

Newspapers around the world told the story.

“What is the matter with California?” The Boston Globe asked, according to author Greg Mitchell. “That is the farthest shift to the left ever made by voters of a major party in this country.”

Building fear

Primaries are one thing. But in 1934, the November general election turned in a different direction.

Terrified by Sinclair’s plan, business leaders mobilized to defeat EPIC, forming the kind of cross-party coalition that is rare in America except when radicals pose an electoral threat. Sinclair described the effort in a book he wrote shortly after the November election: “I, Candidate for Governor: And How I Got Licked.”

Nearly every major newspaper in the state, including the five Democratic-leaning Hearst papers, joined the effort to stop Sinclair. Meanwhile, a high-priced advertising agency set up bipartisan groups with names like California League Against Sinclairism and Democrats for Merriam, trumpeting the names of prominent Democrats who refused to support Sinclair.

Few people of any party were enthusiastic about Merriam, who had recently angered many Californians by sending the National Guard to break a Longshore strike in San Francisco, only to trigger a general strike that shut down the city.

A black and white photo depicts a billboard criticizing Democrat Upton Sinclair.
A billboard supports Republican Frank Merriam and opposes Democrat Upton Sinclair for governor of California in January 1934.
Bettmann /Contributor/Getty Images

The campaign against Sinclair attacked him with billboards, radio and newsreel programming, and relentless newspaper stories about his radical past and supposedly dangerous plans for California.

EPIC faced another challenge, candidate Raymond Haight, running on the Progressive Party label. Haight threatened to divide left-leaning voters.

Sinclair tried to defend himself, energetically denouncing what he called the “Lie Factory” and offering revised, more moderate versions of some elements of the EPIC plan. But the Red Scare campaign worked. Merriam easily outdistanced Sinclair, winning by a plurality in the three-way race.

New York, 2025

Will a Democratic Socialist running for mayor in New York face anything similar in the months ahead?

A movement to stop Mamdani is coming together, and some of what they are saying resonates with the 1934 campaign to stop Sinclair.

The Guardian newspaper has quoted “loquacious billionaire hedge funder Bill Ackman, who said he and others in the finance industry are ready to commit ‘hundreds of millions of dollars’ into an opposing campaign.”

In 1934, newspapers publicized threats by major companies, most famously Hollywood studios, to leave California in the event of a Sinclair victory. The Wall Street Journal, Fortune magazine and other media outlets have recently warned of similar threats.

And there may be something similar about the political dynamics.

Sinclair’s opponents could offer only a weak alternative candidate. Merriam had few friends and many critics.

In 2025, New York City Mayor Eric Adams, who abandoned the primary when he was running as a Democrat and is now running as an independent, is arguably weaker still, having been rescued by President Donald Trump from a corruption indictment that might have sent him to prison. If he is the best hope to stop Mamdani, the campaign strategy will likely parallel 1934. All attack ads – little effort to promote Adams.

But there is an important difference in the way the New York contest is setting up. Andrew Cuomo remains on the ballot as an independent, and his name could draw votes that might otherwise go to Adams.

Curtis Sliwa, the Republican candidate, will also be on the ballot. Whereas in 1934 two candidates divided progressive votes, in 2025 three candidates are going to divide the stop-Mamdani votes.

Religion also looms large in the campaign ahead. The New York City metro area’s U.S. Muslim population is said to be at least 600,000, compared to an estimated 1.6 million Jewish residents. Adams has announced that the threat of antisemitism will be the major theme of his campaign.

The stop-Sinclair campaign also relied on religion, focusing on his professed atheism and pulling quotations from books he had written denouncing organized religion. However, a statistical analysis of voting demographics suggests that this effort proved unimportant.

The Conversation

James N. Gregory does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. When socialists win Democratic primaries: Will Zohran Mamdani be haunted by the Upton Sinclair effect? – https://theconversation.com/when-socialists-win-democratic-primaries-will-zohran-mamdani-be-haunted-by-the-upton-sinclair-effect-260168

Unpacking Florida’s immigration trends − demographers take a closer look at the legal and undocumented population

Source: The Conversation – USA – By Matt Brooks, Assistant Professor of Sociology, Florida State University

Immigration has dominated recent public discourse about Florida, whether it be the opening of Alligator Alcatraz, a migrant detention facility in the middle of the Everglades, or Florida Gov. Ron DeSantis declaring an “immigration emergency” for the state that has lasted more than two years.

As demographers – that is, people who count people – we’ve noticed that this conversation has proceeded largely without the benefit of a clear description of Florida’s immigrant population.

Here’s a snapshot.

How many immigrants are in Florida?

We used data from the Office of Homeland Security Statistics and the American Community Survey, conducted annually by the U.S. Census Bureau. Homeland Security provides estimates of the state’s undocumented population and annual counts of authorized arrivals. Census data allow us to describe the social and economic characteristics of Florida’s immigrant population.

In 2023, the most recent year for which the Department of Homeland Security provides publicly available data, an estimated 590,000 immigrants without legal status were living in Florida. This is the third-largest population of immigrants without legal status in the U.S., behind California and Texas. But in contrast to those two states, the number of immigrants entering Florida illegally has been shrinking since 2018.

On the other hand, DHS data points to recent growth in Florida’s population of immigrants with legal status. This represents a rebound from declines between 2016 and 2020.

In 2023, Florida welcomed 72,850 residents from outside the country. This is just 0.3% of Florida’s population that year. About 95% of these new Florida residents were admitted as lawful permanent residents, or green card holders. The remainder entered as refugees (3%) and people granted asylum (2%).

For comparison, U.S. Census Bureau estimates suggest roughly 640,000 people moved to Florida in 2023 from other states.

Who makes up Florida’s immigrant population?

The American Community Survey data tells us even more about Florida’s immigrant population. The survey estimates that 4,996,874 foreign-born individuals lived in Florida in 2023, up from 3,798,062 in 2013. These numbers include those who are in the U.S. legally and illegally and encompass both recent arrivals and long-term residents.

In 2023, about 22% of Florida residents – and nearly 7% of Florida children – were immigrants. An additional 29% of Florida children have at least one immigrant parent.

According to the American Community Survey, nearly half of Florida’s immigrants were born in Cuba, Haiti, Venezuela, Colombia or Mexico. Despite being born elsewhere, Florida’s immigrants in many ways resemble other Floridians: About 20% hold a bachelor’s degree, compared to 22% of nonimmigrant Floridians, and 13% of both groups have a graduate degree. Nearly all Florida immigrants, 89%, speak English, and the majority, 57%, are naturalized citizens.

Immigrants make up a disproportionate share of Florida’s workforce, particularly in essential sectors of the state’s economy. They account for more than 47% of Florida’s agricultural workers, 41% of hotel workers and 35% of construction workers.

Florida immigrants also work in sectors that many might not consider to be “immigrant jobs.” They constitute 33% of child care workers, 21% of school and university employees and 27% of the health care workers.

Across all sectors, immigrants have lower unemployment rates than nonimmigrants. Although available data cannot tell us the extent to which these numbers are bolstered by undocumented immigrants, the importance of Florida’s immigrants for the state’s economy is undeniable.

Florida’s population is growing at a faster rate than any other state in the country, boosted by people moving in from abroad and from other states. This growth both reflects and feeds the state’s economic vitality. Between 2019 and 2024, Florida’s GDP grew twice as fast as the nation’s as a whole.

Is Florida experiencing an “immigration emergency”? That’s for politicians to decide. Our research suggests that policies that discourage new arrivals or encourage – or force – migrants to leave could jeopardize Florida’s robust economy and the well-being of its population.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Unpacking Florida’s immigration trends − demographers take a closer look at the legal and undocumented population – https://theconversation.com/unpacking-floridas-immigration-trends-demographers-take-a-closer-look-at-the-legal-and-undocumented-population-261425