Yes, ADHD diagnoses are rising, but that doesn’t mean it’s overdiagnosed

Source: The Conversation – USA (3) – By Carol Mathews, Professor of Psychiatry, University of Florida

Differences in how ADHD is defined explain why the condition is sometimes perceived as overdiagnosed. Catherine Falls Commercial/Moment via Getty Images

Many news outlets have reported an increase – or surge – in attention-deficit/hyperactivity disorder, or ADHD, diagnoses in both children and adults. At the same time, health care providers, teachers and school systems have reported an uptick in requests for ADHD assessments.

These reports have led some experts and parents to wonder whether ADHD is being overdiagnosed and overtreated.

As researchers who have spent our careers studying neurodevelopmental disorders like ADHD, we are concerned that fears about widespread overdiagnosis are misplaced, perhaps based on a fundamental misunderstanding of the condition.

Understanding ADHD as one end of a spectrum

Discussions about overdiagnosis of ADHD imply that you either have it or you don’t.

However, when epidemiologists ask people in the general population about their symptoms of ADHD, some have a few symptoms, some have a moderate level, and a few have lots of symptoms. But there is no clear dividing line between those who are diagnosed with ADHD and those who are not, since ADHD – much like blood pressure – occurs on a spectrum.

Treating mild ADHD is similar to treating mild high blood pressure – it depends on the situation. Care can be helpful when a doctor considers the details of a person’s daily life and how much the symptoms are affecting them.

Not only can ADHD symptoms be very different from person to person, but research shows that ADHD symptoms can change within an individual. For example, symptoms become more severe when the challenges of life increase.

ADHD symptoms fluctuate depending on many factors, including whether the person is at school or home, whether they have had enough sleep, if they are under a great deal of stress or if they are taking medications or other substances. Someone who has mild ADHD may not experience many symptoms while they are on vacation and well rested, for example, but they may have impairing symptoms if they have a demanding job or school schedule and have not gotten enough sleep. These people may need treatment for ADHD in certain situations but may do just fine without treatment in other situations.

This is similar to what is seen in conditions like high blood pressure, which can change from day to day or from month to month, depending on a person’s diet, stress level and many other factors.

Can ADHD symptoms change over time?

ADHD symptoms start in early childhood and typically are at their worst in mid-to late childhood. Thus, the average age of diagnosis is between 9 and 12 years old. This age is also the time when children are transitioning from elementary school to middle school and may also be experiencing changes in their environment that make their symptoms worse.

Classes can be more challenging beginning around fifth grade than in earlier grades. In addition, the transition to middle school typically means that children move from having all their subjects taught by one teacher in a single classroom to having to change classrooms with a different teacher for each class. These changes can exacerbate symptoms that were previously well-controlled.

Symptoms can also wax and wane throughout life. For most people, symptoms improve – but may not completely disappear – after age 25, which is also the time when the brain has typically finished developing.

Psychiatric problems that often co-occur with ADHD, such as anxiety or depression, can worsen ADHD symptoms that are already present. These conditions can also mimic ADHD symptoms, making it difficult to know which to treat. High levels of stress leading to poorer sleep, and increased demands at work or school, can also exacerbate or cause ADHD-like symptoms.

Finally, the use of some substances, such as marijuana or sedatives, can worsen, or even cause, ADHD symptoms. In addition to making symptoms worse in someone who already has an ADHD diagnosis, these factors can also push someone who has mild symptoms into full-blown ADHD, at least for a short time.

The reverse is also true: Symptoms of ADHD can be minimized or reversed in people who do not meet full diagnostic criteria once the external cause is removed.

Kids with ADHD often have overlapping symptoms with anxiety, depression, dyslexia and more.

How prevalence is determined

Clinicians diagnose ADHD based on symptoms of inattention, hyperactivity and impulsivity. To make an ADHD diagnosis in children, six or more symptoms in at least one of these three categories must be present. For adults, five or more symptoms are required, but they must begin in childhood. For all ages, the symptoms must cause serious problems in at least two areas of life, such as home, school or work.

Current estimates show that the strict prevalence of ADHD is about 5% in children. In young adults, the figure drops to 3%, and it is less than 1% after age 60. Researchers use the term “strict prevalence” to mean the percentage of people who meet all of the criteria for ADHD based on epidemiological studies. It is an important number because it provides clinicians and scientists with an estimate on how many people are expected to have ADHD in a given group of people.

In contrast, the “diagnosed prevalence” is the percentage of people who have been diagnosed with ADHD based on real-world assessments by health care professionals. The diagnosed prevalence in the U.S. and Canada ranges from 7.5% to 11.1% in children under age 18. These rates are quite a bit higher than the strict prevalence of 5%.

Some researchers claim that the difference between the diagnosed prevalence and the strict prevalence means that ADHD is overdiagnosed.

We disagree. In clinical practice, the diagnostic rules allow a patient to be diagnosed with ADHD if they have most of the symptoms that cause distress, impairment or both, even when they don’t meet the full criteria. And much evidence shows that increases in the diagnostic prevalence can be attributed to diagnosing milder cases that may have been missed previously. The validity of these mild diagnoses is well-documented.

Consider children who have five inattentive symptoms and five hyperactive-impulsive symptoms. These children would not meet strict diagnostic criteria for ADHD even though they clearly have a lot of ADHD symptoms. But in clinical practice, these children would be diagnosed with ADHD if they had marked distress, disability or both because of their symptoms – in other words, if the symptoms were interfering substantially with their everyday lives.

So it makes sense that the diagnosed prevalence of ADHD is substantially higher than the strict prevalence.

Middle-aged woman sitting at a table and giving a pill to an adolescent girl, who is sipping a glass of water.
A robust body of literature shows the negative outcomes associated with underdiagnosis and undertreatment of ADHD.
SolStock/E+ via Getty Images

Implications for patients, parents and clinicians

People who are concerned about overdiagnosis commonly worry that people are taking medications they don’t need or that they are diverting resources away from those who need it more. Other concerns are that people may experience side effects from the medications, or that they may be stigmatized by a diagnosis.

Those concerns are important. However, there is strong evidence that underdiagnosis and undertreatment of ADHD lead to serious negative outcomes in school, work, mental health and quality of life.

In other words, the risks of not treating ADHD are well-established. In contrast, the potential harms of overdiagnosis remain largely unproven.

It is important to consider how to manage the growing number of milder cases, however. Research suggests that children and adults with less severe ADHD symptoms may benefit less from medication than those with more severe symptoms.

This raises an important question: How much benefit is enough to justify treatment? These are decisions best made in conversations between clinicians, patients and caregivers.

Because ADHD symptoms can shift with age, stress, environment and other life circumstances, treatment needs to be flexible. For some, simple adjustments like classroom seating changes, better sleep or reduced stress may be enough. For others, medication, behavior therapy or a combination of these interventions may be necessary. The key is a personalized approach that adapts as patients’ needs evolve over time.

The Conversation

Carol Mathews receives funding from the National Institutes of Health and the International OCD Foundation. She is affiliated with the International OCD Foundation, and the Family Foundation for OCD Research. She acts as a consultant for the Office of Mental Health for the State of New York.

Stephen V. Faraone receives research funding from the National Institutes of Health, the European Union, the Upstate Foundation and Supernus Pharmaceuticals. With his institution, he holds US patent US20130217707 A1 for the use of sodium-hydrogen exchange inhibitors in the treatment of ADHD. His continuing medical education programs are supported by The Upstate Foundation, Corium Pharmaceuticals, Tris Pharmaceuticals and Supernus Pharmaceuticals. He acts as a consultant to multiple pharmaceutical companies.

ref. Yes, ADHD diagnoses are rising, but that doesn’t mean it’s overdiagnosed – https://theconversation.com/yes-adhd-diagnoses-are-rising-but-that-doesnt-mean-its-overdiagnosed-257108

Focused sound energy holds promise for treating cancer, Alzheimer’s and other diseases

Source: The Conversation – USA (3) – By Richard J. Price, Professor of Biomedical Engineering, University of Virginia

Focused ultrasound directs powerful beams of energy to specific disease targets in the body. Andriy Onufriyenko/Moment via Getty Images

Sound waves at frequencies above the threshold for human hearing are routinely used in medical care. Also known as ultrasound, these sound waves can help clinicians diagnose and monitor disease, and can also provide first glimpses of your newest family members.

And now, patients with conditions ranging from cancer to neurodegenerative diseases like Alzheimer’s may soon benefit from recent advances in this technology.

I am a biomedical engineer who studies how focused ultrasound – the concentration of sound energy into a specific volume – can be fine-tuned to treat various conditions. Over the past few years, this technology has seen significant growth and use in the clinic. And researchers continue to discover new ways to use focused ultrasound to treat disease.

A brief history of focused ultrasound

Ultrasound is generated with a probe containing a material that converts electrical current into vibrations, and vice versa. As ultrasound waves pass through the body, they reflect off the boundaries of different types of tissue. The probe detects these reflections and converts them back into electrical signals that computers can use to create images of those tissues.

Over 80 years ago, scientists found that focusing these ultrasonic waves into a volume about the size of a grain of rice can heat up and destroy brain tissue. This effect is analogous to concentrating sunlight with a magnifying glass to ignite a dry leaf. Early investigators began testing how focused ultrasound could treat neurological disorders, pain and even cancer.

Frontal brain MRI
MRI of a patient treated for essential tremor using focused ultrasound, with the targeted part of the brain circled in red.
Jmarchn/Wikimedia Commons, CC BY-SA

Yet, despite these early findings, technical hurdles stood in the way of applying focused ultrasound in the clinic. For example, because the skull absorbs ultrasound energy, sending focusing beams with high enough energy to reach damaged brain tissue proved difficult. Researchers eventually overcame this problem by integrating large arrays of ultrasound transducers – the probes that convert between electrical signals and vibrations – with image-based information about skull shape and density. This change allowed researchers to better fine-tune the beams to their targets.

It is only after scientists made key advances in imaging technology and acoustic physics in recent years that the promise of ultrasound is being realized in the clinic. Hundreds of clinical trials aimed at treating dozens of conditions have been completed or are ongoing. Researchers have found notable success on a condition called essential tremor, which leads to uncontrolled shaking, usually of the hands. Focused ultrasound treatments for essential tremor are now performed routinely at many locations around the world.

I believe some of the most exciting applications for focused ultrasound include improving drug delivery to the brain, stimulating immune responses against cancer, and treating rare diseases of the central nervous system.

Delivering drugs to the brain

The blood-brain barrier is evolution’s exquisite solution to keeping noxious substances away from this most critical organ. The blood-brain barrier is comprised of very tightly connected cells lining the inside of blood vessels. It only allows certain types of molecules to enter the brain, protecting against pathogens and toxins. However, the blood-brain barrier is problematic when it comes to treating disease because it blocks therapies from reaching their intended target.

More than 20 years ago, pioneering studies determined that sending low-intensity pulses of focused ultrasound could temporarily open the blood-brain barrier by causing microbubbles in blood vessels to oscillate. This oscillation pushes and pulls on the surrounding vessel walls, briefly opening tiny pores that allow drugs in the bloodstream to penetrate into the brain. Critically, the blood-brain barrier opens only where the focused ultrasound is applied.

Focused ultrasound can allow drugs to reach targeted areas of the brain.

After many years of testing the safety of this technique and improving control of ultrasound energy, researchers have developed several devices using focused ultrasound to open the blood-brain barrier for treatment. Clinical trials testing the ability of these devices to deliver drugs to the brain to treat conditions like glioblastoma, brain metastases and Alzheimer’s disease are underway.

In parallel, there has been significant progress in developing gene therapies for numerous brain diseases. Gene therapy involves fixing or replacing faulty genetic material to treat a specific disease. Applying gene therapy to the brain is especially challenging because such therapies typically do not cross the blood-brain barrier.

Animal studies have shown that using focused ultrasound to open the blood-brain barrier can facilitate the delivery of gene therapies to their intended targets in the brain, opening doors to testing this technique in people.

Stimulating immune responses against cancer

Cancer immunotherapy instructs the patient’s own immune system to fight the disease. However, many patients – especially those afflicted with breast cancer, pancreatic cancer and glioblastoma – have tumors that are immunologically “cold,” meaning they are unresponsive to traditional immunotherapies.

Researchers have learned that focused ultrasound can destroy solid tumors in ways that allow the immune system to better recognize and destroy cancer cells. One way focused ultrasound does this is by turning tumors into debris that then literally flows to the lymph nodes. Once immune cells in the lymph nodes encounter this debris, they can initiate an immune response specifically against the cancer.

Inspired by these breakthroughs, the University of Virginia started the world’s first focused ultrasound immuno-oncology center in 2022 to support research in this area and push the most promising approaches to the clinic. For example, my colleagues are running a clinical trial at the center to test the use of focused ultrasound and immunotherapy to treat patients with advanced melanoma.

Treating rare diseases with focused ultrasound

Research on focused ultrasound has primarily focused on the most devastating and prevalent diseases, such as cancer and Alzheimer’s disease. However, I believe that further developments in, and increased use of, focused ultrasound in the clinic will eventually benefit patients with rare diseases.

One rare disease of particular interest for my lab is cerebral cavernous malformation, or CCM. CCMs are lesions in the brain that occur when the cells that make up blood vessels undergo uncontrolled growth. While uncommon, when these lesions grow and hemorrhage, they can cause debilitating neurological symptoms. The most common treatment for CCM is surgical removal of the brain lesions; however, some CCMs are located in brain areas that are difficult to access, creating a risk of side effects. Radiation is another treatment option, but it, too, can lead to serious adverse effects.

We found that using focused ultrasound to open the blood-brain barrier can improve drug delivery to CCMs. Additionally, we also observed that focused ultrasound treatment itself could stop CCMs from growing in mice, even without administering a drug. While we don’t yet understand how focused ultrasound is stabilizing CCMs, abundant research on the safety of using this technique in patients treated for other conditions has allowed neurosurgeons to begin designing clinical trials testing the use of this technique on people with CCM.

With further research and advancements, I am hopeful that focused ultrasound can become a viable treatment option for many devastating rare diseases.

The Conversation

Richard J. Price receives funding from the National Institutes of Health and the UVA Focused Ultrasound Cancer Immunotherapy Center.

ref. Focused sound energy holds promise for treating cancer, Alzheimer’s and other diseases – https://theconversation.com/focused-sound-energy-holds-promise-for-treating-cancer-alzheimers-and-other-diseases-262622

Flamingos are making a home in Florida again after 100 years – an ecologist explains why they may be returning for good

Source: The Conversation – USA – By Jerome Lorenz, Biology Researcher, Florida International University

Peaches, who was blown into Florida by Hurricane Idalia in 2023, was sighted in Mexico in June 2025. Kara Durda/Audubon Florida

Hurricane Idalia blew a flamboyance, or flock, of 300-400 flamingos that was likely migrating between the Yucatan Peninsula and Cuba off course in August 2023 and unceremoniously deposited the birds across a wide swath of the eastern United States, from Florida’s Gulf Coast all the way up to Wisconsin and east to Pennsylvania.

After Hurricane Idalia, more than 300 credible sightings of flamingos across the eastern U.S. were reported.
Audubon Florida

I’m an estuarine scientist. That means I study ecosystems where fresh water flows into the ocean. I’ve spent 35 years with Audubon Florida studying the ecology of American flamingos and other wading birds in Florida Bay, Everglades National Park. So naturally, I was thrilled and intrigued by the sudden arrival of these flamingos.

One of the birds was rescued in the Tampa area after nearly drowning in the Gulf of Mexico. His rescuers named him Peaches.

A colleague and I were able to place a GPS tracking device and a bright blue band around his spindly leg, with the code “US02” engraved in white letters.

A woman holds a flamingo while two men are trying to put a band on its leg.
Melissa Edwards, Avian Hospital Director at Seaside Seabird Sanctuary, holds Peaches still while Dr. Frank Ridgley of Zoo Miami and the author, Dr. Jerome Lorenz, place a band and GPS tracker on his leg. Dr. Lorenz has banded or supervised the banding of nearly 3,000 roseate spoonbills, but Peaches was his first and only flamingo to date.
Linda Lorenz

We were hoping to track his movements and see whether he ended up settling in Florida. Unfortunately, a few days after Peaches was released back into the wilds of Tampa Bay, the tracking device failed. His last reported sighting was on a beach near Marco Island on Oct. 5, 2023.

Then, in June 2025, I received an email from colleagues at the Rio Lagartos Biosphere Reserve in Yucatan, Mexico, who had photographed Peaches, blue band still in place, nesting in the reserve.

Peaches’ story is the latest piece in the historical puzzle of flamingos in Florida. Though the native population disappeared more than 100 years ago, recent events lead me to believe that flamingos may be coming back to the Sunshine State, and that their return has been facilitated by the concerted effort to restore the Everglades and coastal ecosystems.

Decimation of a population

In 1956, ornithologist and founder of the National Audubon’s Everglades Science Center Robert Porter Allen wrote “The Flamingos: Their Life History and Survival,” which is still considered a seminal document on the history of flamingos in Florida.

In his book, Allen cites several historical and scientific manuscripts from the 1800s that indicate flamboyances of hundreds to thousands were seen in the Everglades, Florida Bay and the Florida Keys.

Allen documents the demise of flamingos in the late 1800s, in Florida and throughout their Caribbean and Bahamian range. Like all wading birds in Florida, they fell victim to the women’s fashion trend of adorning hats with bird feathers. Wading bird feathers were literally worth their weight in gold.

Led by the National Association of Audubon Societies’ vocal opposition, the grassroots environmental movement that followed brought about laws prohibiting the hunting and sale of bird feathers. But enforcement of those laws in sparsely populated Florida was difficult, and on two occasions deputized Audubon wardens were murdered protecting wading bird nesting colonies.

Fortunately, within a few years, societal pressure turned the tide against the practice of wearing feathers. The passage of the Migratory Bird Treaty Act in 1918 officially ended the feather trade.

Given legal protection, most species managed to reestablish huge nesting populations in the Everglades by the 1930s-1940s, presumably migrating from remote populations in Central America and the Caribbean.

Flamingos, however, did not.

A long road to recovery

In 1956, 40 years after hunting had ended, Allen estimated flamingo populations were only about 25% of what they had been in the previous century, with numbers plummeting from 168,000 to 43,000 breeding adults. They nested in significant numbers at only four locations, compared to 29 historically.

Flamingos’ unique breeding behaviors and their longevity – they can live up to 50 years in the wild – may account for their struggle to bounce back. Other Florida wading birds can nest multiple times a year at different locations, laying three to five eggs at a time.

Flamingos, on the other hand, nest only once a year, generally returning to the same location year after year, and lay only one egg. Furthermore, they prefer forming huge nesting colonies, with thousands of nests, in part due to their elaborate group courtship rituals.

Reason to hope

As a result of their rarity from the 1950s to 1980s, scientists – including myself – believed that any flamingos sighted intermittently around Florida were not wild birds but rather escapees from captive populations.

The largest flock observed in the state between 1930 and 1976 was 14 birds spotted in Biscayne Bay in 1934, on the day after Hialeah Race Track in Miami imported a group of about 30 flamingos. The track’s owners had failed to pinion the birds, and they simply flew away upon release.

But my opinion began to change in 2002, when a flamingo that was banded as a chick at Rio Lagartos was photographed in Florida Bay. In 2012, a second bird from Rio Lagartos was photographed.

By that time, I had observed flamingos in Florida Bay on several occasions, including larger flamboyances of 24 and 64 individuals. Although I still thought the majority of these flocks were escapees, the banded birds provided some evidence that at least a few wild flamingos were starting to spend time in Florida.

Then in 2015, my colleagues put a tracking device on a flamingo they had captured at the Key West Naval Air Station. Conchy, as we called him, was given the blue band US01 and released in Florida Bay in December 2015.

He lived in Florida Bay for two years, and the fact that he stayed for that long was proof to me that it was possible for flamingos to make a more permanent home in Florida.

Conchy was banded and given a GPS tracker by Dr. Frank Ridgley of Zoo Miami before being re-released into Florida Bay in 2015.

In 2018, several colleagues and I published a paper laying out both evidence from historical accounts and also previously overlooked evidence from museums that flamingos were native to Florida. We also presented new data from researchers and citizen science portals that strongly indicated that wild flamingo numbers were increasing in Florida. This suggested that the population might be finally recovering.

Call it a comeback

Fast-forward to today, and it appears that this slow comeback may finally have legs. Six months after Hurricane Idalia, my colleagues at Audubon Florida and I conducted a weeklong online survey of flamingo sightings in Florida.

We received more than 50 reputable observations. After sorting through these observations to remove duplicates, we concluded that at least 100 flamingos were left in the state.

Then in July 2025, a flock of 125 individuals was photographed in Florida Bay. Based on our observations, my colleagues and I believe that the flamingos that arrived with Idalia may be reestablishing a home in Florida.

Progress toward restoration

The question is, why now? The 24 flamingos I saw in 1992 and the 64 I saw in 2004 didn’t take up permanent residence in the state. So what’s changed?

To me, the answer is clear: Efforts to restore the Everglades and Florida’s coastal ecosystems are beginning to show progress.

When I arrived in the Keys in 1989, Florida Bay was undergoing an ecological collapse. A 1993 interagency report by the federal government found that a hundred years of draining, diking and rerouting the flows of the Everglades to create urban and agricultural lands had raised the salt content of the water, making it uninhabitable for many estuarine animals.

The report noted that the bay’s famous seagrass beds were undergoing a massive die-off, accompanied by algal blooms that depleted oxygen levels, thereby killing fish in large numbers. Mangrove trees were dying on its myriad islands, and birds that for decades had nested in them had disappeared.

These events kick-started Everglades restoration efforts, and in 2000 the U.S. Congress passed the Comprehensive Everglades Restoration Plan with nearly unanimous bipartisan support. With a cost in the tens of billions of dollars, it was to be the largest and most expensive ecological restoration project the world has ever seen.

Today, the bay’s health is vastly improved from the condition I observe in the 1980s. Water flow has gotten better, and the salinity is back to appropriate levels to support wildlife.

In 2018 and 2021, more than 100,000 pairs of wading birds such as white ibis, wood storks and roseate spoonbills nested in the Everglades. These numbers hadn’t been seen since the 1940s. In the 1980s and 1990s, 20,000 nesting pairs was thought to be a banner year.

While the Everglades and Florida Bay are still a long way from full restoration, I believe that the return of flamingos such as Conchy and Peaches is evidence that these efforts are on the right track.

The Conversation

Jerome Lorenz has received funding from The Lynn and Louis Wolfson II Family Foundation, the Batchelor Foundation and the Ron Magill Conservation Endowment. He is retired from the National Audubon Society but still does some volunteer work for the Everglades Science Center.

ref. Flamingos are making a home in Florida again after 100 years – an ecologist explains why they may be returning for good – https://theconversation.com/flamingos-are-making-a-home-in-florida-again-after-100-years-an-ecologist-explains-why-they-may-be-returning-for-good-258658

Government shutdown hasn’t left US consumers glum about the economy – for now, at least

Source: The Conversation – USA (2) – By Joanne Hsu, Research Associate Professor at the Institute for Social Research, University of Michigan

Economic clouds gathering? Perhaps not yet. Brendan Smialowski/AFP via Getty Images

The ongoing federal shutdown has resulted in a pause on regular government data releases, meaning economic data has been in short supply of late. That has left market-watchers and monetary policymakers somewhat in the dark over key indicators in the U.S. economy.

Fortunately, the University of Michigan’s Surveys of Consumers is unaffected by the impasse in Washington and released its preliminary monthly report on Oct. 10, 2025; the final read of the month will be released in two weeks.

The Conversation U.S. spoke with Joanne Hsu, the director of the Surveys of Consumers, on what the latest data shows about consumer sentiment – and whether the shutdown has left Americans feeling blue.

What is consumer sentiment?

Consumer sentiment is something that we at the University of Michigan have measured since 1946. It looks at American attitudes toward the current state of the economy and the future direction of the economy through questions on personal finances, business conditions and buying conditions for big-ticket items.

Over the decades, it has been closely followed by policymakers, business leaders, academic researchers and investors as a leading indicator of the overall state of the economy.

When sentiment is on the decline, consumers tend to pull back on spending – and that can lead to a slowdown in the economy. The opposite is also true: High or rising sentiment tends to lead to increased spending and a growing economy.

How is the survey compiled?

Every month, we interview a random sample of the U.S. population across the 48 contiguous states and the District of Columbia. Around 1,000 or so people take part in it every month, and we include a representative sample across ages, income, education level, demography and geography. People from across all walks of life are asked around 50 questions pertaining to the economy, personal finances, job prospects, inflation expectations and the like.

When you aggregate that all together, it gives a useful measure of the health of the U.S. economy.

What does the latest survey show?

The latest survey shows virtually no change in overall sentiment between September and October. Consumers are not feeling that optimistic at the moment, but generally no worse than they were last month.

Pocketbook issues – high prices of goods, inflation and possible weakening in the labor market – are suppressing sentiment. Views of consumers across the country converged earlier in the year when the Trump administration’s tariffs were announced. But since then, higher-wealth and higher-income consumers have reported improved consumer sentiment. It is for lower-income Americans – those not owning stock – that sentiment hasn’t lifted since April.

In October, we also saw a slight decline in inflation expectations, but it remains relatively high – midway between where they were around a year ago and the highs of around the time of the tariff announcements in April and May.

Has the government shutdown affected consumer sentiment?

The government shutdown was in place for around half the time of the latest survey period, which ran from Sept. 23-Oct. 6, 2025. And so far, we are not seeing evidence that it is impacting consumer sentiment one way or another.

And that is not super-surprising. It is not that people don’t care about the shutdown, just that it hasn’t affected how they see the economy and their personal finances yet.

History shows that federal shutdowns do move the needle a little. In 2019, around 10% of people spontaneously mentioned the then-shutdown in the January survey. We saw a decline in sentiment in that month, but it did improve again the following month.

Looking back, we tend to see stronger reaction to shutdowns when there is a debt ceiling crisis attached. In 2013, for example, there was a decline in consumer sentiment coinciding with concerns over the debt ceiling being breached. But it did quickly rebound when the government opened again.

Whether or not we see a decline in sentiment because of the current shutdown depends on how long it lasts – and how consumers believe it will impact pocketbook issues, namely prices and job prospects.

The Conversation

Joanne Hsu receives research funding from NIA, NIH, and various sponsors of the University of Michigan Surveys of Consumers.

ref. Government shutdown hasn’t left US consumers glum about the economy – for now, at least – https://theconversation.com/government-shutdown-hasnt-left-us-consumers-glum-about-the-economy-for-now-at-least-267264

New president of The Church of Jesus Christ of Latter-day Saints inherits a global faith far more diverse than many realize

Source: The Conversation – USA (3) – By Brittany Romanello, Assistant Professor of Sociology, University of Arkansas

Missionary Sayon Ang holds up a sign signifying she speaks Cambodian during the twice-annual conference of The Church of Jesus Christ of Latter-day Saints on Oct. 4, 2014, in Salt Lake City. AP Photo/Kim Raff

The Church of Jesus Christ of Latter-day Saints has spent the past few weeks in a moment of both mourning and transition. On Sept. 28, 2025, a shooting and arson at a Latter-day Saints meetinghouse in Michigan killed four people and wounded eight more. What’s more, Russell M. Nelson, president of the church, died the day before at age 101. Dallin H. Oaks, the longest-serving of the church’s top leaders, was announced the new president on Oct. 14.

Oaks will inherit leadership of a religious institution that is both deeply American and increasingly global – diversity at odds with the way it’s typically represented in mainstream media, from “The Secret Life of Mormon Wives” to “The Book of Mormon” Broadway musical.

As a cultural anthropologist and ethnographer, I research Latter-day Saints communities across the United States, particularly Latina immigrants and young adults. When presenting my research, I’ve noticed that many people still closely associate the church with Utah, where its headquarters are located.

An ornate white building with a tall spire, and green mountains in the background.
The Latter-day Saints temple in Cochabamba, Bolivia, was dedicated in 2000.
Parallelepiped09/Wikimedia Commons, CC BY-SA

The church has played a pivotal role in Utah’s history and culture. Today, though, only 42% of its residents are members. The stereotype of Latter-day Saints as mostly white, conservative Americans is just one of many long-standing misconceptions about LDS communities and beliefs.

Many people are surprised to learn there are vibrant congregations far from the American West’s “Mormon Corridor.” There are devout Latter-day Saints everywhere from Ghana and the United Arab Emirates to Russia and mainland China.

Global growth

Joseph Smith founded The Church of Jesus Christ of Latter-day Saints in upstate New York in 1830 and immediately sent missionaries to preach along the frontier. The first overseas missionaries traveled to England in 1837.

Shortly after World War II, church leaders overhauled their missionary approach to increase the number of international missions. This strategy led to growth across the globe, especially in Central America, South America and the Pacific Islands.

Today, the church has over 17.5 million members, according to church records. A majority live outside the U.S., spread across more than 160 countries.

One way the church and researchers track this global growth is by construction of new temples.These buildings, used not for weekly worship but special ceremonies like weddings, were once almost exclusively located in the United States. Today, they exist in dozens of countries, from Argentina to Tonga.

During Nelson’s presidency, which began in 2018, he announced 200 new temples, more than any of his predecessors. Temples are a physical and symbolic representation of the church’s commitment to being a global religion, although cultural tensions remain.

Two men in suits walk by a large map of the world framed on the wall of a hallway.
Two missionaries for The Church of Jesus Christ of Latter-day Saints walk through the Missionary Training Center in Provo, Utah, in 2008.
AP Photo/George Frey

Among U.S. members, demographics are also shifting. Seventy-two percent of American members are white, down from 85% in 2007, according to the Pew Research Center. Growing numbers of Latinos – 12% of U.S. members – have played a significant role sustaining congregations across the country.

There are congregations in every U.S. state, including the small community of Grand Blanc, Michigan, site of the tragic shooting. Suspect Thomas Jacob Sanford, who was fatally shot by police, had gone on a recent tirade against Latter-day Saints during a conversation with a local political candidate.

In the following days, an American member of the church raised hundreds of thousands of dollars for Sanford’s family.

Growing pains

Despite the church’s diversity, its institutional foundations remain firmly rooted in the United States. The top leadership bodies are still composed almost entirely of white men, and most are American-born.

As the church continues to grow, questions arise about how well the norms of a Utah-based church fit the realities of members in Manila or Mexico City, Bangalore or Berlin. How much room is there, even in U.S. congregations, for local cultural expressions of faith?

Latino Latter-day Saints and members in Latin America, for example, have faced pushback against cultural traditions that were seen as distinctly “not LDS,” such as making altars and giving offerings during Dia de los Muertos. In 2021, the church launched a Spanish-language campaign using Day of the Dead imagery to increase interest among Latinos. Many members were happy to see this representation. Still, some women I spoke with said that an emphasis on whiteness and American nationalism, as well as anti-immigrant rhetoric they’d heard from other members, deterred them from fully celebrating their cultures.

A couple dressed nicely and holding hands walks by a large portrait of Jesus, portrayed as a bearded white man, inside a large hallway.
People attend the twice-annual conference of The Church of Jesus Christ of Latter-day Saints on April 6, 2024, in Salt Lake City.
AP Photo/Rick Bowmer

Even aesthetic details, like musical styles, often reflect a distinctly American model. The standardized hymnal, for example, contains patriotic songs like “America the Beautiful.” This emphasis on American culture can feel especially out of sync in places in countries with high membership rates that have histories of U.S. military or political interventions.

Expectations about clothing and physical appearance, too, have prompted questions about representation, belonging and authority. It was only in 2024, for instance, that the church offered members in humid areas sleeveless versions of the sacred garments Latter-day Saints wear under clothing as a reminder of their faith.

Historically, the church viewed tattoos as taboo – a violation of the sanctity of the body. Many parts of the world have thousands of years of sacred tattooing traditions – including Oceania, which has high rates of church membership.

Change ahead?

Among many challenges, the next president of the church will navigate how to lead a global church from its American headquarters – a church that continues to be misunderstood and stereotyped, sometimes to the point of violence.

A white building in the distance, with palm trees and a clear reflecting pool in the foreground.
The temple in Laie, Hawaii, opened in the early 1900s, making it one of the church’s oldest.
Kaveh/Wikimedia Commons, CC BY-SA

The number of Latter-day Saints continues to grow in many parts of the world, but this growth brings a greater need for cultural sensitivity. The church, historically very uniform in its efforts to standardize Latter-day Saints history, art and teachings, is finding that harder to maintain when congregations span dozens of countries, languages, customs and histories.

Organizing the church like a corporation, with a top-down decision-making process, can also make it difficult to address painful racial histories and the needs of marginalized groups, like LGBTQ+ members.

The transition in leadership offers an opportunity not only for the church but for the broader public to better understand the multifaceted, global nature of Latter-day Saints’ lives today.

This article has been updated with Dallin Oaks officially named president of The Church of Jesus Christ of Latter-day Saints on Oct. 14.

The Conversation

Brittany Romanello does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. New president of The Church of Jesus Christ of Latter-day Saints inherits a global faith far more diverse than many realize – https://theconversation.com/new-president-of-the-church-of-jesus-christ-of-latter-day-saints-inherits-a-global-faith-far-more-diverse-than-many-realize-266337

Typhoon leaves flooded Alaska villages facing a storm recovery far tougher than most Americans will ever experience

Source: The Conversation – USA (2) – By Rick Thoman, Alaska Climate Specialist, University of Alaska Fairbanks

A Coast Guard helicopter flies over flooded homes in Kipnuk, Alaska, on Oct. 12, 2025. U.S. Coast Guard

Remnants of a powerful typhoon swept into Western Alaska’s Yukon-Kuskokwim Delta on Oct. 12, 2025, producing a storm surge that flooded villages as far as 60 miles up the river. The water pushed homes off their foundations and set some afloat with people inside, officials said. More than 50 people had to be rescued in Kipnuk and Kwigillingok, hundreds were displaced in the region, and at least one person died.

Typhoon Halong was an unusual storm, likely fueled by the Pacific’s near-record warm surface temperatures this fall. Its timing means recovery will be even more difficult than usual for these hard-hit communities, as Alaska meteorologist Rick Thoman of the University Alaska Fairbanks explains.

Disasters in remote Alaska are not like disasters anywhere in the lower 48 states, he explains. While East Coast homeowners recovering from a nor’easter that flooded parts of New Jersey and other states the same weekend can run to Home Depot for supplies or drive to a hotel if their home floods, none of that exists in remote Native villages.

Homes are jumbled together and buildings are surrounded by water after the storm.
Ex-Typhoon Halong’s storm surge and powerful winds knocked homes off their foundations and set some afloat.
U.S. Coast Guard via AP

What made this storm unusual?

Halong was an ex-typhoon, similar to Merbok in 2022, by the time it reached the delta. A week earlier, it had been a powerful typhoon east of Japan. The jet stream picked it up and carried it to the northeast, which is pretty common, and weather models did a pretty good job in forecasting its track into the Bering Sea.

But as the storm approached Alaska, everything went sideways.

The weather model forecasts changed, reflecting a faster-moving storm, and Halong shifted to a very unusual track, moving between Saint Lawrence Island and the Yukon-Kuskokwim Delta coast.

A weather map with the storm's locations over time, headed toward Nome but then turning.
Ex-Typhoon Halong’s storm track showing its turn toward Alaska’s Yukon-Kuskokwim Delta.
Rick Thoman

Unlike Merbok, which was very well forecast by the global models, this one’s final track and intensity weren’t clear until the storm was within 36 hours of crossing into Alaska waters. That’s too late for evacuations in many places.

Did the loss of weather balloon data canceled in 2025 affect the forecast?

That’s a question for future research, but here’s what we know for sure: There have not been any upper-air weather balloon observations at Saint Paul Island in the Bering Sea since late August or at Kotzebue since February. Bethel and Cold Bay are limited to one per day instead of two. At Nome, there were no weather balloons for two full days as the storm was moving toward the Bering Sea.

Did any of this cause the forecast to be off? We don’t know because we don’t have the data, but it seems likely that that had some effect on the model performance.

Why is the delta region so vulnerable in a storm like Halong?

The land in this part of western Alaska is very flat, so major storms can drive the ocean into the delta, and the water spreads out.

Most of the land there is very close to sea level, in some places less than 10 feet above the high tide line. Permafrost is also thawing, land is subsiding, and sea-level rise is adding to the risk. For many people, there is literally nowhere to go. Even Bethel, the region’s largest town, about 60 miles up the Kuskokwim River, saw flooding from Halong.

These are very remote communities with no roads to cities. The only way to access them is by boat or plane. Right now, they have a lot of people with nowhere to live, and winter is closing in.

Native residents of Kipnuk discuss the challenges of permafrost loss and climate change in their village. Alaska Institute for Justice.

These villages are also small. They don’t have extra housing or the resources to rapidly recover. The region was already recovering from major flooding in summer 2024. Kipnuk’s tribe was able to get federal disaster aid, but that aid was approved only in early January 2025.

What are these communities facing in terms of recovery?

People are going to have really difficult decisions to make. Do they leave the community for the winter and hope to rebuild next summer?

There likely isn’t much available housing in the region, with the flooding so widespread on top of a housing shortage. Do displaced people go to Anchorage? Cities are expensive.

There is no easy answer.

It’s logistically complicated to rebuild in places like Kipnuk. You can’t just get on the phone and call up your local building contractor.

Almost all of the supplies have to come in by barge – plywood to nails to windows – and that isn’t going to happen in winter. You can’t truck it in – there are no roads. Planes can only fly in small amounts – the runways are short and not built for cargo planes.

The National Guard might be able to help fly in supplies. But then you still need to have people who can do the construction and other repair work.

Everything is 100 times more complicated when it comes to building in remote communities. Even if national or state help is approved, it would be next summer before most homes could be rebuilt.

Is climate change playing a role in storms like these?

That will be another question for future research, but sea-surface temperature in most of the North Pacific that Typhoon Halong passed over before reaching the Aleutian Islands has been much warmer than normal. Warm water fuels storms.

An animated map shows anomalously warm temperatures across the ocean between Japan and the U.S.
A comparison of daily sea-surface temperatures shows how anomalously warm much of the northern Pacific Ocean was ahead of and during Typhoon Halong.
NOAA Coral Reef Watch

Halong also brought lots of very warm air northward with it. East of the track on Oct. 11, Unalaska reached 68 degrees Fahrenheit (20 degrees Celsius), an all-time high there for October.

The Conversation

Rick Thoman retired from National Weather Service Alaska Region in 2018.

ref. Typhoon leaves flooded Alaska villages facing a storm recovery far tougher than most Americans will ever experience – https://theconversation.com/typhoon-leaves-flooded-alaska-villages-facing-a-storm-recovery-far-tougher-than-most-americans-will-ever-experience-267423

What the First Amendment doesn’t protect when it comes to professors speaking out on politics

Source: The Conversation – USA (2) – By Neal H. Hutchens, University Research Professor of Education, University of Kentucky

Employees at public and private colleges do not have the same First Amendment rights. dane_mark/Royalty-free

American colleges and universities are increasingly firing or punishing professors and other employees for what they say, whether it’s on social media or in the classroom.

After the Sept. 10, 2025, killing of conservative activist Charlie Kirk, several universities, including Iowa State University, Clemson University, Ball State University and others, fired or suspended employees for making negative online comments about Kirk.

Some of these dismissed professors compared Kirk to a Nazi, described his views as hateful, or said there was no reason to be sorry about his death.

Some professors are now suing their employers for taking disciplinary action against them, claiming they are violating their First Amendment rights.

In one case, the University of South Dakota fired Phillip Michael Cook, a tenured art professor, after he posted on Facebook in September that Kirk was a “hate spreading Nazi.” Cook, who took down his post within a few hours and apologized for it, then sued the school, saying it was violating his First Amendment rights.

A federal judge stated in a Sept. 23 preliminary order that the First Amendment likely protected what Cook posted. The judge ordered the University of South Dakota to reinstate Cook, and the university announced on Oct. 4 that it would reverse Cook’s firing.

Cook’s lawsuit, as well as other lawsuits filed by dismissed professors, is testing how much legal authority colleges have over their employees’ speech – both when they are on the job and when they are not.

For decades, American colleges and universities have traditionally encouraged free speech and open debate as a core part of their academic mission.

As scholars who study college free speech and academic freedom, we recognize that these events raise an important question: When, if ever, can a college legally discipline an employee for what they say?

A university campus with various buildings and trees is seen from above.
An aerial view of University of South Dakota’s Vermillion campus, one of the places where a professor was recently fired for posting comments about Charlie Kirk, a decision that was later reversed.
anup khanal – CC BY-SA 4.0

Limits of public employees’ speech rights

The First Amendment limits the government’s power to censor people’s free speech. People in the United States can, for instance, join protests, criticize the government and say things that others find offensive.

But the First Amendment only applies to the government – which includes public colleges and universities – and not private institutions or companies, including private colleges and universities.

This means private colleges typically have wide authority to discipline employees for their speech.

In contrast, public colleges are considered part of the government. The First Amendment limits the legal authority they have over their employees’ speech. This is especially true when an employee is speaking as a private citizen – such as participating in a political rally outside of work hours, for example.

The Supreme Court ruled in a landmark 1968 case that public employees’ speech rights as private citizens can extend to criticizing their employer, like if they write a letter critical of their employer to a newspaper.

The Supreme Court also ruled in 2006 that
the First Amendment does not protect public employees from being disciplined by their employers when they say or write something as part of their official job duties.

Even when a public college employee is speaking outside of their job duties as a private citizen, they might not be guaranteed First Amendment protection. To reach this legal threshold, what they say must be about something of importance to the public, or what courts call a “matter of public concern.”

Talking or writing about news, politics or social matters – Kirk’s murder – often meets the legal test for when speech is about a matter of public concern.

In contrast, courts have ruled that personal workplace complaints or gossip typically does not guarantee freedom of speech protection.

And in some cases, even when a public employee speaks as a private citizen on a topic that a court considers a matter of public concern, their speech may still be unprotected.

A public employer can still convince a court that its reasons for prohibiting an employee’s speech – like preventing conflict among co-workers – are important enough to deny this employee First Amendment protection.

Lawsuits brought by the employees of public colleges and universities who have been fired for their comments about Kirk may likely be decided based on whether what they said or wrote amounts to a matter of public concern. Another important factor is whether a court is convinced that an employee’s speech about Kirk was serious enough to disrupt a college’s operations, thus justifying the employee’s firing.

Academic freedom and professors’ speech

There are also questions over whether professors at public universities, in particular, can cite other legal rights to protect their speech.

Academic freedom refers to a faculty member’s rights connected to their teaching and research expertise.

At both private and public colleges, professors’ work contracts – like the ones typically signed after receiving tenure – potentially provide legal protections for faculty speech connected to academic freedom, such as in the classroom.

However, the First Amendment does not apply to how a private college regulates its professors’ speech or academic freedom.

Professors at public colleges have at least the same First Amendment free speech rights as their fellow employees, like when speaking in a private citizen capacity.

Additionally, the First Amendment might protect a public college professor’s work-related speech when academic freedom concerns arise, like in their teaching and research.

In 2006, the Supreme Court left open the question of whether the First Amendment covers academic freedom, in a case where it found the First Amendment did not cover what public employees say when carrying out their official work.

Since then, the Supreme Court has not dealt with this complicated issue. And lower federal courts have reached conflicting decisions about First Amendment protection for public college professors’ speech in their teaching and research.

A large gray stone plaque shows the First Amendment in front of a green grassy field and buildings in the distance.
The First Amendment is on display in front of Independence Hall in Philadelphia.
StephanieCraig/iStock via Getty Images Plus

Future of free speech for university employees

Some colleges, especially public ones, are testing the legal limits of their authority over their employees’ speech.

These incidents demonstrate a culture of extreme political polarization in higher education.

Beyond legal questions, colleges are also grappling with how to define their commitments to free speech and academic freedom.

In particular, we believe campus leaders should consider the purpose of higher education. Even if legally permitted, restricting employees’ speech could run counter to colleges’ traditional role as places for the open exchange of ideas.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. What the First Amendment doesn’t protect when it comes to professors speaking out on politics – https://theconversation.com/what-the-first-amendment-doesnt-protect-when-it-comes-to-professors-speaking-out-on-politics-266128

What are climate tipping points? They sound scary, especially for ice sheets and oceans, but there’s still room for optimism

Source: The Conversation – USA (2) – By Alexandra A Phillips, Assistant Teaching Professor in Environmental Communication, University of California, Santa Barbara

Meltwater runs across the Greenland ice sheet in rivers. The ice sheet is already losing mass and could soon reach a tipping point. Maria-José Viñas/NASA

As the planet warms, it risks crossing catastrophic tipping points: thresholds where Earth systems, such as ice sheets and rain forests, change irreversibly over human lifetimes.

Scientists have long warned that if global temperatures warmed more than 1.5 degrees Celsius (2.7 Fahrenheit) compared with before the Industrial Revolution, and stayed high, they would increase the risk of passing multiple tipping points. For each of these elements, like the Amazon rain forest or the Greenland ice sheet, hotter temperatures lead to melting ice or drier forests that leave the system more vulnerable to further changes.

Worse, these systems can interact. Freshwater melting from the Greenland ice sheet can weaken ocean currents in the North Atlantic, disrupting air and ocean temperature patterns and marine food chains.

World map showing locations for potential tipping points.
Pink circles show the systems closest to tipping points. Some would have regional effects, such as loss of coral reefs. Others are global, such as the beginning of the collapse of the Greenland ice sheet.
Global Tipping Points Report, CC BY-ND

With these warnings in mind, 194 countries a decade ago set 1.5 C as a goal they would try not to cross. Yet in 2024, the planet temporarily breached that threshold.

The term “tipping point” is often used to illustrate these problems, but apocalyptic messages can leave people feeling helpless, wondering if it’s pointless to slam the brakes. As a geoscientist who has studied the ocean and climate for over a decade and recently spent a year on Capitol Hill working on bipartisan climate policy, I still see room for optimism.

It helps to understand what a tipping point is – and what’s known about when each might be reached.

Tipping points are not precise

A tipping point is a metaphor for runaway change. Small changes can push a system out of balance. Once past a threshold, the changes reinforce themselves, amplifying until the system transforms into something new.

Almost as soon as “tipping points” entered the climate science lexicon — following Malcolm Gladwell’s 2000 book, “The Tipping Point: How Little Things Can Make a Big Difference” — scientists warned the public not to confuse global warming policy benchmarks with precise thresholds.

A tall glacier front seen from above shows huge chunks of ice calving off into Disko Bay.
The Greenland ice sheet, which is 1.9 miles (3 kilometers) thick at its thickest point, has been losing mass for several years as temperatures rise and more of its ice is lost to the ocean. A tipping point would mean runaway ice loss, with the potential to eventually raise sea level 24 feet (7.4 meters) and shut down a crucial ocean circulation.
Sean Gallup/Getty Images

The scientific reality of tipping points is more complicated than crossing a temperature line. Instead, different elements in the climate system have risks of tipping that increase with each fraction of a degree of warming.

For example, the beginning of a slow collapse of the Greenland ice sheet, which could raise global sea level by about 24 feet (7.4 meters), is one of the most likely tipping elements in a world more than 1.5 C warmer than preindustrial times. Some models place the critical threshold at 1.6 C (2.9 F). More recent simulations estimate runaway conditions at 2.7 C (4.9 F) of warming. Both simulations consider when summer melt will outpace winter snow, but predicting the future is not an exact science.

Bars with gradients show the rising risk as temperatures rise that key systems, including Greenland ice sheet and Amazon rain forest, will reach tipping points.
Gradients show science-based estimates from the Global Tipping Points Report of when key global or regional climate tipping points are increasingly likely to be reached. Every fraction of a degree increases the likeliness, reflected in the warming color.
Global Tipping Points Report 2025, CC BY-ND

Forecasts like these are generated using powerful climate models that simulate how air, oceans, land and ice interact. These virtual laboratories allow scientists to run experiments, increasing the temperature bit by bit to see when each element might tip.

Climate scientist Timothy Lenton first identified climate tipping points in 2008. In 2022, he and his team revisited temperature collapse ranges, integrating over a decade of additional data and more sophisticated computer models.

Their nine core tipping elements include large-scale components of Earth’s climate, such as ice sheets, rain forests and ocean currents. They also simulated thresholds for smaller tipping elements that pack a large punch, including die-offs of coral reefs and widespread thawing of permafrost.

A few fish swim among branches of a white coral skeleton during a bleaching event.
The world may have already passed one tipping point, according to the 2025 Global Tipping Points Report: Corals reefs are dying as marine temperatures rise. Healthy reefs are essential fish nurseries and habitat and also help protect coastlines from storm erosion. Once they die, their structures begin to disintegrate.
Vardhan Patankar/Wikimedia Commons, CC BY-SA

Some tipping elements, such as the East Antarctic ice sheet, aren’t in immediate danger. The ice sheet’s stability is due to its massive size – nearly six times that of the Greenland ice sheet – making it much harder to push out of equilibrium. Model results vary, but they generally place its tipping threshold between 5 C (9 F) and 10 C (18 F) of warming.

Other elements, however, are closer to the edge.

Alarm bells sounding in forests and oceans

In the Amazon, self-perpetuating feedback loops threaten the stability of the Earth’s largest rain forest, an ecosystem that influences global climate. As temperatures rise, drought and wildfire activity increase, killing trees and releasing more carbon into the atmosphere, which in turn makes the forest hotter and drier still.

By 2050, scientists warn, nearly half of the Amazon rain forest could face multiple stressors. That pressure may trigger a tipping point with mass tree die-offs. The once-damp rainforest canopy could shift to a dry savanna for at least several centuries.

Rising temperatures also threaten biodiversity underwater.

The second Global Tipping Points Report, released Oct. 12, 2025, by a team of 160 scientists including Lenton, suggests tropical reefs may have passed a tipping point that will wipe out all but isolated patches.

Coral loss on the Great Barrier Reef. Australian Institute of Marine Science.

Corals rely on algae called zooxanthellae to thrive. Under heat stress, the algae leave their coral homes, draining reefs of nutrition and color. These mass bleaching events can kill corals, stripping the ecosystem of vital biodiversity that millions of people rely on for food and tourism.

Low-latitude reefs have the highest risk of tipping, with the upper threshold at just 1.5 C, the report found. Above this amount of warming, there is a 99% chance that these coral reefs tip past their breaking point.

Similar alarms are ringing for ocean currents, where freshwater ice melt is slowing down a major marine highway that circulates heat, known as the Atlantic Meridional Overturning Circulation, or AMOC.

Two illustrations show how the AMOC looks today and its expected weaker state in the future
How the Atlantic Ocean circulation would change as it slows.
IPCC 6th Assessment Report

The AMOC carries warm water northward from the tropics. In the North Atlantic, as sea ice forms, the surface gets colder and saltier, and this dense water sinks. The sinking action drives the return flow of cold, salty water southward, completing the circulation’s loop. But melting land ice from Greenland threatens the density-driven motor of this ocean conveyor belt by dilution: Fresher water doesn’t sink as easily.

A weaker current could create a feedback loop, slowing the circulation further and leading to a shutdown within a century once it begins, according to one estimate. Like a domino, the climate changes that would accompany an AMOC collapse could worsen drought in the Amazon and accelerate ice loss in the Antarctic.

There is still room for hope

Not all scientists agree that an AMOC collapse is close. For the Amazon rain forest and the North Atlantic, some cite a lack of evidence to declare the forest is collapsing or currents are weakening.

In the Amazon, researchers have questioned whether modeled vegetation data that underpins tipping point concerns is accurate. In the North Atlantic, there are similar concerns about data showing a long-term trend.

A map of the Amazon shows large areas along its edges and rivers in particular losing tree cover
The Amazon forest has been losing tree cover to logging, farming, ranching, wildfires and a changing climate. Pink shows areas with greater than 75% tree canopy loss from 2001 to 2024. Blue is tree cover gain from 2000 to 2020.
Global Forest Watch, CC BY

Climate models that predict collapses are also less accurate when forecasting interactions between multiple tipping points. Some interactions can push systems out of balance, while others pull an ecosystem closer to equilibrium.

Other changes driven by rising global temperatures, like melting permafrost, likely don’t meet the criteria for tipping points because they aren’t self-sustaining. Permafrost could refreeze if temperatures drop again.

Risks are too high to ignore

Despite the uncertainty, tipping points are too risky to ignore. Rising temperatures put people and economies around the world at greater risk of dangerous conditions.

But there is still room for preventive actions – every fraction of a degree in warming that humans prevent reduces the risk of runaway climate conditions. For example, a full reversal of coral bleaching may no longer be possible, but reducing emissions and pollution can allow reefs that still support life to survive.

Tipping points highlight the stakes, but they also underscore the climate choices humanity can still make to stop the damage.

The Conversation

Alexandra A Phillips does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. What are climate tipping points? They sound scary, especially for ice sheets and oceans, but there’s still room for optimism – https://theconversation.com/what-are-climate-tipping-points-they-sound-scary-especially-for-ice-sheets-and-oceans-but-theres-still-room-for-optimism-265183

The limits of free speech protections in American broadcasting

Source: The Conversation – USA – By Michael J. Socolow, Professor of Communication and Journalism, University of Maine

FCC Chairman Brendan Carr testifies in Washington on May 21, 2025. Brendan Smialowski/AFP via Getty Image

The chairman of the Federal Communications Commission is displeased with a broadcast network. He makes his displeasure clear in public speeches, interviews and congressional testimony.

The network, afraid of the regulatory agency’s power to license their owned-and-operated stations, responds quickly. They change the content of their broadcasts. Network executives understand the FCC’s criticism is supported by the White House, and the chairman implicitly represents the president.

I’m not just referring to the recent controversy between FCC Chairman Brendan Carr, ABC and Jimmy Kimmel. The same chain of events has happened repeatedly in U.S. history.

President Franklin Delano Roosevelt’s FCC chairman, James Lawrence Fly, warned the networks about censoring news commentators.

Then there was John F. Kennedy’s FCC chairman, Newton Minow, who criticized the networks for not airing more news and public affairs programming to support American democracy during the Cold War.

And there was George W. Bush’s FCC chairman, Michael Powell. He decided that a fleeting “wardrobe malfunction” during the 2004 Super Bowl halftime show – when Janet Jackson’s breast was exposed – was sufficient to punish CBS with a fine.

In each of those cases, the FCC represented the views of the White House. And in each case, the regulatory agency was employed to pressure the networks into airing content more aligned with the administration’s ideology.

But what’s interesting in those four examples is that two of the FCC chairmen were Democrats – Fly and Minow – and two were Republicans – Powell and Carr.

As a media historian, I’m aware of the long-existing bipartisan enthusiasm for exploiting the fact that no First Amendment exists in American broadcasting. Pressuring broadcasters by leveraging FCC power occurs regardless of which party controls the White House. And when the agency is used in partisan fashion, the rival party will criticize such politicization of regulation as a threat to free speech.

This recurring cycle is made possible by the fact that broadcasting is licensed by the government. Since a Supreme Court decision in 1943, the supremacy of the FCC in broadcast regulation has been unquestioned.

Such strong governmental oversight separates broadcasting from any other medium of mass communication in the United States. And it’s the reason why there’s no “free speech” when it comes to Kimmel, or any other performer, on U.S. airwaves.

The FCC’s empowerment

Since its establishment in 1934, the FCC’s primary role in broadcasting has been to authorize local station licenses “in the public interest, convenience, or necessity.”

In 1938, the FCC began its first investigation into network practices and policies, which resulted in new regulations. One of the new rules stated that no network could own and operate more than one licensed station in any single market. This forced NBC, which owned two networks that operated stations in several markets, to divest itself of one of its networks. NBC sued.

In the first serious constitutional test of the FCC’s full authority, in 1943, the Supreme Court vindicated the FCC’s expansive power over all U.S. broadcasting in its 5–4 verdict in National Broadcasting Co. v. United States. The ruling has stood since.

That’s why there’s no First Amendment in broadcasting. The Supreme Court ruled that, due to spectrum scarcity – the idea that the airwaves are a limited public resource and therefore not every American can operate a broadcast station – the FCC’s power over broadcasting must be expansive.

The 1934 act, the 1943 Supreme Court decision read, “gave the Commission … expansive powers … and a comprehensive mandate to ‘encourage the larger and more effective use of radio in the public interest,’ if need be, by making ‘special regulations applicable to radio stations engaged in chain (network) broadcasting.’”

The ruling also explains why the FCC can be credited with having created the American Broadcasting Company. Yes, the same ABC that suspended Kimmel in the face of FCC threats was the network that emerged from NBC’s forced divestiture of its Blue Network as a result of the 1943 Supreme Court decision.

The empowerment of the FCC by NBC v. U.S. led to such content restrictions as the Fairness Doctrine, which intended to ensure balanced political broadcasting, instituted in 1949, and later, additional FCC rules against obscenity and indecency on the airwaves. The Supreme Court decision also encouraged FCC chairmen to flex their regulatory muscles in public more often.

A Black woman and white man sing onstage.
A federal appeals court ruled on Nov. 2, 2011, that CBS should not be fined US$550,000 for Janet Jackson’s infamous ‘wardrobe malfunction.’
AP Photo/David Phillip

For example, when CBS suspended news commentator Cecil Brown in 1943 for truthful but critical news commentary about the U.S. World War II effort, FCC Chairman Fly expressed his displeasure with the network’s decision.

“It is a little strange,” Fly told the press, “that all Americans are to enjoy free speech except radio commentators.”

When FCC Chairman Minow complained about television in the U.S. devolving into a “vast wasteland” in 1961, the networks responded both defensively and productively. They invested far more money into news and public affairs programming. That led to significantly more news reporting and documentary production throughout the 1960s and 1970s.

A ‘hands-off’ FCC

In the early 2000s, FCC Chairman Powell promised to “refashion the FCC into an outfit that is fast, decisive and, above all, hands-off.”

Yet his promise to be “hands-off” did not apply to content regulation. In 2004, his FCC concluded a contentious legal battle with Clear Channel Communications over comments ruled “indecent” by shock jock Howard Stern. The settlement resulted in a US$1.75 million payment by Clear Channel Communications – the largest fine ever collected by the FCC for speech on the airwaves.

Powell apparently enjoyed policing content, as evidenced by the $550,000 fine his FCC levied against CBS for the fleeting exposure of singer Janet Jackson’s breast during the Super Bowl. The fine was eventually overturned. But Powell did successfully lobby Congress to significantly hike the amount of money the FCC could fine broadcasters for indecency. The fine for a single incident increased from $32,000 to $325,000, and up to $3 million if a network broadcasts it on multiple stations.

Powell’s regulatory activism, done mostly to curb the outrageous antics of radio shock jocks, resulted in some of the most significant and long-lasting restrictions on broadcast freedom in U.S. history. Thus, Carr’s 2025 threats toward ABC can be viewed in a historical context as an extension of established FCC activism.

Demonstrators holds signs in front of a building with columns.
Demonstrators hold signs on Sept. 18, 2025, outside Los Angeles’ El Capitan Entertainment Centre, where the late-night show ‘Jimmy Kimmel Live!’ is staged.
AP Photo/Damian Dovarganes

But Carr’s threat also appeared to contradict his previously espoused values.

As the author of the FCC section in Project 2025, a conservative blueprint for federal government policies, Carr wrote: “The FCC should promote freedom of speech … and pro-growth reforms that support a diversity of viewpoints.” In exploiting the FCC’s licensing power to threaten to penalize speech he found offensive, Carr failed to promote either freedom of speech or diversity of viewpoints.

If there’s one thing the Carr-Kimmel episode teaches us, it’s that more Americans should know the structural constraints in the U.S. system of broadcasting. Media literacy has proved essential as curbs to free expression – both official and unofficial – have become more popular.

When the FCC threatens a broadcaster, it does so in Americans’ name.

If Americans applaud regulatory activism when it supports their partisan beliefs, consistency demands they accept the same regulatory activism in the hands of their political opponents. If Americans prefer their political opposition show restraint in the regulation of broadcasting, then they need to promote restraint when their preferred administration is in power.

The Conversation

Michael J. Socolow does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. The limits of free speech protections in American broadcasting – https://theconversation.com/the-limits-of-free-speech-protections-in-american-broadcasting-266206

Industrial facilities owned by profitable companies release more of their toxic waste into the environment

Source: The Conversation – USA (2) – By Mahelet G Fikru, Professor of Economics, Missouri University of Science and Technology

Toxic chemical pollution can come in many forms, including compounds that float on top of water. Brett Hondow/iStock / Getty Images Plus

How much pollution a facility engaged in production or resource extraction emits isn’t just based on its location, its industry or the type of work it does. That’s what our team of environmental and financial economists found when we examined how corporate characteristics shape pollution emissions.

Pollution emissions rates also vary with specific characteristics of the company that owns the facility – such as how many patents it holds, how profitable it is and how many employees it has, according to an analysis we have conducted of corporate pollution data.

We found that industrial and mining facilities owned by profitable companies with relatively few patents and fewer employees tend to release higher proportions of their toxic waste into the environment – into the air, into water or onto soil.

By contrast, industrial sites owned by unprofitable companies with higher levels of innovation and more personnel tend to handle higher proportions of their toxic waste in more environmentally responsible ways, such as processing them into nontoxic forms or recycling them, or burning them to generate energy.

Corporations publish their pollution data

A 1986 federal law requires companies that are in certain industries, employ more than 10 people and make, use or process significant amounts of certain toxic or dangerous chemicals to tell the government where those chemicals go after the company is done with them.

That data is collected by the U.S. Environmental Protection Agency in a database called the Toxics Release Inventory. That data includes information about the companies, their facilities and locations, and what they do with their waste chemicals.

The goal is not only to inform the public about which dangerous chemicals are being used in their communities, but also to encourage companies to use cleaner methods and handle their waste in ways that are more environmentally responsible.

Overall, U.S. companies reported releasing to the environment 3.3 billion pounds of toxic chemicals (1.5 billion kg) in 2023, a 21% decrease from 2014. The decline reflects increased waste management, adoption of pollution prevention and cleaner technologies, in addition to the fact that disclosure requirements motivate companies to reduce releases.

The 2023 releases came from over 21,600 industrial facilities in all 50 states and various U.S. territories, including Puerto Rico, the U.S. Virgin Islands, Guam and American Samoa. One-fifth of the facilities reporting toxic releases in 2023 were in Texas, Ohio and California.

What kinds of businesses release toxic pollution?

Metal mining, chemical manufacturing, primary metals, natural gas processing and electric utilities represent the top five polluting industrial sectors in the U.S. Combined, businesses in those sectors accounted for 78% of the toxic chemicals released in 2023.

Research has found that, often, higher levels of toxic chemical releases come from industrial facilities in less populated, economically disadvantaged, rural or minority communities.

But geography and population are not the whole story. Even within the same area, some facilities pollute a lot less than others. Our inquiry into the differences between those facilities has found that corporate characteristics matter a lot – such as operational size, innovative capacity and financial strength.

In our analysis, we combined the data companies reported to the EPA about toxic chemical releases with financial information on those companies and ZIP-code level geographic and demographic data. We found that corporate characteristics like profitability, employment size and number of patents are more strongly connected with toxic chemical releases than a community’s population density, minority-group percentage or household income.

We looked at what percentage of its toxic chemical waste a facility or mine released to the environment versus how much it treated, recycled or incinerated.

The average facility in our sample, which included 1,976 facilities owned by companies for which financial data is available, released about 39% of its toxic chemical waste to the environment, whether to air, water or land – with the remaining 61% of it managed through recycling, treatment or energy recovery either on-site or off-site.

But facilities in different industries have different release rates. For example, about 99% of toxic chemicals from coal mines are released to the environment, compared with 81% for natural gas extraction, recovery and processing; 25% for power-generating electric utilities; and less than 3% for electrical equipment manufacturers.

The role of innovation

One corporate attribute we examined was innovation, which we measured by counting corporations’ patent families, which are groups of patent documents related to the same invention, even if they are filed in different countries. We found that companies with more patent families tend to release less of their toxic waste to the environment.

Specifically, facilities owned by the top 25% of companies, when rated by innovation, released an average of 32.5% of their toxic waste to the environment, which is 8 percentage points lower than the average of facilities owned by the remaining companies in the sample.

We hypothesize that innovation may give firms a competitive advantage that also enables them to adopt cleaner production technologies or invest in more environmentally conscious methods of handling waste containing toxic chemicals, thereby preventing toxic chemicals from being directly released to the environment.

Size and profitability matter, too

We also looked at companies’ size – in terms of number of employees – and their profitability, to see how those connected with pollution rates at the facilities the company owns.

We found that larger companies, those with more than 19,000 employees, own facilities that release an average of 31% of their toxic chemical waste to the environment. By contrast, facilities owned by midsized companies, from 1,000 to 19,000 workers, release 45%, on average. Those owned by smaller companies, with less than 1,000 employees, release an average of 42% of their toxic chemical waste to the environment.

An important note is that those larger companies, which are more likely to have multiple locations, often own facilities that handle larger volumes of chemicals. So even if they release smaller proportions of their toxic waste to the environment, that may still add up to larger quantities.

We also found that industrial facilities owned by profitable firms have higher average rates of releasing toxic chemicals to the environment than those owned by unprofitable companies.

Facilities owned by companies with positive net income, according to their income statements obtained from PitchBook, a company that collects data on corporations, released an average of 40% of their toxic-chemical-containing wastes to the environment. Facilities owned by companies with negative net income released an average of 31% of their toxic chemical waste to the environment. To us, that indicates that financially strong companies are not necessarily more environmentally responsible. That may be evidence that profitable firms make money in part by contaminating the environment rather than paying for pollution prevention or cleanup.

Our analysis shows that geography and demographics alone do not fully account for industries’ and facilities’ differing levels of pollution. Corporate characteristics are also key factors in how toxic waste is handled and disposed of.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Industrial facilities owned by profitable companies release more of their toxic waste into the environment – https://theconversation.com/industrial-facilities-owned-by-profitable-companies-release-more-of-their-toxic-waste-into-the-environment-265227