Federal energy office illustrates the perils of fluctuating budgets and priorities

Source: – By Christelle Khalaf, Associate Director, Government Finance Research Center, University of Illinois Chicago

How much money goes into which pile often changes with the presidency. valiantsin suprunovich/iStock / Getty Images Plus

When new presidential administrations enter the White House, federal agencies often find their funding and priorities shifting, sometimes dramatically.

I’m a scholar who studies how policy and market shifts affect regional economies, labor markets and public systems, particularly in the context of critical infrastructure such as energy and water. I’ve seen how both of those types of changes – of funding levels and priorities – destabilize agencies and cut off long-term projects before they achieve their intended goals.

In one research project, with co-authors Dr. Deborah A. Carroll and Zach Perkins, I took a close look at one office within a federal agency, the Department of Energy’s Office of Energy Efficiency and Renewable Energy. What we found serves as an example of how these changes have played out in the past, and it gives context to how the Trump administration’s changes are playing out now in that agency and across the federal government.

The office, known by researchers and its personnel as EERE, is mainly focused on funding research and development to advance energy efficiency and renewable energy technologies and reduce the costs of those technologies to consumers. Its key efforts involve low-emission transportation, renewable electricity generation and decreasing the carbon emissions of buildings and industry processes.

It makes grants to, and enters research and development agreements with, small businesses, industry, national laboratories, universities and state and local governments. Recipients are often required to contribute matching funds or other support to the project to complement the federal funding.

In general, Congress appropriates funding to the office as part of the yearly budget process. However, the office also receives sporadic influxes of additional funding to stimulate the economy or address concerns related to energy security and greenhouse gas emissions. Ultimately, the amount of funding EERE gets depends in part on overall economic conditions or national crises.

Boosting funding levels

Some of those supplemental allocations can be significant, and many last until the funds have been spent, even if that takes a number of years. Following the energy crisis in the early 2000s, Congress allocated EERE a total of about $7 billion in funding for research and development in energy efficiency, renewable energy and biofuels.

Then in 2009, following the Great Recession, Congress gave EERE $16.7 billion – most of which was to help low-income families pay to install efficient light sources or insulation to save them money. About $5.4 billion was for research and development.

In 2020, amid the COVID-19 pandemic, Congress passed the Energy Act of 2020, mainly focusing on nuclear energy and carbon capture technologies but also providing over $500 million in research and development funding for EERE.

In 2021, the Infrastructure Investment and Jobs Act allocated about $16.3 billion to EERE. And in 2022, the Inflation Reduction Act provided an additional $18 billion. As with other additional funding allocations, Congress made most of that money available until the total authorized amount has been spent.

But the future of these allocations is uncertain. A January 2025 executive order by President Donald Trump requested that all agencies immediately pause the disbursement of funds Congress approved in both laws.

In its 2026 budget, the Trump administration is proposing spending $900 million on EERE’s work – a 70% reduction from its 2025 allocation of $3.5 billion. This echoes a move during Trump’s first term when the White House proposed the office’s funding be cut by nearly 70% between the 2017 and 2018 budgets. However, at that time, Congress decided to keep the office’s budget largely intact. Congress will review and decide on this proposed budget as well.

A row of solar panels against a blue sky.
Solar energy is just one of the Office of Energy Efficiency and Renewable Energy’s areas of research.
alexsl/iStock / Getty Images Plus

Shifting priorities

How those varying amounts of money are spent also changes, often based on shifts in political leadership with different views about what types of technologies are most worth investing in, and about the most effective role of government in developing new technologies.

Our qualitative analysis has found that Republican administrations typically believe that very-early-stage research and development is an appropriate role for the federal government, but that as technologies move closer to commercialization, the private sector should take the lead.

In contrast, we found that Democratic administrations believe that promising innovations often fail to reach the market due to insufficient private sector support during the demonstration and deployment phases. So they tend to advocate for increased federal involvement to assist with the transition from research to market-ready technologies.

There is also a partisan difference in which technologies get financial support. Solar and wind energy technologies have historically received higher funding under Democratic administrations. In contrast, bioenergy and hydrogen technologies have received higher funding under Republican administrations.

Funding the future

EERE often funds projects that are considered too risky for private investors to fund alone. Expanding knowledge requires experimentation, so some EERE projects have achieved notable success, while others have not.

For instance, the office’s investments have played a pivotal role in both spreading electric vehicle technologies and reducing their cost to consumers. Beginning with a major funding boost from the American Recovery and Reinvestment Act of 2009, and with further allocations in subsequent years, EERE helped fund breakthroughs in battery manufacturing, power electronics and electric drive systems.

These advancements contributed to a sharp rise in adoption: In 2012, there were just 100,000 electric vehicles registered in the U.S. By 2022, that number was above 3 million. And in 2014, hybrid, plug-in hybrid and battery electric vehicles accounted for 3% of all new light-duty vehicle sales. By 2024, that share had grown to 19%.

EERE’s investments in electric vehicles powered by hydrogen fuel cells, by contrast, have not done so well. Despite significant government support in the 2000s, their commercial availability remains largely limited to California, where most of the country’s hydrogen refueling stations are located.

A person connects a plug to a car.
Various aspects of electric vehicle technologies have received federal support.
Cavan Images/Cavan via Getty Images

A change in approach

Our analysis of the office’s operations finds that the amount of change in funding levels and priorities can create an environment that hinders thoughtful project selection. Programs that begin under one administration can’t be counted on to continue under subsequent presidents, and dollars allocated for the future may be repurposed down the road, leaving projects only partially finished.

Studies also find that rapidly increasing budgets can create misaligned incentives as public administrators scramble to use the funds during the authorization period. For example, some may prioritize grantees who can accept and spend money rapidly, regardless of the potential public benefit of their innovation.

Further, the shifting priorities complicate long-term planning for government officials, researchers and businesses. Sustaining innovation over a long period takes years of commitment. Studies have shown that inconsistent or volatile government funding can hinder overall technological progress and discourage private investment. One example is the exploration of algae-based biofuels in the 1980s, which was shut down in the 1990s due to shifting federal priorities. That stalled progress in the field and led to a loss of more than half of the genetic legacy collected through the program. In the late 2000s, the federal government resumed funding algae-based biofuel research.

Overall, research by us and others underscores the importance of sustained funding and institutional continuity to ensure the success of publicly funded research and development. That’s what other peer countries are doing: boosting long-term investments in clean energy with consistent priorities and predictable funding.

Following that model, in contrast to the current practice of ever-shifting priorities, would create more effective opportunities to develop, produce and deploy innovative energy technologies in the U.S., helping to maintain global competitiveness and reduce reliance on foreign manufacturing.

The Conversation

Christelle Khalaf received funding from the Alfred P. Sloan Foundation to examine EERE R&D funding trends. She has also received funding from the Department of Energy for separate research.

ref. Federal energy office illustrates the perils of fluctuating budgets and priorities – https://theconversation.com/federal-energy-office-illustrates-the-perils-of-fluctuating-budgets-and-priorities-255936

AI helps tell snow leopards apart, improving population counts for these majestic mountain predators

Source: – By Eve Bohnett, Assistant Scholar, Center for Landscape Conservation Planning, University of Florida

Snow leopards are hard to find and count, which makes protecting them difficult. zahoor salmi/Moment via Getty Images

Snow leopards are known as the “ghosts of the mountains” for a reason. Imagine waiting for months in the harsh, rugged mountains of Asia, hoping to catch even a glimpse of one. These elusive big cats move silently across rocky slopes, their pale coats blending so seamlessly with snow and stone that even the most seasoned biologists seldom spot them in the wild.

Travel writer Peter Matthiessen spent two months in 1973 searching the Tibetan plateau for them and wrote a 300-page book about the effort. He never saw one. Forty years later, Peter’s son Alex retraced his father’s steps – and didn’t see one either.

Researchers have struggled to come up with a figure for the global population. In 2017, the International Union for Conservation of Nature reclassified the snow leopard from endangered to vulnerable, citing estimates of between 2,500 and 10,000 adults in the wild. However, the group also warned that numbers continue to decline in many areas due to habitat loss, poaching and human-wildlife conflict. Those who study these animals want to help protect the species and their habitat – if only we can determine exactly where they live and how many there are.

Traditional tracking methods – searching for footprints, droppings and other signs – have their limits. Instead of waiting for a lucky face-to-face encounter, conservationists from the Wildlife Conservation Society, led by experts including Stéphane Ostrowski and Sorosh Poya Faryabi, began deploying automated camera traps in Afghanistan. These devices snap photos whenever movement is detected, capturing thousands of images over months, all in hopes of obtaining a rare glimpse of a snow leopard.

But capturing images is only half the battle. The next, even harder task is telling one snow leopard apart from another.

Two images of snow leopards.
Are these the same animal or different ones? It’s really hard to tell.
Eve Bohnett, CC BY-ND

At first glance, it might sound simple: Each snow leopard has a unique pattern of black rosettes on its coat, like a fingerprint or a face in a crowd. Yet in practice, identifying individuals by these patterns is slow, subjective and prone to error. Photos may be taken at odd angles, under poor lighting, or with parts of the animal obscured – making matches tricky.

A common mistake happens when photos from different cameras are marked as depicting different animals when they actually show the same individual, inflating population estimates. Worse, camera trap images can get mixed up or misfiled, splitting encounters of one cat across multiple batches and identities.

I am a data analyst working with Wildlife Conservation Society and other partners at Wild Me. My work and others’ has found that even trained experts can misidentify animals, failing to recognize repeat visitors at locations monitored by motion-sensing cameras and counting the same animal more than once. One study found that the snow leopard population was overestimated by more than 30% because of these human errors.

To avoid these pitfalls, researchers follow camera sorting guidelines: At least three clear pattern differences or similarities must be confirmed between two images to declare them the same or different cats. Images too blurry, too dark or taken from difficult angles may have to be discarded. Identification efforts range from easy cases with clear, full-body shots to ambiguous ones needing collaboration and debate. Despite these efforts, variability remains, and more experienced observers tend to be more accurate.

Now people trying to count snow leopards are getting help from artificial intelligence systems, in two ways.

Spotting the spots

Modern AI tools are revolutionizing how we process these large photo libraries. First, AI can rapidly sort through thousands of images, flagging those that contain snow leopards and ignoring irrelevant ones such as those that depict blue sheep, gray-and-white mountain terrain, or shadows.

A snow leopard stands amid rocks.
Unique spots and spot patterns are key to telling snow leopards apart.
Eve Bohnett, CC BY-NC-ND

AI can identify individual snow leopards by analyzing their unique rosette patterns, even when poses or lighting vary. Each snow leopard encounter is compared with a catalog of previously identified photos and assigned a known ID if there is a match, or entered as a new individual if not.

In a recent study, several colleagues and I evaluated two AI algorithms, both separately and in tandem.

The first algorithm, called HotSpotter, identifies individual snow leopards by comparing key visual features such as coat patterns, highlighting distinctive “hot spots” with a yellow marker.

The second is a newer method called pose invariant embeddings, which operates similar to facial recognition technology: It recognizes layers of abstract features in the data, identifying the same animal regardless of how it is positioned in the photo or what kind of lighting there may be.

We trained these systems using a curated dataset of photos of snow leopards from zoos in the U.S., Europe and Tajikistan, and with images from the wild, including in Afghanistan.

Alone, each model worked about 74% of the time, correctly identifying the cat from a large photo library. But when combined, the two systems together were correct 85% of the time.

These algorithms were integrated into Wildbook, an open-source, web-based software platform developed by the nonprofit organization Wild Me and now adopted by ConservationX. We deployed the combined system on a free website, Whiskerbook.org, where researchers can upload images, seek matches using the algorithms, and confirm those matches with side-by-side comparisons. This site is among a growing family of AI-powered wildlife platforms that are helping conservation biologists work more efficiently and more effectively at protecting species and their habitats.

Two images of snow leopards, one in daylight and one in infrared light.
A view from an online wildlife-tracking system suggests a possible match for a snow leopard caught by a remote camera.
Wildbook/Eve Bohnett, CC BY-ND

Humans still needed

These AI systems aren’t error-proof. AI quickly narrows down candidates and flags likely matches, but expert validation ensures accuracy, especially with tricky or ambiguous photos.

Another study we conducted pitted AI-assisted groups of experts and novices against each other. Each was given a set of three to 10 images of 34 known captive snow leopards and asked to use the Whiskerbook platform to identify them. They were also asked to estimate how many individual animals were in the set of photos.

The experts accurately matched about 90% of the images and delivered population estimates within about 3% of the true number. In contrast, the novices identified only 73% of the cats and underestimated the total number, sometimes by 25% or more, incorrectly merging two individuals into one.

Both sets of results were better than when experts or novices did not use any software.

The takeaway is clear: Human expertise remains important, and combining it with AI support leads to the most accurate results. My colleagues and I hope that by using tools like Whiskerbook and the AI systems embedded in them, researchers will be able to more quickly and more confidently study these elusive animals.

With AI tools like Whiskerbook illuminating the mysteries of these mountain ghosts, we have another way to safeguard snow leopards – but success depends on continued commitment to protecting their fragile mountain homes.

The Conversation

Eve Bohnett receives funding from San Diego State Research Foundation and Wildlife Conservation Society. She is affiliated with University of Florida.

ref. AI helps tell snow leopards apart, improving population counts for these majestic mountain predators – https://theconversation.com/ai-helps-tell-snow-leopards-apart-improving-population-counts-for-these-majestic-mountain-predators-258154

At Antarctica’s midwinter, a look back at the frozen continent’s long history of dark behavior

Source: – By Daniella McCahey, Assistant Professor of History, Texas Tech University

Is this visitor to Antarctica going crazy or having a good time? Tim Bieber/Photodisc via Getty Images

As Midwinter Day approaches in Antarctica – the longest and darkest day of the year – those spending the winter on the frozen continent will follow a tradition dating back more than a century to the earliest days of Antarctic exploration: They will celebrate having made it through the growing darkness and into a time when they know the Sun is on its way back.

The experience of spending a winter in Antarctica can be harrowing, even when living with modern conveniences such as hot running water and heated buildings. At the beginning of the current winter season, in March 2025, global news outlets reported that workers at the South African research station, SANAE IV, were “rocked” when one worker allegedly threatened and assaulted other members of the station’s nine-person winter crew. Psychologists intervened – remotely – and order was apparently restored.

The desolate and isolated environment of Antarctica can be hard on its inhabitants. As a historian of Antarctica, the events at SANAE IV represent a continuation of perceptions – and realities – that Antarctic environments can trigger deeply disturbing behavior and even drive people to madness.

A view of a small cluster of buildings below a cone-shaped hill, with a dark sky and the Moon shining.
Long hours of constant near-darkness take their toll in the Antarctic winter.
Andrew Smith, via Antarctic Sun, CC BY-ND

Early views

The very earliest examples of Antarctic literature depict the continent affecting both mind and body. In 1797, for instance, more than two decades before the continent was first sighted by Europeans, the English poet Samuel Taylor Coleridge wrote “The Rime of the Ancient Mariner.” It tells a tale of a ship blown by storms into an endless maze of Antarctic ice, which they escape by following an albatross. For unexplained reasons, one man killed the albatross and faced a lifetime’s torment for doing so.

In 1838, Edgar Allan Poe published the story of “Arthur Gordon Pym of Nantucket,” who journeyed into the Southern Ocean. Even before arriving in Antarctica, the tale involves mutiny, cannibalism and a ship crewed by dead men. As the story ends, Pym and two others drift southward, encountering an enormous, apparently endless cataract of mist that parts before their boat, revealing a large ghostly figure.

H.P. Lovecraft’s 1936 story “At the Mountains of Madness” was almost certainly based on real stories of polar exploration. In it, the men of a fictitious Antarctic expedition encounter circumstances that “made us wish only to escape from this austral world of desolation and brooding madness as swiftly as we could.” One man even experiences an unnamed “final horror” that causes a severe mental breakdown.

The 1982 John Carpenter film “The Thing” also involves these themes, when men trapped at an Antarctic research station are being hunted by an alien that perfectly impersonates the base members it has killed. Paranoia and anxiety abound, with team members frantically radioing for help, and men imprisoned, left outside or even killed for the sake of the others.

Whether to gird themselves for what may come or just as a fun tradition, the winter-over crew at the United States’ South Pole Station watches this film every year after the last flight leaves before winter sets in.

A trailer for the 1982 film ‘The Thing,’ set at an Antarctic research station.

Real tales

These stories of Antarctic “madness” have some basis in history. A long-told anecdote in modern Antarctic circles is of a man who stabbed, perhaps fatally, a colleague over a game of chess at Russia’s Vostok station in 1959.

More certain were reports in 2018, when Sergey Savitsky stabbed Oleg Beloguzov at the Russian Bellingshausen research station over multiple grievances, including the one most seized upon by the media: Beloguzov’s tendency to reveal the endings of books that Savitsky was reading. A criminal charge against him was dropped.

In 2017, staff at South Africa’s sub-Antarctic Marion Island station reported that a team member smashed up a colleague’s room with an ax over a romantic relationship.

Mental health

Concerns over mental health in Antarctica go much further back. In the so-called “Heroic Age” of Antarctic exploration, from about 1897 to about 1922, expedition leaders prioritized the mental health of the men on their expeditions. They knew their crews would be trapped inside with the same small group for months on end, in darkness and extreme cold.

American physician Frederick Cook, who accompanied the 1898-1899 Belgica expedition, the first group known to spend the winter within the Antarctic Circle, wrote in helpless terms of being “doomed” to the “mercy” of natural forces, and of his worries about the “unknowable cold and its soul-depressing effects” in the winter darkness. In his 2021 book about that expedition, writer Julian Sancton called the ship the “Madhouse at the End of the Earth.”

Cook’s fears became real. Most men complained of “general enfeeblement of strength, of insufficient heart action, of a mental lethargy, and of a universal feeling of discomfort.”

“When at all seriously afflicted,” Cook wrote, “the men felt that they would surely die” and exhibited a “spirit of abject hopelessness.”

And in the words of Australian physicist Louis Bernacchi, a member of the 1898-1900 Southern Cross expedition, “There is something particularly mystical and uncanny in the effect of the grey atmosphere of an Antarctic night, through whose uncertain medium the cold white landscape looms as impalpable as the frontiers of a demon world.”

Footage from 1913 shows the force of the wind at Cape Denison, which has been called ‘the home of the blizzard.’

A traumatic trip

A few years later, the Australasian Antarctic Expedition, which ran from 1911 to 1914, experienced several major tragedies, including two deaths during an exploring trip that left expedition leader Douglas Mawson starving and alone amid deeply crevassed terrain. The 100-mile walk to relative safety took him a month.

A lesser-known set of events on that same expedition involved wireless-telegraph operator Sidney Jeffryes, who arrived in Antarctica in 1913 on a resupply ship. Cape Denison, the expedition’s base, had some of the most severe environmental conditions anyone had encountered on the continent, including winds estimated at over 160 miles an hour.

Jeffryes, the only man in the crew who could operate the radio telegraph, began exhibiting signs of paranoia. He transmitted messages back to Australia saying that he was the only sane man in the group and claiming the others were plotting to kill him.

In Mawson’s account of the expedition, he blamed the conditions, writing:

(T)here is no doubt that the continual and acute strain of sending and receiving messages under unprecedented conditions was such that he eventually had a ‘nervous breakdown.’”

Mawson hoped that the coming of spring and the possibility of outdoor exercise would help, but it did not. Shortly after his return to Australia in February 1914, Jeffryes was found wandering in the Australian bush and institutionalized. For many years, his role in Antarctic exploration was ignored, seeming a blot or embarrassment on the masculine ideal of Antarctic explorers.

A group of people stand on a rocky shore waving at a small boat in the distance.
After five months of isolation in trying conditions on a remote Antarctic island, 22 men rejoice at their rescue in August 1916.
Frank Hurley, Underwood & Underwood, via Library of Congress

Wider problems

Unfortunately, the general widespread focus on Antarctica as a place that causes disturbing behavior makes it easy to gloss over larger and more systemic problems.

In 2022, the United States Antarctic Program as well as the Australian Antarctic Division released reports that sexual assault and harassment are common at Antarctic bases and in more remote field camps. Scholars have generally not linked those events to the specifics of the cold, darkness and isolation, but rather to a continental culture of heroic masculinity.

As humans look to live in other extreme environments, such as space, Antarctica represents not only a cooperative international scientific community but also a place where, cut off from society as a whole, human behavior changes. The celebrations of Midwinter Day honor survival in a place of wonder that is also a place of horror, where the greatest threat is not what is outside, but what is inside your mind.

The Conversation

Daniella McCahey does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. At Antarctica’s midwinter, a look back at the frozen continent’s long history of dark behavior – https://theconversation.com/at-antarcticas-midwinter-a-look-back-at-the-frozen-continents-long-history-of-dark-behavior-253906

After the smoke clears, a wildfire’s legacy can haunt rivers for years, putting drinking water at risk

Source: – By Ben Livneh, Associate Professor of Hydrology, University of Colorado Boulder

Burned ground can become hydrophobic and almost waxlike, allowing rainfall to quickly wash contaminants downslope. Carli Brucker

Picture a wildfire raging across a forested mountainside. The smoke billows and the flames rise. An aircraft drops vibrant red flame retardant. It’s a dramatic, often dangerous scene. But the threat to water supplies is only just beginning.

After the smoke clears, the soil, which was once nestled beneath a canopy of trees and a spongy layer of leaves, is now exposed. Often, that soil is charred and sterile, with the heat making the ground almost water-repellent, like a freshly waxed car.

When the first rain arrives, the water rushes downhill. It carries with it a slurry of ash, soil and contaminants from the burned landscape. This torrent flows directly into streams and then rivers that provide drinking water for communities downstream.

As a new research paper my colleagues and I just published shows, this isn’t a short-term problem. The ghost of the fire can haunt these waterways for years.

Scientists explain how wildfires can contaminate water supplies and the ways they measure the effects, summarized in their 2024 publication. University of Colorado-Boulder.

This matters because forested watersheds are the primary water source for nearly two-thirds of municipalities in the United States. As wildfires in the western U.S. become larger and more frequent, the long-term security and safety of water supplies for downstream communities is increasingly at risk.

Charting the long tail of wildfire pollution

Scientists have long known that wildfires can affect water quality, but two key questions remained: Exactly how bad is the impact? And how long does it last?

To find out, my colleagues and I led a study, coordinated by engineer Carli Brucker. We undertook one of the most extensive analyses of post-wildfire water quality to date. The results were published June 23, 2025, in the journal Nature Communications Earth & Environment.

We gathered decades of water quality data from 245 burned watersheds across the western U.S. and compared them to nearly 300 similar, unburned watersheds.

A map of watersheds in the western U.S.
A map of the basins studied shows the outlines of fires in red and burned basins in black. The blue basins did not burn and were used for comparisons.
Carli Brucker, et al., 2025, Nature Communications Earth & Environment

By creating a computer model for each basin that accounted for its normal water quality variability, based on factors such as rainfall and temperature, we were able to isolate the impact of the wildfire. This allowed us to see how much the water quality deviated after the fire, year after year.

The results were stark. In the first year after a fire, the concentrations of some contaminants skyrocketed. We found that levels of sediment and turbidity – the cloudiness of the water – were 19 to 286 times higher than prefire levels. That much sediment can clog filters at water treatment plants and require expensive treatment and maintenance. Think of trying to use a coffee filter with muddy water – the water just won’t flow through.

Concentrations of organic carbon, nitrogen and phosphorus were three to 103 times greater in the burned basins. These dissolved remnants of burned plants and soil are particularly problematic. When they mix with the chlorine used to disinfect drinking water, they can form harmful chemicals called disinfection byproducts, some of which are linked to cancer.

More surprisingly, we found the impacts to be really persistent. While the most dramatic spikes in phosphorous, nitrate, organic carbon and sediment generally occurred in the first one to three years, some contaminants lingered for much longer.

Charts show how contaminants lingered in water supplies for years after wildfires.
Contaminants including phosphorus, organic carbon and nitrates lingered in water supplies for years after wildfires. The charts show the average among all burned basins eight years before fires (light blue) and all burned basins after fires (orange). The gray bars show levels in the year immediately after the fire. The horizontal purple line shows levels that would be expected without a fire, based on the prefire years.
Carli Brucker, et al., 2025, Nature Communications Earth & Environment

We saw significantly elevated levels of nitrogen and sediment for up to eight years following a fire. Nitrogen and phosphorus act like fertilizer for algae. A surge of these nutrients can trigger algal blooms in reservoirs, which can produce toxins and create foul odors.

This extended timeline suggests that wildfires are fundamentally altering the landscape in ways that take a long time to heal. In our previous laboratory-based research, including a 2024 study, we simulated this process by burning soil and vegetation and then running water over them.

A blackened mountain slope where all of the trees have burned.
After mountain slopes burn, the rain that falls on them washes ash, charred soil and debris downstream.
Carli Brucker

The stuff that leaches out is a cocktail of carbon, nutrients and other compounds that can exacerbate flood risks and degrade water quality in ways that require more expensive treatment at water treatment facilities. In extreme cases, the water quality may be so poor that communities can’t withdraw river water at all, and that can create water shortages.

After the Buffalo Creek Fire in 1996 and then the Hayman Fire in 2002, Denver’s water utility spent more than US$27 million over several years to treat the water, remove more than 1 million cubic yards of sediment and debris from a reservoir, and fix infrastructure. State Forest Service crews planted thousands of trees to help restore the surrounding forest’s water filtering capabilities.

A growing challenge for water treatment

This long-lasting impact poses a major challenge for water treatment plants that make river water safe to drink. Our study highlights that utilities can’t just plan for a few bad months after a fire. They need to be prepared for potentially eight or more years of degraded water quality.

We also found that where a fire burns matters. Watersheds with thicker forests or more urban areas that burned tended to have even worse water quality after a fire.

Since many municipalities draw water from more than one source, understanding which watersheds are likely to have the largest water quality problems after fires can help communities locate the most vulnerable parts of their water supply systems.

As temperatures rise and more people move into wildland areas in the American West, the risk of wildfires increases, and it is becoming clear that preparing for longer-term consequences is crucial. The health of forests and our communities’ drinking water are inseparably linked, with wildfires casting a shadow that lasts long after the smoke clears.

The Conversation

Ben Livneh receives funding from the Western Water Assessment NOAA grant #NA21OAR4310309, ‘Western Water Assessment: Building Resilience to Compound Hazards in the Inter-Mountain West’.

ref. After the smoke clears, a wildfire’s legacy can haunt rivers for years, putting drinking water at risk – https://theconversation.com/after-the-smoke-clears-a-wildfires-legacy-can-haunt-rivers-for-years-putting-drinking-water-at-risk-259118

Reproducibility may be the key idea students need to balance trust in evidence with healthy skepticism

Source: – By Sarah R. Supp, Associate Professor of Data Analytics, Denison University

Reproducing results can increase trust in scientific studies. Huntstock via Getty Images

Many people have been there.

The dinner party is going well until someone decides to introduce a controversial topic. In today’s world, that could be anything from vaccines to government budget cuts to immigration policy. Conversation starts to get heated. Finally, someone announces with great authority that a scientific study supports their position. This causes the discussion to come to an abrupt halt because the dinner guests disagree on their belief in scientific evidence. Some may believe science always speaks the truth, some may think science can never be trusted, and others may disagree on which studies with contradicting claims are “right.”

How can the dinner party – or society – move beyond this kind of impasse? In today’s world of misinformation and disinformation, healthy skepticism is essential. At the same time, much scientific work is rigorous and trustworthy. How do you reach a healthy balance between trust and skepticism? How can researchers increase the transparency of their work to make it possible to evaluate how much confidence the public should have in any particular study?

As teachers and scholars, we see these problems in our own classrooms and in our students – and they are mirrored in society.

The concept of reproducibility may offer important answers to these questions.

Reproducibility is what it sounds like: reproducing results. In some ways, reproducibility is like a well-written recipe, such as a recipe for an award-winning cake at the county fair. To help others reproduce their cake, the proud prizewinner must clearly document the ingredients used and then describe each step of the process by which the ingredients were transformed into a cake. If others can follow the directions and come up with a cake of the same quality, then the recipe is reproducible.

Think of the English scholar who claims that Shakespeare did not author a play that has historically been attributed to him. A critical reader will want to know exactly how they arrived at that conclusion. What is the evidence? How was it chosen and interpreted? By parsing the analysis step by step, reproducibility allows a critical reader to gauge the strength of any kind of argument.

We are a group of researchers and professors from a wide range of disciplines who came together to discuss how we use reproducibility in our teaching and research.

Based on our expertise and the students we encounter, we collectively see a need for higher-education students to learn about reproducibility in their classes, across all majors. It has the potential to benefit students and, ultimately, to enhance the quality of public discourse.

The foundation of credibility

Reproducibility has always been a foundation of good science because it allows researchers to scrutinize each other’s studies for rigor and credibility and expand upon prior work to make new discoveries. Researchers are increasingly paying attention to reproducibility in the natural sciences, such as physics and medicine, and in the social sciences, such as economics and environmental studies. Even researchers in the humanities, such as history and philosophy, are concerned with reproducibility in studies involving analysis of texts and evidence, especially with digital and computational methods. Increased interest in transparency and accessibility has followed the rising importance of computer algorithms and numerical analysis in research. This work should be reproducible, but it often remains opaque.

Broadly, research is reproducible if it answers the question: “How do you know?” − such that another researcher could theoretically repeat the study and produce consistent results.

Reproducible research is explicit about the materials and methods that were used in a study to make discoveries and come to conclusions. Materials include everything from scientific instruments such as a tensiometer measuring soil moisture to surveys asking people about their daily diet. They also include digital data such as spreadsheets, digitized historic texts, satellite images and more. Methods include how researchers make observations and analyze data.

To reproduce a social science study, for example, we would ask: What is the central question or hypothesis? Who was in the study? How many individuals were included? What were they asked? After data was collected, how was it cleaned and prepared for analysis? How exactly was the analysis run?

Proper documentation of all these steps, plus making available the original data from the study, allows other scientists to redo the research, evaluate the decisions made during the process of gathering and analyzing information, and assess the credibility of the findings.

This short video, made by the National Academies, explains the key concepts in reproducing scientific findings and notes ways the process can be improved.

Over the past 20 years, the need for reproducibility has become increasingly important. Scientists have discovered that some published studies are too poorly documented for others to repeat, lack verified data sources, are questionably designed, or even fraudulent.

Putting reproducibility to work: An example

A highly contentious, retracted study from 1998 linked the measles, mumps and rubella (MMR) vaccine and autism. Scientists and journalists used their understanding of reproducibility to discover the flaws in the study.

The central question of the study was not about vaccines but aimed to explore a possible relationship between colitis − an inflammation of the large intestine − and developmental disorders. The authors explicitly wrote, “We did not prove an association between measles, mumps, and rubella vaccine and the syndrome described.”

The study observed just 12 patients who were referred to the authors’ gastroenterology clinic and had histories of recent behavioral disorders, including autism. This sample of children is simply too small and selective to be able to make definitive conclusions.

In this study, the researchers translated children’s medical charts into summary tables for comparison. When a journalist attempted to reproduce the published data tables from the children’s medical histories, they found pervasive inconsistencies.

Reproducibility allows for corrections in research. The article was published in a respected journal, but it lacked transparency with regard to patient recruitment, data analysis and conflicts of interest. Whereas traditional peer review involves critical evaluation of a manuscript, reproducibility also opens the door to evaluating the underlying data and methods. When independent researchers attempted to reproduce this study, they found deep flaws. The article was retracted by the journal and by most of its authors. Independent research teams conducted more robust studies, finding no relationship between vaccines and autism.

Each research discipline has its own set of best practices for achieving reproducibility. Disciplines in which researchers use computational or statistical analysis require sharing the data and software code for reproducing studies. In other disciplines, researchers interpret nonnumerical qualities of data sources such as interviews, historical texts, social media content and more. These disciplines are working to develop standards for sharing their data and research designs for reproducibility. Across disciplines, the core principles are the same: transparency of the evidence and arguments by which researchers arrived at their conclusions.

It is true that the underlying data for some studies cannot be fully released to the public – for example, confidential patient health information or the exact locations for species threatened by illegal poaching. But this does not mean that the research didn’t employ many other reproducibility techniques or that the findings should be discredited. Even without publicly available data, the description of the data and methods should be transparent enough to understand and to replicate.

Reproducibility in the classroom

Colleges and universities are uniquely situated to promote reproducibility in research and public conversations. Critical thinking, effective communication and intellectual integrity, staples of higher-education mission statements, are all served by reproducibility.

Teaching faculty at colleges and universities have started taking some important steps toward incorporating reproducibility into a wide range of undergraduate and graduate courses. These include assignments to replicate existing studies, training in reproducible methods to conduct and document original research, preregistration of hypotheses and analysis plans, and tools to facilitate open collaboration among peers. A number of initiatives to develop and disseminate resources for teaching reproducibility have been launched.

Despite some progress, reproducibility still needs a central place in higher education. It can be integrated into any course in which students weigh evidence, read published literature to make claims, or learn to conduct their own research. This change is urgently needed to train the next generation of researchers, but that is not the only reason.

Reproducibility is fundamental to constructing and communicating claims based on evidence. Through a reproducibility lens, students evaluate claims in published studies as contingent on the transparency and soundness of the evidence and analysis on which the claims are based. When faculty teach reproducibility as a core expectation from the beginning of a curriculum, they encourage students to internalize its principles in how they conduct their own research and engage with the research published by others.

Institutions of higher education already prioritize cultivating engaged, literate and critical citizens capable of solving the world’s most challenging contemporary problems. Teaching reproducibility equips students, and members of the public, with the skills they need to critically analyze claims in published research, in the media and even at dinner parties.

Also contributing to this article are participants in the 2024 Reproducibility and Replicability in the Liberal Arts workshop, funded by the Alliance to Advance Liberal Arts Colleges (AALAC) [in alphabetical order]: Ben Gebre-Medhin (Department of Sociology and Anthropology, Mount Holyoke College), Xavier Haro-Carrión (Department of Geography, Macalester College), Emmanuel Kaparakis (Quantitative Analysis Center, Wesleyan University), Scott LaCombe (Statistical and Data Sciences, Smith College), Matthew Lavin (Data Analytics Program, Denison University), Joseph J. Merry (Sociology Department, Furman University), Laurie Tupper (Department of Mathematics and Statistics, Mount Holyoke College).

Editors Note: This article has been updated to clarify standards for good reproducibility.

The Conversation

Sarah Supp receives funding from the National Science Foundation, awards #1915913, #2120609, and #2227298.

Joseph Holler receives funding from the National Science Foundation, award #2049837.

Peter Kedron receives funding from the National Science Foundation, award #2049837 and from Esri.

Richard Ball has received funding from the Alfred P. Sloan Foundation and the United Kingdom Reproducibility Network.

Anne M. Nurse and Nicholas J. Horton do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Reproducibility may be the key idea students need to balance trust in evidence with healthy skepticism – https://theconversation.com/reproducibility-may-be-the-key-idea-students-need-to-balance-trust-in-evidence-with-healthy-skepticism-251771

4 creative ways to engage children in STEM over the summer: Tips to foster curiosity and problem-solving at home

Source: – By Amber M. Simpson, Associate Professor of Mathematics Education, Binghamton University, State University of New York

Families and caregivers can boost children’s confidence and interest in science, technology, engineering and mathematics while school is out for summer. heshphoto/Getty Images

The Trump administration is reshaping the pursuit of science through federal cuts to research grants and the Department of Education. This will have real consequences for students interested in science, technology, engineering and mathematics, or STEM learning.

One of those consequences is the elimination of learning opportunities such as robotics camps and access to advanced math courses for K-12 students.

As a result, families and caregivers are more essential than ever in supporting children’s learning.

Based on my research, I offer four ways to support children’s summer learning in ways that feel playful and engaging but still foster their interest, confidence and skills in STEM.

1. Find a problem

Two children wearing blue shirts lay on the grass with magnifying glasses, exploring the environment.
To support STEM learning outside of school, encourage children to find and solve problems.
kali9/Getty Images

Look for “problems” in or around your home to engineer a solution for. Engineering a solution could include brainstorming ideas, drawing a sketch, creating a prototype or a first draft, testing and improving the prototype and communicating about the invention.

For example, one family in our research created an upside-down soap dispenser for the following problem: “the way it’s designed” − specifically, the straw − “it doesn’t even reach the bottom of the container. So there’s a lot of soap sitting at the bottom.”

To identify a problem and engage in the engineering design process, families are encouraged to use common materials. The materials may include cardboard boxes, cotton balls, construction paper, pine cones and rocks.

Our research found that when children engage in engineering in the home environment with caregivers, parents and siblings, they communicate about and apply science and math concepts that are often “hidden” in their actions.

For instance, when building a paper roller coaster for a marble, children think about how the height will affect the speed of the marble. In math, this relates to the relationship between two variables, or the idea that one thing, such as height, impacts another, the speed. In science, they are applying concepts of kinetic energy and potential energy. The higher the starting point, the more potential energy is converted into kinetic energy, which makes the marble move faster.

In addition, children are learning what it means to be an engineer through their actions and experience. Families and caregivers play a role in supporting their creative thinking and willingness to work through challenging problems.

2. Spark curiosity

Two girls wearing shorts use tweezers and a magnifying class to examine insects and tree bark.
Spontaneous learning moments can lead to deep engagement and learning of STEM concepts.
cglade/Getty Images

Open up a space for exploration around STEM concepts driven by their interests.

Currently, my research with STEM professionals who were homeschooled talk about the power of learning sparked by curiosity.

One participant stated, “At one time, I got really into ladybugs, well Asian Beatles I guess. It was when we had like hundreds in our house. I was like, what is happening? So, I wanted to figure out like why they were there, and then the difference between ladybugs and Asian beetles because people kept saying, these aren’t actually ladybugs.”

Researchers label this serendipitous science engagement, or even spontaneous math moments. The moments lead to deep engagement and learning of STEM concepts. This may also be a chance to learn things with your child.

3. Facilitate thinking

In my research, being uncertain about STEM concepts may lead to children exploring and considering different ideas. One concept in particular − playful uncertainties − is when parents and caregivers know the answer to a child’s uncertainties but act as if they do not know.

For example, suppose your child asks, “How can we measure the distance between St. Louis, Missouri, and Nashville, Tennessee, on this map?” You might respond, “I don’t know. What do you think?” This gives children the chance to share their ideas before a parent or caregiver guides them toward a response.

4. Bring STEM to life

With a piggy bank in the foreground, a man holding money talks to a young girl about finances.
Overhearing or participating in budget talks can help children develop math skills and financial literacy.
SeizaVisuals/Getty Images

Turn ordinary moments into curious conversations.

“This recipe is for four people, but we have 11 people coming to dinner. What should we do?”

In a recent interview, one participant described how much they learned from listening in on financial conversations, seeing how decisions got made about money, and watching how bills were handled. They were developing financial literacy and math skills.

As they noted, “By the time I got to high school, I had a very good basis on what I’m doing and how to do it and function as a person in society.”

Globally, individuals lack financial literacy, which can lead to negative outcomes in the future when it comes to topics such as retirement planning and debt.

Why is this important?

Research shows that talking with friends and family about STEM concepts supports how children see themselves as learners and their later success in STEM fields, even if they do not pursue a career in STEM.

My research also shows how family STEM participation gives children opportunities to explore STEM ideas in ways that go beyond what they typically experience in school.

In my view, these kinds of STEM experiences don’t compete with what children learn in school − they strengthen and support it.

The Conversation

Amber M. Simpson receives funding from the U.S. National Science Foundation.

ref. 4 creative ways to engage children in STEM over the summer: Tips to foster curiosity and problem-solving at home – https://theconversation.com/4-creative-ways-to-engage-children-in-stem-over-the-summer-tips-to-foster-curiosity-and-problem-solving-at-home-257407

NCAA will pay its current and former athletes in an agreement that will transform college sports

Source: – By Joshua Lens, Associate Professor of Instruction of Sport & Recreation Management, University of Iowa

Former Arizona State University swimmer Grant House is one of the plaintiffs in the class action lawsuit filed against the NCAA. Mike Comer/NCAA Photos via Getty Images

The business of college sports was upended after a federal judge approved a settlement between the NCAA and former college athletes on June 6, 2025.

After a lengthy litigation process, the NCAA has agreed to provide US$2.8 billion in back pay to former and current college athletes, while allowing schools to directly pay athletes for the first time.

Joshua Lens, whose scholarship centers on the intersection of sports, business and the law, tells the story of this settlement and explains its significance within the rapidly changing world of college sports.

What will change for players and schools with this settlement?

The terms of the settlement included the following changes:

  • The NCAA and conferences will distribute approximately $2.8 billion in media rights revenue back pay to thousands of athletes who competed since 2016.

  • Universities will have the ability to enter name, image and likeness, or NIL, agreements with student-athletes. So schools can now, for example, pay them to appear in ads for the school or for public appearances.

  • Each university that opts in to the settlement can disburse up to $20.5 million to student-athletes in the 2025-26 academic year, a number that will likely rise in future academic years.

  • Athletes’ NIL agreements with certain individuals and entities will be subject to an evaluation that will determine whether the NIL compensation exceeds an acceptable range based on a perceived fair market value, which could result in the athlete having to restructure or forego the deal.

  • The NCAA’s maximum sport program scholarship limits will be replaced with maximum team roster size limits for universities that choose to be part of the settlement.

Why did the NCAA agree to settle with, rather than fight, the plaintiffs?

In 2020, roughly 14,000 current and former college athletes filed a class action lawsuit, House v. NCAA, seeking damages for past restrictions on their ability to earn money.

For decades, college athletics’ primary governing body, the NCAA, permitted universities whose athletics programs compete in Division I to provide their athletes with scholarships that would help cover their educational expenses, such as tuition, room and board, fees and books. By focusing only on educational expenses, the NCAA was able to reinforce the notion that collegiate athletes are amateurs who may not receive pay for participating in athletics, despite making money for their schools.

A year later, in 2021, the U.S. Supreme Court unanimously ruled in a separate case, Alston v. NCAA, that the NCAA violated antitrust laws by limiting the amount of education-related benefits, such as laptops, books and musical instruments, that universities could provide to their athletes. The ruling challenged the NCAA’s amateurism model while opening the door for future lawsuits tied to athlete compensation.

It also burnished the plaintiffs’ case in House v. NCAA, compelling college athletics’ governing body to take part in settlement talks.

What were some of the key changes that took place in college sports after the Supreme Court’s decision in Alston v. NCAA?

Following Alston, the NCAA permitted universities to dole out several thousand dollars in what’s called “education benefits pay” to student-athletes. This could include cash bonuses for maintaining a certain GPA or simply satisfying NCAA academic eligibility requirements.

But contrary to popular belief, the Supreme Court’s Alston decision didn’t let college athletes be paid via NIL deals. The NCAA continued to maintain that this would violate its principles of amateurism.

However, many states, beginning with California, introduced or passed laws that required universities within their borders to allow their athletes to accept NIL compensation.

With over a dozen states looking to pass similar laws, the NCAA folded on June 30, 2021, changing its policy so athletes could accept NIL compensation for the first time.

Will colleges and universities be able to weather all of these financial commitments?

The settlement will result in a windfall for certain current and former collegiate athletes, with some expected to receive several hundred thousands of dollars.

Universities and their athletics departments, on the other hand, will have to reallocate resources or cut spending. Some will cut back on travel expenses for some sports, others have paused facility renovations, while other athletic departments may resort to cutting sports whose revenue does not exceed their expenses.

As Texas A&M University athletic director Trev Alberts has explained, however, that college sports does not have a revenue problem – it has a spending problem. Even in the well-resourced Southeastern Conference, for example, many universities’ athletics expenses exceed its revenue.

Do you see any future conflicts on the horizon?

Many observers hope the settlement brings stability to the industry. But there’s always a chance that the settlement will be appealed.

More potential challenges could involve Title IX, the federal gender equity statute that prohibits discrimination based on sex in schools.

What if, for example, a university subject to the statute distributes the vast majority of revenue to male athletes? Such a scenario could violate Title IX.

Middle-aged man wearing lanyard being interviewed.
NCAA President Charlie Baker, who has served in his role since 2023, has overseen major changes in conference governance and athlete compensation.
David J. Griffin/Icon Sportswire via Getty Images

On the other hand, a university that more equitably distributes revenue among male and female athletes could face legal backlash from football athletes who argue that they should be entitled to more revenue, since their games earn the big bucks.

And as I pointed out in a recent law review article, an athlete or university may challenge
the new enforcement process that will attempt to limit athletes’ NIL compensation within an acceptable range that is based on a fair market valuation.

The NCAA and the conferences named in the lawsuit have hired the accountancy firm Deloitte to determine whether athletes’ compensation from NIL deals fall within an acceptable range based on a fair market valuation, looking to other collegiate and professional athletes to set a benchmark range. If athletes and universities have struck deals that are too generous, both could be penalized, according to the terms of the settlement.

Finally, the settlement does not address – let alone solve – issues facing international student-athletes who want to earn money via NIL. Most international student-athletes’ visas, and the laws regulating them, heavily limit their ability to accept compensation for work, including NIL pay. Some lawmakers have tried to address this issue in the past, but it hasn’t been a priority for the NCAA, as it has lobbied Congress for a federal NIL law.

The Conversation

Joshua Lens owns The Compliance Group, which provides NCAA compliance consulting services for universities and conferences.

ref. NCAA will pay its current and former athletes in an agreement that will transform college sports – https://theconversation.com/ncaa-will-pay-its-current-and-former-athletes-in-an-agreement-that-will-transform-college-sports-256178

How school choice policies evolved from supporting Black students to subsidizing middle-class families

Source: – By Kendall Deas, Assistant Professor of Education Policy, Law, and Politics, University of South Carolina

Originally developed as a tool to help Black children attend better schools, school voucher programs now serve a different purpose. Drazen via Getty Images

School voucher programs that allow families to use public funds to pay tuition to attend private schools have become increasingly popular.

Thirteen states and the District of Columbia currently operate voucher programs.

In addition, 15 states have universal private school choice programs that offer vouchers, education savings accounts and tax credit scholarships.

More states are considering school choice and voucher programs as the Trump administration advocates for widespread adoption.

School vouchers have a long history in the U.S.

The first vouchers were offered in the 1800s to help children in sparsely populated towns in rural Vermont and Maine attend classes in public and private schools in nearby districts.

After the U.S. Supreme Court’s 1954 Brown v. Board of Education decision, in which justices ruled that separating children in public schools on the basis of race was unconstitutional, segregationists used vouchers to avoid school integration.

More recently, school voucher programs have been pitched as a tool to provide children from low-income families with quality education options.

As a scholar who specializes in education policy, law and politics, I can share how current policies have strayed from efforts to support low-income Black children.

History of school voucher programs

A rolled diploma and mortar board with US dollars inside
Over time, as school voucher policies grew in popularity, they evolved into education subsidies for middle-class families.
Peter Dazeley/Getty Images

Research from education history scholars shows that more recent support for school choice was not anchored in an agenda to privatize public schools but rooted in a mission to support Black students.

Over time, as school voucher policies grew in popularity, they evolved into subsidies for middle-class families to send their children to private and parochial schools.

School choice policies have also expanded to include education savings account programs and vouchers funded by tax credit donations.

Vouchers can redirect money from public schools, many of which are serving Black students.

Impact on public schools

A boy wearing a white shirt holds up an exam paper with an A+ grade.
School voucher programs can negatively impact the quality of public schools serving Black students.
Connect Images via Getty Images

States looking to add or expand school choice and voucher programs have adopted language from civil rights activists pushing for equal access to quality education for all children. For example, they contend that school choice is a civil right all families and students should have as U.S. citizens. But school voucher programs can exclude Black students and harm public schools serving Black students in a host of ways, research shows.

This impact of voucher programs disproportionately affects schools in predominantly Black communities with lower tax bases to fund public schools.

Since the Brown v. Board ruling, school voucher programs have been linked to racial segregation. These programs were at times used to circumvent integration efforts: They allowed white families to transfer their children out of diverse public schools into private schools.

In fact, school voucher programs tend to exacerbate both racial and economic segregation, a trend that continues today.

For example, private schools that receive voucher funding are not always required to adopt the same antidiscrimination policies as public schools.

School voucher programs can also negatively impact the quality of public schools serving Black students.

As some of the best and brightest students leave to attend private or parochial ones, public schools in communities serving Black students often face declining enrollments and reduced resources.

In cities such as Macon, Georgia, families say that majority Black schools lack resources because so many families use the state’s voucher-style program to attend mostly white private schools.

Moreover, the cost of attending a private or parochial school can be so expensive that even with a school voucher, Black families still struggle to afford the cost of sending children to these schools.

Vouchers can siphon school funding

Two parents walk hand-in-hand in a school hallway while talking with an educator.
Voucher programs can disproportionately affect funding in majority Black school districts.
kali9/Getty Images

Research from the Economic Policy Institute, a nonpartisan, nonprofit think tank based in Washington, D.C., shows that voucher programs in Ohio result in majority Black school systems such as the Cleveland Metropolitan School District losing millions in education funding.

This impact of voucher programs disproportionately affects schools in predominantly Black communities across the U.S. with lower tax bases to fund public schools.

Another example is the Marion County School District, a South Carolina system where about 77% of students are Black.

Marion County is in the heart of the region of the state known as the “Corridor of Shame,” known for its inadequate funding and its levels of poor student achievement. The 17 counties along the corridor are predominantly minority communities, with high poverty rates and poor public school funding because of the area’s low tax base due to a lack of industry.

On average, South Carolina school districts spent an estimated US$18,842 per student during the 2024-25 school year.

In Marion County, per-student funding was $16,463 during the 2024-2025 school year.

By comparison, in Charleston County, the most affluent in the state, per-student funding was more than $26,000.

Returning voucher policy to its roots

Rather than focus on school choice and voucher programs that take money away from public schools serving Black students, I argue that policymakers should address systemic inequities in education to ensure that all students have access to a quality education.

Establishing restrictions on the use of funds and requiring preferences for low-income Black students could help direct school voucher policies back toward their intent.

It would also be beneficial to expand and enforce civil rights laws to prevent discrimination against Black students.

These measures would help ensure all students, regardless of background, have access to quality education.

The Conversation

Kendall Deas does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How school choice policies evolved from supporting Black students to subsidizing middle-class families – https://theconversation.com/how-school-choice-policies-evolved-from-supporting-black-students-to-subsidizing-middle-class-families-252481

The complex reality of college student mental health: Data reveals both challenges and positive trends

Source: – By Jeffrey A. Hayes, Professor of Education and Psychology, Penn State

College students are facing mental health challenges, but not all is lost. Bevan Goldswain/Getty Images

The word “crisis” is used frequently and, I would argue, inaccurately, to depict the psychological well-being of today’s college students.

It is true that college students’ mental health has deteriorated in many regards during the past two decades.

The Healthy Minds Study, which gathers national survey data on tens of thousands of students annually, has found that the percentage who considered suicide in the prior year rose from 6% in 2007 to 13% in 2024. The percentage of students who made a specific suicide plan tripled during that period.

While some news reports portray the current state of student mental health as an unprecedented crisis, the full picture is more nuanced. As a psychologist who has been researching college student mental health for more than 20 years, as summarized in my recent book, “College Student Mental Health and Wellness: Coping on Campus,” I believe recent data suggests a turning of the tide.

The 2024 Health Minds Study found a slight decrease over the previous two years in the percentage of students contemplating suicide.

Data also reveals a similar decline in the percentage of students dealing with severe anxiety from 2022 to 2024.

The study marks the first time since data collection began on suicide or severe anxiety that there has been a two-year decrease in either area.

Reason for concern

A student with dark hair sits in front of a laptop computer crying with her head in her hands.
The demand for psychological services at college and university counseling centers has outpaced growth in undergraduate enrollment.
Peter Dazeley/Getty Images

To be clear, there is reason for concern about the psychological well-being of college students.

Healthy Minds Study researchers found that in 2007, 9% of college students were taking psychotropic medication such as antidepressants. In 2024, that number had grown to 26%.

A 2024 national survey conducted by the American College Health Association found that more than a third of students received mental health care in the previous year.

The demand for psychological services at college and university counseling centers has outpaced growth in undergraduate enrollment more than fourfold.

From 2013 to 2021, suicidal thoughts, depression and anxiety worsened, particularly among Native American and Alaskan Native students and other students of color.

During that same time, there was a 13% increase in students who were at risk for developing an eating disorder.

Findings from another national dataset gathered by the Center for Collegiate Mental Health, an international network of more than 800 college and university counseling centers, indicate that from 2010 to 2024, depression symptoms increased 18% among students receiving psychological services, general anxiety symptoms rose more than 25%, and social anxiety symptoms climbed more than 30%.

In addition, students’ family-related distress steadily increased during the past decade.

The sky is not falling

A group of six smiling students walks outside on a college campus.
Despite disturbing trends in student mental health, recent data suggests that fewer students are contemplating suicide and dealing with anxiety.
Ariel Skelley/Getty Images

Despite these challenges, there is good news regarding decreases in the share of students considering self-injury and reporting depression symptoms.

Data from the Healthy Minds Study reveals that the percentage of students considering self-injury has not increased the past two years, after more than doubling from 14% in 2007 to 29% in 2022.

A similar pattern can be found in Center for Collegiate Mental Health data about depression. Depression symptoms have decreased each of the past two academic years.

The network has been collecting depression data since 2010, and never before have scores dropped in consecutive years.

Other researchers have noted a similar recent decrease in depression among college students.

The Center for Collegiate Mental Health data also indicates that students’ academic distress peaked following the onset of COVID-19 and declined each of the past three years, returning to pre-pandemic levels. Students’ frustration has also shown a gradual, 7% decline from 2010 to 2024.

Furthermore, for the first time since 2012, there has been a two-year uptick in college students who are flourishing, according to data from the Healthy Minds Study. Other researchers have found a similar recent trend, accompanied by a decrease in student loneliness.

More good news, based on data, about what students put in their bodies: Symptoms related to eating disorders have not increased in any of the past four years, according to the Center for Collegiate Mental Health. Data from the network indicates that current alcohol use is at its lowest level since 2010, declining 29% over that period.

Binge drinking has also decreased 18% since 2012, according to the Healthy Minds Study.

We need data, not dread

A student and mental health therapist sit and talk in a college library
Mental health professionals need accurate data to support the psychological well-being of college students.
SeventyFour/Getty Images

Valid data can help in discerning the truth about college student mental health.

Data that captures national trends in college student psychological well-being is needed to support mental health professionals. For example, as data reveals emerging trends, such as an increase in college students with attention-deficit/hyperactivity disorder, training can be provided to clinicians in treating students with these concerns.

Campus mental health professionals and administrators can also use data to advocate for resources they need to support students. For instance, our research has found that students of color are more likely to seek psychological help when there are therapists on staff from the same ethnic or racial background. This data can inform hiring practices at college and university counseling centers.

Finally, continuous data collection can help determine how college student mental health is impacted by specific events, such as pandemics, campus shootings and laws that eliminate diversity, equity and inclusion programs. During the COVID-19 pandemic, social anxiety decreased, while general anxiety spiked.

These events may not affect students equally.

International students, a group that already experiences heightened suicidal thoughts, may be particularly impacted by recent news of visa cancellations and deportations.

The Conversation

Jeffrey A. Hayes has received a research grant from the American Foundation for Suicide Prevention to study college student suicide.

ref. The complex reality of college student mental health: Data reveals both challenges and positive trends – https://theconversation.com/the-complex-reality-of-college-student-mental-health-data-reveals-both-challenges-and-positive-trends-257086

How the end of carbon capture could spark a new industrial revolution

Source: – By Andres Clarens, Professor of Civil and Environmental Engineering, University of Virginia

Steelmaking uses a lot of energy, making it one of the highest greenhouse gas-emitting industries.
David McNew/Getty Images

The U.S. Department of Energy’s decision to claw back US$3.7 billion in grants from industrial demonstration projects may create an unexpected opening for American manufacturing.

Many of the grant recipients were deploying carbon capture and storage – technologies that are designed to prevent industrial carbon pollution from entering the atmosphere by capturing it and injecting it deep underground. The approach has long been considered critical for reducing the contributions chemicals, cement production and other heavy industries make to climate change.

However, the U.S. policy reversal could paradoxically accelerate emissions cuts from the industrial sector.

An emissions reality check

Heavy industry is widely viewed as the toughest part of the economy to clean up.

The U.S. power sector has made progress, cutting emissions 35% since 2005 as coal-fired power plants were replaced with cheaper natural gas, solar and wind energy. More than 93% of new grid capacity installed in the U.S. in 2025 was forecast to be solar, wind and batteries. In transportation, electric vehicles are the fastest-growing segment of the U.S. automotive market and will lead to meaningful reductions in pollution.

But U.S. industrial emissions have been mostly unchanged, in part because of the massive amount of coal, gas and oil required to make steel, concrete, aluminum, glass and chemicals. Together these materials account for about 22% of U.S. greenhouse gas emissions.

The global industrial landscape is changing, though, and U.S. industries cannot, in isolation, expect that yesterday’s means of production will be able to compete in a global marketplace.

Even without domestic mandates to reduce their emissions, U.S. industries face powerful economic pressures. The EU’s new Carbon Border Adjustment Mechanism imposes a tax on the emissions associated with imported steel, chemicals, cement and aluminum entering European markets. Similar policies are being considered by Canada, Japan, Singapore, South Korea and the United Kingdom, and were even floated in the United States.

The false promise of carbon capture

The appeal of carbon capture and storage, in theory, was that it could be bolted on to an existing factory with minimal changes to the core process and the carbon pollution would go away.

Government incentives for carbon capture allow producers to keep using polluting technologies and prop up gas-powered chemical production or coal-powered concrete production.

The Trump administration’s pullback of carbon capture and storage grants now removes some of these artificial supports.

Without the expectation that carbon capture will help them meet regulations, this may create space to focus on materials breakthroughs that could revolutionize manufacturing while solving industries’ emissions problems.

The materials innovation opportunity

So, what might emissions-lowering innovation look like for industries such as cement, steel and chemicals? As a civil and environmental engineer who has worked on federal industrial policy, I study the ways these industries intersect with U.S. economic competitiveness and our built environment.

There are many examples of U.S. innovation to be excited about. Consider just a few industries:

Cement: Cement is one of the most widely used materials on Earth, but the technology has changed little over the past 150 years. Today, its production generates roughly 8% of total global carbon pollution. If cement production were a country, it would rank third globally after China and the United States.

Researchers are looking at ways to make concrete that can shed heat or be lighter in weight to significantly reduce the cost of building and cooling a home. Sublime Systems developed a way to produce cement with electricity instead of coal or gas. The company lost its IDP grant in May 2025, but it has a new agreement with Microsoft.

Making concrete do more could accelerate the transition. Researchers at Stanford and separately at MIT are developing concrete that can act as a capacitor and store over 10 kilowatt-hours of energy per cubic meter. Such materials could potentially store electricity from your solar roof or allow for roadways that can charge cars in motion.

How concrete could be used as a capacitor. MIT.

Technologies like these could give U.S. companies a competitive advantage while lowering emissions. Heat-shedding concrete cuts air conditioning demand, lighter formulations require less material per structure, and energy-storing concrete could potentially replace carbon-intensive battery manufacturing.

Steel and iron: Steel and iron production generate about 7% of global emissions with centuries-old blast furnace processes that use intense heat to melt iron ore and burn off impurities. A hydrogen-based steelmaking alternative exists today that emits only water vapor, but it requires new supply chains, infrastructure and production techniques.

U.S. Steel has been developing techniques to create stronger microstructures within steel for constructing structures with 50% less material and more strength than conventional designs. When a skyscraper needs that much less steel to achieve the same structural integrity, that eliminates millions of tons of iron ore mining, coal-fired blast furnace operations and transportation emissions.

Chemicals: Chemical manufacturing has created simultaneous crises over the past 50 years: PFAS “forever chemicals” and microplastics have been showing up in human blood and across ecosystems, and the industry generates a large share of U.S. industrial emissions.

Companies are developing ways to produce chemicals using engineered enzymes instead of traditional petrochemical processes, achieving 90% lower emissions in a way that could reduce production costs. These bio-based chemicals can naturally biodegrade, and the chemical processes operate at room temperature instead of requiring high heat that uses a lot of energy.

Is there a silver bullet without carbon capture?

While carbon capture and storage might not be the silver bullet for reducing emissions that many people thought it would be, new technologies for managing industrial heat might turn out to be the closest thing to one.

Most industrial processes require temperatures between 300 and 1830 degrees Fahrenheit (150 and 1000 degrees Celsisus for everything from food processing to steel production. Currently, industries burn fossil fuels directly to generate this heat, creating emissions that electric alternatives cannot easily replace. Heat batteries may offer a breakthrough solution by storing renewable electricity as thermal energy, then releasing that heat on demand for industrial processes.

How thermal batteries work. CNBC.

Companies such as Rondo Energy are developing systems that store wind and solar power in bricklike materials heated to extreme temperatures. Essentially, they convert electricity into heat during times when electricity is abundant, usually at night. A manufacturing facility can later use that heat, which allows it to reduce energy costs and improve grid reliability by not drawing power at the busiest times. The Trump administration cut funding for projects working with Rondo’s technology, but the company’s products are being tested in other countries.

Industrial heat pumps provide another pathway by amplifying waste heat to reach the high temperatures manufacturing requires, without using as much fossil fuel.

The path forward

The Department of Energy’s decision forces industrial America into a defining moment. One path leads backward toward pollution-intensive business as usual propping up obsolete processes. The other path drives forward through innovation.

Carbon capture offered an expensive Band-Aid on old technology. Investing in materials innovation and new techniques for making them promises fundamental transformation for the future.

The Conversation

Andres Clarens receives funding from the National Science Foundation and the Alfred P Sloan Foundation.

ref. How the end of carbon capture could spark a new industrial revolution – https://theconversation.com/how-the-end-of-carbon-capture-could-spark-a-new-industrial-revolution-257894