Attacks on Nigeria’s energy systems weaken the country – research unpacks costs, risks and ways forward

Source: The Conversation – Africa – By Haruna Inuwa, DPhil Candidate, Energy Systems, University of Oxford

Energy systems are coming under attack globally because disrupting power or fuel supplies offers strategic, economic or political leverage. This can be in local conflicts or large-scale geopolitical confrontations.

Nigeria illustrates this clearly: militants in the Niger Delta sabotage pipelines to assert control and tap into oil revenues, while the extremist group Boko Haram and armed bandits in the north hit power lines to weaken state presence.

These incidents reveal how conflict actors weaponise energy systems.

We recently published a study assessing how militancy, insurgency and armed banditry undermine Nigeria’s energy systems by disrupting oil, gas and power infrastructure. We compiled novel datasets of energy related incidents, mapping their timing, location and cost from 2009 to 2025.

Our findings show that more than 2,300 separate attacks were recorded. We see a widening pattern of energy insecurity that drains national revenue, drives away investment, and worsens environmental injustice.

This explains why Nigeria’s energy insecurity has become one of its most serious development and security challenges.

We recommend investment in decentralised systems, community engagement in oil regions, and policies supporting industrial decarbonisation to strengthen resilience and advance climate goals.

The price

According to our estimates, between 2009 and 2024, approximately US$20 billion was lost as a result of attacks. During the 2013-2016 surge in militancy, losses peaked at roughly US$17 billion.

We found that the South-South (Niger Delta) region remains the epicentre of oil sabotage, with peak revenue losses of US$8.62 trillion (2009-2012) and sustained environmental damage.

Attacks and oil theft along the Trans-Niger Pipeline were particularly devastating. This pipeline moves 450,000 barrels of crude oil daily from oil-producing fields in Niger Delta region to export terminals. Each disruption not only shuts down production but also deprives the government of huge revenues.

Since 2021, tactics have shifted. Over 40 attacks have targeted transmission lines in the North-East and North-Central, largely linked to Boko Haram and armed bandits.

Case studies of the 2016 Shell Forcados terminal bombing and the 2024 Shiroro transmission line attack show reliance on backup generators increased electricity costs by 3.2-6.0 times.

Beyond the financial toll, communities suffer respiratory illnesses, unsafe drinking water and food insecurity.

Disruptions have made Nigeria’s grid more unstable and pose risks to critical infrastructure projects nearing completion, including gas pipelines.

Attacks threaten regional energy trade and integration projects, such as the West African Power Pool, West African Gas Pipeline, Nigeria-Morocco Gas Pipeline, and the proposed Nigeria-Algeria-Gas-Pipeline, which rely on secure cross-border energy infrastructure.

Foreign investors view these risks as prohibitive. Due to attacks on energy infrastructure, in 2020, Nigeria lost around US$40 billion in foreign direct investment.

Oil theft and sabotage have also left a toxic legacy in the Niger Delta. Each pipeline rupture spills crude into rivers and farmland, wiping out livelihoods.

We find that clean-up costs from oil spills on the Trans-Niger Pipeline alone ranged from US$150 million to US$290 million per period (2009-2012, 2013-2016, 2017-2020, 2021-2024), highlighting continuous environmental degradation in the Niger Delta area.

In line with this, the United Nations Environment Programme estimated that a US$1 billion 30-year clean-up is needed in Ogoniland, while Reuters reported that addressing oil pollution in Bayelsa State alone might require US$12 billion over 12 years. When compared to Nigeria’s GDP of US$375 billion in 2024, these figures underscore the substantial financial strain that this attack-induced environmental crisis places on national resources.

Our analysis indicates that insurgents and bandits have shifted tactics since 2021. We see increased disruption and attacks on power infrastructure in the northern part of the country.

More than 40 incidents targeting high-voltage transmission lines have been recorded in just four years, a 20-fold increase from the previous decade. Two major examples show the consequences: the 2016 Forcados terminal bombing cut national power generation by 3,132MW, while the 2024 Shiroro transmission-line attack left the north-western part of the country in darkness for two weeks.

During attack-induced outages, businesses and households switch to diesel or petrol generators. We find that this backup electricity costs three to six times more than grid power, with the North-East and North-West experiencing the highest cost increase.

Each attack also carries an invisible environmental cost. Backup generators release far more carbon dioxide than grid electricity. During the 2016 and 2024 outages, we estimated sharp spikes in CO₂ across the South-West and South-South, Nigeria’s most energy-hungry regions.

This trend undermines Nigeria’s commitments under the National Climate Change Policy 2021-2030, which aims to cut emissions and expand energy access using renewable energy. Insecurity, therefore, is not just an economic or social problem – it is an obstacle to climate progress.

How Nigeria can respond

Our research points to several steps that could make the energy systems more resilient:

  1. Invest in decentralised and modular power systems: Smaller, locally managed plants – such as the 52-megawatt Maiduguri Emergency Power Plant – are harder to sabotage and quicker to repair.

  2. Rebuild trust with host communities: Environmental remediation and transparent benefit-sharing can reduce grievances that drive sabotage. Local participation in energy projects must move beyond tokenism.

  3. Adopt technology for early warning and monitoring: Pressure sensors, drones and predictive analytics can detect tampering and leaks in real-time. Government contracts with former militants to guard pipelines must be coupled with strict accountability.

  4. Accelerate innovative clean-energy deployment: In the light of Nigeria’s commitment to achieve climate goals, it is important to explore emerging decarbonisation pathways, including clean hydrogen.

Nigeria’s energy wealth has long promised prosperity, but persistent insecurity has made it a liability. The financial losses, pollution and emissions caused by repeated attacks erode resilience and deter investment. This challenge is not unique to Nigeria; it reflects a broader global reality in which energy transitions depend on secure infrastructure.

Achieving a stable, decentralised and low-carbon system will require protecting the assets that make it possible.

The Conversation

Haruna Inuwa receives funding from Petroleum Trust Development Fund, Nigeria. However, the views expressed herein do not necessarily reflect the Nigeria government’s official policies.

Stephanie Hirmer receives funding from the Climate Compatible Growth (CCG) Programme which is funded by UK aid from the UK government. The views expressed in this work do not necessarily reflect the UK government’s official policies.

Alycia Leonard does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Attacks on Nigeria’s energy systems weaken the country – research unpacks costs, risks and ways forward – https://theconversation.com/attacks-on-nigerias-energy-systems-weaken-the-country-research-unpacks-costs-risks-and-ways-forward-271366

Donkeys are a common sight in northern Namibia – what colonial history has to do with it

Source: The Conversation – Africa – By Giorgio Miescher, Associate Researcher University of Basel and University of Namibia, University of Basel

Donkeys are an unassuming yet ubiquitous presence in northern Namibia. They traverse sandy village roads, pull carts stacked with firewood, and graze freely along the northern edge of the Etosha National Park.

The story of how they came to occupy such a central role in rural life – and in such large numbers – is a fascinating one that’s linked to the country’s colonial history, the management of wildlife versus domestic animals, and the role of migrant workers.

We are historians who specialise in Namibia and Southern Africa. Our research focuses on colonial legacies in nature conservation and land. In a research paper we retraced the routes of the domesticated donkey through a conservation landscape.

We found that donkeys occupy a contradictory status in communities in northern Namibia. They are indispensable, yet undervalued. For example, they remain central to tasks such as ploughing, hauling water and transporting logs. Yet their social status remains curiously low. They are rarely used in ceremonies, have little monetary value, and are strongly associated with those who cannot afford tractors or cars.

We conclude that this ambiguity has arisen from the long histories of colonial rule, labour migration, conservation and veterinary control that shaped northern Namibia.

The great trek north

We traced donkeys’ ability to move across one of the country’s most significant borders: the veterinary cordon fence known as the Red Line. The Red Line is an inner-Namibian border, over 1,000 kilometres long, running from west to east and separating the country into two distinct parts. It originated under German colonial rule (1884-1915) and was fully implemented under South African rule (1915-1990).

It still exists today.

The Red Line separated the more densely populated northern parts of the country from the settler-colonial heartland, the so-called Police Zone in central and southern Namibia. The Etosha Game Reserve served as a buffer zone between the Police Zone and the Owambo region in the central north, conceptualised as a migrant labour reservoir.

Donkeys entered Namibia’s central north relatively late, and only became common in the 1920s and 1930s. Their presence across the region was driven largely by migrant labourers working on contract. As thousands of men travelled between the Police Zone and Owambo, many returned home with equines – especially donkeys – purchased in the south.

Cheap, hardy, and resistant to many diseases, donkeys became essential companions on the workers’ long journeys. Donkeys carried heavy loads of clothes, tools and other goods, including gramophones and radios, earned through contract labour.

Since they were associated with commodities, donkeys also became a symbol of modernity expanding from the thriving settler economy in the south.

Today, people still recount how returning labour migrants used donkeys to haul luggage through predator-rich landscapes within Etosha, or how villagers took their carts to meet these men halfway. Donkeys also served as ambulances during emergencies in the Namibian Liberation War (1966-1989).

Their presence has also been entangled with colonial border regimes and conservation policies.

The tensions

During the rinderpest epidemic of 1896-97, in a failed attempt to stop the disease from entering the colony, German colonial authorities established a cordon of military outposts along the southern edge of the Etosha Pan. Although intended to control the movement of cattle, this cordon would later become the Red Line.

The devastation of rinderpest prompted German forces to import donkeys and mules as disease-resistant alternatives to oxen. These animals gradually filtered into civilian hands in the Police Zone, the heartland of settler colonialism in central and southern Namibia, and became increasingly common by the 1910s.

The establishment of Game Reserve 2, comprising today’s Etosha National Park and the areas north-west of the Etosha Pan, was part of a policy to seal off Owambo from the Police Zone. Hunting and human movement in the reserve became highly regulated.

In 1915 South Africa defeated the German forces and took over Namibia. The new colonial power maintained the inner border and formalised it as the Red Line in the 1920s and 1930s. They banned cattle movement across the Red Line but allowed equines, provided they carried veterinary certificates.

Donkeys thus became one of the few domestic animals permitted to cross the border legally.

As migrant labour expanded, so too did the flow of donkeys northward. By the late 1920s and 1930s, hundreds of donkeys passed through Etosha each year. In Owambo, they were quickly adopted for local agriculture and transport. Even as motorised lorries and buses began to dominate long-distance travel from the 1930s onward, many migrant workers still preferred to buy donkeys as durable companions.

By the 1940s, however, administrators in Owambo began to worry about the donkeys’ impact on grazing. Restrictions were introduced, but donkeys continued to slip into the north through unofficial routes.

From the 1950s onward, the situation changed dramatically as the Etosha National Park was transformed into a fenced conservation area. Residents and livestock were expelled, and by 1961 the southern boundary was fully fenced. Donkey traffic through Etosha came to an end.

Meanwhile, the northern boundary of Etosha became a flashpoint. The government of the pseudo-independent new Ovamboland homeland resisted efforts to fence this border and insisted on continued movement of wildlife out of Etosha – especially zebra, an important local food source. Conservation officials accused communities of using donkeys to disguise poaching tracks and allowing their animals to stray into the park.

New rules

With Namibia’s independence in 1990, new animal-movement regulations emerged, but donkeys retained their special status. Unlike cattle, they were still permitted to cross the Red Line.

Their symbolic and practical importance has changed. Migrant workers no longer return with donkeys from the south, and motorised transport dominates even in rural areas.

But donkeys remain deeply woven into the fabric of northern Namibian life. They continue to support poorer households, endure harsh environments, and live in proximity to wildlife. Their presence evokes conflicting memories – of difficult journeys and colonial border regimes, but also of development and modernity.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Donkeys are a common sight in northern Namibia – what colonial history has to do with it – https://theconversation.com/donkeys-are-a-common-sight-in-northern-namibia-what-colonial-history-has-to-do-with-it-273058

South Africa’s new immigration policy takes a digital direction – will it succeed?

Source: The Conversation – Africa (2) – By Alan Hirsch, Senior Research Fellow New South Institute, Emeritus Professor at The Nelson Mandela School of Public Governance, University of Cape Town

South Africa has a new draft white paper on immigration, citizenship and refugees. This, the fourth in three decades, represents a step change from the previous efforts. It is a genuine attempt to develop an efficient but humane set of policies.

Based on my work on migration over two decades, I am convinced that the policies in this new paper are far more ambitious than previous reforms. They represent a genuine attempt to address a complex and sensitive set of challenges in a comprehensive way, using state-of-the-art technological tools. The key question is: are the reforms practically and politically feasible?

The first post-apartheid immigration white paper, published in 1997, led to the new Immigration Act of 2002. This was the second significant reform to immigration policy in the post-apartheid era. The first was the Refugee Act of 1998. The Refugee Act represented a bold realignment. In it South Africa acceded to global and African refugee treaties. It also placed human rights at the centre of the policy.

The 2002 Immigration Act was reformist rather than revolutionary. It was rightly criticised for not getting to grips with the legacy of migration patterns in southern Africa.

The white paper represents a far more coherent and systematic rethink than previous South African piecemeal reforms or similar attempts elsewhere in Africa.

The changes are being driven by Home Affairs minister Leon Schreiber. Schreiber is unusual among politicians. He is a real political scientist with real expertise in public policy. He is ambitious and seems determined to accomplish as much as he can in the current term of government. The impression I get is that his senior officials buy into the reforms – indeed, they devised many of them.

The generational change is essentially digitisation. All civil records about citizens, migrants, prospective migrants, visitors, asylum seekers and refugees will be digitised and integrated. If it works, it could result in a watertight management system for immigration, citizenship and refugee protection. This would be a huge step up from the current jumble of paper-based and incomplete datasets.

If completely successful it would eliminate both the massive inefficiency of the Department of Home Affairs, and the fraud and general confusion which still plague the governance of migrants and refugees in South Africa.

Fit for the 21st century

Digitisation and integration of information systems was recommended by the Lubisi enquiry into documentation fraud commissioned by the previous minister.

In my own work on South Africa’s migration policies, I made similar recommendations, with the benefit of the evidence in the Lubisi report and other sources.

At the heart of the system being proposed in the new white paper is an Intelligent Population Register. This is a modern, digitised system to manage and use comprehensive population data. Countries like Estonia and Denmark have pioneered such systems, and India has shown how a digital ID system can be extended to its massive population. Botswana already has an integrated civil registration system similar to the one South Africa is planning.

As the minister of Home Affairs put it, an intelligent population register

uses advanced technologies, such as artificial intelligence, machine learning, biometrics, interoperability and real-time data integration, to improve governance, integrated service delivery, and national planning.

The new system will require mandatory birth and death registration, and biometric data not only for citizens but also for foreigners, regular and irregular, who reside in the country. This would provide data that enables far more effective social and economic policies than the current incomplete population register.

Irregular foreigners, including asylum seekers and others whose status is yet to be determined, will be:

  • counted

  • allowed to use the banking system irrespective of their status

  • expected to pay tax.

Other improvements are that it will be:

  • more difficult for unethical visa applicants to game the system

  • easier to keep track of refugees and asylum seekers

  • more difficult to carry out identity theft.

The other major change is that the new system will introduce a “merit-based path” to naturalisation, in contrast to the existing “mechanical and compliance-based” pathway.

Merit is preferred to years served. After five years of permanent residence, naturalisation will be acquired according to a set of accomplishments that are yet to be detailed. This will be available to immigrants who have come in through a points-based system as well as to current citizens of Zimbabwe, Lesotho and Angola holding exemption permits. The yet to be finalised points system will include assessments of educational qualifications, acquired skills, and some measure of social impact.

The points-based system for skilled immigrants will replace or, for now, complement the critical skills list.

Other immigration reforms include a new start-up visa for tech firms, a subset of an investment visa which replaces the business visa, and new age and income requirements for retiree immigrants. The recently introduced Trusted Employer Scheme, Trusted Tour Operator Scheme and the remote work visa are endorsed in the white paper.

Reforms are proposed to speed up the asylum applications process, including a dedicated immigration court. Even those who obtain refugee status may be returned to the “first safe country” that they passed through when exiting their perilous country.

Countries which are safe for returnees would be designated by government – those which do not have raging civil wars or extreme repression or similar hazards for their citizens. South Africa would have to get agreement from the designated safe countries that they would accept returnees without prejudice.

Caveats and concerns

None of these reforms will be easy. Some, like the various points-based systems for entry, permanent residence and citizenship, and the establishment of dedicated refugee courts, are complex proposals not yet fully explained.

Other concerns include the privacy implications of the intelligent population register and the willingness of other countries to agree to being designated first safe country. Both issues are vulnerable to court challenges. Prospective first safe countries may require some incentive to cooperate, and South Africa might have to offer to accept a considerable share of the refugees.

There are also some issues covered in previous white papers not addressed here. Whether and how to draw on the financial and networking resources of the South African diaspora is not discussed. Nor is the issue of proactive policies to promote the social integration of foreigners.

Also not covered is the issue of lower-skilled migrants. However, migrant labour, mostly low-skilled, is the focus of the White Paper on National Labour Migration Policy republished by the Department of Employment and Labour last year.

The ambition signalled in the new policy paper is impressive. Whether it is doable, and whether the project will be completed, depends on many things, political, technical and judicial.

The Conversation

Alan Hirsch does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. South Africa’s new immigration policy takes a digital direction – will it succeed? – https://theconversation.com/south-africas-new-immigration-policy-takes-a-digital-direction-will-it-succeed-274038

Afcon drama: what went wrong and what went right at the continent’s biggest football cup in Morocco

Source: The Conversation – Africa – By Chuka Onwumechili, Professor of Communications, Howard University

The 35th edition of the Africa Cup of Nations, hosted by Morocco, produced thrills and several story lines, some good and others not so good. It ended in a victory for Senegal – their second Afcon championship. While the 1-0 victory over Morocco was deserved, the championship game ended on a sour note as fans invaded the field and the winning country abandoned the game for 16 minutes.

I’m a sports communications scholar and an author of multiple books on football as it relates to Africa.

The top four positives of the tournament were:

  • quality matches played on impeccable surfaces

  • expanded media coverage

  • increased global interest

  • higher attendance figures.

On the downside, however, we had the Senegalese team walkout during the final, bad refereeing decisions, especially in games involving Morocco, and ticketing challenges.

This 2026 Afcon provided examples of quality pitches and marketing that future hosts should learn from. However, providing better security around the field and better trained match officials are lessons that CAF (the Confederation of African Football) must learn from this tournament.

What went well

The infrastructure at Afcon showed Morocco’s readiness to host the World Cup later in the year. On six stadiums alone, the country spent US$1.4 billion. As much as US$10 billion was spent on allied public infrastructure for transport. The matches were of high quality on excellent surfaces.

The fans who watched the spectacular football on the field were transported by a high-speed rail system and seamless other transportation means.

The quality of the surfaces may have contributed to the fact that there were fewer surprises or upsets. All four teams that reached the semi-final stage – Egypt, Morocco, Nigeria and Senegal – were top ranked in their groups.

Eventually, the championship game was contested by the two top ranked African teams. The game was outstanding as the well-known names produced memorable football throughout the tournament.




Read more:
African football won the 34th Afcon, with Côte d’Ivoire a close second


Expanded media coverage

The decision to expand to additional markets led to expanded media coverage in China, Brazil and key European markets. With several well-known players from European clubs participating, a global audience was assured. Teams like Real Madrid, PSG, Bayern Munich, Manchester United and Liverpool had players participating.

Beyond those were recent world renowned players such as Sadio Mane, Riyad Mahrez and Pierre-Emerick Aubameyang. Those names were certain to attract media audiences across the world.

Viewership rose overall, with remarkable increases in Europe. France recorded 3.4 million viewers and the UK had 1.7 million viewers.

Increased global interest

CAF announced a 90% increase in revenue. This year’s revenue was US$192.6 million (US$114 million profit) compared to US$105.6 million and US$72 million profit in the previous Afcon. This shows the steady rise from just nine partners in the 2021 tournament to 17 in the 2023 tournament and 23 in this one. Greater media reach resulted in commercial interest.

Attendance figures have also risen remarkably. Figures announced at the end of the competition showed 1.34 million attended the games. The number of attendees in 2023 in Côte d’Ivoire was 1.1 million.

This clearly shows increased interest in the tournament. Morocco’s proximity to Europe was also a critical factor. More attendees travelled from the continent and elsewhere.

The prizes awarded to teams at the tournament also set records, with Senegal taking home US$11.6 million. Teams eliminated at the group stage received US$1.3 million each.




Read more:
Nigeria wins its 10th Wafcon title – but women’s football has never been more competitive


Errors

Angry scenes: The championship game was marred by a Senegalese walkout following protest over a penalty kick awarded to Morocco during the extra time. The game was delayed for 16 minutes. Senegal was angered by the cancellation of its goal late in regulation time. Its protest over the penalty awarded to Morocco lasted until one of its famous faces, Sadio Mane, asked his teammates to continue the game.

By then angry Senegalese fans had torn seats in the stands and multiple fights broke out. In the end, Morocco could not convert the penalty award and Senegal scored a memorable goal to emerge winner.

Umpiring questions: Throughout the tournament, Morocco appeared to be favoured by several refereeing decisions and non-decisions. CAF should consider match official exchange programmes with other confederations as a way of improving officiating. This would not only help Afcon but expose officials to other continental events.

Also of concern, Moroccan ball boys were seen seizing the goalkeepers’ towels for opposing teams in both Nigeria v Morocco and Senegal v Morocco.

Ticketing challenges: There were ticketing challenges also. While tickets were sold out, several stadiums during the group games were deserted. This may be attributed to hiccups where secondary sellers may have bought more tickets than they could re-sell. Nonetheless, an average 21,167 attended each game. Media attendance also rose during the tournament. Reports indicated over 3,800 journalists covered the event from Morocco.

Looking ahead

The competition demonstrated Morocco’s readiness to host World Cup games in 2030. Morocco, along with Spain and Portugal, will host the games, featuring 48 teams. All six cities used for the 2025 Afcon will host the world in 2030. Portugal will have only two host cities and Spain will provide nine venues.

It will be difficult for the host nations for the 2027 Afcon to match Morocco’s accomplishment.

The three hosts for 2027 – Kenya, Tanzania and Uganda – should at least measure up to what Côte d’Ivoire accomplished hosting the 2023 event.

They can look to improve the ticketing system, at the least. Further improving security around stadiums and educating the ball boys would help in protecting visiting teams.

But the on-field disturbances should not take away from this tournament’s numerous accomplishments off the field and the available facilities.

The Conversation

Chuka Onwumechili does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Afcon drama: what went wrong and what went right at the continent’s biggest football cup in Morocco – https://theconversation.com/afcon-drama-what-went-wrong-and-what-went-right-at-the-continents-biggest-football-cup-in-morocco-273819

Can shoes alter your mind? What neuroscience says about foot sensation and focus

Source: The Conversation – USA – By Atom Sarkar, Professor of Neurosurgery, Drexel University

Your shoes might not necessarily free your mind. ksana-gribakina/iStock via Getty Images Plus

Athletic footwear has entered a new era of ambition. No longer content to promise just comfort or performance, Nike claims its shoes can activate the brain, heighten sensory awareness and even improve concentration by stimulating the bottom of your feet.

“By studying perception, attention and sensory feedback, we’re tapping into the brain-body connection in new ways,” said Nike’s chief science officer, Matthew Nurse, in the company’s press release for the shoes. “It’s not just about running faster — it’s about feeling more present, focused and resilient.”

Other brands like Naboso sell “neuro-insoles,” socks and other sensory-based footwear to stimulate the nervous system.

It’s a compelling idea: The feet are rich in sensory receptors, so could stimulating them really sharpen the mind?

As a neurosurgeon who studies the brain, I’ve found that neuroscience suggests the reality is more complicated – and far less dramatic – than the marketing implies.

Close links between feet and brain

The soles of the feet contain thousands of mechanoreceptors that detect pressure, vibration, texture and movement.

Signals from these receptors travel through peripheral nerves to the spinal cord and up to an area of the brain called the somatosensory cortex, which maintains a map of the body. The feet occupy a meaningful portion of this map, reflecting their importance in balance, posture and movement.

Footwear also affects proprioception – the brain’s sense of where the body is in space – which relies on input from muscles, joints and tendons. Because posture and movement are tightly linked to attention and arousal, changes in sensory feedback from the feet can influence how stable, alert or grounded a person feels.

This is why neurologists and physical therapists pay close attention to footwear in patients with balance disorders, neuropathy or gait problems. Changing sensory input can alter how people move.

But influencing movement is not the same thing as enhancing cognition.

Proprioception is the sense of where your body is in space.

Minimalist shoes and sensory awareness

Minimalist shoes, with thinner soles and greater flexibility, allow more information about touch and body position to reach the brain compared with heavily cushioned footwear. In laboratory studies, reduced cushioning can increase a wearer’s awareness of where their foot is placed and when it’s touching the ground, sometimes improving their balance or the steadiness of their gait.

However, more sensation is not automatically better. The brain constantly filters sensory input, prioritizing what is useful and suppressing what is distracting. For people unaccustomed to minimalist shoes, the sudden increase in sensory feedback may increase cognitive load – drawing attention toward the feet rather than freeing mental resources for focus or performance.

Sensory stimulation can heighten awareness, but there is a threshold beyond which it becomes noise.

Can shoes improve concentration?

Whether sensory footwear can improve concentration is where neuroscience becomes especially skeptical.

Sensory input from the feet activates somatosensory regions of the brain. But brain activation alone does not equal cognitive enhancement. Focus, attention and executive function depend on distributed networks involving various other areas of the brain, such as the prefrontal cortex, the parietal lobe and the thalamus. They also rely on hormones that modulate the nervous system, such as dopamine and norepinephrine.

There is little evidence that passive underfoot stimulation – textured soles, novel foam geometries or subtle mechanical features – meaningfully improves concentration in healthy adults. Some studies suggest that mild sensory input may increase alertness in specific populations – such as older adults training to improve their balance or people in rehabilitation for sensory loss – but these effects are modest and highly dependent on context.

Put simply, feeling more sensory input does not mean the brain’s attention systems are working better.

Blurred shot of the legs and shoes of three people running
How you move in your shoes might matter more for your cognition than the shoes themselves.
Elena Popova/Moment via Getty Images

Belief, expectation and embodied experience

While shoes may not directly affect your cognition, that does not mean the mental effects people report are imaginary.

Belief and expectation still play a powerful role in medicine. Placebo effects and their influence on perception, motivation and performance are well documented in neuroscience. If someone believes a shoe improves focus or performance, that belief alone can change perception and behavior – sometimes enough to produce measurable effects.

There is also growing interest in embodied cognition, the idea that bodily states influence mental processes. Posture, movement and physical stability can shape mood, confidence and perceived mental clarity. Footwear that alters how someone stands or moves may indirectly influence how focused they feel, even if it does not directly enhance cognition.

In the end, believing a product gives you an advantage may be the most powerful effect it has.

Where science and marketing diverge

The problem is not whether footwear influences the nervous system – it does – but imprecision. When companies claim their shoes are “mind-altering,” they often blur the distinction between sensory modulation and cognitive enhancement.

Neuroscience supports the idea that shoes can change sensory input, posture and movement. It does not support claims that footwear can reliably improve concentration or attention for the general population. If shoes truly produced strong cognitive changes, those effects would be robust, measurable and reproducible. So far, they are not.

Shoes can change how we feel in our bodies, how you move through space and how aware you are of your physical environment. Those changes may influence confidence, comfort and perception – all of which matter to experience.

But the most meaningful “mind-altering” effects a person can experience through physical fitness still come from sustained movement, training, sleep and attention – not from sensation alone. Footwear may shape how the journey feels, but it is unlikely to rewire the destination.

The Conversation

Atom Sarkar does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Can shoes alter your mind? What neuroscience says about foot sensation and focus – https://theconversation.com/can-shoes-alter-your-mind-what-neuroscience-says-about-foot-sensation-and-focus-273759

NASA’s Artemis II crewed mission to the Moon shows how US space strategy has changed since Apollo – and contrasts with China’s closed program

Source: The Conversation – USA – By Michelle L.D. Hanlon, Professor of Air and Space Law, University of Mississippi

As part of the Artemis II mission, humans will fly around the Moon for the first time in decades. Roberto Moiola/Sysaworld via Getty Images

When Apollo 13 looped around the Moon in April 1970, more than 40 million people around the world watched the United States recover from a potential catastrophe. An oxygen tank explosion turned a planned landing into an urgent exercise in problem-solving, and the three astronauts on board used the Moon’s gravity to sling themselves safely home. It was a moment of extraordinary human drama, and a revealing geopolitical one.

The Cold War space race was a two-player contest. The Soviet Union and the United States operated in parallel, rarely cooperating, but clearly measuring themselves against one another. By 1970, the United States had already landed on the Moon, and competition centered on demonstrating technological capability, political and economic superiority and national prestige. As Apollo 13 showed, even missions that did not go as planned could reinforce a country’s leadership if they were managed effectively.

More than half a century later, NASA’s Artemis II mission will send humans around the Moon again in early 2026, this time deliberately. But the strategy going into Artemis II looks very different from that of 1970. The United States is no longer competing against a single rival in a largely symbolic race.

An artist's impression of a spacecraft flying over the surface of the Moon.
The crew will make a single flyby of the Moon in an Orion capsule, shown in this illustration.
NASA, CC BY-NC

As a professor of air and space law, I research questions of governance and conflict avoidance beyond Earth. From a space law perspective, sustained human activity on the Moon and beyond depends on shared expectations about safety and responsible behavior. In practice, the countries that show up, operate repeatedly and demonstrate how activity on the lunar surface and in outer space can be carried out over time shape these expectations.

Artemis II matters not as nostalgia or merely a technical test flight. It is a strategic signal that the United States intends to compete in a different kind of Moon race, one defined less by singular achievements and more by sustained presence, partnerships and the ability to shape how activity on the Moon is conducted.

From a 2-player race to a crowded field

Today, more countries are competing to land on the Moon than ever before, with China emerging as a pacing competitor. While national prestige remains a factor, the stakes now extend well beyond flags and firsts.

Governments remain central actors in the race to the Moon, but they no longer operate alone. Commercial companies design and operate spacecraft, and international partnerships shape missions from the start.

China, in particular, has developed a lunar program that is deliberate, well-resourced and focused on establishing a long-term presence, including plans for a research station. Its robotic missions have landed on the Moon’s far side and returned samples to Earth, and Beijing has announced plans for a crewed landing by 2030. Together, these steps reflect a program built on incremental capability rather than symbolic milestones.

Why Artemis II matters without landing

Artemis II, scheduled to launch in February 2026, will not land on the Moon. Its four-person crew will loop around the Moon’s far side, test life-support and navigation systems, and return to Earth. This mission may appear modest. Strategically, however, crewed missions carry a different weight than robotic missions.

A diagram showing the trajectory of Artemis II and major milestones, from jettisoning its rocket boosters to the crew capsule's separation.
Artemis II’s four-person crew will circle around the Earth and the Moon.
NASA

Sending people beyond low Earth orbit requires sustained political commitment to spaceflight, funding stability and systems reliable enough that sovereign and commercial partners can align their own plans around them.

Artemis II also serves as a bridge to Artemis III, the mission where NASA plans to land astronauts near the Moon’s south pole, currently targeted for 2028. A credible, near-term human return signals that the U.S. is moving beyond experimentation and toward a sustained presence.

The Artemis II mission, detailed from launch to splashdown.

2 different models for going back to the Moon

The contrast between U.S. and Chinese lunar strategies is increasingly clear.

China’s program is centrally directed and tightly controlled by the state. Its partnerships are selective, and it has released few details about how activities on the Moon would be coordinated with other countries or commercial actors.

The U.S. approach, by contrast, is intentionally open. The Artemis program is designed so partners, both other countries and companies, can operate within a shared framework for exploration, resource use and surface activity.

This openness reflects a strategic choice. Coalitions among countries and companies expand their capabilities and shape expectations about how activities such as landing, operating surface equipment and using local resources are conducted.

When vague rules start to matter

International space law already contains a framework relevant to this emerging competition. Article IX of the 1967 outer space treaty requires countries to conduct their activities with “due regard” for the interests of others and to avoid harmful interference. In simple terms, this means countries are expected to avoid actions that would disrupt or impede the activities of others.

For decades, this obligation remained largely theoretical. On Earth, however, similarly open-ended rules, particularly in maritime contexts, created international conflicts as traffic on shipping lanes, resource extraction and military activity increased. Disputes intensified as some states asserted claims that extended beyond what international law recognized.

The Moon is now approaching a comparable phase.

As more actors converge on resource-rich regions, particularly near the lunar south pole, due regard becomes an immediate operational question rather than a theoretical future issue. How it is interpreted – whether it means simply staying out of each other’s way or actively coordinating activities – will shape who can operate where, and under what conditions.

Washington is naming the race − without panic

During his second Senate Commerce Committee confirmation hearing, NASA Administrator Jared Isaacman was asked directly about competition with China in lunar exploration. He emphasized the importance of keeping U.S. space efforts on track over time, linking the success of the Artemis program to long-term American leadership in space.

A similar perspective appears in a recent U.S. government assessment, the U.S.-China Economic and Security Review Commission’s 2025 annual report to Congress. Chapter 7 addresses space as a domain of strategic competition, highlighting China’s growing capabilities. The report frames human spaceflight and deep-space infrastructure – including spacecraft, lunar bases and supporting technologies – as part of broader strategic efforts. It emphasizes growing a human space program over time, rather than changing course in response to individual setbacks or the accomplishments of other countries.

Three people sitting at a panel table and one speaking at a podium with the NASA logo. Projected behind them is a slide reading Artemis Accords, with the flags of several countries.
The U.S. approach to spaceflight is emphasizing international cooperation.
Joel Kowsky/NASA via Getty Images

Recent U.S. policy reflects this emphasis on continuity. A new executive order affirms federal support for sustained lunar operations, as well as commercial participation and coordination across agencies. Rather than treating the Moon as a short-term challenge, the order anticipates long-term activity where clear rules, partnerships and predictability matter.

Artemis II aligns with this posture as one step in the U.S.’s plans for sustained activity on the Moon.

A different kind of test

As Artemis II heads toward the Moon, China will also continue to advance its lunar ambitions, and competition will shape the pace and manner of activity around the Moon. But competition alone does not determine leadership. In my view, leadership emerges when a country demonstrates that its approach reduces uncertainty, supports cooperation and translates ambition into a set of stable operating practices.

Artemis II will not settle the future of the Moon. It does, however, illustrate the American model of space activity built on coalitions, transparency and shared expectations. If sustained, that model could influence how the next era of lunar, and eventually Martian, exploration unfolds.

The Conversation

Michelle L.D. Hanlon is affiliated with For All Moonkind, Inc. a non-profit organization focused on protecting human cultural heritage in outer space.

ref. NASA’s Artemis II crewed mission to the Moon shows how US space strategy has changed since Apollo – and contrasts with China’s closed program – https://theconversation.com/nasas-artemis-ii-crewed-mission-to-the-moon-shows-how-us-space-strategy-has-changed-since-apollo-and-contrasts-with-chinas-closed-program-270245

Repeated government lying, warned Hannah Arendt, makes it impossible for citizens to think and to judge

Source: The Conversation – USA – By Stephanie A. (Sam) Martin, Frank and Bethine Church Endowed Chair of Public Affairs, Boise State University

Despite evidence to the contrary, Homeland Security Secretary Kristi Noem said at a Jan. 24, 2026, news conference that Alex Pretti ‘came with a weapon … and attacked’ officers, who took action to ‘defend their lives.’ AP Photo/Julia Demaree Nikhinson

In Minneapolis, two recent fatal encounters with federal immigration agents have produced not only grief and anger, but an unusually clear fight over what is real.

In the aftermath of Alex Pretti’s killing on Jan. 24, 2026, federal officials claimed the Border Patrol officers who fired weapons at least 10 times acted in self-defense.

But independent media analyses showed the victim holding a phone, not a gun, throughout the confrontation. Conflicting reports about the earlier death of Renée Good have similarly intensified calls for independent review and transparency. Minnesota state and local officials have described clashes with federal agencies over access to evidence and investigative authority.

That pattern matters because in fast-moving crises, early official statements often become the scaffolding on which public judgment is built. Sometimes those statements turn out to be accurate. But sometimes they do not.

When the public repeatedly experiences the same sequence – confident claims, partial disclosures, shifting explanations, delayed evidence, lies – the damage can outlast any single incident.

It teaches people that “the facts” are simply one more instrument of power, distributed strategically. And once that lesson sinks in, even truthful statements arrive under suspicion.

And when government stories keep changing, democracy pays the price.

CNN’s Jake Tapper goes through key excerpts from a judge’s ruling which found that Border Patrol official Greg Bovino lied “multiple times” about events surrounding his deployment of tear gas in a Chicago neighborhood.

Lying in politics

This is not a novel problem. During the U.S. Civil War, for example, President Abraham Lincoln handled hostile press coverage with a blunt mix of repression and restraint. His administration shut down hundreds of newspapers, arrested editors and censored telegraph lines, even as Lincoln himself often absorbed vicious, personal ridicule.

The Iran-Contra scandal in the 1980s brought similar disingenuous attempts by the Reagan administration to manage public perception, as did misleading presidential claims about weapons of mass destruction in the 2003 leadup to the Iraq War.

During the Vietnam era, the gap between what officials said in public and what they knew in private was especially stark.

Both the Johnson and Nixon administrations repeatedly insisted the war was turning a corner and that victory was near. However, internal assessments described a grinding stalemate.

Those contradictions came to light in 1971 when The New York Times and The Washington Post published the Pentagon Papers, a classified Defense Department history of U.S. decision-making in Vietnam. The Nixon administration fiercely opposed the document’s public release.

Several months later, political philosopher Hannah Arendt published an essay in the New York Review of Books called “Lying in Politics”. It was also reprinted in a collection of essays titled “Crises of the Republic.”

Arendt, a Jewish refugee who fled Germany in 1933 to escape Nazi persecution and the very real risk of deportation to a concentration camp, argued that when governments try to control reality rather than report it, the public stops believing and becomes cynical. People “lose their bearings in the world,” she wrote.

‘Nobody believes anything any longer’

Arendt first articulated this argument in 1951 with the publication of “The Origins of Totalitarianism,” in which she examined Nazism and Stalinism. She further refined it in her reporting for The New Yorker on the 1961 trial of Adolf Eichmann, a major coordinator of the Holocaust.

Arendt did not wonder why officials lie. Instead, she worried about what happens to a public when political life trains citizens to stop insisting on a shared, factual world.

Arendt saw the Pentagon Papers as more than a Vietnam story. They were evidence of a broader shift toward what she called “image-making” – a style of governance in which managing the audience becomes at least as important as following the law. When politics becomes performance, the factual record is not a constraint. It is a prop that can be manipulated.

The greatest danger of organized, official lying, Arendt warned, is not that people will believe something that is false. It is that repeated, strategic distortions make it impossible for citizens to orient themselves in reality.

“The result of a consistent and total substitution of lies for factual truth is not that the lie will now be accepted as truth and truth be defamed as a lie,” she wrote, “but that the sense by which we take our bearings in the real world … [gets] destroyed.”

She sharpened the point further in a line that feels especially poignant in today’s fragmented, rapid and adversarial information environment:

“If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer,” she wrote. “A lying government has constantly to rewrite its own history … depending on how the political wind blows. And a people that no longer can believe anything cannot make up its mind. It is deprived not only of its capacity to act but also of its capacity to think and to judge.”

When officials lie time and again, the point isn’t that a single lie becomes accepted truth, but that the story keeps shifting until people don’t know what to trust. And when this happens, citizens cannot deliberate, approve or dissent coherently, because a shared world no longer exists.

A gray-haired woman with a cigarette, looking thoughtful.
Political theorist Hannah Arendt in 1963.
Bettman/Getty Images

Maintaining legitimacy

Arendt helps clarify what Minneapolis is showing us, and why the current federal government posture matters beyond one city.

Immigration raids are high-conflict operations by design. They happen quickly, often without public visibility, and they ask targeted communities to accept a heavy federal presence as legitimate. When killings occur in that context, truth and transparency are essential. They protect the government’s legitimacy with the public.

Reporting on the Pretti case shows why. Even as federal government leaders issued definitive claims about the victim’s allegedly threatening behavior – they said Pretti approached agents while pointing a gun – video evidence contradicted that official account.

The point isn’t that every disputed detail in a fast-moving, complicated event causes public harm. It’s that when officials make claims that appear plainly inconsistent with readily available evidence – as in the initial accounts of what happened with Pretti – that mismatch is itself damaging to public trust.

Distorted declarations paired with delayed disclosure, selective evidence or interagency resistance to outside investigations nudge the public toward a conclusion that official accounts are a strategy for controlling the story, and not a description of reality.

Truth is a public good

Politics is not a seminar in absolute clarity, and competing claims are always part of the process. Democracies can survive spin, public relations and even occasional falsehoods.

But Arendt’s observations show that it is the normalization of blatant dishonesty and systematic withholding that threatens democracy. Those practices corrode the factual ground on which democratic consent is built.

The U.S. Constitution assumes a people capable of what Arendt called judgment – citizens who can weigh evidence, assign responsibility and act through law and politics.

If people are taught that “truth” is always contingent and always tactical, the harm goes beyond misinformation. A confused, distrustful public is easier to manage and harder to mobilize into meaningful democratic participation. It becomes less able to act, because action requires a shared world in which decisions can be understood, debated and contested.

The Minneapolis shootings are not only an argument about use of force. They are a test of whether public institutions will treat facts and truth as a public good – something owed to the community precisely when tensions are highest. If democratic life depends on a social contract among the governed and those governing, that contract cannot be sustained on shifting sand. It requires enough shared reality to support disagreement.

When officials reshape the facts, the damage isn’t only to the record. The damage is to the basic belief that a democratic public can know what its government has done.

The Conversation

Stephanie A. (Sam) Martin does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Repeated government lying, warned Hannah Arendt, makes it impossible for citizens to think and to judge – https://theconversation.com/repeated-government-lying-warned-hannah-arendt-makes-it-impossible-for-citizens-to-think-and-to-judge-274340

Colorado ski resorts got some welcome snowfall from Winter Storm Fern, but not enough to turn a dry and warm winter around

Source: The Conversation – USA (2) – By Steven R. Fassnacht, Professor of Snow Hydrology, Colorado State University

Colorado ski resorts faced sparse snow conditions in early 2025. Hyoung Chang/Getty Images

Winter Storm Fern brought Colorado’s mountain towns a bit of what they’ve spent weeks hoping for.

It snowed 23 inches (58 centimeters) at the Crested Butte ski resort over the weekend of Jan. 24-25, 2026. Aspen Snowmass got 13 inches (33 cm).

It was a welcome change in Colorado, where the ski season is off to a slow start. By Thanksgiving 2025, Colorado had only about 45% of the average snowpack it usually does at that time of year. Thanksgiving weekend is when many western ski resorts, such as Steamboat and Vail, typically open for the season.

By January 2026, the snowpack had increased only slightly to 57% of average. About half of the runs were open at central Colorado resorts in late January.

Two people in winter clothes sit on a ski lift with a ski resort with sparse snow visible in the background.
Colorado’s ski season started out dry, with less than half the average snowpack in November.
Hyoung Chang/Getty Images

On top of a dry fall, Colorado has been unseasonably warm. Denver’s average air temperature for December 2025 was 11 degrees Fahrenheit (6 degrees Celsius) warmer than normal.

This phenomenon is not entirely new. Over the past four decades, Colorado has seen a decline in November snowfall, which is a problem for developing the snowpack base for ski runs. There has also been a decline in March snowpack, which can reduce skier numbers during spring break, when many families and college-age skiers typically flock to the mountains.

In spite of the welcome snowfall, the forecast for the rest of the season, which runs through April, doesn’t look good. It’s expected to continue to be warmer than average across the Colorado mountains.

The warm temperatures and lack of snow in Colorado are a problem for skiers and ski resorts. This has translated into a growing economic impact on the state’s mountain communities.

We are a snow hydrologist and a historian of the ski industry. We’re concerned about how this year’s continuing low snowfall will affect Colorado’s US$5 billion ski industry, the state’s environment and water resources across the western U.S.

Creating snow

The ski industry relies on natural snow falling from the sky. Natural snowfall can be supplemented by resorts making their own snow, often in a race to be the first ski area to open. This year, Keystone opened first after beginning to make snow before Halloween.

Ski resorts use snow guns that use high-pressure air to blow fine water particles that freeze and form snow. But to make snow, the wet-bulb temperature needs to be colder than 28 F (minus 2 C). The wet bulb temperature is a combination of air temperature, humidity and air pressure.

A man in snow gear stands behind a machine blasting a stream of snow into the air.
Snowmaking in Colorado can help ski resorts to start the season, but they still need snow to fall from the sky.
Boulder Daily Camera/via Getty Images

Snowmaking covers only a fraction of any ski resort in Colorado. Even the most extensive snowmaking resorts like Keystone are only able to cover 40% of their runs with humanmade snow. Other resorts can cover less than 10%. Snowmaking in Colorado provides the base for the ski season to start, but it can’t replace a season with no snow.

By comparison, ski areas in other states, such as Arizona’s Snowbowl, rely on snowmaking throughout the winter. Snowmaking can cover most of a ski resort in the eastern U.S., where resorts tend to be smaller than ones in the West.

Snowmaking has environmental costs. On average, snowmaking accounts for 67% of a ski resort’s electricity costs and consumes billions of gallons of water.

Snow is made using water taken from streams during low-flow condition, or times when less water is available. The water is essentially stored on the ski slope, with about 80% flowing back into the streams when it melts.

Due to water rights legislation unique to the western U.S., a ski resort cannot easily use more water to make more snow without going through an extensive and often expensive legal process.

Regardless of how much snow can be made, Colorado’s ski areas mostly rely on natural snowfall.

History of snowmaking

This year is far from the first time Colorado’s ski industry has struggled with a lack of snow. Many areas in the state did not see snow during the winter of 1976-77 until after the December holidays. The lack of snowfall caused skier numbers to drop a staggering 38% from the previous season.

That dry season convinced the ski industry to take matters into its own hands. In the summer of 1976, Winter Park Ski Area made a $1.2 million investment in snowmaking, which saved the following season. Other larger ski resorts followed suit and invested heavily in the technology over the following five years.

A black and white photo shows a machine to make snow in the foreground and skiers in the background on a mountain.
A November 1981 archival photo from The Denver Post shows the Loveland ski resort west of Denver after it opened using mechanically made snow.
Denver Post/via Getty Images

Over the past decade, Vail Resorts, owner of 42 ski resorts worldwide, has invested more than $100 million in snowmaking to compensate for marginal snow years.

These investments reflect a broader rivalry within the industry as resorts compete for a limited number of skiers. By the end of the 1990s, snowmaking was understood as essential for ski resorts across the country.

What’s next

Low snow years are not just a problem for skiing. Colorado has a semi-arid climate, so water stored in the snowpack is a crucial water resource. In Colorado, up to 80% of the water comes from snow, so below-average snowfall usually means there will be a drought during the summer months that follow. Dry winters also lead to more wildfires.

Snowpack also affects summer tourism activities in Colorado that rely on water from snowmelt, such as whitewater rafting, whitewater parks, fishing and related river activities.

A person fishing stands in a shallow river surrounded by brown and gold foliage.
Snowpack has a direct impact on Colorado’s summer tourism, including fishing.
UCG/Getty Images

Coloradans may hope for snow to change the course of this winter. It’s happened before, like with the March 2003 Colorado blizzard, when 3 feet (1 meter) of snow fell in a day. Or in the winter of 2010-11, which started out drier than average and went on to be the wettest year on record.

The ski industry has tried to insulate itself from bad snow years through season pass sales and diversifying entertainment options. Vail Resorts and Alterra Mountain Company require skiers and snowboarders to buy their Epic and Ikon season passes by October or spend upward of $300 for a day lift ticket. But the high costs of skiing are making the sport more exclusive.

Numerous ski resorts, such as Winter Park, have invested heavily in summer activities such as mountain biking and music festivals to increase revenue. But you can bet your mittens that people in Colorado’s mountain towns are hoping for another big dump of natural snow – and as soon as possible.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Colorado ski resorts got some welcome snowfall from Winter Storm Fern, but not enough to turn a dry and warm winter around – https://theconversation.com/colorado-ski-resorts-got-some-welcome-snowfall-from-winter-storm-fern-but-not-enough-to-turn-a-dry-and-warm-winter-around-272008

How fire, people and history shaped the South’s iconic longleaf pine forests

Source: The Conversation – USA (2) – By Andrea De Stefano, Assistant Professor of Forestry, Mississippi State University

A land manager examines young longleaf pines, some in their grassy phase, in a private forest in South Carolina. AP Photo/James Pollard

For thousands of years, one tree species defined the cultural and ecological identity of what is now the American South: the longleaf pine. The forest once stretched across 92 million acres from Virginia to Texas, but about 5% of that original forest remains. It was one of North America’s richest ecosystems, and it nearly disappeared.

As part of my job with the Mississippi State University forestry extension, I help private landowners, public agencies and nonprofit conservation groups restore these ecosystems. The forests’ story begins before European settlement, when Native peoples shaped and sustained this vast landscape using one of nature’s oldest tools: fire.

Longleaf pine trees depend on fire for survival and regeneration. Fire reduces competition from other plants, recycles nutrients into the soil and maintains the open structure of the landscape where longleaf pines grow best. In its open, grassy woodlands, red-cockaded woodpeckers, gopher tortoises, orchids, pitcher plants and hundreds of other species find homes.

A map of the southeastern United States shows the historical longleaf pine forest range in yellow and National Forests in green.
Historically, the longleaf pine forest had a vast range.
Andrea De Stefano, CC BY

Native stewardship

Longleaf pine seedlings spend about three to 10 years in a low, grasslike stage, building deep roots and resisting flames that sweep across the forest floor. Regular, low-intensity fires keep the ground open and sunny, and allow an incredibly diverse understory to flourish: pine lilies, meadow beauties, white bog orchids, carnivorous pitcher plants and dozens of native grasses.

For millennia, Native American tribes intentionally set fires to keep these areas open for hunting, travel and agriculture. This practice is evident from Indigenous oral histories, early European accounts and archaeological findings. Fire was part of daily life – a tool, not a danger.

People stand in a spacious open grove of trees.
A postcard from the early 20th century shows people standing next to longleaf pine trees in Mississippi.
Mississippi Department of Archives and History via Wikimedia Commons

European settlers arrive

When the first Europeans made it to that part of North America, they encountered a landscape that seemed almost limitless: tall, straight pines ideal for shipbuilding; deep soils in the uplands suited for farming; and understory, the plants that grow in the shade of the forest, perfect for open-range grazing.

Longleaf pine trees became the backbone of early industries. They provided lumber, fuel and naval supplies, such as tar, pitch and turpentine, which are essential for waterproofing wooden ships. By the mid-1800s, the naval industry alone consumed millions of longleaf pines each year, especially in the Carolinas, Georgia and Florida.

At the same time, livestock, especially hogs, roamed freely and caused unexpected ecological damage. Hogs rooted up the starchy, above-ground stems of young longleaf seedlings, often wiping out an area’s entire year of seedlings before they could grow beyond the grass stage.

Still, even into the mid-1800s, millions of acres of longleaf forest remained intact. That would soon change.

People, equipment and machines stand amid tall trees.
Workers build a logging railroad through a longleaf pine forest in Texas in 1902.
Corbis Historical via Getty Images

Industrial logging and the collapse of a forest

By the late 19th century, the industrial South entered a new era of logging. Railroads could reach deep into forests that were previously inaccessible. Steam-powered skidders dragged huge logs to mobile mills that could turn thousands of acres of trees into lumber in a single season. Lumber towns appeared overnight, then disappeared once the last trees were cut.

Most longleaf forests were felled between 1880 and 1930, with little thought given to regrowth. Land was cheap, timber was valuable, and scientific forestry was in its infancy. After logging, what was left on the ground at many sites burned in wildfires too hot for young longleaf pines to survive. Some of the fires were ignited accidentally by sparks from railroads or logging operations, others by lightning, and some by people attempting to clear land.

Other parcels of land were overrun by hogs or were converted to farms. Other forestland simply failed to regenerate because longleaf requires both good seed years and carefully timed burning to establish new generations of seedlings. By 1930, the once-vast longleaf forest was effectively gone.

A video shows the process of railroad-enabled logging of longleaf pine forests.

A turning point

The early 20th century brought public debates about fire. National forestry leaders, trained in northern ecosystems where wildfire was destructive, insisted that all fire was harmful and should be quickly extinguished. Southern landowners disagreed. They had long understood that fire kept the woods open, reduced pests and improved forage.

A series of pioneering researchers, including Herbert Stoddard, Austin Cary and others, proved scientifically what Native peoples had practiced for centuries: Prescribed fire is essential for longleaf pine forests.

By the 1930s, prescribed fire began to gain acceptance among Southern landowners and wildlife biologists, and by the 1940s it was recognized by state forestry agencies and the U.S. Forest Service as a legitimate management tool. This shift marked the beginning of a slow recovery of the forest.

Yet, after the logging of old-growth longleaf pine forests ended, foresters faced challenges regenerating the trees. Early planting attempts often failed. The longleaf species grows more slowly than loblolly or slash pine, making it less attractive to industry.

Millions of acres that once supported longleaf pines were converted to fast-growing plantation pines through the mid-20th century. By 1990, only 2.9 million acres of longleaf pine forest remained.

An open grassy area is punctuated by tall trees that are spaced well apart.
A view of a stand of young longleaf pines near Waycross, Ga., in 1936.
Carl Mydans via Library of Congress

A new era of restoration

But beginning in the 1980s, research breakthroughs had begun to offer the prospect of change. Studies across the Southeast demonstrated that longleaf pine trees could be reliably planted if seedling quality, site preparation and fire timing were carefully managed.

Improved genetics – for instance, choosing those seedlings more likely to grow straight and tall and those more resistant to disease and drought – and starting seedlings in containers increased survival dramatically.

A tree trunk shows black burn marks on its bark.
A longleaf pine tree shows marks from past controlled burns.
AP Photo/Chris Carlson

At the same time, landowners and agencies began to appreciate the benefits of longleaf pines. They are strong enough to withstand hurricanes, resistant to pests and disease, and provide high-quality timber and exceptional wildlife habitat. And they are compatible with grazing, need little to no fertilizer or other support to grow, and are ready to adapt to a warming, more fire-prone climate.

Today, many organizations are restoring longleaf pine trees across national forests, private lands and working farms.

Landowners are choosing the species not only for conservation but for recreation, hunting and cultural reasons.

In many parts of the South, longleaf pines have become a symbol of both heritage and resilience to hurricanes, drought, wildfire and climate change.

The longleaf pine ecosystem is more than a forest: It is the story about how people shape landscapes over centuries. It thrived under Native fire stewardship, declined under industrial exploitation, and is now returning – thanks to science, collaboration and cultural rediscovery.

The future of the longleaf pine forest will depend on continued use of prescribed fire, support for and from private landowners and recognition that restoring a complex ecosystem takes time. But across the South, the open, grassy longleaf pine ecosystems are coming back. A forest once given up for lost is becoming, again, a living emblem of the southern landscape.

The Conversation

Andrea De Stefano does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How fire, people and history shaped the South’s iconic longleaf pine forests – https://theconversation.com/how-fire-people-and-history-shaped-the-souths-iconic-longleaf-pine-forests-272003

Oversalting your sidewalk or driveway harms local streams and potentially even your drinking water – 3 tips to deice responsibly

Source: The Conversation – USA (2) – By Steven Goldsmith, Associate Professor of Environmental Science, Villanova University

Conservation organizations recommend using one 12-ounce coffee mug of deicer for every 10 sidewalk squares. Joe Lamberti/Getty Images News via Getty Images

Snow has returned to the Philadelphia region, and along with it the white residues on streets and sidewalks that result from the over-application of deicers such as sodium chloride, or rock salt, as well as more modern salt alternatives.

As an environmental scientist who studies water pollution, I know that much of the excess salt flows into storm drains and ultimately into area streams and rivers.

For example, a citizen science stream monitoring campaign led by the Stroud Water Research Center in Chester County – about 40 miles west of Philadelphia – found that chloride concentrations in southeastern Pennsylvania streams remained higher than EPA-recommended levels not only after winter snowfalls, but also in many cases during some summer months – showing salt persists in watersheds year-round.

Once there, it can have a profound impact on fish and other aquatic life. This includes a decrease in the abundance of macroinvertebrates, which are small organisms that form the base of many freshwater food webs, and reductions in growth and reproduction in fish.

Increased salt concentrations can also degrade and pollute the local water supply. Working with other researchers at Villanova University, I have measured spikes in sodium levels in Philadelphia region tap water during and immediately after snow melts. These spikes can pose a health risk to people on low-sodium diets.

What local governments can do

In recent years, many state and local governments nationwide have adopted best management practices – such as roadway brining, more efficient salt spreaders and improved storm forecasting – to limit damage from salt to infrastructure, including roads and bridges.

Roadway brining works by applying a salt solution, or brine, that contains about 23% sodium chloride by weight prior to a storm. Unlike road salt, brines adhere to all pavement and can prevent ice from sticking to the roadway during the storm. This potentially reduces the need for subsequent road salt applications.

View from overpass of snow-covered highway with two cars and snowplow
A view of I-676 during the major winter storm in Philadelphia on Jan. 25, 2026. Philly and other local governments pretreat major roads with brine to prevent ice.
Wolfgang Schwan/Anadolu via Getty Images

The environmental benefits of these best practices, when properly administered, are promising. The Maryland State Highway Administration reduced its total salt usage on roadways by almost 50% by using multiple best practices.

The extent to which these strategies will continue to reduce the road salt burden on roads and, by extension, improve the water quality of streams elsewhere will largely depend on political will and corresponding economic investments.

Yet, roads are not the only source of salt to our streams. Recent studies have suggested that the cumulative amount of salt applied to other impervious surfaces in a watershed, such as parking lots, driveways and sidewalks, can exceed that applied to roads.

For example, one survey of private contractors suggests their application rate can be up to 10 times higher than that of transportation departments.

I do not know of any studies that have been able to determine a household application rate.

How to salt at home

To better understand how individuals or households deice their properties, and what they know about the environmental impacts of deicing, I collaborated with a team of environmental scientists and psychologists at Villanova University and the local conservation-focused nonprofit Lower Merion Conservancy.

In winter 2024-2025, the Lower Merion Conservancy disseminated a survey in a social media campaign that received over 300 responses from residents in southeastern Pennsylvania. We are completing the analysis to determine a household application rate, but some of our initial findings provide a starting point for engaging households on how to limit the environmental impact of deicers.

One key finding is that only 7% of respondents reported being aware of municipal ordinances regarding deicer use on residential sidewalks.

Of those who applied deicers to their property, 55% indicated they were unsure whether they used them in a way that would reduce environmental harm.

About 80% of all respondents indicated interest in learning more about the environmental impacts of road salt.

Based on these survey results, here are several actionable steps that homeowners can take to reduce their deicer use.

1. Check your local municipal ordinance

Most municipalities in the greater Philadelphia area do not require deicer use but instead require clearing a walkable path – in most cases, 3 feet wide – free of snow and ice within a certain time frame after a storm event ends.

For example, the city of Philadelphia requires this be done within six hours, the borough of Narberth within 12 hours and Lower Merion and Haverford townships within 24 hours.

Narberth and Lower Merion specify which abrasives – such as sand, ashes and sawdust – or deicers, like rock salt, can be used if ice persists.

2. Use rock salt and other deicers judiciously

The recommended amount from conservation organizations is one 12-ounce coffee mug of deicer for every 10 sidewalk squares. Keep in mind that “pet-friendly” deicers are not necessarily environmentally friendly. Many of these deicers contain magnesium chloride, which is harmful to plants and aquatic life.

Deicers coupled with dyes might be a good choice to visually prevent over-application. They can also temporarily reduce concrete’s surface reflectivity, thereby increasing its warming effect and enabling melting.

Finally, it’s important to know that many deicers become ineffective at or below certain temperatures. Rock salt/sodium chloride loses its effectiveness at 15 degrees Fahrenheit (minus 9 Celsius), magnesium chloride at 5 F (minus 15 C) and calcium chloride at minus 20 degrees F (minus 29 C). If temperatures are expected to fall below those numbers, it might make sense to skip the salt.

Person spills bag of teal snow salt into a salt spreader
Colored deicers can make it easier to not spread too much.
Heather Diehl/Getty Images

3. Sweep up after

We have all seen rock salt on sidewalks for days on end, especially when a storm never materializes. If the next storm brings rain, this leftover salt will form a concentrated brine solution that will wash down the nearest storm drain and into a local waterway.

Leftover salt can be swept up and reapplied after the next storm event, saving money and supplies.

Read more of our stories about Philadelphia and Pennsylvania, or sign up for our Philadelphia newsletter on Substack.

The Conversation

Steven Goldsmith receives funding from the National Fish and Wildlife Foundation. He is affiliated with Villanova University.

ref. Oversalting your sidewalk or driveway harms local streams and potentially even your drinking water – 3 tips to deice responsibly – https://theconversation.com/oversalting-your-sidewalk-or-driveway-harms-local-streams-and-potentially-even-your-drinking-water-3-tips-to-deice-responsibly-274353