How Donald Trump’s economic policies, including uncertainty around tariffs, are damaging the US economy

Source: The Conversation – UK – By John Whittaker, Senior Teaching Fellow in Economics, Lancaster University

Donald Trump set a deadline of July 9 2025 for trade deals to be made before he hits some of the world’s biggest economies with his controversial tariffs. It’s impossible to predict what will happen on the day, but it is already clear that his economic policies are damaging American interests.

Just look at the state of US government debt for example. Currently it stands at US$36 trillion (£26 trillion). And with total economic output (GDP) worth US$29 trillion per year, that debt is 123% of GDP, the highest it has been since 1946.

Government debts are alarmingly high in other countries too (the UK’s is at 104% of GDP, with France at 116% and China at 113%), but the US is towards the top of the range.

The recently passed budget reconciliation bill (what Trump calls the “big beautiful bill”) is projected to add US$3 trillion to that debt over the next decade. With these sorts of numbers, there is little prospect of putting US debt on a downward track.


Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK’s latest coverage of news and research, from politics and business to the arts and sciences.


In 2024, the US government had to borrow an additional US$1.8 trillion to cover spending not supported by tax revenue (the budget deficit). This is equivalent to 6.2% of GDP, a number that is officially predicted to rise to 7.3% during the next 30 years.

The predictable consequence of this fiscal profligacy and the chaotic tariff programme is the high rates of interest that the US government is having to pay for its borrowing.

For instance, the interest rate on ten-year US government debt (otherwise known as its yield) has risen from 0.5% in mid-2020 to 4.3% now. And as government debt yields rise, so do interest rates on mortgages and corporate borrowing.

The power of the dollar

For decades, the United States has enjoyed a high level of trust in the strength, openness and stability of its economy.

As a result, US bonds or “treasuries”, the financial assets that the government sells to raise money for public spending, have long been considered safe investments by financial institutions around the world. And the US dollar has been the dominant currency for international payments and debts.

Sometimes referred to as “exorbitant privilege”, this status of the US dollar as the world’s reserve currency brings big advantages. It benefits US consumers by making imported goods cheaper (albeit contributing to the trade deficits (when US imports to a country are worth more than its exports) which bother the president so much).

It also means the US government can borrow a lot of money before doubts arise about its ability to repay. Investors will generally buy as many bonds as the US govt needs to issue to pay for its spending.

The dominance of the dollar in international transactions also brings political power, such as the ability to exclude Russia from major global payment systems.

But this privilege is being eroded by the US president’s tariff agenda. Economic motives aside, it is the way they are being applied – their size and the unpredictability – that is really sapping investor confidence.

It’s costly to adjust trading patterns and supply chains in response to tariffs. So when the scope of future tariffs is unknown, the rational response is to stop investing while awaiting greater certainty.

The dollar has lost 8% in value since the beginning of the year, reflecting investor doubts about the US economy, and making imports even more expensive.

Financial markets are vulnerable

But perhaps the biggest danger to US financial markets is a sudden rise in yields on government debt. No investor wants to be left holding a bond when its yield rises because – as with all fixed-interest debt – the rise in yield causes the bond’s market value to fall. This is because new bonds are issued with a higher yield, making existing bonds less attractive and less valuable.

A bond holder expecting a rise in yield therefore has an incentive to sell it before the rise occurs. But the rise in yield can become self-reinforcing if the scramble to sell becomes a stampede.

Indeed, there was a jump in US yields after the increases in trade tariffs announced on “liberation day” in early April, with the yield on ten-year treasuries rising by 0.5% in just four days.

Magnifying glass on US dollar bill with US flag and financial chart graphics.
Damaged dollar?
Dilok Klaisataporn/Shutterstock

Fortunately, this rise was halted on April 10 when the tariffs were abruptly paused, allegedly in response to the fall in bond prices and an accompanying fall in share prices. The opinion of a senior central banker, that financial markets had been close to “meltdown”, was one of several such warnings.

The dollar is unlikely to be quickly dislodged from its pedestal as the world’s reserve currency, as the alternatives are not attractive. The euro is not suitable because it is the currency of 20 EU countries, each with its own separate government debt. Nor is the Chinese yuan a likely contender, given the Chinese government involvement in managing the yuan exchange rate.

But since March, foreign central banks have been selling off US treasuries, often choosing to hold gold instead.

On Trump’s watch, the reputation of the US dollar as the ultimate safe asset has been tarnished, leaving the financial system more vulnerable – and borrowing more expensive.

The Conversation

John Whittaker does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How Donald Trump’s economic policies, including uncertainty around tariffs, are damaging the US economy – https://theconversation.com/how-donald-trumps-economic-policies-including-uncertainty-around-tariffs-are-damaging-the-us-economy-259809

Agatha Christie’s mid-century ‘manosphere’ reveals a different kind of dysfunctional male

Source: The Conversation – UK – By Gill Plain, Professor of English Literature and Popular Culture, University of St Andrews

This piece contains spoilers for Towards Zero.

Agatha Christie, a middle-class English crime writer who preferred to be known as a housewife, is the world’s bestselling novelist. Since her death in 1976, her work has been translated into over 100 languages and adapted for cinema, TV and even video games.

Her writing is characterised by its cheerful readability and ruthless dissection of hypocrisy, greed and respectability. Christie is fascinated by power and its abuse, and explores this through the skilful deployment of recognisable character types. The suspects in her books are not just there for the puzzle – they also exemplify the attitudes, ideals and assumptions that shaped 20th-century British society.

If we want to know about the mid-century “manosphere”, then, there is no better place to look than in the fiction of Agatha Christie. What did masculinity mean to this writer, and would we recognise it in the gender types and ideals of today? Some answers might be found through the recent BBC adaptation of Towards Zero, which confronts viewers with a range of dysfunctional male types.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


Chief among these is Thomas Royde, a neurotic twitching figure driven to breakdown by the shame of having his word doubted. Gaslit by his pathologically perfect cousin Nevile, Thomas has been dispatched to the colonies, where he has compounded his injuries through financial failure. Broke and broken, the adaptation imagines him returning to the family home with trauma quite literally written on his body.

This is not the Thomas Royde of Christie’s original 1944 novel. That figure was stoic, silent and perfectly capable of managing his failure to live up to the spectacular masculinity of cousin Nevile. Christie’s Thomas may have regretted his romantic losses and physical limitations, but the idea of exposing his pain in public would have horrified him.

This is not a case of repression; rather it speaks to a world in which pain is respected, but simply not discussed. Thomas’s friends, we are told, “had learned to gauge his reactions correctly from the quality of his silences”. The stoical man of few words is a recurrent type within Christie’s fiction. It’s a mode of masculinity of which she approves – even while poking fun at it – and one recognised by her mid-20th century audience.

These are men who embody ideal British middle-class values: steady, reliable, resilient, modest, good humoured and infinitely sensible. They find their fictional reward in happy unions, sometimes with sensible women, sometimes with bright young things who benefit from their calm assurance.

Christie also depicted more dangerous male types – attractive adventurers who might be courageous, or reckless and deadly. These charismatic figures present a troubling mode of masculinity in her fiction, from the effortlessly charming Ralph in The Murder of Roger Ackroyd (1926), to Michael Rogers, the all too persuasive narrator of Endless Night (1967).

Superficially, these two types of men might be mapped onto Christie’s own experiences. Her autobiography suggests that she was irresistibly drawn to something strange and inscrutable in her first husband, Archie. By contrast, her second husband, the archaeologist Max Mallowan, brought friendship and shared interests.

Yet while it’s possible to see biographical resonances in these types, it is equally important to recognise them as part of a middle-class world view that set limits on acceptable masculinities. In my book, Agatha Christie: A Very Short Introduction, I explore these limits, examining a cultural climate riven with contradictions.

A different time

Mid-20th century culture insisted that men be articulate when discussing public matters – science, politics, sport – but those who extended this to the emotions were not to be trusted. They were seen to be glib, foolish or possibly dangerous.

British masculinity acts rather than talks and does a decent job of work. As a result, work itself is a vital dimension of man-making in Christie’s novels, and in the fiction of contemporaries like Nigel Balchin, Hammond Innes and Nevil Shute.

These writers witnessed the conflicting pressures on men, expected to be both soldiers and citizens, capable of combat and domestic breadwinning. They saw the damage caused by war, unemployment and the loss of father figures. But the answer wasn’t talking. Rather, the best medicine for wounded masculinity was the self-respect that comes with doing a good day’s work.

This ideology still resonates within understandings of “healthy” masculinity, but there are limits to the problems that can be solved through a companionable post-work pint. Which brings us back to the BBC’s Towards Zero. Contemporary adaptations often speak to the preoccupations of their moment, and the plot is driven by one man’s all-consuming hatred of his ex-wife.

With apologies for plot spoilers, perfect Nevile turns out to be a perfect misogynist, scheming against the woman who has – to his mind – humiliated him. But the world of his hatred is a long way from the online “manosphere” of our contemporary age.

Quite aside from the technological gulf separating the eras, Christie does not imagine misogyny as an abusive mass phenomenon, a set of echo chambers which figure men as the victims of feminism. Rather, Nevile, like all Christie’s murderers, kills for reasons that can clearly be defined, detected and articulated: he is an isolated madman, not a cultural phenomenon.

Towards Zero’s topicality – its preoccupation with celebrity, resentment of women and a manipulative gaslighting villain – does much to explain its adaptation, but it does not account for the radical revision of Thomas Royde. Is it an indication that stoicism is out of fashion? Or simply a desire to convert Christie’s cool-tempered fictions into melodramas appropriate for a social-media age?

Whatever the thinking, there is a familiar consolation for Thomas’s pain. He might not get the girl of his dreams, but he does get something better: a steady, reliable woman whose modest virtues illustrate that, in Christie’s world, “ideal masculinity” is unexpectedly non-binary. Women can be just as stoic, reserved and resilient as men.

Christie’s “manosphere”, then, has its share of haters, but they are isolated figures forced to disguise their resentments. They also, frequently, meet untimely ends – another reason why Christie remains a bestseller to this day.


This article features references to books that have been included for editorial reasons, and may contain links to bookshop.org. If you buy something, The Conversation UK may earn a commission.

The Conversation

Gill Plain does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Agatha Christie’s mid-century ‘manosphere’ reveals a different kind of dysfunctional male – https://theconversation.com/agatha-christies-mid-century-manosphere-reveals-a-different-kind-of-dysfunctional-male-254726

I survived the 7/7 London bombings, but as a British Muslim I still grew up being called a terrorist

Source: The Conversation – UK – By Neema Begum, Assistant Professor in British Politics, University of Nottingham

Twenty years ago, I was walking through central London with my history teacher when a bus exploded behind us. We were in London for an awards ceremony at Westminster where I was to pick up the award for best opposition speaker in the Youth Parliament competition.

We had arrived at Euston station and all local transport had been cancelled. At this point, we heard that there’d been a bomb scare.

We bought a map at the station and set off to walk to Westminster when the number 30 bus exploded on Tavistock Square. It was the loudest sound I’d ever heard. People were running and screaming. We ran too and took shelter in a nearby park.

We later learned that four bombs had been detonated on London’s transport system. The attack, carried out by British-born Islamist extremists on July 7 2005, claimed the lives of 52 people and injured hundreds. My teacher and I were far enough away from the bus to be physically unhurt.

Four years on from the attacks on 9/11, this was a time when, in the minds of many, Muslims were already associated with terrorism. Despite going to a state school where the pupils were predominantly Muslim, we were called terrorists in the playground.


Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK’s latest coverage of news and research, from politics and business to the arts and sciences.


In the aftermath of 7/7, there was no space for Muslim survivors like me. No headlines about our fear, our trauma or our belonging – only suspicion. While I was lucky to walk away physically unscathed, I carried a different kind of wound: being part of a community that was treated with collective blame.

My academic research focuses on ethnic minority voting behaviour, political participation and representation in Britain. The events of 7/7 marked a critical moment in how British Muslims are still viewed as inherently suspect today.

Over the last 25 years, Muslim communities have been viewed as places where terrorism is fostered. Following 7/7, British Muslims were viewed as a security threat by politicians, the media and many non-Muslims.

One stark example was the implementation of the Prevent counter-terrorism programme after 7/7. Prevent has contributed to increased surveillance and marginalisation of Muslim communities in the UK.

Fear of Muslims and especially “home-grown terrorists” has meant that Muslims are made to feel that they must condemn terrorist acts. Despite the fact that an overwhelming majority of Muslims in the UK identify as British and are proud to be British citizens, British Muslims often feel they must prove their “Britishness” and distance themselves from stereotypes of Muslims as terrorists or terrorist sympathisers.

Post-7/7 arguments that British Muslims were at odds with “British values” and fears that Britain was sleepwalking into segregation have persisted in politics and the media. Negative portrayals of Islam and Muslims in media, including stigmatising, offensive and biased news reports have not helped.

In 2013, a device exploded outside the mosque I attended as a child, carried out by an extreme right-wing white supremacist. In 2025, hate crimes against Muslims have reached record levels.

Stereotypes of Muslims in politics

Twenty years after the London bombings, there are more Muslim voices in politics and media, and a greater awareness of Islamophobia. The idea that London could have a Muslim mayor, as it does today with Sadiq Khan, may have been unthinkable in the immediate aftermath of 7/7.

But the fear that gripped the country in 2005 never disappeared, it just changed shape. Today it shows up in political attacks and increases in anti-Muslim hate crimes in the context of the war in Gaza. It also shows up in attacks on the religious freedoms of British Muslims – like calls for a burka ban – under the guise of “British values”.

While there are more Muslims in politics at every level, they are not exempt from stereotypes. In my research on ethnic minority local councillors, I’ve found Muslim women councillors were often stereotyped as submissive and oppressed in white council spaces.

A hijab-wearing Muslim woman councillor received comments that she wasn’t “westernised enough” and that she needed to be “more modernised”. Another Muslim woman councillor had a white male journalist remark that she was “very confident” in a way she felt was derisive.

Working against ingrained stereotypes of how a Muslim woman would behave, these councillors often faced a double burden: having to constantly prove their “modernity” and competence while simultaneously navigating accusations of being either too passive or too assertive – never quite fitting the narrow expectations imposed upon them.

The 7/7 memorial in Hyde Park, a collection of vertical steel columns, against a blue sky
The 7/7 memorial in London’s Hyde Park.
Chris Dorney/Shutterstock

In research on ethnic minority voting behaviour in the EU referendum, I found that campaign groups for Brexit such as Muslims for Britain drew on “good Muslim” narratives to buttress their claims to Britishness. For example, they have referred to the sacrifices Muslim soldiers made for Britain in the two world wars, to position British Muslims – particularly those with south Asian heritage – as established and loyal members of the nation.

Even as a survivor of terrorism, I – like many British Muslims – am constantly made to prove my distance from it. I have particularly noticed this as a woman of Bangladeshi heritage, sharing a surname with Shamima Begum, who joined Islamic State as a teenager and had her UK citizenship stripped.

Begum is also my mother’s name, my classmates’ name, and shared by many British Bengali women. It belongs to Nadiya Hussain (née Begum), winner of The Great British Bake Off and Halima Begum, chief executive of Oxfam. Behind every headline are real, complex communities still hoping to be seen beyond the shadow of suspicion.

The Conversation

Neema Begum receives funding from the British Academy.

ref. I survived the 7/7 London bombings, but as a British Muslim I still grew up being called a terrorist – https://theconversation.com/i-survived-the-7-7-london-bombings-but-as-a-british-muslim-i-still-grew-up-being-called-a-terrorist-259316

Ageing isn’t the same everywhere – why inflammation may be a lifestyle problem

Source: The Conversation – UK – By Samuel J. White, Associate Professor & Head of Projects, York St John University

The Orang Asli age differently. Azami Adiputera/Shutterstock.com

For years, scientists have believed that inflammation inevitably increases with age, quietly fuelling diseases like heart disease, dementia and diabetes. But a new study of Indigenous populations challenges that idea and could reshape how we think about ageing itself.

For decades, scientists have identified chronic low-level inflammation – called “inflammaging” – as one of the primary drivers of age-related diseases. Think of it as your body’s immune system stuck in overdrive – constantly fighting battles that don’t exist, gradually wearing down organs and systems.

But inflammaging might not be a universal feature of ageing after all. Instead, it could be a byproduct of how we live in modern society.


Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK’s latest coverage of news and research, from politics and business to the arts and sciences.


The research, published in Nature Aging, compared patterns of inflammation in four very different communities around the world. Two groups were from modern, industrialised societies – older adults living in Italy and Singapore.

The other two were Indigenous communities who live more traditional lifestyles: the Tsimane people of the Bolivian Amazon and the Orang Asli in the forests of Malaysia.

The researchers analysed blood samples from more than 2,800 people, looking at a wide range of inflammatory molecules, known as cytokines. Their goal was to find out whether a pattern seen in earlier studies – where certain signs of inflammation rise with age and are linked to disease – also appears in other parts of the world.

The answer, it turns out, is both yes and no.

Among the Italian and Singaporean participants, the researchers found a fairly consistent inflammaging pattern. As people aged, levels of inflammatory markers in the blood, such as C-reactive protein and tumour necrosis factor, rose together. Higher levels were linked to a greater risk of chronic diseases including kidney disease and heart disease.

But in the Tsimane and Orang Asli populations, the inflammaging pattern was absent. The same inflammatory molecules did not rise consistently with age, and they were not strongly linked to age-related diseases.

In fact, among the Tsimane, who face high rates of infections from parasites and other pathogens, inflammation levels were often elevated. Yet this did not lead to the same rates of chronic diseases that are common in industrialised nations.

Despite high inflammatory markers, the Tsimane experience very low rates of conditions such as heart disease, diabetes and dementia.

Inflammaging may not be universal

These results raise important questions. One possibility is that inflammaging, at least as measured through these blood signals, is not a universal biological feature of ageing. Instead, it may arise in societies marked by high-calorie diets, low physical activity and reduced exposure to infections.

In other words, chronic inflammation linked to ageing and disease might not simply result from an inevitable biological process, but rather from a mismatch between our ancient physiology and the modern environment.

The study suggests that in communities with more traditional lifestyles – where people are more active, eat differently and are exposed to more infections – the immune system may work in a different way. In these groups, higher levels of inflammation might be a normal, healthy response to their environment, rather than a sign that the body is breaking down with age.

Another possibility is that inflammaging may still occur in all humans, but it might appear in different ways that are not captured by measuring inflammatory molecules in the blood. It could be happening at a cellular or tissue level, where it remains invisible to the blood tests used in this research.

A man tucks in to fast food.
Chronic low-level inflammation may be a lifestyle problem.
Nattakorn_Maneerat/Shutterstock.com

Why this matters

If these findings are confirmed, they could have significant consequences.

First, they challenge how we diagnose and treat chronic inflammation in ageing. Biomarkers used to define inflammaging in European or Asian populations might not apply in other settings, or even among all groups within industrialised nations.

Second, they suggest that lifestyle interventions aimed at lowering chronic inflammation, such as exercise, changes in diet, or drugs targeting specific inflammatory molecules, might have different effects in different populations. What works for people living in cities might be unnecessary, or even ineffective, in those living traditional lifestyles.

Finally, this research serves as an important reminder that much of our knowledge about human health and ageing comes from studies conducted in wealthy, industrialised nations. Findings from these groups cannot automatically be assumed to apply worldwide.

The researchers are clear: this study is just the beginning. They urge scientists to dig deeper, using new tools that can detect inflammation not just in the blood, but within tissues and cells where the real story of ageing may be unfolding. Just as important, they call for more inclusive research that spans the full range of human experience, not just the wealthy, urbanised corners of the world.

At the very least, this study offers an important lesson. What we thought was a universal truth about the biology of ageing might instead be a local story, shaped by our environment, lifestyle and the way we live.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Ageing isn’t the same everywhere – why inflammation may be a lifestyle problem – https://theconversation.com/ageing-isnt-the-same-everywhere-why-inflammation-may-be-a-lifestyle-problem-260322

The ‘Mind’ diet is good for cognitive health – here’s what foods you should put on your plate

Source: The Conversation – UK – By Aisling Pigott, Lecturer, Dietetics, Cardiff Metropolitan University

The ‘Mind’ diet is very similar to the Mediterranean diet, but emphasises consuming nutrients that benefit the brain. Svetlana Khutornaia/ Shutterstock

There’s long been evidence that what we eat can affect our risk of dementia, Alzheimer’s disease and cognitive decline as we age. But can any one diet actually keep the brain strong and lower dementia risk? Evidence suggests the so-called “Mind diet” might.

The Mind diet (which stands for the Mediterranean-Dash intervention for neurocognitive delay) combines the well-established Mediterranean diet with the “Dash” diet (dietary approaches to stop hypertension). However, it also includes some specific dietary modifications based on their benefits to cognitive health.

Both the Mediterranean diet and Dash diet are based on traditional eating patterns from countries which border the Mediterranean sea.

Both emphasise eating plenty of plant-based foods (such as fruits, vegetables, nuts and seeds), low-fat dairy products (such as milk and yoghurts) and lean proteins including fish and chicken. Both diets include very little red and processed meats. The Dash diet, however, places greater emphasis on consuming low-sodium foods, less added sugar and fewer saturated and trans-fats to reduce blood pressure.


Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK’s latest coverage of news and research, from politics and business to the arts and sciences.


Both diets are well-researched and shown to be effective in preventing lifestyle-related diseases – including cardiovascular disease and hypertension. They’re also shown to help protect the brain’s neurons from damage and benefit cognitive health.

The Mind diet follows many of the core tenets of both diets but places greater emphasis on consuming more foods that contain nutrients which promote brain health and prevent cognitive decline, including:

Numerous studies have been conducted on the Mind diet, and the evidence for this dietary approach’s brain health benefit is pretty convincing.

For instance, one study asked 906 older adults about their usual diet — giving them a “Mind score” based on the number of foods and nutrients they regularly consumed that are linked with lower dementia risk. The researchers found a link between people who had a higher Mind diet score and slower cognitive decline when followed up almost five years later.

Another study of 581 participants found that people who had closely followed either the Mind diet or the Mediterranean diet for at least a decade had fewer signs of amyloid plaques in their brain when examined post-mortem. Amyloid plaques are a key hallmark of Alzheimer’s disease. Higher intake of leafy greens appeared to the most important dietary component.

A systematic review of 13 studies on the Mind diet has also found a positive association between adherence to the Mind diet and cognitive performance and function in older people. One paper included in the review even demonstrated a 53% reduction in Alzheimer’s disease risk in those that adhered to the diet.

A bowl full of berries, including blackberries, raspberries and blueberries.
The Mind diet encourages eating berries, which contain a plant compound thought to be beneficial for the brain.
etorres/ Shutterstock

It’s important to note that most of this research is based on observational studies and food frequency questionnaires, which have their limitations in research due to reliabiltiy and participant bias. Only one randomised control trial was included in the review. It found that women who were randomly assigned to follow the Mind diet over a control diet for a short period of time showed a slight improvement in memory and attention.

Research in this field is ongoing, so hopefully we’ll soon have a better understanding of the diet’s benefits – and know exactly why it’s so beneficial.

Mind your diet

UK public health guidance recommends people follow a balanced diet to maintain good overall health. But the Mind diet offers a more targeted approach for those hoping to look after their cognitive health.

While public health guidance encourages people to eat at least five portions of fruit and vegetables daily, the Mind diet would recommend choosing leafy green vegetables (such as spinach and kale) and berries for their cognitive benefits.

Similarly, while UK guidance says to choose unsaturated fats over saturated ones, the Mind diet explicitly recommends that these fats come from olive oil. This is due to the potential neuroprotective effects of the fats found in olive oil.

If you want to protect your cognitive function as you age, here are some other small, simple swaps you can make each day to more closely follow the Mind diet:

  • upgrade your meals by sprinkling nuts and seeds on cereals, salads or yoghurts to increase fibre and healthy fats
  • eat the rainbow of fruit and vegetables, aiming to fill half your plate with these foods
  • canned and frozen foods are just as nutrient-rich as fresh fruits and vegetables
  • bake or airfry vegetables and meats instead of frying to reduce fat intake
  • opt for poly-unsaturated fats and oils in salads and dressings – such as olive oil
  • bulk out meat or meat alternatives with pulses, legumes chickpeas or beans. These can easily be added into dishes such as spaghetti bolognese, chilli, shepherd’s pie or curry
  • use tinned salmon, mackerel or sardines in salads or as protein sources for meal planning.

These small changes can have a meaningful impact on your overall health – including your brain’s health. With growing evidence linking diet to cognitive function, even little changes to your eating habits may help protect your mind as you age.

The Conversation

Aisling Pigott receives funding from Health and Care Research Wales

Sophie Davies does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. The ‘Mind’ diet is good for cognitive health – here’s what foods you should put on your plate – https://theconversation.com/the-mind-diet-is-good-for-cognitive-health-heres-what-foods-you-should-put-on-your-plate-259106

The NHS ten-year health plan is missing a crucial ingredient: nature

Source: The Conversation – UK – By Andrea Mechelli, Professor of Early Intervention in Mental Health, King’s College London

mimagephotography/Shutterstock

The UK government has finally unveiled its much anticipated ten-year Plan for improving England’s health. It contains a long overdue focus on prevention, after years of sidestepping by previous administrations.

The plan rightly recognises that preventing illness before it begins is the most effective way to improve people’s wellbeing. It should have the added benefit of reducing strain on the NHS and easing the nation’s financial burden.

Mental health, too, is given the attention it deserves. Recognised as integral to our overall health, its inclusion couldn’t be more timely. A 2023 international study found that one in two people will experience a mental health condition in their lifetime — a much higher figure than previously estimated.

But one striking omission threatens to undermine the plan’s success: nature. Evidence tells us that it’s one of the most powerful means of supporting physical and mental health. And yet is not mentioned once in the plan’s 168 pages.

If this plan is about prevention, then nature should be central to it. The science is unequivocal: contact with the natural world supports human health in wide ranging and profound ways. It lowers stress, improves mood, and alleviates symptoms of anxiety.

For children, time in nature can even aid brain development. Nature helps reduce exposure to air pollution, moderates urban heat, and fosters physical activity and social connection.

It can also reduce feelings of loneliness, improve the diversity of our gut microbiota – by exposing us to a wider range of environmental microbes that help train and balance the immune system – and support the immune system by reducing inflammation. All of these play a vital role in protecting against chronic disease.




Read more:
People feel lonelier in crowded cities – but green spaces can help


Then there are the intangible yet no less important benefits. Nature provides a sense of awe and wonder – feelings that help us gain perspective, boost emotional resilience and find deeper meaning in everyday life.

Our own research shows that even small, everyday moments in nature, watching birds from your window, for example, or pausing under a blooming tree on your way to the shop, can significantly boost mental wellbeing.

Consider this: a Danish study found that growing up near green spaces during the first ten years of life reduces the risk of developing mental health problems in adulthood by a staggering 55%. A UK study similarly showed that people living in greener neighbourhoods were 16% less likely to experience depression and 14% less likely to develop anxiety.

And as heatwaves become more frequent and intense – with soaring illness and mortality rates – the cooling effects of trees and parks will become more vital than ever for protecting our health.

Not all green space is equal

But it’s not just access to green space that matters – it’s also the quality of that space.

Green areas rich in biodiversity, with a wide variety of plant life, birds, insects and fungi, provide much greater health benefits than sparse or manicured lawns. Biodiversity builds resilience not just in ecosystems, but in our bodies and minds.

A recent study in The Lancet Planetary Health found that people living in areas with greater bird diversity were significantly less likely to experience depression and anxiety, even after accounting for socioeconomic and demographic factors.

This research underlines a simple but urgent truth: we cannot talk about human health without talking about biodiversity.




Read more:
Why diversity in nature could be the key to mental wellbeing


To deliver true prevention and resilience, we need a joined-up approach across government: one that aligns health policy with environmental protection, housing, urban design, education and transport. This means rethinking how we plan and build our communities: what kind of housing we develop, how we move around, what we grow and eat and how we live in relationship with the ecosystems that support us.

There are many ways this vision can be put into action. The Neighbourhood Health Service outlined in the ten-year plan could be tied directly to local, community-led efforts such as Southwark’s Right to Grow campaign, which gives residents the right to cultivate unused land. This kind of initiative improves access to fresh food, promotes physical activity, strengthens community bonds and increases green cover – all of which support long-term health.

School curricula could be revised to give children the opportunity to learn not just about nature, but also in nature – developing ecological literacy, emotional resilience and healthier habits for life. Health professionals could be trained to understand and promote the value of time outdoors for managing chronic conditions and supporting recovery. Green social prescribing – already gaining ground across the UK – should be fully integrated into standard care, with robust resourcing and cross-sector support.

Learning from success

Scotland’s Green Health Partnerships show what’s possible. These initiatives bring together sectors including health, environment, education, sport and transport to promote nature-based health solutions – from outdoor learning and physical activity in parks, to conservation volunteering and nature therapy.

They don’t just improve health; they strengthen communities, build climate resilience and create cost-effective, scaleable solutions for prevention.

The ten-year plan is a once-in-a-generation opportunity. It could help remove departmental silos and unify national goals across health, climate, inequality and economic recovery, while saving billions in the process. But in its current form, it misses a crucial ingredient.

By failing to recognise the centrality of nature in our health, the government overlooks one of the simplest and most effective ways to build resilience – both human and ecological. Surely it is not beyond a nation of nature lovers to put nature at the heart of our future health?

The Conversation

Andrea Mechelli receives funding from Wellcome Trust.

Giulia Vivaldi, Michael Smythe, and Nick Bridge do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. The NHS ten-year health plan is missing a crucial ingredient: nature – https://theconversation.com/the-nhs-ten-year-health-plan-is-missing-a-crucial-ingredient-nature-260508

Fun with fossils: South African kids learn a whole lot more about human evolution from museum workshops

Source: The Conversation – Africa (2) – By Shaw Badenhorst, Associate Professor in Zooarchaeology, University of the Witwatersrand

‘Find the fossil sites’ interactive display, Maropeng exhibition, Cradle of Humankind. flowcomm, CC BY

South Africa has one of the world’s richest fossil records of hominins (humans and their fossil ancestors). But many misconceptions still exist regarding human evolution, and school textbooks contain inaccuracies.

South Africans still have some of the lowest rates of acceptance of human evolution, mostly due to conflicting religious views. Religion and the non-acceptance of evolution hinders the understanding of evolution by teachers and learners.

It doesn’t help that school subjects (evolution being one of them) are often taught in unengaging ways, rather than interactive methods.

Many studies have shown that collaborations between schools and informal science learning centres, such as natural history museums, can have a positive effect on school learners. Inquiry-based activities at museums have been shown to help learners gain knowledge and meaning about the past. Museum visits foster “thinking skills” through guided conversation and questions asked by educators and learners. New information is gained through reasoning, inference and deduction, which enhance learning.




Read more:
Evolution revolution: how a Cape Town museum exhibit is rewriting the story of humankind


In 2018, a team of researchers from the University of the Witwatersrand launched workshops on human evolution for grade 12 learners (in the final year of secondary school) in South Africa’s Gauteng province. The aim was to stimulate interest in the palaeosciences and improve learner performance. We worked with learners from 13 schools in the area. The workshops were conducted at the museum of the university’s Evolutionary Studies Institute.

From tests before and after the workshops, we found that they improved the learners’ understanding and acceptance of concepts related to evolution. More teacher training and school visits to museums and exhibitions could build on this success.

Workshops on human evolution

Our human evolution workshops were conducted with well-resourced and historically disadvantaged schools attending. The grade 12 learners, aged 17 and 18 years, visited the fossil preparatory laboratory, searched for clues in the museum while answering a worksheet, and did activities on human evolution using inquiry-based approaches.




Read more:
What it’s like curating ancient fossils: a palaeontologist shares her story


These activities included measuring and describing skulls of apes and hominins, comparing hip bones to see whether the creature was able to walk upright on two legs, investigating stone tools, and drawing a phylogenetic tree (a diagram showing how species are descended from each other). Due to financial constraints, some of the workshops were held at the schools themselves.

The 687 learners wrote a test before and after the workshop to test their knowledge of hominin evolution. Their scores increased from an average of 39% to 61%.

The location of the workshops (either at the museum or at the school) did not affect the scores, suggesting that workshops can be scaled to reduce costs. Feedback from interviews indicated that learners regarded the workshops as beneficial, enabling them to learn new facts and gain a deeper understanding of human evolution. Teachers echoed the same view.

One learner said:

It was pretty enjoyable, and informative and interesting. Especially the part when we asked questions and we actually got answered. It helped us to understand the knowledge more.

Another said:

It is always better to physically see things as compared to seeing a picture of it, it is easier to understand it this way.

A teacher commented that learners

could literally see exactly what is happening and it is not just talk, they can touch it and they can take part in the experiment, which is not something they are exposed to at school.

It was apparent that learners understood human evolution better after the workshops. In the preliminary exam paper of Gauteng province, learners who attended the workshops scored nearly double (average 41%) the score of schools that did not attend (average 21%). While the scores are still low, and there is still much room for improvement, the results suggest that a short, hands-on workshop can make a major difference to learners.

The workshop also increased the acceptance of evolution from 41% to 51%. (It was not the purpose of the workshops to increase acceptance, but rather to improve understanding of the topic.)

Why the workshops worked

In our view, the workshops were successful because they used inquiry-based learning, learners working in groups using problem solving and physical handling of fossil casts. This enabled active participation in the learning process.




Read more:
It’s time to celebrate Africa’s forgotten fossil hunters


With this approach, learners took ownership of the learning process and it developed their curiosity, interest and a desire to learn. The guidance of a subject expert during the workshops enhanced the quality of the workshops and the learning experience. It’s clear that visits to places like natural history museums created connections which helped with understanding concepts such as human evolution in the classroom, and developing an enjoyment of learning.

What’s next

We recommend that teachers receive training in human evolution and how to teach this topic. Common misconceptions of teachers can be identified through surveys, and intervention training must be planned around these misconceptions. The Gauteng Department of Education has a free professional development programme offering training to teachers (not publicly available), which can be used for this purpose.




Read more:
Species without boundaries: a new way to map our origins


Various institutions in Gauteng offer exhibitions on human evolution and fossils, including the University of the Witwatersrand, the Ditsong National Museum of Natural History, Maropeng Cradle of Humankind, Sterkfontein Caves and the Sci-Bono Discovery Centre. The provincial education department must promote school visits to these places. Human evolution can be one of the most rewarding topics for learners, especially in a country where the fossil record is right on the doorstep.

It’s vital for grade 12 learners in South Africa to have a solid understanding of human evolution – it fosters critical thinking about science, identity and our shared African origins. This knowledge not only deepens their appreciation of the continent’s fossil heritage, but also counters misinformation with evidence-based insight.


This article was prepared with Grizelda van Wyk and in memory of Ian J. McKay.

The Conversation

Shaw Badenhorst works for the University of the Witwatersrand. He receives funding from GENUS, the National Research Foundation and the Palaeontological Scientific Trust.

ref. Fun with fossils: South African kids learn a whole lot more about human evolution from museum workshops – https://theconversation.com/fun-with-fossils-south-african-kids-learn-a-whole-lot-more-about-human-evolution-from-museum-workshops-259319

A surprisingly effective way to save the capercaillie: keep its predators well-fed – new research

Source: The Conversation – UK – By Chris Sutherland, Reader in Statistical Ecology, University of St Andrews

A male capercaillie showing off its colours. Rolands Linejs/Shutterstock

Conserving species can be a complicated affair. Take this dilemma.

After being hunted to near extinction, numbers of a native predator are recovering and eating more of an endangered prey species, whose own numbers are declining as a result. Should conservationists accept that some successes mean losing other species, or reinstate lethal control of this predator in perpetuity?

Or perhaps there is a third option that involves new means of managing species in the face of new conditions. This issue is playing out globally, as land managers grapple with predators such as wolves and lynx reclaiming their historic ranges.


Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK’s latest coverage of news and research, from politics and business to the arts and sciences.


In the ancient Caledonian pine forests of Scotland there are fewer than 500 capercaillie remaining. This grouse is beset by multiple threats, not least shifts in spring weather caused by climate change that are driving its Europe-wide decline, relating to changes in when chicks are reared and available nutrition.

Additionally, and in common with other ground-nesting birds, capercaillie lose eggs and chicks to carnivores. As such, the recovery of the pine marten (a relative of weasels and otters) from its own near extinction in Scotland is contributing to the decline of capercaillie.

A chicken-sized bird displaying its colourful feathers to a female.
A capercaillie cock displaying for a hen.
Jack Bamber

Internationally, little has been achieved to slow the heating of Earth’s climate, and decades of dedicated conservation efforts have not arrested the decline of capercaillie. Extinction will follow unless new solutions are found.

Killing pine martens, the capercaillie’s predators, might offer short-term relief, but it is socially and politically contested and scientific evidence on its effectiveness is meagre. Most importantly, it risks undermining the recovery of species conservationists have worked hard to restore. Instead, the challenge is to reduce the effects of predators, not their numbers, and encourage coexistence between species.

We have tried one such method in Scotland – with incredibly positive results.

A non-lethal alternative for controlling predators

Our idea is simple: predators have to be efficient, so when given access to a free meal, they are less likely to hunt for harder-to-find prey like capercaillie nests.

A pine  near a deer carcass in the woods.
Taking the bait: a pine marten eating carrion.
Jack Bamber

Satiated predators are less likely to kill and eat prey that is of concern to conservationists. This is called diversionary feeding: giving predators something easy to eat at critical times, such as during the time when capercaillie build their ground nests and rear chicks between April and July.

To test this idea we systematically dumped deer carrion across 600 square kilometres of the Cairngorms national park in north-eastern Scotland, during eight weeks in which capercaillie are laying and incubating eggs. This area is home to the last Scottish stronghold of capercaillie. We also made artificial nests across the same area that contained chicken eggs, to represent capercaillie eggs.

Through this landscape-scale experiment, we showed that the predation rate of pine marten on artificial nests fell from 53% to 22% with diversionary feeding. This decrease from a 50% chance of a nest being eaten by a pine marten, to 20%, is a massive increase in nest survival.

A black and white photo with a hen and chicks highlighted in colour.
A capercaillie brood, with chicks and hen highlighted.
Jack Bamber

This was a strong indication that the method worked. But we were unsure whether the effect seen in artificial nests translated to real capercaillies, and the number of chicks surviving to independence.

Counting chicks in forests with dense vegetation is difficult, and land managers are increasingly reluctant to use trained dogs. Our innovation was to count capercaillie chicks using camera traps (motion-activated cameras which can take videos and photos) at dust baths, which are clear patches of ground where chicks and hens gather to preen.

We deployed camera traps across the landscape in areas with and without diversionary feeding and measured whether a female capercaillie had chicks or not, and how many she had. Chicks are fragile and many die early in life. The number of chicks in a brood declined at the same rate in the fed and unfed areas.

However, in areas where predators received diversionary feeding, 85% of the hens we detected had chicks compared to just 37% where predators were unfed. That sizeable difference mirrored the improvement seen in artificial nest survival.

Fewer nests being predated led to more hens with broods, such that by the end of the summer, we observed a staggering 130% increase in the number of chicks per hen in fed areas – 1.9 chicks per hen were seen compared to half that in unfed areas.

So, does diversionary feeding provide a non-lethal alternative to managing conservation conflict and promoting coexistence? Our work suggests it does.

A mother capercaillie surrounded by adolescent birds.
A mature capercaillie brood.
Jack Bamber

Diversionary feeding is now a key element of the capercaillie emergency plan, which is the Scottish government’s main programme for recovering the species. Diversionary feeding will probably be adopted across all estates with capercaillie breeding records in the Cairngorms national park by 2026.

This rapid implementation of scientific evidence is a direct result of working closely, from conception, with wildlife managers and policy makers. For capercaillie, diversionary feeding has real potential to make a difference, a glimmer of hope in their plight (some nicer weather in spring might help too).

More broadly, for conservationists, land managers, gamekeepers, farmers, researchers and anyone else involved in managing wildlife, this work is testament to the fact that, with the right evidence and a willingness to adapt, we can move beyond the binaries of killing or not killing. Instead, finding smarter ways to promote the coexistence of native predators and native prey.


Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 45,000+ readers who’ve subscribed so far.


The Conversation

Jack Anthony Bamber received funding from the SUPER DTP.

Xavier Lambin would like to credit the academic contribution of Kenny Kortland, environment policy advisor for Scottish Forestry.

Chris Sutherland does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. A surprisingly effective way to save the capercaillie: keep its predators well-fed – new research – https://theconversation.com/a-surprisingly-effective-way-to-save-the-capercaillie-keep-its-predators-well-fed-new-research-259925

How the myth of ‘Blitz spirit’ defined and divided London after 7/7

Source: The Conversation – UK – By Darren Kelsey, Reader in Media and Collective Psychology, Newcastle University

The “Blitz spirit” is one of Britain’s most enduring national myths – the stories we tell ourselves about who we were, and who we still believe we are today. Growing up among football fans, I heard constant nostalgic refrains about England and Germany, wartime bravery and national pride.

Chants about “two world wars and one World Cup” or “ten German bombers in the air” were cultural rituals, flexes of a shared memory that many had never experienced themselves.

Blitz spirit refers to the resilience, unity and stoic determination of civilians during the German bombing raids (the Blitz) of the second world war. It has reemerged time and again, symbolising a collective pride in facing adversity with courage, humour and a “keep calm and carry on” attitude.

After the July 7 bombings in 2005, which killed 52 people and injured more than 700, I noticed how quickly the Blitz spirit reappeared. British newspapers reached into the past and pulled the myth forward.


Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK’s latest coverage of news and research, from politics and business to the arts and sciences.


The Independent on July 8 said, “London can take it, and it can do so because its stoicism is laced as it always has been with humour.” The Daily Mail evoked images of “London during the Blitz… with everyone dancing through the bombs”.

Tony Parsons opened his Daily Mirror column with “07/07 war on Britain: We can take it; if these murderous bastards go on for a thousand years, the people of our islands will never be cowed”, alongside an image of St Paul’s Cathedral during the Blitz.

The spirit of working-class wartime London was, ironically, even applied to bankers and City traders who “kept the economy alive” after the attacks. A July 8 Times article claimed: “A Dunkirk spirit spread through London’s financial districts as Canary Wharf and City workers vowed they would not be deterred.”

The use of river transport to evacuate workers reinforced the analogy. The Times described how “bankers and lawyers in London’s riverside Canary Wharf complex experienced their own version of the Dunkirk-style evacuations”, assisted by a “flotilla of leisure vessels and little ships”.

I was fascinated: why this story, and why now? That question became the heart of a book I published in 2015 – one that explored how a myth born in 1940 was reborn in 2005, repurposed for a very different London.

What I found was that the “Blitz spirit” wasn’t a lie, but it was a myth in the academic sense: a simplified, selective story built from the most comforting parts of the past.

Wartime Britain was not uniformly united, stoic and proud. There were deep class divides. Looting occurred. Morale was rock-bottom in many cities and communities. Evacuees weren’t always welcomed with open arms. Government censorship and transnational propaganda masked social unrest.

Understandably, these messy realities were left out of the postwar narrative. But what happens when we bring that myth into the present?

The myth of the ‘Blitz spirit’

Londoners did come together after the 7/7 bombings – there were undoubtedly examples of communities and strangers supporting each other and maintaining a sense of resilience that enabled them to continue their lives undeterred.

But it was not one single unified message. Hate crimes against British Muslim communities in the weeks after the 2005 attacks exposed cracks in the narrative of national unity.

Some used the Blitz spirit to support Tony Blair and George W. Bush, casting them as Churchillian leaders standing firm against a new fascism in the form of global terrorism. For others, the same figures represented a betrayal of British values.

They were evoked instead to shame Blair and Bush. The Express made its feelings clear when it said: “It was throw up time when Blair was compared to Churchill by some commentators. What an insult!”

The Blitz spirit also became a weapon in anti-immigration discourse. Some argued that Britain, unlike in 1940, had become a “soft touch” – compromised by EU human rights laws, welfare handouts and multiculturalism. The underlying message: today’s London could never be as brave or unified as wartime London.

Writing in The Sun, Richard Littlejohn said: “War office memo. Anyone caught fighting on the beaches will be prosecuted for hate crimes.”

An article in the Express condemning human rights laws said: “What a good thing these people weren’t running things when Hitler was doing his worst. Would the second world war have been more easily won if we had spent more time talking about freedom of speech than bombing Nazi Germany?”

Multicultural resilience

And yet, another narrative emerged – one that saw London’s multicultural identity as a strength, not a weakness. Here, the Blitz spirit wasn’t just a historical relic, but a kind of transcendental force. The city’s soul, it was said, remained resilient – passed down across generations, regardless of race, class or religion. For some, this was proof that Britain had evolved and still held fast to its best values.

A letter to the Daily Mirror (July 17) invoked the Blitz spirit through a cross-cultural lens: “Colour, creed and cultures forgotten, black helping white and vice versa… We stood firm in the Blitz and we’ll do so again, going about our business as usual.”

The Sunday Times quoted Michael Portillo, who framed London’s resilience as multicultural continuity: “Fewer than half the names of those killed on the 7th look Anglo-Saxon… Today’s Londoners come in all colours and from every cultural background. Yet they have inherited the city’s historic attitudes of nonchalance, bloody-mindedness and defiance.”

The Blitz spirit, as my research revealed, is not a single story. It is a narrative tool used for many different – often opposing – purposes. It can bring people together, or be used to divide. It can inspire pride, or be weaponised in fear.

National myths don’t just reflect who we were – they shape who we think we are. They’re never neutral. They’re always curated, always contested. If we want to be genuinely proud of our country – and we should – then we also have to be honest about the stories we cling to. We must ask: what’s left out, and who decides?

The Conversation

Darren Kelsey does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How the myth of ‘Blitz spirit’ defined and divided London after 7/7 – https://theconversation.com/how-the-myth-of-blitz-spirit-defined-and-divided-london-after-7-7-259948

Salmonella cases are at ten-year high in England – here’s what you can do to keep yourself safe

Source: The Conversation – UK – By Rob Kingsley, Professor, Microbiology, Quadram Institute

_Salmonella_ causes salmonellosis — an infection that typically results in vomiting and diarrhoea. Lightspring/ Shutterstock

Salmonella cases in England are the highest they’ve been in a decade, according to recent UK Health Security Agency (UKHSA) data. There was a 17% increase in cases observed from 2023 to 2024 – culminating in 10,388 detected infections last year. Children and older adults accounted for around a fifth of cases.

Although the number of infections caused by foodborne diseases such as Salmonella had broadly decreased over the last 25 years, this recent spike suggests a broader issue is at play. A concurrent increase in Campylobacter cases points to a possible common cause that would affect risk of both foodborne pathogens – such as changes in consumer behaviour or food supply chains.

While the UK maintains a high standard of food safety, any increase in the incidence of pathogens such as Salmonella warrants serious attention.

Salmonella is a species of bacteria that is one of the most common causes of foodborne illnesses globally. The bacteria causes salmonellosis – an infection that typically causes vomiting and diarrhoea.

Most cases of salmonellosis don’t require medical intervention. But approximately one in 50 cases results in more serious blood infections. Fortunately, fatalities from Salmonella infections in the UK are extremely rare – occurring in approximately 0.2% of all reported infections.

Salmonella infections are typically contracted from contaminated foods. But a key challenge in controlling Salmonella in the food supply chain lies in the diverse range of foods it can contaminate.

Salmonella is zoonotic, meaning it’s present in animals, including livestock. This allows it to enter the food chain and subsequently cause human disease. This occurs despite substantial efforts within the livestock industry to prevent it from happening – including through regular testing and high welfare practices.

Salmonella can be present on many retail food products – including raw meat, eggs, unpasteurised milk, vegetables and dried foods (such as nuts and spices). When present, it’s typically at very low contamination levels. This means it doesn’t pose a threat to you if the product is stored and cooked properly.

Vegetables and leafy greens can also become contaminated with Salmonella through cross-contamination, which may occur from contaminated irrigation water on farms, during processing or during storage at home. As vegetables are often consumed raw, preventing cross-contamination is particularly critical.

Spike in cases

It’s premature to draw definitive conclusions regarding the causes of this recent increase in Salmonella cases. But the recent UKHSA report suggests the increase is probably due to many factors.

The same cutting board used to chop up chicken and dice carrots.
Never prepare raw meat next to vegetables you intend to eat without cooking, as cross-contamination can lead to Salmonella.
kathrinerajalingam/ Shutterstock

One contributing factor is that diagnostic testing has increased. This means we’re better at detecting cases. This can be viewed as a positive, as robust surveillance is integral to maintaining a safe food supply.

The UKHSA also suggests that changes in the food supply chain and the way people are cooking and storing their food due to the cost of living crisis could also be influential factors.

To better understand why Salmonella cases have spiked, it will be important for researchers to conduct more detailed examinations of the specific Salmonella strains responsible for the infections. While Salmonella is commonly perceived as a singular bacterial pathogen, there are actually numerous strains (serotypes).

DNA sequencing can tell us which of the hundreds of Salmonella serotypes are responsible for human infections. Two serotypes, Salmonella enteritidis and Salmonella Typhimurium, account for most infections in England.

Although the UKHSA reported an increase in both serotypes in 2024, the data suggests that Salmonella enteritidis has played a more significant role in the observed increase. This particular serotype is predominantly associated with egg contamination.

Salmonella enteritidis is now relatively rare in UK poultry flocks thanks to vaccination and surveillance programmes that were introduced in the 1980s and 1990s. So the important question here is where these additional S enteritidis infections are originating.

Although the numbers may seem alarming, what the UKHSA has reported is actually a relatively moderate increase in Salmonella cases. There’s no reason for UK consumers to be alarmed. Still, this data underscores the importance of thoroughly investigating the underlying causes to prevent this short-term increase from evolving into a longer-term trend.

Staying safe

The most effective way of lowering your risk of Salmonella involves adherence to the “4 Cs” of food hygiene:

1. Cleaning

Thoroughly wash hands before and after handling any foods – especially raw meat. It’s also essential to keep workspaces, knives and utensils clean before, during and after preparing your meal.

2. Cooking

The bacteria that causes Salmonella infections can be inactivated when cooked at the right temperature. In general, foods should be cooked to an internal temperature above 65°C – which should be maintained for at least ten minutes. When re-heating food, it should reach 70°C or above for two minutes to kill any bacteria that have grown since it was first cooked.

3. Chilling

Raw foods – especially meat and dairy – should always be stored below 5°C as this inhibits Salmonella growth. Leftovers should be cooled quickly and also stored at 5°C or lower.

4. Cross-contamination

To prevent Salmonella passing from raw foods to those that are already prepared or can be eaten raw (such as vegetables and fruit), it’s important to wash hands and clean surfaces after handling raw meat, and to use different chopping boards for ready-to-eat foods and raw meat.

Most Salmonella infections are mild and will go away in a few days on their own. But taking the right steps when storing and preparing your meals can significantly lower your risk of contracting it.

The Conversation

Rob Kingsley receives funding from the Biotechnology and Biological Sciences Research Council (BBSRC), Bill and Melinda Gates Foundation

ref. Salmonella cases are at ten-year high in England – here’s what you can do to keep yourself safe – https://theconversation.com/salmonella-cases-are-at-ten-year-high-in-england-heres-what-you-can-do-to-keep-yourself-safe-260032