From blood sugar to gut bacteria, how beans can improve your health

Source: The Conversation – UK – By Raysa El Zein, Lecturer, Life Sciences, University of Westminster

Beans, pulses and legumes are affordable and nutritious. Pixel-Shot/ Shutterstock

Celebrity chefs Jamie Oliver, Hugh Fearnley‑Whittingstall and Tom Kerridge have backed a new campaign that is putting the spotlight on beans. The Bang In Some Beans campaign is a bid to double the UK’s intake of beans, legumes and pulses by 2028.

Such a campaign is long overdue. Despite beans on toast being a British favourite, beans, pulses and legumes remain under-consumed in the UK. According to data from the Food Foundation, two-thirds of the UK population eat less than one portion of beans a week.

Beans are one of the most affordable and nutritious foods out there. With food costs continuing to rise and poor nutrition contributing to a growing number of diseases, beans may offer a solution to both problems.

Encouraging greater bean consumption could also help close the UK’s fibre gap, as most of the UK population do not meet the recommended 30g of fibre per day. Beans are one of the simplest, most achievable ways to bridge that gap.

If you still aren’t convinced, here are just a few of the health benefits beans can provide:

1. They can help you manage your weight

Beans are a great source of protein, fibre and micronutrients such as iron, magnesium and potassium. Increasing bean intake could improve your health and reduce chronic disease risk.

Research also shows that people who consume higher amounts of beans have lower body weight, smaller waist circumference and lower blood pressure. These are all associated with reduced risk of multiple chronic diseases including obesity, diabetes and heart disease.

Not only are beans low in calories, their high fibre and protein content can help increase satiety (the feeling of fullness), which is a key factor in appetite regulation and long-term weight management.

2. They’re good for your heart

An abundance of research links eating beans to a healthy heart. Diets rich in beans can significantly lower LDL (“bad”) cholesterol, improve blood pressure and reduce inflammation.

The fibre in beans binds cholesterol in the gut so it can be excreted from the body. Their potassium and magnesium content supports vascular function, which is essential for a healthy heart. This is why, for those managing cardiovascular diseases or hyperlipidaemia, beans should be a cornerstone of a heart healthy diet.

3. They’re good for blood sugar levels

Beans have a low glycaemic index. This means they release energy slowly, which reduces blood sugar spikes. Their fibre and protein content also helps slow carbohydrate absorption, which promotes better blood sugar control. Both factors are important for preventing or managing type 2 diabetes.

Evidence from clinical trials shows incorporating beans into meals also benefits other aspects of blood sugar in people with, or at risk of, type 2 diabetes – such as improving fasting blood sugar and insulin levels.

A randomised controlled trial of over 100 people with type 2 diabetes found that those who consumed at least one cup of legumes daily for three months not only had better blood sugar control, they also had a significant decrease in body weight, waist circumference, cholesterol levels and blood pressure.

4. They can benefit gut health

Beans support gut health by providing both soluble and insoluble fibre. These act as prebiotics, feeding beneficial gut bacteria.

The fermentation of these fibres in the gut also produces short-chain fatty acids, prebiotics which have anti-inflammatory effects and support the colon. Regular consumption contributes to improved digestion and bowel regularity.

A young woman in a supermarket compares two jars of beans.
Beans have many gut health benefits.
BearFotos/ Shutterstock

Boosting your bean consumption

You don’t need to make any sorts of dramatic dietary changes in order to incorporate more beans in your diet. Here are a few simple ways to eat more beans.

1. Start gradually.

Begin with small portions (about half a cup of cooked beans) a few times a week, increasing this as your digestive system adjusts and to avoid flatulance and bloating.

2. Mix up varieties.

Rotate between beans such as chickpeas, kidney beans, lentils, black beans and cannellini beans. Diversity boosts nutrient variety and keeps meals interesting.

3. Add beans to familiar dishes.

Stir beans or other legumes into soups, stews, curries, salads or pasta sauces. Even a handful can make a meaningful difference.

4. Choose canned beans.

These are just as nutritious as dried or fresh beans – just ensure you rinse them well to reduce the sodium content. If you do use dried beans, ensure you soak them overnight and cook them thoroughly to neutralise anti-nutrients such as phytates (which can reduce absorption of other nutrients) and improve their digestibility.

Nutritionally speaking, chickpeas and lentils are good choices, as they’re high in fibre and protein. Black beans contain antioxidants – compounds which have been linked to lower risk of diseases such as cancer, diabetes and Alzheimer’s disease.

Ultimately, the best beans are the ones you can integrate into your diet and will eat regularly.

However, there are some groups of people who should be mindful when increasing their bean intake, as some of the compounds they contain can have a negative impact on health.

People with IBS, IBD or digestive sensitivities may struggle with bloating or gastric discomfort if they consume large amounts of beans. Beans should be introduced into the diet gradually based on how well your body tolerates them.

People with kidney disease may want to be careful due to the high potassium content in beans. In this case, it’s important to consult with a doctor before consuming diets rich in beans.

Those who suffer from low iron or zinc levels may also want to be careful with how they prepare beans. The anti-nutrient compounds in beans can disrupt the absorption of minerals, which is why it’s so important to soak beans and cook them well.

Beans are a nutritional powerhouse. High in fibre, protein and key micronutrients, they support heart, metabolic and gut health while being both affordable and environmentally friendly.

The Conversation

Raysa El Zein does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. From blood sugar to gut bacteria, how beans can improve your health – https://theconversation.com/from-blood-sugar-to-gut-bacteria-how-beans-can-improve-your-health-269653

The cancer blood test making waves – and what the numbers really show

Source: The Conversation – UK – By John Ferguson, Senior Lecturer in Statistical Science, University of Galway

BLKStudio/Shutterstock.com

Progress in cutting the global toll of cancer remains painfully slow, but a new blood test has sparked unusual levels of hope. Researchers say it could one day make routine screening far more effective by catching cancers earlier, when treatment has the best chance of saving lives.

The Galleri blood test, developed by US firm Grail, is the latest entrant to attract worldwide attention, after early trial results were described as “exciting” by researchers.

A press release claims the test, currently being trialled by the NHS, can detect signals from 50 cancers and correctly identify the disease in 62% of people who receive a positive result.

It also appears to be highly accurate at ruling cancer out, with a reported 99.6% success rate among those who were disease-free. At first glance, these headline figures appear to represent a significant step forward.

But before we reach for the champagne, it’s worth looking more closely at what these numbers really mean. Early promise does not always translate into real-world performance.

The Pathfinder 2 trial, involving 23,161 people aged over 50 from the US and Canada with no prior cancer diagnosis, produced the figures now circulating widely. Of the 216 participants who tested positive, 133 were later found to have cancer, giving the “positive predictive value” (PPV) of 62% that has been so widely reported.

That metric answers a crucial question: “If I test positive, what’s the chance I actually have cancer?” It also means, however, that 38% of positive results were false alarms.

Specificity – how often a test avoids falsely diagnosing cancer – is equally important, given the anxiety and medical follow-up triggered by an incorrect result. Here, the test performed well: 99.6% of people without cancer received a correct negative result.

Yet even this strong number has implications. If everyone aged over 50 in the UK were tested – more than 26 million people – the same rate would still generate over 100,000 false positives.

What has been less widely discussed is sensitivity, the measure of how many true cancer cases the test actually detects. On this measure, the result was 40.4%, meaning the test missed around three in every five cancers that appeared over the following year.

Galleri test results:

A chart showing how many cancer cases the Galleri test would correctly identify.
The Galleri cancer test in numbers.
John Ferguson, CC BY-SA

The figure that’s been less widely reported

That shortfall may disappoint those hoping for a catch-all screening tool. It also raises the risk that patients could be falsely reassured by a negative result, potentially delaying a diagnosis.

Statisticians caution that the reported PPV, specificity and sensitivity are estimates rather than fixed values, and each comes with uncertainty. They also note that tests often perform less well outside carefully controlled trials, meaning real-world accuracy could be lower.

So, where does this leave the Galleri test? It may well become a useful addition to future screening programmes, provided that negative results are not viewed as definitive by patients or doctors.

But the low sensitivity means many cancers would still be missed in its current form. The test is also expensive – US$949 (£723) in the US – and no evidence yet shows that widely using it reduces cancer deaths.

The early data is encouraging, but perhaps the excitement deserves to be tempered. This technology may be a step forward, but it is not a solution on its own.

The Conversation

John Ferguson does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. The cancer blood test making waves – and what the numbers really show – https://theconversation.com/the-cancer-blood-test-making-waves-and-what-the-numbers-really-show-270438

Tim Berners-Lee wants everyone to own their own data – his plan needs state and consumer support to work

Source: The Conversation – UK – By Alex Zarifis, Lecturer in Information Systems, University of Southampton

Tim Berners-Lee, the creator of the world wide web, has released an important new book about the problems we face online and how to solve them. It is called This is for Everyone, meaning that the internet should be for all.

The philosophy espoused in the book is that the internet should not be a tool for the concentration of power among an elite. He wants the internet to function in a way that maximises the benefit to society.

His central idea, as he has written before, is that people should own their data. Personal data is any data that can be linked to us, such as our purchasing habits, health information and political opinions.

Everyone owning their data is a radically different approach to what we have today where big tech companies own most of it. This change is needed for two reasons.

The first is specifically about people’s right to privacy, so we don’t all feel like we live in a glass box with everything we do being monitored and having an effect on our careers and the prices we pay for services such as insurance. If AI is steered to make more money for an insurer it will do that, but it will not necessarily treat people fairly.

The second reason is that in a world being shaped by AI and data, if we do not own our data, we will have no power and no say in our future. For most of human history, workers’ labour was needed, and this gave them some power to pursue a fairer deal for themselves.

Most of us have the power to deny our valuable labour if we feel we are not treated fairly, but this may not have the same effect in the future. For many of us, in the highly automated AI driven world we are moving towards, our labour will not always be needed. Our data, however, will be very valuable, and if people own their data, they will still have a voice. When a tech giant owns our data, it holds all the cards.

None of these ideas are new, but as with the creation of the world wide web, Berners-Lee excels in bringing the best ideas together into one coherent, workable vision.

Many people have pet-hates about the internet, some dislike how algorithms sometimes promote controversial views, and others don’t like handing over more personal information for a service than what is necessary. His ability to see the bigger picture is due to the knowledge he has, having had a front row seat to the development of the world wide web from the start.

But what would this look like?

In practice, owning our data would mean having a data wallet app on our phone which internet companies might request access to. The internet companies could offer a small payment, or make their service free in exchange for the access. The individual could choose to manage access themselves on a case-by-case basis, or delegate the management of the data to a trusted third party such as a data union.

Berners-Lee recommends two possible solutions to break free from the oligopolistic situation we are in. The first is for government to intervene and create the regulation that would maximise the social good of the internet limiting the power of big tech.

This is highly unlikely in the United States where big tech is fully supported by the state. While a court in the US recently decided that Google had acted illegally to keep its monopoly status in search, it was not broken up under monopoly laws because it would be “messy”.

Elsewhere, though, for instance in the EU and Australia, there is a concerted effort to limit the negative outcomes for society of the
internet. The EU constantly updates its general data protection regulation so that it offers some protection to citizens’ privacy, while in Australia a world-first social media ban has been passed for children under 16.

Berners-Lee’s vision would require governments to go further. He has repeatedly asked for governments to regulate big tech warning that failing to do so would lead to the internet being “weaponised at scale” to maximise profit not social good. The regulation would seek to broaden competition beyond a small number of giant tech companies.

Beyond state intervention, Berners-Lee presents other ways forward. Perhaps, he contends, people themselves can begin building better alternatives. For example, more people could use social media such as Mastodon.social that is decentralised and does not promote polarising views.

As he sees it, a key part of the problem is that we become tied into platforms run by the giants. Owning our data would go some way to having a fairer relationship. Instead of being locked into an increasingly small number of big tech firms this would open the door to new platforms offering a better deal.

Berners-Lee created the Open Data Institute that tries to bring agreement on new online standards. He is promoting what he calls socially linked data and co-founded Inrupt that offers an online wallet to store all our personal data. This could include our passport, qualifications, and information about our health.

This decentralised model would give people the ability to analyse their data locally within the wallet to gain insights on their finances and health, without giving their data away. They would have the option to share their data, but this would now be from a position of strength.

Access would be given to a specific organisation, to use specific personal data, for a specific purpose. AI, even more so than the internet, gives power to whoever has the data. If the data is shared, so will the power.

Unlikely, but you never know

Despite proposing solutions, his vision is the underdog here. The chances of it prevailing in the face of big tech power are limited. But it is a powerful message that better alternatives are possible. This message can motivate citizens and leaders to push for a fairer internet that maximises social good.

The future of the internet and the future of humanity are interwoven. Will we actively engage to shape the future we want, or will we be helpless passive consumers? The worsening or “enshitification” of services has become an almost inevitable part of the innovation cycle. Many of us now wonder when, not if the service we receive will start to degrade dramatically once we are locked in.

There is dissatisfaction but this has not yet led to people changing their habits, possibly because there have not been better alternatives. Berners-Lee made the world wide web a success because his solution was more decentralised than the alternatives. People are now seeing the results of the overcentralised internet, and they want to go back to those decentralised principles.

Berners-Lee has offered an alternative vision. To succeed it would need to support from both consumers and states. That may seem unlikely, but once, so did the idea that the world would be connected via a single online information sharing platform.

The Conversation

Alex Zarifis does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Tim Berners-Lee wants everyone to own their own data – his plan needs state and consumer support to work – https://theconversation.com/tim-berners-lee-wants-everyone-to-own-their-own-data-his-plan-needs-state-and-consumer-support-to-work-269042

Does BBC Civilisations get its four stories of collapse correct? Experts weigh in

Source: The Conversation – UK – By Jay Silverstein, Senior Lecturer in the Department of Chemistry and Forensics, Nottingham Trent University

In four episodes, the BBC’s Civilisations series tells the story of the fall of the Romans, Aztecs, Egypt’s Ptolemies and Japan’s Edo Samurais. The show tells these stories through a combination of recreated dramatic scenes, explanation from experts and discussions of objects from the British Museum. Here, four experts in each period have reviewed the episodes and shared their recommendations for further reading.

The Collapse of the Roman Empire

The canonical date of the fall of the Western Roman Empire is 476, when the general Odoacer deposed the last emperor, Romulus Augustulus – a child who had been on the throne for less than a year. I teach my students that this relatively muted event was probably not noticed by many ordinary people at the time, as very little likely immediately changed in their daily lives.

Instead, the much more dramatic events of 410 were the real collapse moment of the ancient world: the metropolis of Rome, the capital of the empire, was sacked by King Alaric and his Gothic army. As one of the expert contributors to this episode puts it, you would remember where you were when the news reached you.

The episode’s key achievement is to depict the way that Roman mistreatment of the Goths – a Germanic-speaking people many of whom fled war with Huns into the Roman Empire – effectively threatened their survival and backed them into a corner. While historians have long discussed these realities, it’s refreshing to see this message presented in such a compelling and humane way to the wider public. The contemporary resonances are obvious, and while history cannot provide us with answers, it can give us food for thought.

Further reading
To learn more about the end of the Western Roman Empire, I would recommend starting with the very readable and provocative introduction by Bryan Ward-Perkins, The Fall of Rome: And the End of Civilization. It looks at the very real changes that ordinary people would have experienced as a centuries-old empire fell apart.

Tim Penn is Lecturer in Roman and Late Antique Material Culture at University of Reading

The Last Days of the Ptolemies in Egypt

Neither the gradual decline nor the final fall of the Ptolemaic dynasty in Egypt in 30 BC is accurately realised in this episode. It presents a simplistic narrative riddled with factual inaccuracies. It also features inadvertent misreadings or deliberate misrepresentations that play fast and loose with the historical chronology of the reign of Cleopatra VII, and the significant historical figures that were part of it.

Such inaccuracy is not helped by the fact that, with the exception of two contributors, no one participating is actually an expert on this specific period of ancient Egyptian history. One prominent figure is not even an historian or archaeologist at all.

Most of the artefacts that are incorporated in an attempt to provide insight don’t date to this period of Egyptian history, and lead the narrative off in irrelevant directions. It’s not clear who the intended audience is, nor what they are expected to take away from this, beyond appreciation for the sumptuous dramatisation that unfolds in the background. There was potential here, such as the contribution of climate change and the wider geopolitical context, that was unfortunately squandered.

Further reading

If you want to read about Cleopatra’s reign specifically, then Duane W. Roller’s Cleopatra: A Biography is good. For the Ptolemaic dynasty more broadly, from start to end, I’d recommend Lloyd Llewelyn-Jones’s The Cleopatras: The Forgotten Queens of Egypt.

Jane Draycott is Senior Lecturer in Ancient History at the University of Glasgow

The Collapse of the Aztec Empire

The episode on the Aztecs focuses on the Aztec emperor Moctezuma in the 15th century. It offers a refreshing shift from the Eurocentric narrative that often paints him as indecisive while glorifying his nemesis, the conquistador Hernán Cortés. Here, the roles are reversed: Cortés’s ambition and brutality are exposed, while Moctezuma appears as a thoughtful and capable leader. Their confrontation feels less like a simple conquest and more like a high-stakes chess match – Moctezuma had Cortés in check until one audacious move changed history.

If you’re looking for a comprehensive account of the Aztec collapse, this episode won’t deliver that. Experts such as Matthew Restall, known for challenging colonial myths, are used sparingly, and the story remains selective. Key events are skipped, and contradictory sources are left out. All of this is inevitable in a single-episode format.

What it does offer is a visually stunning, well narrated introduction to imperial collapse, framed through iconic artefacts that bring the past to life.

Further reading

To learn more about the fall of the Aztecs, read
The True History of the Conquest of New Spain, Volume 4 by Bernal Díaz del Castillo – a Spaniard who served under Cortés during conquest of the Aztec Empire. There are many translations but the first edition of the text, edited by Mexican historian Genaro García and translated by Alfred Percival Maudslay, is my pick.

Jay Silverstein is Senior Lecturer in the Department of Chemistry and Forensics at Nottingham Trent University

The End of the Samurai in Japan

This episode deals with the military encounter between the American “black ships” (kurofune 黒船) under naval commodore Matthew Perry and the Tokugawa shogunate 徳川幕府 between 1852 and 1855. The interviewed historians are certainly familiar with the event, yet the conceptual framing is not quite right.

“Traditional Japan” is introduced as an unchanging and isolated place. In reality, Japan had lived in close economic and cultural symbiosis with continental East Asia since at least the rise of Buddhism in the 6th century.

A 1637 proclamation, known as sakoku, by the Tokugawa shogunate did make Japan a hostile place for Christians and foreigners. However, the Protestant Dutch, arch-enemies of their former Spanish overlords, were granted the right to send annual expeditions. These became the basis for Japan’s “Dutch studies” (rangaku 蘭學), an exchange of scientific knowledge which is ignored by the programme. Meanwhile, contact with China and Korea continued, albeit under stricter regulations.

The documentary dwells on the image of a powerful and conservative samurai class without alluding to the social transformations which had eroded its influence. The capital Edo was not only the largest city on earth, but a veritable engine of urbanisation and commercialisation.

This documentary is still a pleasure to watch, but the premise that Perry’s western gunboats led to the “fall” of Japanese civilisation is erroneous.

Further reading
If you want to know more about the political and social turmoil that led to the end of the samurais and the Tokugawa shogunate, I recommend The Emergence of Meiji Japan by Marius B. Jansen.

Lars Laamann is Senior Lecturer in the History of China At Soas, University of London


This article features references to books that have been included for editorial reasons, and may contain links to bookshop.org. If you click on one of the links and go on to buy something from bookshop.org The Conversation UK may earn a commission.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Does BBC Civilisations get its four stories of collapse correct? Experts weigh in – https://theconversation.com/does-bbc-civilisations-get-its-four-stories-of-collapse-correct-experts-weigh-in-270114

The uncompromising politics of Jimmy Cliff

Source: The Conversation – UK – By Kenny Monrose, Researcher, Department of Sociology, University of Cambridge

“I have a dislike for politicians as they’re not truthful people. It’s the nature of politics that you cannot be straight, you have to lie and cheat,” said the reggae singer Jimmy Cliff, who died on November 24 at the age of 81.

Cliff was born James Chambers on July 30 July 1944 in Somerton, Saint James Parish, Jamaica. Long before luminaries such as Bob Andy, the Wailers, Lee Perry and others had made an indelible mark on Jamaican popular music, Jimmy had taken the genre to “foreign” – not just to the US or the UK but around the world. Suffice to say Jimmy Cliff was reggae’s first international star.

His career started seriously with ska recordings for legendary Chinese-Jamaican producer Leslie Kong on his Beverly’s label. As well as being a musician, Jimmy acted as an artist and repertoire representative, finding and developing new talent for Kong.

Cliff, at the request of singer Desmond Dekker, invited Bob Marley to record his first song Judge Not at Federal studios in 1962. In the same year, Jimmy recorded Hurricane Hattie, a number about the tropical cyclone that devastated the Caribbean, significantly British Honduras, in 1961.

Some of Cliff’s subsequent early hits included Miss Jamaica and King of Kings, both of which showcased his lyrical dexterity on the frenetic tempo of ska.

Jimmy had a knack of reflecting world events in his music at any given opportunity. By the end of 1960s, through his material he became one of the strongest advocates of the growing anti-war movement, typified by the 1968 recording Vietnam.

Vietnam, for me, was an incredibly courageous song to be recorded at the time. It is reminiscent of Wilfred Owens’s first world war poems “Futility” and “Dulce et Decorum Est” that reflect the ineffectuality of war.

In it, he sings:

Don’t be alarmed, she told me the telegram said
But mistress Brown your son is dead.
Vietnam, Vietnam, Vietnam
What I’m saying now somebody stop that war

The importance and power of protest against war loom at the epicentre of this song, making it resonate today.

Similarly, Cliff’s soul wrenching crossover hit Many Rivers to Cross, again recorded in 1968, is a cry for resilience. It became an anthem for Windrush arrivals who had left the Caribbean and sojourned to the mother country of Britain.

It represented Jimmy, who moved to London in the mid-60s and frequently recounted how difficult it was for him. Today it is suitably applicable for those who have felt the sting of displacement, loneliness, heartbreak and loss anywhere.

Struggling Man from 1973 opens with:

Every man has a right to live.
Love is all that we have to give.
Together we struggle by your will to survive,
and together we fight just to stay alive.

This composition highlighted the political climate and general feeling of the 1970s nationally with the start of a series of recessions gripping the country. But it also reached globally with the emergence of the international oil crisis, which impacted the lives of masses.

Jimmy was unquestionably a renaissance man who deftly moved with ease from being a singer to songwriter and then actor. Many recount his role as Rhyging, the anti-hero of Perry Henzell’s 1972 film The Harder They Come. Ivanhoe Martin (Jimmy Cliff), aka Rhyging, is a struggling singer who, despite hits, resorts to crime to get by. The film highlighted the corruption and exploitation in Jamaica’s music industry.

As well as acting in the film, Cliff provided the heart of the film’s soundtrack with the title track, The Harder They Come. Three of his earlier songs also feature. His turn in the Jamaican crime film is seen as one of the most powerful cinematic performances in Jamaican cinema.

In the 80s, Jimmy returned to his reggae roots recording Rub-A-Dub Partner in 1981. He also contributed to the emergence of reggae dancehall culture in 1988 when he recorded Pressure on Botha with the uncompromising Jamaican deejay Joey Wales. The song is a political track hitting out against the then state president of South Africa, P.W. Botha, who was a central figure in the Apartheid regime.

Jimmy Cliif was without doubt the greatest exponent of Jamaican music, taking reggae to an international audience while placing the island firmly on the map. As an artist, his contribution was accomplished within each category of the genre, from ska through to dancehall.

It was not only reggae that benefited from the brilliance of Jimmy Cliff, as he worked with a number artists from a broad range of musical backgrounds, including the Rolling stones, Sting, Latoya Jackson, Kool and the Gang, Jimmi Hendrix, Elvis Costello and Annie Lennox, to name but a few. After recording 33 albums, 50 years of performing and winning a Grammy in 1986, Jimmy Cliff was inducted to the Rock and Rock Hall of fame.

For a man who said he hated politics, it is exactly his uncompromising sense of right and his engagement with the world that will make his legacy everlasting.

The Conversation

Kenny Monrose does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. The uncompromising politics of Jimmy Cliff – https://theconversation.com/the-uncompromising-politics-of-jimmy-cliff-270596

The real reason states first emerged thousands of years ago – new research

Source: The Conversation – UK – By Christopher Opie, Senior Lecturer in Evolutionary Anthropology, University of Bristol

Shutterstock/RawPixel

Globalisation, migration, climate change and war – nation states are currently under huge pressure on many fronts. Understanding the forces that initially drove the emergence of states across the world may help explain why.

For a long time after humans evolved, we lived in oral-based, mostly small-scale and egalitarian societies. Things began to change with the dawn of the Holocene, when a suite of climatic, social and technological shifts led to the emergence of the first states about 5,000 years ago.

The earliest known state was in Mesopotamia (now southern Iraq), followed by Egypt, the Indus Valley, China and Meso-America. The long-standing view was that the invention of agriculture was the spur for these large-scale human societies to emerge. But there was a 4,000-year gap between the expansion of agriculture (circa 9,000 years ago) and the founding of the earliest states, which throws this link into question.

One theory suggests it was the intensification of agriculture that spurred the creation of states. Once fertilisation and irrigation were used, it produced a surplus that elites could extract to build and maintain states.

However, an alternative view, first put forward by anthropologist James Scott, is gaining ground. This proposes that states didn’t emerge from agriculture in general – rather, they almost invariably formed in societies that grew cereal grains.

Grasses such as wheat, barley, rice and maize grow above ground, ripen at a predictable time, and the grains they produce are readily stored. This makes them perfect for the systems of taxation that Scott argues fuelled state formation.

By Scott’s account, Mafia-style protection rackets forced people to produce grain, from which tax could be extracted and used to fund further exploitation. Scott proposed that these protection rackets were effectively the original states.

a field of wheat.
Grain: the fuel of ancient state formation.
Shutterstock/Hari Seldon

In the meantime, writing was invented and adopted as the information system to record those taxes. Once states had formed, writing had a huge influence on the structure and institutions of those societies. States, controlled by very small elites, used writing to build institutions and laws to maintain extreme hierarchies.

We tested these ideas, combining data from hundreds of societies worldwide with a global language family tree representing the ancestral relationships between those societies. We then used a mathematical model to evaluate claims about how statehood and its possible drivers evolved along the branches of this tree.

Our results suggest that intensive agriculture, with fertilisation and irrigation, was just as likely to be the result of state formation as it was to be its cause. On the other hand, grain agriculture consistently predicted subsequent state formation and the adoption of taxes.

We also found a strong correlation between non-grain agriculture and the formation of states. However, crops such as vegetables, fruit, roots and tubers – which were hard to tax – were more likely to be lost, not gained, as states were formed. This is consistent with the idea that grains were favoured over other forms of agriculture by emerging states for their taxation potential.

Trying to test causal claims about complex social changes in the deep past is inherently uncertain, but our results provide new evidence in support of Scott’s theory – that grain agriculture fuelled the formation of states, and that writing, invented and adopted to record taxation, was then used by states to maintain themselves through a very hierarchical system of laws and societal structures.

Lessons for the modern state

Our findings also highlight a broader connection between social systems and modes of information.

Long after the first emergence of writing, the invention of the printing press in medieval Europe is thought to have been integral to a raft of social changes that followed. As a much larger number of people were literate, information became both easier and cheaper to disseminate.

In turn, mass education, which became compulsory in the late 19th century in England and many other countries, is sometimes credited with the rise of universal suffrage and the beginning of democracy.

This change in the information system of societies clearly had a profound effect on the functioning of the state, yet writing has always been a system controlled by a small elite. Even after the emergence of mass literacy in many countries, publishers, working within state rules, have exerted control and influence over how and what we read.

This helps us understand current concerns about the destabilisation of modern nation states. Digital technologies and AI are disrupting how we generate, store and broadcast information; globalisation and cryptocurrencies are disrupting our taxation systems; and our agricultural production is under pressure because of climate change.

It may feel worlds apart, but the challenges and choices facing states today have been playing out since the dawn of the earliest states, thousands of years ago.

This article contains references to books that have been included for editorial reasons, and this may include links to bookshop.org. If you click on one of the links and go on to buy something from bookshop.org The Conversation UK may earn a commission.

The Conversation

Christopher Opie received funding from the Leverhulme Trust – Early Career Fellowship – ECF 619

Quentin Douglas Atkinson receives funding from the Royal Society of New Zealand.

ref. The real reason states first emerged thousands of years ago – new research – https://theconversation.com/the-real-reason-states-first-emerged-thousands-of-years-ago-new-research-268539

The real reason nation states first emerged thousands of years ago – new research

Source: The Conversation – UK – By Christopher Opie, Senior Lecturer in Evolutionary Anthropology, University of Bristol

Shutterstock/RawPixel

Globalisation, migration, climate change and war – nation states are currently under huge pressure on many fronts. Understanding the forces that initially drove the emergence of states across the world may help explain why.

For a long time after humans evolved, we lived in oral-based, mostly small-scale and egalitarian societies. Things began to change with the dawn of the Holocene, when a suite of climatic, social and technological shifts led to the emergence of the first states about 5,000 years ago.

The earliest known state was in Mesopotamia (now southern Iraq), followed by Egypt, the Indus Valley, China and Meso-America. The long-standing view was that the invention of agriculture was the spur for these large-scale human societies to emerge. But there was a 4,000-year gap between the expansion of agriculture (circa 9,000 years ago) and the founding of the earliest states, which throws this link into question.

One theory suggests it was the intensification of agriculture that spurred the creation of states. Once fertilisation and irrigation were used, it produced a surplus that elites could extract to build and maintain states.

However, an alternative view, first put forward by anthropologist James Scott, is gaining ground. This proposes that states didn’t emerge from agriculture in general – rather, they almost invariably formed in societies that grew cereal grains.

Grasses such as wheat, barley, rice and maize grow above ground, ripen at a predictable time, and the grains they produce are readily stored. This makes them perfect for the systems of taxation that Scott argues fuelled state formation.

By Scott’s account, Mafia-style protection rackets forced people to produce grain, from which tax could be extracted and used to fund further exploitation. Scott proposed that these protection rackets were effectively the original states.

a field of wheat.
Grain: the fuel of ancient nation state formation.
Shutterstock/Hari Seldon

In the meantime, writing was invented and adopted as the information system to record those taxes. Once states had formed, writing had a huge influence on the structure and institutions of those societies. States, controlled by very small elites, used writing to build institutions and laws to maintain extreme hierarchies.

We tested these ideas, combining data from hundreds of societies worldwide with a global language family tree representing the ancestral relationships between those societies. We then used a mathematical model to evaluate claims about how statehood and its possible drivers evolved along the branches of this tree.

Our results suggest that intensive agriculture, with fertilisation and irrigation, was just as likely to be the result of state formation as it was to be its cause. On the other hand, grain agriculture consistently predicted subsequent state formation and the adoption of taxes.

We also found a strong correlation between non-grain agriculture and the formation of states. However, crops such as vegetables, fruit, roots and tubers – which were hard to tax – were more likely to be lost, not gained, as states were formed. This is consistent with the idea that grains were favoured over other forms of agriculture by emerging states for their taxation potential.

Trying to test causal claims about complex social changes in the deep past is inherently uncertain, but our results provide new evidence in support of Scott’s theory – that grain agriculture fuelled the formation of states, and that writing, invented and adopted to record taxation, was then used by states to maintain themselves through a very hierarchical system of laws and societal structures.

Lessons for the modern state

Our findings also highlight a broader connection between social systems and modes of information.

Long after the first emergence of writing, the invention of the printing press in medieval Europe is thought to have been integral to a raft of social changes that followed. As a much larger number of people were literate, information became both easier and cheaper to disseminate.

In turn, mass education, which became compulsory in the late 19th century in England and many other countries, is sometimes credited with the rise of universal suffrage and the beginning of democracy.

This change in the information system of societies clearly had a profound effect on the functioning of the state, yet writing has always been a system controlled by a small elite. Even after the emergence of mass literacy in many countries, publishers, working within state rules, have exerted control and influence over how and what we read.

This helps us understand current concerns about the destabilisation of modern nation states. Digital technologies and AI are disrupting how we generate, store and broadcast information; globalisation and cryptocurrencies are disrupting our taxation systems; and our agricultural production is under pressure because of climate change.

It may feel worlds apart, but the challenges and choices facing states today have been playing out since the dawn of the earliest states, thousands of years ago.

This article contains references to books that have been included for editorial reasons, and this may include links to bookshop.org. If you click on one of the links and go on to buy something from bookshop.org The Conversation UK may earn a commission.

The Conversation

Christopher Opie received funding from the Leverhulme Trust – Early Career Fellowship – ECF 619

Quentin Douglas Atkinson receives funding from the Royal Society of New Zealand.

ref. The real reason nation states first emerged thousands of years ago – new research – https://theconversation.com/the-real-reason-nation-states-first-emerged-thousands-of-years-ago-new-research-268539

The gender pay gap looks different depending where you are on the income ladder

Source: The Conversation – UK – By Vanessa Gash, Deputy Director of the Violence and Society Centre, City St George’s, University of London

Somchai_Stock/Shutterstock

Despite decades of progress, the gender pay gap remains a persistent feature of the UK labour market. According to women’s rights charity the Fawcett Society, November 22 marked Equal Pay Day 2025 – the day when women effectively stop getting paid due to the wage gap with men.

This gender pay gap means women continue to earn less than men – currently by around 11% in the UK. This is not just because of differences in education or job type, but due to deeper inequalities in how work and care responsibilities are distributed.

A study on barriers to equal pay that I undertook with colleagues used 40 years of work history data from the UK Household Longitudinal Study to uncover how these inequalities play out across income groups.

We found that differences in work history – particularly full-time employment – account for nearly 29% of the gender pay gap on average. Women have shorter full-time work histories and spend more time doing part-time roles and unpaid care work. This reflects the challenges of reconciling paid employment with caregiving responsibilities.

For example, men in our sample had an average of 20 years of full-time work, compared to 14 years for women. Women also spent significantly more time in unpaid family care – more than two years on average – while men had less than three weeks.

This disparity is not just about time, it’s about how the labour market rewards different types of work. Full-time work earns a premium, while part-time work and unpaid care are penalised. Our research found that a year of full-time work increases hourly pay by 4%, while a year of part-time work decreases it by 3%.

The cost of being female

Even after education, occupation, sector and work history are taken into account, women still face a significant pay penalty. We found that this “female residual” – a proxy for discriminatory pay practices and cultural biases in how women and men behave – accounts for 43% of the average gender pay gap. This is an astonishing finding from a complex model that controlled for a wide variety of predictors of pay differentials.

In low-income households, the penalty is even more severe. Women in these households would earn more than men if not for this “female residual”. For example, for low-paid public sector workers, other factors like education and work history might create an expectation of women outearning men. But the female residual here meant that any advantage was cancelled out.

This finding challenges the assumption that discrimination is more prevalent among high earners and underscores the need to focus on inequality at the bottom of the income distribution.

Interestingly, the impact of part-time work varies by income group. Among wealthier households, part-time work increases the gender pay gap. But in poorer households, it does not appear to carry the same penalty – and may even offer a slight premium for women. This suggests that exposure to part-time work for men in lower income groups is associated with very low pay.

With this in mind, policies encouraging full-time work for women may not be appropriate for all groups. In low-income households, part-time work may be a necessary and viable option, especially when the quality of the available jobs is poor and caregiving responsibilities are high.

However, we also found that men face a stronger penalty for part-time work than women, which may discourage them from sharing caregiving duties. This reinforces traditional gender roles and perpetuates inequality.

Sex-segregated occupations – those dominated by either men or women – also play a role in the gender pay gap. Female-dominated jobs (for example, care work, hospitality and retail) tend to be lower paid, less regulated and offer fewer opportunities for progressing up the career ladder.

We found that this segregation accounts for 17% of the average gender pay gap. Yet, we also found penalties associated with male-dominated occupations such as construction, particularly in low-income households. This challenges the assumption that male-dominated jobs are always better paid.

On the other hand, we found that public-sector employment, union membership and paid parental leave reduce the gender pay gap, especially for women in low-income households. These offer some protection against discrimination at the same time as supporting work-life balance.

But these benefits are not evenly distributed. Women in wealthier households are less likely to rely on these protections, while those in poorer households benefit disproportionately. This highlights the importance of jobs that offer these protections for low-income workers.

Our findings suggest that equal pay policies must be tailored to the needs of different income groups. For wealthier households, policies that support full-time work and chip away at sex segregation may be effective so that women can more readily access better-paid jobs.

But for poorer households, the focus should be on improving access to stable and better-paid jobs, while reducing discrimination and supporting flexible work arrangements.

Crucially, efforts to close the gender pay gap must avoid pitting the gains of high-earning women against the losses of low-earning men. In an era of rising political populism, this could undermine support for equality.

Instead, we need an approach that promotes good-quality employment for all and that supports equalised caregiving responsibilities. If we fail to address the barriers that prevent men and women from participating fully in both paid work and unpaid care work, we are unlikely to see reductions in the gender pay gap any time soon.

The Conversation

Vanessa Gash does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. The gender pay gap looks different depending where you are on the income ladder – https://theconversation.com/the-gender-pay-gap-looks-different-depending-where-you-are-on-the-income-ladder-270199

The world’s little-known volcanoes pose the greatest threat

Source: The Conversation – UK – By Mike Cassidy, Associate Professor, School of Geography, Earth and Environmental Sciences, University of Birmingham

El Chichón volcano in Mexico erupted explosively in 1982 after lying dormant for centuries. Michael Cassidy, CC BY-NC-ND

The next global volcanic disaster is more likely to come from volcanoes that appear dormant and are barely monitored than from the likes of famous volcanoes such as Etna in Sicily or Yellowstone in the US.

Often overlooked, these “hidden” volcanoes erupt more often than most people realise. In regions like the Pacific, South America and Indonesia, an eruption from a volcano with no recorded history occurs every seven to ten years. And their effects can be unexpected and far-reaching.

One volcano has just done exactly that. In November 2025, the Hayli Gubbi volcano in Ethiopia has erupted for the first time in recorded history (at least 12,000 years that we know of). It sent ash plumes 8.5 miles into the sky, with volcanic material failing in Yemen and drifting into air space over northern India.

You don’t have to look far back in history to find another example. In 1982, the little-known and unmonitored Mexican volcano El Chichón erupted explosively after lying dormant for centuries. This series of eruptions caught authorities off-guard: hot avalanches of rock, ash and gas flattened vast areas of jungle. Rivers were dammed, buildings destroyed, and ash fell as far as Guatemala.

More than 2,000 people died and 20,000 were displaced in Mexico’s worst volcanic disaster in modern times. But the catastrophe did not end in Mexico. The sulphur from the eruption formed reflective particles in the upper atmosphere, cooling the northern hemisphere and shifting the African monsoon southwards, causing extreme drought.

This alone would test the resilience and coping strategies of any region. But when it coincided with a vulnerable population that was already experiencing poverty and civil war, disaster was inevitable. The Ethiopian (and East African) famine of 1983-85 claimed the lives of an estimated 1 million people. This brought global attention to poverty with campaigns like Live Aid.

Few scientists, even within my field of Earth science, realise that a remote, little-known volcano played a part in this tragedy.

Despite these lessons, global investment in volcanology has not kept pace with the risks: fewer than half of active volcanoes are monitored, and scientific research still disproportionately focuses on the well-known few.

There are more published studies on one volcano (Mount Etna) than on all the 160 volcanoes of Indonesia, Philippines and Vanuatu combined. These are some of the most densely populated volcanic regions on Earth – and the least understood.

The largest eruptions don’t just affect the communities around them. They can temporarily cool the planet, disrupt monsoons and reduce harvests across entire regions. In the past, such shifts have contributed to famines, disease outbreaks and major social upheaval, yet scientists still lack a global system to anticipate or manage these future risks.

volcano erupting with red explosive ash
Mount Etna on the Italian island of Sicily.
Wead/Shutterstock

To help address this, my colleagues and I recently launched the Global Volcano Risk Alliance, a charity that focuses on anticipatory preparedness for high-impact eruptions. We work with scientists, policymakers and humanitarian organisations to highlight overlooked risks, strengthen monitoring capacity where it is most needed, and support communities before eruptions occur.

Acting early, rather than responding only after disaster strikes, stands the best chance of preventing the next hidden volcano from becoming a global crisis.

Why ‘quiet’ volcanoes aren’t safe

So why do volcanoes fail to receive attention proportionate to their risk? In part, it comes down to predictable human biases. Many people tend to assume that what has been quiet will remain quiet (normalcy bias). If a volcano has not erupted for generations, it is often instinctively considered safe.

The likelihood of an event tends to be judged by how easily examples come to mind (this mental shortcut is known as availability heuristic). Well-known volcanoes or eruptions, such as the Icelandic ash cloud from 2010, are familiar and can feel threatening, while remote volcanoes with no recent eruptions rarely register at all.




Read more:
Wildfires, volcanoes and climate change: how satellites tell the story of our changing world


These biases create a dangerous pattern: we only invest most heavily after a disaster has already happened (response bias). El Chichón, for instance, was only monitored after the 1982 catastrophe. However, three-quarters of large eruptions (like El Chichón and bigger) come from volcanoes that have been quiet for at least 100 years and, as a result, receive the least attention.

Volcano preparedness needs to be proactive rather than reactive. When volcanoes are monitored, when communities know how to respond, and when communication and coordination between scientists and authorities is effective, thousands of lives can be saved.

Disasters have been averted in these ways in 1991 (at Mount Pinatubo in the Philippines), in 2019 (at Mount Merapi in Indonesia) and in 2021 (at La Soufrière on the Caribbean island of Saint Vincent).

To close these gaps, the world needs to shift attention towards undermonitored volcanoes in regions such as Latin America, south-east Asia, Africa and the Pacific – places where millions of people live close to volcanoes that have little or no historical record. This is where the greatest risks lie, and where even modest investments in monitoring, early warning and community preparedness could save the most lives.


Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 47,000+ readers who’ve subscribed so far.


The Conversation

Mike Cassidy receives funding from the UK’s Natural Research Research Council. He is the Co-founder and Chair of the Global Volcano Risk Alliance charity.

ref. The world’s little-known volcanoes pose the greatest threat – https://theconversation.com/the-worlds-little-known-volcanoes-pose-the-greatest-threat-266292

First human bird-flu death from H5N5 – what you need to know

Source: The Conversation – UK – By Ed Hutchinson, Professor, MRC-University of Glasgow Centre for Virus Research, University of Glasgow

Melanie Hobson/Shutterstock.com

H5N1 bird flu has infected growing numbers of people worldwide in recent years, but this week saw something new: the first recorded human case of an H5N5 avian influenza virus. What is this virus and how concerned about it should we be?

What happened?

In early November, a resident of Grays Harbor, a county on the south-west Pacific coast of Washington state about 100 miles from Seattle, became severely unwell with flu-like symptoms including high fever, respiratory distress and confusion.

They were admitted to hospital, and on November 14 officials confirmed that tests showed infection with an H5N5 avian influenza virus. The patient, an older adult with underlying conditions, was treated in hospital, but sadly they died on November 21.

This was the first reported human infection with an H5N5 influenza virus.

What is H5N5 influenza virus?

H5N5 influenza viruses are a type of avian influenza (bird flu) – an influenza A virus that infects birds.

Bird flu viruses are classified as either “high pathogenicity” or “low pathogenicity” based on the severity of symptoms they cause in poultry. (Their severity also varies in other bird species.) This H5N5 strain, like the widespread and much-reported-on H5N1 strain, is one of the high pathogenicity forms.

Where did it come from?

This hasn’t yet been formally confirmed. However, the patient kept a flock of backyard poultry that were exposed to wild birds, which suggests how they might have caught the virus.

H5N5 is found in wild birds around the world, and it is relatively common for it to pass from them into flocks of poultry. This is, however, the first time an H5N5 influenza virus has been found to go one step further and infect a human.

What does the name mean? Is it similar to H5N1?

Influenza A viruses are one of the major branches of the influenza virus family, and are divided into subtypes based on differences in the two proteins that form spikes on the surface of virus particles: haemagglutinin (HA) and neuraminidase (NA).

Both proteins are good targets for the immune system’s antibodies. The proteins rapidly mutate as the virus evolves to evade these antibodies, and the different forms that result are used to categorise influenza A viruses.

The bird flu viruses H5N1 and H5N5 both have HA proteins of the same H5 subtype (though recognisably distinct from each other), but have NA proteins of different subtypes. Just as humans can be infected by different influenza A virus subtypes during the same winter season (H1N1 and H3N2), genetic studies show us this H5N5 virus is distinct from the dominant H5N1 strain that is also circulating in birds worldwide.

Should we be worried about what happens next?

H5N5 is an ecological and agricultural threat. Although bird flu vaccines exist, at the moment, political and economic factors make it hard to use them in US poultry. Instead, the virus must be controlled by surveillance, housing poultry indoors, increasing farm biosecurity and, as a last resort, by mass culling of infected poultry.

This is challenging enough, but bird flu also demands our attention because the virus is a potential cause of new pandemics.

In the long run, this risk is very significant. However, it is worth remembering that, although influenza is better at changing its host species and creating pandemics than any other virus, that is still an incredibly hard thing for the virus to do.

The vast majority of “spillover” infections of bird flu into humans are one-off events. They can vary unpredictably in their effects. Most are quite mild (for example, causing conjunctivitis), but some can be very severe, as was the case in this first recorded case of H5N5. But after infecting one human, most avian influenza viruses go no further.

Scientists will watch for several warning signs that a virus may be adapting to humans, especially any hint of person-to-person spread. There is no sign that this has happened here.

At the moment, the wider risk to humans from H5N5 is still low, and there is no reason to think this was anything other than a tragic one-off case. However, there will be plenty of opportunities for influenza viruses to try again. As H5N5 and other subtypes of avian influenza virus continue to circulate, it is important we continue to monitor this virus carefully.

The Conversation

Ed Hutchinson receives funding from UKRI and the Wellcome Trust. He has unpaid positions as a board member of the European Scientifiic Working group on Influenza (ESWI), as Chair-Elect of the Microbiology Society’s Virus Division and as a scientific advisor to Pinpoint Medical.

ref. First human bird-flu death from H5N5 – what you need to know – https://theconversation.com/first-human-bird-flu-death-from-h5n5-what-you-need-to-know-270535