Recipes from the middle ages have much in common with how our grandparents used to cook

Source: The Conversation – UK – By Diane Purkiss, The William F Pollard Tutorial Fellow in English, University of Oxford

Painting of a banquet from the manuscript of The Romance of Alexander the Great, mid-15th century. Wiki Commons

“You have to keep beating it for longer,” my grandmother instructed me. “It isn’t pale yet. It’s still too yellow.” I didn’t ask how long this would take. I was nine years old, and I understood what my grandmother meant. You have to keep doing something until it works. It’s like asking: “Are we there yet?”

I watched for the miraculous transformation. The eggs, golden when first beaten, were lightening to a soft lemon colour. The texture was changing. You couldn’t see the sugar anymore; it had looked like sand, but now it was invisible, cloaked in the egg. My grandmother stopped beating, and lifted up the beater. A stream of thick liquid hung down, like the wet sand you used to reinforce a sandcastle. “Yes, that’s enough. Now add the melted butter. Slowly. Then the flour. We’ll need a bit more.”

My grandmother taught me to cook. She never weighed anything. The only measurement she used was a pink breakfast teacup, and it was more a useful scoop than a measure. Instead, she worked towards a desired result. You didn’t cook things for five minutes. You cooked things until you got the result you wanted. The first thing she taught me to make was bechamel sauce. She didn’t call it that. She called it white sauce with flavour. I could make it when I was five, and I still do it the same way.

Her cooking was preliterate, or, more exactly, a special kind of literacy, a grammar of ingredients and heat and air.

I’m a food historian and the author of English Food: A People’s History. I have never found the recipes of the middle ages as difficult to understand as most food historians. Perhaps because they look a little like my grandmother’s instructions.

Cooking in the middle ages

A medieval recipe
The recipe for sambocade from Add MS 5016.
British Library

Take and make a crust in a trap, and take cruddes and wryng out þe wheyze, and drawe hem þurgh a straynour, and put in þe straynour crustes. Do þerto sugar the þridde part and somdel whyte of ayren, and shake þerin blomes of elren, and bake it up with eurose, and messe it forth.

This is a recipe for “sambocade” from a middle ages manuscript held in the British Library. Sambocade is an elderflower cheesecake of sorts. It uses curds – the beginnings of cheese – and the recipe gives quite detailed instructions on how to make them, a method a little like making Greek yoghurt. You add sugar, egg white and elderflowers, along with rosewater. Then you serve it.

A recipe like this is not a series of instructions. It is meant to act as a reminder, a series of quick notes to recall to mind something taught orally – something taught as my grandmother taught me.

Just as Google Maps will not tell me how to walk by putting one foot in front of the other, this kind of recipe doesn’t tell me what I’m looking for or how to achieve it. It doesn’t give exact measurements. It doesn’t really give any measurements at all. But if you made this recipe half a dozen times, you would soon understand the process required. And then, it would be yours, in a way that a recipe tested or created by another cook can never quite be yours.

Medieval image of a baker putting bread into an oven
A baker with his assistant making bread rolls, from a book of hours manuscript (circa 1500).
Bodleian Library

In my kitchen I still keep my mother’s recipe book, a manuscript volume in which she tried to preserve recipes that were gifts from friends. All of it is in her handwriting.

It contains a recipe for cheesecake from the days when cheesecake was a little-known novelty; it notes that the recipe comes from an American friend. It contains exact quantities and exact baking times, although the result is a lot more strongly baked than the majority of cheesecakes now. The exact quantities preserve a memory of the effect that’s difficult to reconstruct from recipes that come from earlier times.

In the same way, I have only my memories of my grandmother’s cooking to preserve what she did; she was barely literate, and her own recipes consisted solely of lists of ingredients. These were kept in a shoe box and after she died, my mother threw it away on the grounds that it was of no possible value to anyone. All the same, every time I make a sponge cake, I say to myself, is it pale enough yet?


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

Diane Purkiss is affiliated with Keble College, University of Oxford.

ref. Recipes from the middle ages have much in common with how our grandparents used to cook – https://theconversation.com/recipes-from-the-middle-ages-have-much-in-common-with-how-our-grandparents-used-to-cook-264139

Putting your CV together? Complete honesty might not be the best policy

Source: The Conversation – UK – By Tom Lane, Senior Lecturer in Economics, Newcastle University

PeopleImages/Shutterstock

Writing a CV requires important decisions. What should you include, what should you leave out – and how honest should you be?

One particularly tricky dilemma that might come up is whether to disclose weaknesses on your CV or remain silent about them. Common sense suggests it’s not advisable to advertise your flaws, but what about important information that employers might expect you to supply? Could the omission of such details look suspicious?

Research my colleague and I conducted looks at this specific question, focusing on the academic qualifications of new graduates entering the job market. And it provides a clear, evidence-based answer: if your grades are low, you are better off not disclosing this.

Complete honesty is not the best policy.

In the UK, where we did the research, most universities award undergraduate degrees on a scale: first-class, upper second (2:1), lower second (2:2) and third. While a first or 2:1 is often seen as evidence of strong performance, lower degrees are held in lower esteem.

A graduate jobseeker with a lower classification has a choice of what to reveal on their CV. They can be upfront about it, or they could simply state that they have a degree, without mentioning the class. (A third option, to lie about the class is probably a bad idea because employers can and do ask for proof.)

Perhaps surprisingly, traditional economic theory would probably favour fronting up. Interactions like this, where a “seller” (in our case, a jobseeker supplying their skills) holds information about their quality that they can voluntarily disclose or not to “buyers” (here, employers), have been popular subjects for analysts of game theory (the mathematical study of strategic interactions).

The idea starts with the notion that people who fail to supply available evidence about their quality look like they have something to hide. Some economists have concluded that buyers will assume non-disclosing sellers must be not merely bad, but of the lowest possible quality level.

In our context, this means employers would think that any graduate whose CV omits degree classification information has a third-class degree, and should treat them accordingly. To avoid this, it would be in the interests of any applicant who earned a 2:2 or higher to disclose it.

To see how jobseekers actually behave, we analysed the CVs of recent graduates on the job website Monster. We noticed that a substantial minority left their degree class undisclosed. Included among them, presumably, were plenty of applicants with at least a 2:2.

To work out whether these applicants were making a mistake, we also conducted a large experiment, sending more than 12,000 applications to genuine graduate job vacancies. These varied only in the jobseeker’s degree classification, and whether this was disclosed on their CV, with other details kept the same.

Success was measured by how often applications resulted in invitations for an interview or further communication. As expected, the most successful of our applications were those with a first-class degree.

However, those who said nothing about degree class were not the least successful. Instead, their success rate was in between that achieved by jobseekers disclosing 2:1s and 2:2s. Applicants who openly reported a third-class degree were the least likely to receive a response.

Put simply then, full disclosure harmed their chances.

The third degree

Our findings challenge the neat logic of traditional economic theory. If employers always assumed the worst about missing information, hiding poor grades should not help.

Yet in practice, it seems recruiters do not have time to scrutinise every detail. Faced with hundreds of applications, they may skim CVs, focusing on standout positives or negatives. If the grade is not there, it may simply go unnoticed.

Of course, interviewers might ask about grades later in the application process, but by initially concealing this information, otherwise unattractive applicants can help themselves get to the interview stage, at which point they can use other qualities to impress.

Graduates throw caps into the sky.
Don’t mention it.
Roman Samborskyi/Shutterstock

The practical message of our research is clear. If you have strong academic credentials, highlight them proudly. But if your results are weaker, you are under no obligation to advertise them. Omitting them will not guarantee success, but it may increase your chances.

The graduate job market remains highly competitive. Yet our study suggests that lower grades do not need to define a candidate’s prospects, provided they make careful choices about self-presentation.

Strategic omissions may help level the playing field for those whose academic record does not reflect their potential. So if you have recently graduated with a third, there’s no need to panic, and no need to mention it either.

The Conversation

Tom Lane does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Putting your CV together? Complete honesty might not be the best policy – https://theconversation.com/putting-your-cv-together-complete-honesty-might-not-be-the-best-policy-263679

Curious Kids: why do we need to do homework?

Source: The Conversation – UK – By James Williams, Emeritus reader in science education and communication, University of Sussex

PeterPike/Shutterstock

Why do we need to do homework when we already spend all day in school? – Grace, aged nine, Belfast

If you’ve ever stared at your homework feeling stuck, you’re not alone. Many children say it makes them feel stressed, bored, or even anxious. Why do teachers keep giving you work to do at home when you’ve already spent hours learning at school?

The available research suggests that for secondary school students, well-designed homework can lead to about five extra months of progress in subjects like maths and English. In primary school, the impact is smaller – around three months – but still useful.

Homework helps you practise what you’ve learned, remember it better and build skills like time management and independence.

However, research shows that how you feel about homework depends on a few things. If you find your homework boring, it might be because the activity you’ve been given to do really is pretty boring. Not all homework is equal.


Curious Kids is a series by The Conversation that gives children the chance to have their questions about the world answered by experts. If you have a question you’d like an expert to answer, send it to curiouskids@theconversation.com and make sure you include the asker’s first name, age and town or city. We won’t be able to answer every question, but we’ll do our very best.


A worksheet that doesn’t connect to your lessons is not so helpful. A task that challenges you to think, create, or explore concepts and ideas is much better.

Teachers have to think hard about the tasks they set and how they explain them. If the task is explained clearly and if students get helpful feedback, the chance they will complete it is much higher. Teachers must also choose meaningful tasks help you see homework as part of learning – not just extra work.

Girl doing homework with pen, paper and laptop
Sometimes homework really is boring.
Studio Romantic/Shutterstock

Homework that’s creative or linked to your passions is more enjoyable. Then comes the idea of success. If the task feels impossible, it’s easy to give up. Finally, does it make sense? Homework that connects to what you learned in class feels more useful.

As a science teacher I would always try and set the homework early in the lesson rather than right at the end. Knowing what is going to be expected means that the children better understood the task and could link it to the work being done in the lesson.

How you do homework

Your attitude toward homework isn’t just about the task – it’s also about the people around you. If you have parents or guardians who encourage you, help you plan your time, or show interest in your work, this can make homework feel more positive. That said, there is research that shows that while it’s helpful for parents to ask whether you’ve done your homework, helping you do it isn’t actually useful.

Some children also face bigger challenges. Not everyone has a quiet space to work, or someone at home who can help. This is called the “homework gap” and it can make school feel unfair.

It’s up to schools whether they set homework, and some schools are rethinking homework altogether. They are looking to make it more accessible and creative. Some schools make homework optional rather than demand it for every subject. Schools are also looking at how they can make homework fair for everyone. This includes ideas such as homework clubs, where you can get help and work with friends.

Homework isn’t going away any time soon. But it doesn’t have to be a burden. When it’s well-designed, supported by teachers and parents, and connected to learning, it can help you grow – not just as a student, but as a thinker.

So next time you sit down with your homework, ask yourself: What can I learn from this? And if it feels too hard or pointless, speak up. Your voice matters.

The Conversation

James Williams does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Curious Kids: why do we need to do homework? – https://theconversation.com/curious-kids-why-do-we-need-to-do-homework-262992

Two seventh-century people found with west African ancestry – a story of diversity and integration in early Anglo-Saxon society

Source: The Conversation – UK – By Duncan Sayer, Professor in Archaeology, University of Lancashire

In 2022, archaeologists worked on the ancient DNA from a number of early medieval cemeteries, and found two individuals that stood out. One was from Updown Eastry in Kent, known as Updown girl, and the other was a young man from Worth Matravers in Dorset. Both were dated to the 7th century and both appeared to have west African heritage.

Two recent papers on these findings, along with other discoveries, highlight that English people from this time with west African heritage spanned generations and social status. The burials of these individuals also show that they were integrated into their respective communities. For example, Updown girl was buried next to her maternal relatives.

As a result, the presence of African heritage should not be a surprise. Early medieval society was much richer and more globally connected than most people believe.

Updown Eastry is a cemetery associated with the early Anglo-Saxon Kentish elite and part of a royal network. Updown girl was aged between 11 and 13 at her time of death and was buried around the middle of the seventh century.

An analysis of her autosomal DNA (which derives from both parents) found she was 67% continental northern European and 33% west African – most closely related to modern-day Esan and Yoruba populations in Nigeria. One of her great grandparents was 100% west African. Some of her maternal relatives were buried close by and their ancestry derived from northern Europe.

The second burial was of a young man aged between 17 and 25 at the time of his death. He was found in a grave with an unrelated adult male in a small cemetery that was in use for around 100 years, with his burial dated between AD605 to AD650.

Analysis of the site shows that the burial population had predominantly (77%) western British and Irish ancestry. Worth Matravers contained four primary family groups mostly related along the maternal line, suggesting a degree of matrilocality (where women remain after marriage) within this community. The young man also stood out because his Y-chromosome DNA was consistent with west African ancestry (25%) coming from his grandfather.

Some, modern ideas of medieval England paint it as an insular place with little or no diversity. However, England was much more connected to the rest of the world and its society was, as a result, much less homogenous than we imagine. Some early Anglo Saxon’s had brown eyes and African Ancestors.

Finds connecting Britain to the world

Royal burials like that at Sutton Hoo, Suffolk, and Prittlewell, Essex, contained objects from far afield, including Byzantine silver bowls and a jug from the eastern Mediterranean.

Amethysts and garnets have been found in seventh century jewellery and these stones were mined in Sri Lanka and India. Analysis of loop-like bag openings found in female graves from the fifth to seventh century revealed that these were made from African elephant ivory.

The Byzantium reconquest of north Africa in AD634 to AD635 provided new sources of sub-Saharan gold. In the west of Britain, fragments of red slip ware (distinctive Byzantium amphora vessels or pottery) have been found at sites associated with elites, like Tintagel in northern Cornwall. There is also evidence of glass beads made in early medieval England being found in contemporary Tanzania.

The newly emerging elite of seventh century England were looking east and were building new ideas about governance derived from old or far-flung places. Christianity, for instance, came from Rome, part of Byzantium.

There were also historical references to people from the African continent known to be part of society at the time. For instance, in the late seventh century, the African Abbot, Hadrian, joined Archbishop Theodore in Canterbury. And later in the 10th century, an Old English vernacular verse from Exodus described “the African maiden on the ocean’s shore, adorned with gold”.

While we cannot rule out the possibility that the ancestors of Updown Girl and the young man from Worth Matravers had been slaves, we must also be careful of interpreting the evidence though a post-colonial bias. The closer we look, the richer and more complex the connections between Britain, Byzantium and Africa are.

We do not know if these Africans were slaves, but we do know that early medieval slaves would have included western British, Frankish and Anglo-Saxon people too.

At a royal centre like Eastry in Kent, many accents might be found as well as different ways of wearing clothing. These places contained well-travelled people connected via family and marriage. DNA and isotopic studies also show that movement for marriage was common among early medieval elite women, who married into wealthy families, particularly in the east of Britain. So, we must also consider other possibilities alongside slavery, include religion, trading, travelling, marriage and seafaring.

Indeed the difference between Updown Eastry, an elite site, and Worth Matravers, a small coastal community, is critical to understanding the range of possibilities. African ancestry is found at both ends of the social spectrum and in the east and west of England.

Though England was more diverse than we think, life was not easy and, like these two examples, people died young. As well as disease, death by violence was also known – the weapons we find in early medieval graves were displayed as well as being functional objects.

DNA and cemetery evidence points to the importance of kinship and family for survival. These units provided shelter, protection, food and care. The evidence suggests that both of these African descendants were fully integrated into their respective communities sharing family ties and even the grave.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

Duncan Sayer does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Two seventh-century people found with west African ancestry – a story of diversity and integration in early Anglo-Saxon society – https://theconversation.com/two-seventh-century-people-found-with-west-african-ancestry-a-story-of-diversity-and-integration-in-early-anglo-saxon-society-263375

Environmental pressures need not always spark conflict – lessons from history show how crisis can be avoided

Source: The Conversation – UK – By Jay Silverstein, Senior Lecturer in the Department of Chemistry and Forensics, Nottingham Trent University

Afghan farmers plough a field while US soldiers patrol in Helmand province in 2010.
ResoluteSupportMedia / Flickr, CC BY-ND

The expectation that competition for dwindling resources drives societies towards conflict has shaped much of the discourse around climate change and warfare. As resources become increasingly vulnerable to environmental fluctuations, climate change is often framed as a trigger for violence.

In one study from 2012, German-American archaeologist Karl Butzer examined the conditions leading to the collapse of ancient states. Among the primary stressors he identified were climate anxieties and food shortages.

States that could not adapt followed a path towards failure. This included pronounced militarisation and increased internal and external warfare. Butzer’s model can be applied to collapsed societies throughout history – and to modern societies in the process of dissolution.


Wars and climate change are inextricably linked. Climate change can increase the likelihood of violent conflict by intensifying resource scarcity and displacement, while conflict itself accelerates environmental damage. This article is part of a series, War on climate, which explores the relationship between climate issues and global conflicts.


Bronze age aridification in Mesopotamia from roughly 2200BC to 2100BC, for example, is correlated with an escalation of violence there and the collapse of the Akkadian empire. Some researchers also attribute drought as a major factor in recent wars in east Africa.

There is a wide consensus that climatic stress contributes to regional escalations of violence when it has an impact on food production. Yet historical evidence reveals a more complex reality. While conflict can arise from resource scarcity and competition, societal responses to environmental stress also depend on other factors – including cultural traditions, technological ingenuity and leadership decisions.

The temptation to draw a direct correlation between climate stress and war is both reductionist and misleading. Such a perspective risks surrendering human agency to a deterministic “law of nature” – a law to which humanity need not subscribe.

Catalysing transformation

In the first half of the 20th century, researchers grappled with the Malthusian dilemma: the fear that population growth would outpace the environment’s carrying capacity. The reality of this dynamic has contributed to the collapse of certain civilisations around the world.

These include the Maya and Indus Valley civilisations in Mesoamerica and south Asia respectively. The same applies to the Hittite in what is now modern-day Turkey and the Chaco Canyon culture in the US south-west.

Civilisations affected by climate stress:

A table documenting examples of civilisations affected by climate stress.
Many civilisations have been affected by climate stress in the past.
Jay Silverstein, CC BY-NC-ND

However, history is equally rich with examples of societies that have successfully averted crisis through innovation and adaptation. From the dawn of agriculture (10,000BC) onward, human ingenuity has consistently expanded the boundaries of environmental possibility. It has also intensified the means of food production.

Irrigation systems, efficient planting techniques and the selective breeding of crops and livestock enabled early agricultural societies to flourish. In Roman (8th century BC to 5th century AD) and early medieval Europe (5th to 8th centuries AD), the development of iron ploughshares revolutionised soil cultivation. And water-lifting technologies – from the Egyptian shaduf to Chinese water wheels and Persian windmills – expanded arable land and intensified production.

In the 19th century, when Europe’s population surged and natural fertiliser supplies such as guano became strained, the Haber-Bosch process revolutionised agriculture by enabling nitrogen to be extracted from the atmosphere. This allowed Europe to meet its growing demand for food and, incidentally, munitions.

Danish economist Esther Boserup’s work from 1965, The Conditions of Agricultural Growth, challenged the Malthusian orthodoxy. It demonstrated that population pressure can stimulate technological innovation. Boserup’s insights remain profoundly relevant today.

As humanity confronts an escalating environmental crisis driven by global warming, we stand at another historic inflection point. The reflexive response to climate stress – political instability and conflict – should be challenged by a renewed commitment to adaptation, cooperation and innovation.

The measuring shaft of a nilometer.
A nilometer, which was used to gauge the optimal time to open agricultural canals in ancient Egypt.
Baldiri / Wikimedia Commons, CC BY-NC-SA

Dwindling military superiority

There are many examples of societies successfully overcoming environmental threats. But our history is also full of failed civilisations that more often than not suffered ecological catastrophe.

In many cases, dwindling resources and the lure of wealth in neighbouring societies contributed to invasion and military confrontation. Droughts have been implicated in militaristic migration in central Asia, such as the westward movement of the Huns and the southward push of the Aryans.

Asymmetries in military power can encourage or deter conflict. They offer opportunities for reward or impose strategic constraints. And while military superiority has largely shielded the wealthiest nations in the modern era, this protection may erode in the foreseeable future.

Natural disasters that erode security infrastructure are becoming increasingly frequent and severe. In 2018, for example, two hurricanes caused a combined US$8.3 billion (£6.2 billion) in damage to two military facilities in the US. There has also been a proliferation of inexpensive military technologies like drones.

Both of these developments could create new opportunities to challenge dominant powers. Under such conditions, increases in military conflict should be expected in the coming decades.

In my view, dramatic action must be taken to avoid a spiral of conflict. Ideals, knowledge and data should be translated into political and economic will. This will require coordinated efforts by every nation.

The growth of organisations such as the Center for Climate and Security, a US-based research institute focused on systemic climate and ecological security risks, signals movement in the right direction. Yet such organisations face a steep climb in their efforts to translate geopolitical climate issues into meaningful political action.

One of the main barriers is the rise of anti-intellectualism and populist politics. Often aligned with unregulated capitalism, this can undermine the very strategies needed to address the unfolding crisis.

If we are to avoid human tragedy, we will need to transform our worldview. This requires educating those unaware of the causes and consequences of global warming. It also means holding accountable those whose greed and lust for power have made them adversaries of life on Earth.

History tells us that environmental stress need not lead to war. It can instead catalyse transformation. The path forward lies not in fatalism, but in harnessing the full spectrum of human creativity and resilience.

The Conversation

Jay Silverstein does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Environmental pressures need not always spark conflict – lessons from history show how crisis can be avoided – https://theconversation.com/environmental-pressures-need-not-always-spark-conflict-lessons-from-history-show-how-crisis-can-be-avoided-262300

International law isn’t dead. But the impunity seen in Gaza urgently needs to be addressed

Source: The Conversation – UK – By James Sweeney, Professor, Lancaster Law School, Lancaster University

Philippe Lazzarini, the commissioner general of the United Nations Relief and Works Agency for Palestine Refugees (Unrwa) says that Gaza is “becoming the graveyard of international humanitarian law”.

International humanitarian law (IHL), regulates the conduct of armed conflict, which is the legal expression for war. It covers everything from what is a lawful target, to the treatment of prisoners and injured people, and even to the testing of new weapons. The main rules of IHL can be found in the Geneva Conventions of 1948.

Lazzarini, though, has gone so far as to say that we “have made the Geneva convention[s] almost irrelevant. What is happening and being accepted today in Gaza is not something that can be isolated; it will become the new norm for all future conflicts”.

There can be no doubt that the situation in Gaza is dire. There is plausible evidence of the Israeli military carrying out war crimes there in its military operation triggered by, and commenced soon after, the devastating attack by Hamas against Israel on October 7 2023. The Hamas attack itself involved the commission of war crimes – and so does its taking of hostages and the subsequent treatment of them in captivity. But to say that all these atrocities render the law irrelevant is premature.

There are several reasons for this. One is that there is a difference between the existence of an important rule and its enforcement. Even where a rule is not being enforced, international law gives us a precise language to articulate exactly what is wrong with the situation. I recently wrote that what appears to have been a deliberate “double tap” attack against Nasser hospital in Khan Younis, northern Gaza, on August 25, violated IHL and can be seen as a war crime.

I have also written that other Israeli operations in Gaza amount to a crime against humanity, as they are part of a widespread or systematic attack against a civilian population. I, and others, have seriously contemplated the idea that a genocide is under way.

These legal expressions are important, and to accuse anyone of perpetrating the crimes that they embody has very serious political consequences. That is why, however implausibly, states like Israel and Russia have tried to maintain that they are totally compliant with international law.

Enforcement and impunity

Returning to the issue of enforcement, it is important to recognise that there are in fact several legal and political forums that provide an opportunity for it. These include the organs of the UN, including the International Court of Justice. There are also International Criminal Court arrest warrants for key leaders in respect of the events in both Gaza and Ukraine.

States that are signed up to the International Criminal Court are meant to be under an obligation to arrest people who are wanted by it. Yet, several opportunities to arrest Vladimir Putin have been spurned, like when he visited Mongolia recently. Likewise, Hungary failed to arrest the Israeli prime minister, Benjamin Netanyahu, when he visited earlier this year (Hungary has since denounced the court).

It’s debatable whether either of them will ever face trial. But the arrests warrants have already had political consequences. Putin was unable to attend the Brics summit in South Africa in 2023 because that country recognises the ICC. There were mixed reactions internationally to the news of the warrant against Netanyahu, with some affirming their support or at least their intention to comply with the warrants if necessary.

But history tells us that leaders who once seemed untouchable have eventually faced justice in one form or another.

Did the surviving leading Nazis ever expect to go on trial at a hastily convened military tribunal in Nuremberg? Did Augusto Pinochet expect that he would die under house arrest in his native Chile, facing trial for his actions during and since the military coup of 1973? Or that Saddam Hussein would face the death penalty and be hanged for his crimes in Iraq? Or that Libya’s Muammar Gaddafi would be ousted, abused, and then killed by a militia? Probably not.

A fair trial at the ICC would be preferable to most of those examples.

Justice for violations of international humanitarian law clearly needs to be seen to be done – if we don’t want Lazzarini’s catastrophic prediction to become a reality.

The Conversation

James Sweeney does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. International law isn’t dead. But the impunity seen in Gaza urgently needs to be addressed – https://theconversation.com/international-law-isnt-dead-but-the-impunity-seen-in-gaza-urgently-needs-to-be-addressed-264520

Mass hysteria at Heathrow airport – how social contagion works

Source: The Conversation – UK – By Kit Yates, Professor of Mathematical Biology and Public Engagement, University of Bath

Heathrow’s Terminal 4 was evacuated on September 8 as fire crews were called in to investigate “possible hazardous materials” at the London airport. After a few hours of halted flights and frustrating inconvenience, emergency services declared that no “adverse substance” had been found anywhere in the airport.

People were allowed back into the terminal, and normal service was resumed. In the meantime, however, 21 people were treated at the scene by the London Ambulance Service. So what really happened at Heathrow?

According to the Metropolitan Police, it was probably “mass hysteria”. Such outbreaks – variously called mass psychogenic disorder, mass sociogenic illness, epidemic hysteria or mass hysteria – are all types of social contagion. They are typically characterised by the rapid spread, between members of a social group, of symptoms that have no apparent known cause and for which no physical infectious agent can be identified. The symptoms are real, but the trigger is psychological.

History is full of examples. In 1962, a textile factory in the US city of Spartanburg, South Carolina, shut down after dozens of workers reported rashes, numbness, nausea and fainting. Investigators suspected an insect in a shipment of cloth, but no evidence of such a cause was ever found.

Sociologists later concluded that, while an insect bite may have triggered the first case, the rest were probably psychogenic (something that originates from psychological factors rather than a physical cause). Clusters of illness followed social ties, and the main predictors were background anxiety and stress – classic conditions for hysterical contagion.

Mass psychogenic effects have been recorded even further back in time. The infamous “dancing plague” of 1518 in Strasbourg began with a single woman dancing without pause. Within weeks, hundreds of others had joined her.

In a misguided attempt to help the victims “dance away their mania”, officials in the town hired musicians and erected an enormous stage for the merrymakers to help them burn off their energy. Unsurprisingly, this only attracted more people to the fray. At its height, 15 people a day were reported to be dropping dead until the dancing abruptly stopped.

Positive feedback loop

In their early stages, infectious diseases typically spread according to a mathematical mechanism known as a positive feedback loop. These are characterised by a signal that triggers a response – or series of responses – which ultimately ends up amplifying the original signal.

In an epidemic, infected individuals can come into contact and infect susceptible people, creating more infectious individuals who have the power to infect more people, and so on.

Something similar happens in the spread of social epidemics – only in these cases, the illness is spread by the infectious power of emotion, rather than something physical. The same mathematics that we use to describe the explosive onset of an infectious disease can be used to describe the viral outbreak of an idea.

Just because an illness is spread by an idea or emotion, rather than a virus or bacterium, it doesn’t make that illness any less real for the communities or people affected. Scientists have suggested that a hugely diverse range of social phenomena – from generosity to violence and from kindness to unemployment – may be socially contagious.

Some scientists have even come full circle by suggesting that diseases like obesity, which is typically considered to be a non-communicable disorder, may have a strong social component that allows it to spread like a contagious disease. Whether teen pregnancy, for example, is genuinely socially contagious, as some scientists claim, is still hotly debated.

What is clear is that positive feedback loops can amplify an initially small quantity to unexpected magnitudes. For this reason, the impact of positive feedback is sometimes referred to as the snowball effect. A small amount of snow that begins rolling down a hillside picks up more snow as it rolls and increases in size. The bigger it gets, the more snow it picks up, until the initially small snowball has gathered both size and pace.

It seems that social contagion, mediated by a positive feedback loop, may have been the cause of the disruption at Heathrow airport. Of the 21 people assessed by ambulance staff, all but one was discharged at the scene. The Metropolitan Police even used the positive feedback loop terminology, suggesting the incident may have started with a single person falling ill and then “snowballed” from there.

The situation at Heathrow was quickly resolved, but when ideas spread like diseases, they’re much harder to stop than actual germs. Underestimating an idea’s potency, its longevity and its ability to enthral can lead us to misjudge or misunderstand how a situation will unfold.

One only has to look at the pervasive spread of disinformation throughout the COVID pandemic to see the damage that dangerously incorrect ideas – overstating the potential harms of effective vaccines, underplaying the risks of contracting COVID and falsely claiming the effectiveness of unproven treatments – can do.

The viral spread of such falsehoods through social media means they can reach far and wide in virtually no time – and are, consequently, extremely difficult to counter. We underestimate the snowballing of these pervasive myths at our peril.

The Conversation

Kit Yates does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Mass hysteria at Heathrow airport – how social contagion works – https://theconversation.com/mass-hysteria-at-heathrow-airport-how-social-contagion-works-264900

Why painting your home white could help you survive a heatwave

Source: The Conversation – UK – By Rosa Schiano-Phan, Reader in Architecture and Environmental Design, University of Westminster

At a seminar on building cooling strategies in the late 1990s, I vividly remember hearing that “in 30 years time, the climate of London will feel like Marseille’s today”. That warning stuck with me. Back then, it sounded both alarming – and oddly appealing.

Three decades on, it no longer feels theoretical. As a Londoner of Mediterranean origin, I’ve lived through the shift. When I co-wrote The Architecture of Natural Cooling, I drew not only on professional expertise but also on childhood memories of white walls, shady courtyards and shuttered windows. These ancient techniques – once suited to the Mediterranean – now hold lessons for modern Britain, where heatwaves are becoming the new normal.

One of the simplest and most effective ways to cool a building is to change its colour. White surfaces reflect sunlight rather than absorb it, and studies show that painting roofs white or adding some other type of reflective coating can reduce the internal temperatures by more than 1°C and sometimes more than 4°C. They can even lower the surrounding outdoor temperatures by up to 2°C.

That might not sound like much, but across a whole city it can make a real difference, helping to counter the urban heat island effect (where human-made surfaces absorb heat and mean a city is hotter than surrounding countryside) and keep homes more comfortable during the hottest hours of the day.

Seaside scene with white buildings
White roofs and thick walls in Tunisia.
BTWImages / shutterstock

The success of these strategies, however, comes with a caveat. The more low-energy “passive” strategies – shutters, white buildings, ventilation and so on – we adopt in combination the more likely they are to work effectively. A white roof, for example, is more effective if windows stay shut during the hottest hours, with shutters or external shades to keep the sun out.

If you close the windows, you will be better off with heavyweight walls and floors because the materials store coolness from the night air and release it through the day. That’s one reason Mediterranean homes often stay comfortable for longer even in extreme heat.

Night-time ventilation also plays a key role – at least if the air outside actually cools down after dark. In cities like London or Manchester with a strong urban heat island effect reflective roofs and avoiding the waste heat generated by air-conditioning units becomes even more crucial.

What about winter?

Some people may worry that a white roof might make their home colder in winter. But this is a very marginal problem, especially if the roof is well-insulated. How much you’ll need to heat your home is driven by the ability of your home’s outer shell to retain the heat that is already inside, rather than its ability to prevent heat coming from outside.

Rooftop view in winter
In winter, retaining heat is more important than absorbing sunlight.
Multishooter / shutterstock

In northern climates, winter sunlight is weak and often blocked by clouds. If, in a cold climate with sunny skies, you want to harness solar energy for warmth, it’s more effective to let sunlight in through double glazed windows than to rely on darker building materials.

A practical upgrade

Repainting your house white is not excessively expensive, at least compared to the big overall costs involved in heating and maintaining a home. Many homeowners, especially in suburban residential areas in the UK, already choose white finishes when refurbishing.

On flat or low-pitched roofs, reflective coatings can be applied at relatively low cost. For steeply pitched roofs, it is not possible to apply coats of paint as it would soon wear away and look terrible, requiring regular repainting. Tile roofs also need to “breathe” and let moisture out – paint could block this process, leading to damp problems. The best option is to replace dark shingles or slate tiles with more reflective clay tiles that reduce the roof’s surface temperature. This is a more time consuming and expensive option with costs, in the UK, starting from about £125 per square metre of roof.

The climate is changing and there’s no getting away from it. Yet sometimes the best solutions aren’t hi-tech or expensive. A coat of white paint, combined with a few other simple design strategies, could help keep Britain’s homes cooler, cheaper to run and better prepared for the climate changes and high energy prices expected in the decades ahead.


Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 45,000+ readers who’ve subscribed so far.

The Conversation

Rosa Schiano-Phan does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Why painting your home white could help you survive a heatwave – https://theconversation.com/why-painting-your-home-white-could-help-you-survive-a-heatwave-264634

As pine martens are reintroduced to south-west England, a new study shows why local people need to be involved

Source: The Conversation – UK – By Roger Auster, Lecturer in Environmental Social Science, Centre for Resilience in Environment, Water and Waste, University of Exeter

Fifteen pine martens have been reintroduced to the south west of England as part of the Two Moors project. Terry Whittaker 2020Vision, CC BY-NC-ND

Fifteen pine martens were relocated from Scotland to Dartmoor, Devon, late last year in the first phase of a reintroduction to south-west England. This autumn, more of these domestic cat-sized mammals will be released into Exmoor as part of a long-term recovery strategy to restore pine marten populations.

Pine martens live primarily in woodland habitats, feeding on fruits, small mammals and birds. They were once found throughout Britain, until habitat loss from woodland clearance and increased predator control led to population collapse. It is thought pine martens lived in south-west England until the late 19th century.

In 2023, before plans for this release had been agreed, my colleague Kirsty Frith and I were commissioned by the Two Moors Pine Marten Project – a conglomeration of seven organisations, including the county’s environmental charity Devon Wildlife Trust and Dartmoor National Park Authority – to independently capture perspectives of local people and interest groups on the proposals. This “social feasibility” assessment used an approach similar to one used previously for a release in Wales to determine how a pine marten reintroduction would be received in this area.

Our new study, published in Human Dimensions of Wildlife, outlines how we used a technique called Q-methodology. This method identifies shared perspectives and enables a rich understanding of subjectivity.

For participants, this involves a sorting exercise with discussion, placing written statements into a configuration to illustrate their levels of agreement with each. Once completed, their sorting arrangements are statistically compared and interpreted to identify perspectives which participants associate with.

small brown pine marten climbs out of enclosure onto ground
A remote camera trap captures the moment that a pine marten takes its first step into the Devon countryside.
Devon Wildlife Trust, CC BY-NC-ND

Pining for martens?

Three main perspectives were identified. The anonymised participants included farmers, land managers, shooting representatives, conservationists and local residents.

Two of these perspectives supported pine martens and their reintroduction. Although similar, they exhibited some differences. The first viewpoint was more favourable to pine martens and reintroduction as a point of principle, with fewer reservations about introducing wild animals into the countryside. As one environmental farm advisor commented, “living around more nature and wildlife is a good thing”.

Although the second viewpoint still agreed strongly with reintroduction in this region, emphasis was on the motivation to restore the native population of pine martens and natural habitats. Some people expressed concerns about whether there might be negative effects on threatened native wildlife, for example, bats or dormice.

Participants wanted further evidence about the effects pine marten would have on habitats and more information about future plans for monitoring them and dealing with any issues. One participant, an environmental professional and public official, held this viewpoint and agreed with the reintroduction of pine martens “if it is done well and it is well planned”.

The third perspective was opposed to pine martens and their reintroduction. These participants were worried about introducing a predator like pine martens because they perceived them to be a threat to native wildlife, poultry and gamebirds.

They were also concerned about the availability of management support if there were negative effects from the reintroduction of pine martens. As one gamekeeper and conservationist viewed it, “they would add to the taking of wildlife when we have already lost more than 50%”.

What next?

Our new paper and previous research highlight two key challenges for any pine marten reintroduction project. By addressing those, the ability to coexist with pine martens can be improved.

close up face of brown pine marten
Pine martens are acrobatic hunters and people’s perceptions of them vary drastically.
Terry Whittaker 2020Vision, CC BY-NC-ND

People can have very different, polarised views. To minimise any conflict, reintroduction projects need to support inclusive dialogue around pine martens and how they can be monitored and managed. Unanimous support may be unlikely, but more collaborative relationships can be developed when people are involved in making plans for reintroduction.

It also really matters that people have contrasting understandings of predation. While supporters of reintroduction believed pine martens would contribute towards a functioning ecosystem, people who were less supportive were concerned that pine martens could kill threatened wildlife. Giving space for sensitive, nuanced conversations helps build trust and mutual understanding.

Our findings highlight the importance of assessing social feasibility before wildlife reintroductions take place. To ensure future success, that dynamic is just as crucial as ecological feasibility.


Imagine weekly climate newsletter

Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 35,000+ readers who’ve subscribed so far.


The Conversation

This research was commissioned by the Two Moors Pine Marten Project partnership. At the time of the research project, this included: Devon Wildlife Trust, National Trust, Woodland Trust, Exmoor National Park Authority, and Dartmoor National Park Authority.
Additional content contributed by Kirsty Frith.

ref. As pine martens are reintroduced to south-west England, a new study shows why local people need to be involved – https://theconversation.com/as-pine-martens-are-reintroduced-to-south-west-england-a-new-study-shows-why-local-people-need-to-be-involved-240559

How our minds trick us into thinking we are being greener than we really are

Source: The Conversation – UK – By John Everett Marsh, Reader in Cognitive Psychology, University of Lancashire

non c/Shutterstock

You’re in the supermarket. Imported beef mince, shrink-wrapped vegetables and cleaning spray are already in your basket. Then you toss in some organic apples and feel a flicker of moral relief. Surely that small green gesture lightens the load?

Not quite. Objectively, every extra product increases your carbon footprint. But psychology research reveals a curious illusion: when we add eco-friendly items, we often judge our shopping basket as having less impact on our carbon footprint than before.

This mental glitch is called the negative footprint illusion, and it matters for how we shop, how businesses market themselves and how governments design climate policies.

The illusion has been demonstrated across dozens of studies. In a typical experiment, people are asked to estimate the carbon footprint of 150 standard houses. Then they estimate the footprint of those same houses plus 50 eco-houses. Mathematically, the second total must be higher – there are simply more houses. Yet participants often judge the mixed set as lower.

In other words, adding a “good” item doesn’t just seem to cancel out a “bad” one. It creates a false impression that the total footprint has gone down, when in reality it has gone up. And the more “green” items you add, the stronger the illusion becomes.

What’s striking is how stubborn this bias is. It occurs among people with strong environmental values, people with scientific training and even among experts in energy systems. Education and numeracy don’t protect us. This isn’t a problem of knowledge, but of how the mind simplifies complex judgements.

Why does it happen?

The main culprit is averaging. Instead of adding up the total impact, we unconsciously average the mix. Toss in a few low-impact items and the “average impression” improves, even though the overall footprint goes up.

Our memory also plays tricks. If a sequence ends with an eco-friendly item, that last impression weighs heavily and colours the whole set. Likewise, when items are arranged irregularly, we find it harder to keep track of how many there are, so we default to averages rather than totals.

Psychologists have long shown that even when people are told about a bias, they often fall right back into it. Our latest experiments suggest the same applies to the so-called negative footprint illusion. That suggests it isn’t just sloppy reasoning but a deeper mental tendency: the mind simplifies.

The illusion may seem harmless in a lab, but it has real-world consequences when it comes to shopping, for example.

Businesses have also learned, consciously or not, to exploit this bias. A fast-food chain might showcase paper straws while still promoting beef-heavy menus. A hotel might advertise its towel-reuse policy while quietly expanding its energy-hungry facilities. These green cues create a halo that spills over to the whole brand.

Woman Choosing Bamboo Eco Friendly Biodegradable Toothbrush in Zero Waste Shop
One green products doesn’t cancel out other ones.
dmitriylo/Shutterstock, CC BY-SA

Even well-intentioned policy nudges can misfire. Offering more green-labelled choices is often assumed to drive better behaviour. But if those choices mask the real cost of consumption, they may backfire – encouraging people to consume more under the false impression of virtue.

Can it be fixed?

The good news is the illusion can be reduced. One promising approach is “summative priming”: nudging people to think in totals rather than averages. In experiments, participants who first completed simple “totalling” tasks were later more accurate in judging carbon footprints.

Research shows that when eco-friendly items appear at the end of a list, they distort overall impressions more strongly. Placing them earlier makes the illusion weaker. Likewise, when items are arranged in a regular, predictable structure, people find it easier to keep track of totals and are less prone to averaging errors.

These tweaks won’t eliminate cognitive bias entirely, but they show that design matters. Product labels, online platforms and policy communications can all be shaped to help people think in terms of totals rather than averages.

Climate change is driven by millions of everyday decisions: what we buy, what we eat, what we throw away. Understanding the psychological biases behind those decisions is essential.

The negative footprint illusion reminds us that even well-intentioned, environmentally conscious people can misjudge the true impact of their actions. Simply offering more green options isn’t enough. If those options distort our perceptions, they may slow genuine progress.

The challenge, then, is not only to provide information – carbon scores, eco-labels, green badges – but to present it in ways that match how people actually think. That means designing interventions that highlight totals, not averages, and that help consumers see the cumulative impact of their choices.

Climate change is a global problem, but it is fuelled by small misjudgments at the individual level. By recognising how our minds work, we can design smarter tools, better policies and more honest messages – and nudge ourselves towards the sustainable future we urgently need.


Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 45,000+ readers who’ve subscribed so far.


The Conversation

John Everett Marsh is a Reader in Cognitive Psychology at the University of Lancashire in the UK. He is also a Visiting Associate Professor at the Luleå University of Technology in Sweden and Bond University in Australia. He receives funding from Riksbankens Jubileumsfond.

Patrik Sörqvist receives funding from Stiftelsen Riksbankens Jubileumsfond and The Swedish Energy Agency.

ref. How our minds trick us into thinking we are being greener than we really are – https://theconversation.com/how-our-minds-trick-us-into-thinking-we-are-being-greener-than-we-really-are-263959