Every product we touch has a footprint. A phone, a fridge, a hospital syringe. Each begins and ends in the same place: the planet’s resources.
The EU’s recent ecodesign for sustainable products regulation aims to break the cycle of take, make, waste by forcing manufacturers to think circularly. Products will need to last longer, be easier to repair and feed back into the economy instead of the landfill.
It represents a major shift for most industries. But for healthcare, where safety and sterility come first, it could be revolutionary.
In the UK alone, the NHS generates approximately 156,000 tonnes of waste each year from hospitals and specialist clinics, equivalent to more than 5,700 40ft containers. Up to 90% of such waste comes from single-use disposable products or components.
Although medical products are included under the ecodesign regulation, the rules will only apply where patient health and safety are not compromised. Products that pose a risk to patients, such as those where infection, contamination, or reduced effectiveness could occur, may be exempt.
But are considerations for human health and environmental protection really at odds with one another? Or can we expand the principle of “do no harm” to include the planet itself?
In the US, climate commitments are being rolled back. The Trump administration’s withdrawal from the Paris climate agreement has slowed progress toward a more sustainable medtech industry. Implementation of new emissions standards has also been delayed, including rules to reduce ethylene oxide, a cancer-causing chemical used to sterilise surgical kits and medical devices.
These setbacks stall innovation in cleaner, safer alternatives such as CO₂ and UV light sterilisation. This matters because reusing devices, when safely sterilised, could dramatically reduce waste and resource use.
Fortunately, many sustainability gains in medtech are already within reach. By examining the full lifecycle of devices, from production to disposal, it is possible to identify where the biggest improvements can be made.
Green public procurement policies can immediately encourage healthcare providers to make more sustainable purchasing choices. Smarter research and development decisions can improve repairability, reduce material use and waste, and simplify components for easier assembly and disassembly.
Standardisation also enables interchangeable parts across devices, as seen with consumer products’ universal power supplies. This approach extends product lifespans and allows parts to be recovered and reprocessed for use in future devices, provided they meet the necessary medical standards.
Using consistent materials across devices also ensures they are directed into the correct waste, recycling, or reuse streams rather than ending up in landfill. Even sterile packaging can be reimagined to minimise volume, avoid mixed materials, and favour fully recyclable mono-materials.
Some of the world’s leading medtech companies are already proving what is possible. Medtronic is aiming for net-zero emissions by 2030 through designing smaller, longer-lasting products, investing in new materials and enforcing responsible sourcing across its supply chain.
Johnson and Johnson is cutting waste by recycling and using closed-loop systems to recapture valuable materials from single-use devices. The company also measures and publicly shares the environmental footprint of its products.
Abbott, a global healthcare and medical devices company, has committed to a 90% reduction in waste across its product lifecycles, with a particular focus on minimising the environmental impact of packaging.
The path to a sustainable medtech industry is not without challenges, but it is achievable. As regulations advance, companies innovate and healthcare professionals push for change, the sector has an opportunity to redefine what innovation really means. It is no longer just about safer, more efficient care – it is about care that protects the planet too.
With the medtech industry valued at US$587 billion in the US (£459 billion) alone, and 8% of that investment directed toward research and development, the potential for transformation is enormous. Imagine the progress if even a fraction of that funding were channelled into responsible innovation – empowering every stakeholder through education, engagement and sustainable action.
By aligning environmental responsibility with patient safety, and investing in circular design, smarter procurement, connected infrastructures and genuine collaboration, medtech can show that health and sustainability are not competing priorities. They are, in fact, inseparable.
Muireann McMahon does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Source: The Conversation – UK – By Lynda Yorke, Associate Professor (Senior Lecturer) in Critical Physical Geography, Bangor University
Nestled between the Caerneddau mountains and the Afon (River) Conwy, the small village of Dolgarrog in north Wales looks peaceful. But the huge hydro-electric pipes that run down the hillside are a constant reminder of the village’s history, and of how the same source of power that once brought prosperity also unleashed disaster.
On November 2 1925, the dam at Llyn Eigiau burst. A torrent of water and boulders thundered down the valley, sweeping through the northern part of Dolgarrog and destroying the small settlement of Porth Llŵyd. Sixteen people were killed.
One hundred years later, Dolgarrog’s story is not just one of tragedy. The village has become what its residents call a living memorial. It’s a place where disaster is not only remembered, but woven into the landscape, the law and the community’s sense of itself.
At 8pm on that night, the inhabitants of Dolgarrog felt the force of a catastrophic sequential engineering failure in the mountains above.
Two reservoirs, Llyn Eigiau and lower Coedty, supplied electricity to the local aluminium works, an industry that sustained the village. But the upper dam at Eigiau had been built on a foundation of glacial clay and boulders. After a dry summer, the clay had cracked. When autumn rains came, water seeped through. The dam wall gave way, unleashing a surge down the afon Porth Llŵyd.
This flood rapidly reached the lower Coedty dam, overwhelming its embankment. As the second dam failed, the water rushed like a massive tsunami wave down the steep gorge of afon Porth Llŵyd. Ripping out the hydro-electric pipeline, it created a deadly flow of water, debris and boulders that destroyed homes, and swept villagers into the afon Conwy.
Newsreel footage depicting the aftermath of the Dolgarrog dam disaster.
From local tragedy to national protection
The Dolgarrog disaster was not the first dam failure in the UK, but it was the one that forced government action. Public outrage over the deaths of 16 villagers led directly to the Reservoirs (Safety Provisions) Act 1930, the first law in the UK to regulate dam safety.
For the first time, large reservoirs had to be inspected and supervised by qualified, independent engineers. This ended the era when private companies could self-regulate. It marked a major shift in how the UK governed risk and infrastructure.
The event was codified into national law and updated in 1975. It created an invisible, yet mandatory, safety structure that continues to protect people today.
If the law is an unseen memorial, the land around Dolgarrog is a visible one. The remnants of the Llyn Eigiau dam wall still stand, a stark reminder of the engineering flaws that caused the disaster.
Downstream toward the Coedty dam, the torn-up peat moorland is barely visible. But the afon Porth Llŵyd gorge still shows the impact of the powerful flood, constrained by its bedrock walls. As the flood waters thundered down the gorge, they shattered, split and tore at the bedrock walls, ripping huge boulders from their rest.
The boulders dumped at the gorge’s outlet, formed a huge fan of rock debris still visible at the roadside – a chilling, preserved record of the suffering.
That landscape tells a story, not just of destruction but of recovery. The village’s memorial walk, created in 2004 around the boulder field, traces the path of the flood and symbolises the community’s ability to reclaim the space. It is both a site of reflection and an everyday walking route. This is cultural resilience and proof that remembrance and daily life coexist.
Disasters are not just events of the past: shape how we individually and collectively experience places, politics and society. Dolgarrog’s residents are marking the centenary with a programme of events under the banner “Dolgarrog Past, Present and Future”. These include commissioned art, musical performances, history projects and a lantern parade – acts of remembrance that also look forward.
Lessons for today
The lessons of Dolgarrog are as urgent now as they were a century ago. In an age of climate change, when extreme rainfall and flood risks are rising, the need for strong safety standards and accountable infrastructure has never been greater.
The 1925 disaster shows why state oversight of private infrastructure is vital when public lives depend on it. It also offers a model of resilience, one that is legislative as well as communal.
A hundred years on, the memory of the 16 villagers who died is not only preserved in stone and ceremony, but in the law itself, and in the ongoing safety of every major reservoir across the UK. Dolgarrog remains a living memorial to both the dangers of neglect and the power of collective renewal.
Lynda Yorke receives funding from NERC, British Council and Learned Society of Wales.
Giuseppe Forino has received funding from NERC, British Council and Learned Society of Wales.
Every few years, a familiar anxiety resurfaces in British public discourse: that sharia law is establishing a parallel legal system and threatening the sovereignty of English law. Those fears were reignited following Donald Trump’s recent speech to the UN, where he claimed that London wants “to go to sharia law”.
Such claims ignore two realities. First, that the English legal system is adaptive and capable of accommodating diversity. And second, that having multiple legal systems is – far from undermining British law – an inevitable legacy of Britain’s colonial history. Looking to that history, it should be no surprise that it is a feature of modern, multicultural Britain.
My research shows how British colonial administrators deliberately designed plural legal systems to sustain imperial rule. The colonial state recognised that it could not rule diverse populations by imposing English law on multicultural societies.
In northern Nigeria, this approach became a defining feature of colonial governance. English law operated alongside Islamic courts, which handled family disputes and aspects of land tenure. Allowing limited autonomy for Africans under sharia was both a pragmatic and political strategy. It maintained local legitimacy while ensuring that English law remained supreme in cases of conflict.
A similar arrangement existed in British India. This legacy continues to shape how law functions in postcolonial, multicultural Britain today.
How sharia operates in Britain today
There is no separate sharia legal system in the UK. What exist are sharia councils and the Muslim Arbitration Tribunal. The sharia councils have no statutory authority under English law. They may be used to resolve personal disputes such as marriage, divorce and inheritance.
The Muslim Arbitration Tribunal, in existence since the early 2000s, operates under the Arbitration Act 1996. This law allows private arbitration between consenting adults in civil disputes. But such tribunals must operate within the boundaries of English law.
Sharia councils have a slightly longer history, dating back to the 1980s. Their number and activities are difficult to track: in 2009, rightwing thinktank Civitas approximated at least 85, while a 2012 study by a researcher at the University of Reading identified 30.
No comprehensive survey has been conducted since, leaving the exact number uncertain. This lack of official oversight fuels the perception that the councils pose a challenge to Britain’s legal sovereignty.
But, as a 2018 Home Office review confirmed, sharia councils hold no legal jurisdiction in England and Wales.
The review did acknowledge concerns raised by women’s rights groups about gender inequality and lack of representation of women in some councils. It concluded that these issues called for better regulation and oversight, and that the “state would be justified in intervening” in bad practices by sharia councils that disadvantage women.
It also found that public fears are fuelled by misleading terms, used in both the media and sometimes by councils themselves. For example, referring to the councils as “courts” and their members as “judges” reinforces misconceptions about the existence of a parallel legal system.
Multifaith Britain and the law
English law is capable of accommodating and regulating diverse legal practices without losing its sovereignty. Besides sharia councils, other faith-based arbitration bodies exist in Britain.
The Beth Din courts, for example, serve the Jewish community, offering guidance on issues of marriage and divorce. While they cannot compel a divorce, they can encourage or persuade a husband to grant a religious divorce certificate.
The Roman Catholic Church, which complies with the Marriage Act 1949, operates its own tribunals to consider annulments under canon law. None of these institutions undermine the authority of English courts.
The same applies to sharia councils. Participation is voluntary: individuals choose to use these forums, often to resolve family or inheritance matters in line with their faith. English civil courts remain fully available to them.
Following concerns about the protection of women’s rights in the councils, the 2018 Home Office review recommended stronger safeguards. These include requiring civil registration of marriages, greater transparency in decision-making, and education about legal rights.
The review found that nearly all users of the sharia councils were women, with over 90% seeking an Islamic divorce. Many were unable to obtain a civil divorce because their marriages had never been registered under English law, leaving them without legal recourse in the civil legal system.
The review stressed that its proposed safeguards were designed to protect vulnerable women, rather than suppress or prohibit sharia councils from operating. This recognises that the demand for religious divorce will continue regardless of sharia prohibition.
The UK government accepted the review’s findings but has not established a regulatory body. This suggests that most safeguards are currently dependent on voluntary good practice within the councils.
Postcolonial legal pluralism
In a postcolonial, multifaith society like Britain, legal pluralism is not a sign of a fragmented legal sovereignty – it’s an acknowledgement of social reality. The persistence of sharia in modern Britain reflects a society still negotiating how to govern cultural and religious difference through law, as the empire once did.
Other postcolonial societies have accepted this. In India, different personal law systems for Hindus, Muslims and Christians coexist under one constitution. There is an ongoing debate in the country about how to balance faith-based identity with the rights guaranteed by the secular state.
The same question now faces Britain. The challenge is not whether to recognise the arbitrating powers of sharia councils, but how to regulate them fairly – ensuring that every citizen, regardless of faith, can exercise their rights within the boundaries of English law.
Femi Owolade does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Suspected witches kneeling before James VI in Daemonologie, his 1597 treatise on witches. Wikimedia Commons
In the 16th century, witches and demons weren’t just for Halloween. People were terrified and preoccupied with them – even kings.
In 1590, James VI of Scotland – who was later also crowned James I of England – travelled by sea to Denmark to wed a Danish princess, Anne. On the return journey, the fleet was hit by a terrible storm and one of the ships was lost.
James, a pious Protestant who would go on to sponsor the translation of the King James bible, was convinced he’d been the target of witchcraft. On his return, he set in motion the brutal North Berwick witch trials.
A few years later, James decided to write a treatise called Daemonologie, setting out his views on the relationship between witches and their master, the devil.
Meanwhile, another firm Halloween favourite – ghosts – had fallen out of favour in the wake of the Protestant Reformation because they were seen as a hangover from Catholicism.
In this episode of The Conversation Weekly podcast, Penelope Geng, an associate professor of English at Macalester College in the US who teaches a class on demonology, takes us back to a time when beliefs around witches, ghosts and demons were closely tied to religious politics. She explains how these beliefs have come to influence the way witches and ghouls have been portrayed in popular culture ever since:
It seemed that at a very grassroots level, people believed in the existence of witches and devils. At a very high theological level, writers were talking about it. So I think, compared to today, the early modern period really was a moment in which people were somewhat obsessed with thinking about this eternal struggle between good and evil and their own place in this warfare.
You can also read an article Penelope Geng wrote on the difference between ghosts and demons, and the way they were portrayed in literature, as part of The Conversation’s Curious Kids series.
This episode of The Conversation Weekly was written and produced by Katie Flood, Mend Mariwany and Gemma Ware. Mixing by Eleanor Brezzi and theme music by Neeta Sarl.
Listen to The Conversation Weekly via any of the apps listed above, download it directly via our RSS feed or find out how else to listen here. A transcript of this episode is available on Apple Podcasts or Spotify.
Penelope Geng does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
À l’Université de l’Alberta, par exemple, il n’y a plus de vice‑provost à l’équité, à la diversité et à l’inclusion. L’établissement dispose désormais d’un bureau de « l’accès, de la communauté et de l’appartenance ».
Si la langue évolue naturellement, ce changement ressemble davantage à un repli déguisé qu’à un véritable progrès, car il manque l’engagement délibéré nécessaire à l’équité.
Les efforts actuels de rebranding relèvent davantage de l’apaisement que du progrès. Ce sont des gestes réactifs, dictés par des pressions externes, plutôt que des réponses aux besoins et aux demandes des communautés les plus concernées.
Dans le monde des affaires, la tendance est frappante. Les mentions d’« EDI » dans les dépôts réglementaires des sociétés du S&P 500 ont chuté de 70 % depuis 2022, remplacées par des termes plus consensuels comme « appartenance » et « culture inclusive ».
Ce changement permet aux organisations d’échapper à leurs responsabilités, de masquer les inégalités et de remplacer les cadres d’équité mesurables par des platitudes vagues.
Pourquoi est-ce important ?
En adoucissant les termes utilisés, les organisations s’assurent un moyen socialement acceptable de se soustraire à la difficile mission qu’est l’équité. Comme si elles avaient « dépassé » l’équité, alors qu’elles n’ont jamais fait le travail nécessaire. C’est en quelque sorte une illusion.
Supprimer l’équité du langage organisationnel a des conséquences tangibles. Tout d’abord, cela compromet l’imputabilité. Les cadres d’équité efficaces créent des objectifs mesurables et vérifiables. Des termes tels que « appartenance » sont plus difficiles à définir et plus faciles à abandonner. Ils permettent aux organisations de prétendre s’engager en faveur de l’inclusion sans avoir à fournir les efforts nécessaires à un réel changement systémique.
Enfin, les organisations elles‑mêmes s’exposent à des risques. Les reculs en matière d’EDI nuisent au moral, à la rétention, à l’innovation et à la performance, et peuvent même accroître le risque juridique.
Un sondage de 2025 du Meltzer Center for Diversity, Inclusion, and Belonging (NYU) révèle que 80 % des dirigeants estiment que la réduction des efforts en équité augmente les risques réputationnels et juridiques. Il fait aussi état d’un large consensus selon lequel les initiatives d’EDI améliorent la performance financière des entreprises.
Mais la méritocratie suppose l’égalité des chances et occulte le fait que le « mérite » est une construction sociale qui dépend du contexte. Elle ignore que des barrières inégales – comme l’accès à l’éducation et aux réseaux – influencent la réussite individuelle, et ce outre les réalisations de la personne.
Par exemple, dans une étude réalisée auprès de 445 participants ayant une expérience en gestion, les chercheurs ont demandé aux participants de prendre des décisions concernant les primes, les promotions et les licenciements d’employés fictifs. Lorsque la culture d’une organisation mettait l’accent sur la méritocratie, les hommes recevaient des primes plus élevées que les femmes ayant les mêmes qualifications.
À l’inverse, lorsque la culture d’entreprise mettait plutôt l’accent sur le pouvoir discrétionnaire des dirigeants, le biais s’inversait en faveur des femmes. Cela s’explique vraisemblablement par le fait que l’énoncé signalait un biais de genre potentiel, déclenchant une sur‑correction.
Dans un troisième scénario, où ni la méritocratie ni la discrétion managériale n’étaient mises de l’avant, il n’y avait pas de différence significative dans les primes accordées.
Bien que le dernier scénario semble prometteur, la plupart des environnements de travail privilégient la méritocratie, consciemment ou non. La rémunération basée sur le mérite ou la performance demeure la norme dans la plupart des organisations, ce qui signifie que le premier scénario est le plus fréquent.
Sans transparence, le discours sur « qui mérite » une promotion/un bonus a tendance à renforcer les inégalités. Le népotisme, les avantages liés aux réseaux et la visibilité sélective comblent souvent le vide lorsque les cadres d’équité sont abandonnés. Les réseaux et la visibilité comptent, mais ils ne doivent pas être confondus avec le mérite.
Alors que certaines institutions reculent sur leurs engagements en matière d’EDI, d’autres au Canada et en Europe maintiennent le cap en intégrant l’équité à leur stratégie, à leur leadership et à leurs cadres de performance.
Pour faire progresser l’équité dans le contexte actuel, il faut à la fois une stratégie et une mobilisation continue. Voici par où les organisations peuvent commencer :
Établir et intégrer des objectifs explicites et mesurables en matière d’équité, alignés sur la stratégie de leur entreprise.
Améliorer la transparence des données en collectant et en partageant publiquement des informations désagrégées sur le recrutement, la promotion, l’équité salariale, le taux de rotation du personnel et l’expérience des employés.
Donner un véritable pouvoir décisionnel aux voix issues de la diversité dans l’élaboration des politiques et des initiatives. Les groupes‑ressources d’employés constituent un excellent point de départ.
Tenir les leaders imputables en les formant à promouvoir l’équité et en liant leurs incitatifs à des résultats concrets en matière de diversité, d’équité et d’inclusion.
Communiquez de manière transparente et authentique sur les impacts de l’EDI en partageant des témoignages et des indicateurs qui montrent comment les efforts en matière d’équité ont amélioré les performances de l’organisation.
Ces solutions fonctionnent déjà. Dans ma pratique de consultant, j’ai accompagné des organisations qui progressent en bâtissant la confiance, en dynamisant leurs équipes et en stimulant l’innovation. Au final, elles sont plus performantes et plus résilientes.
L’argument économique pour l’équité, la diversité et l’inclusion est bien établi : L’EDI stimule la performance, soutient la croissance et constitue un impératif de leadership. Dans le climat politique actuel, il est crucial de rester concentré sur les résultats plutôt que de se laisser entraîner par un discours qui présente l’équité comme inutile ou clivante.
La voie à suivre
Rebaptiser « l’équité » en « appartenance » ne fait pas avancer la justice, surtout en l’absence d’une définition partagée de ce que signifie réellement « appartenance ». Cela nie poliment la nécessité de démanteler de véritables barrières systémiques. Pour les personnes qui font face à ces barrières, cela sonne comme une promesse creuse.
Déjà des milliers d’abonnés à l’infolettre de La Conversation. Et vous ? Abonnez-vous gratuitement à notre infolettre pour mieux comprendre les grands enjeux contemporains.
Personne ne choisit sa race, son sexe, son milieu socio-économique, son orientation sexuelle, ni de vivre avec un handicap ou les séquelles durables du service militaire (par exemple, un trouble de stress post-traumatique). En revanche, les institutions peuvent choisir de s’attaquer aux inégalités liées à ces expériences et de démanteler les obstacles auxquels les individus sont confrontés.
Ce moment invite également à une réflexion honnête au sein même du secteur de l’EDI. Certaines initiatives ont dépassé les limites ou perdu de vue leur objectif, ce qui a contribué au contrecoup actuel. Reconnaître ouvertement ces faux pas fait partie du travail de reconstruction de la crédibilité de l’EDI.
Pour progresser, il est nécessaire de réduire la polarisation, d’ouvrir le dialogue, et de mieux coordonner les actions afin que chaque personne ait une chance équitable de s’épanouir et de réussir.
Simon Blanchette ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d’une organisation qui pourrait tirer profit de cet article, et n’a déclaré aucune autre affiliation que son organisme de recherche.
Source: The Conversation – (in Spanish) – By Raúl Rivas González, Catedrático de Microbiología. Miembro de la Sociedad Española de Microbiología., Universidad de Salamanca
El elemento visual más icónico de la festividad de Halloween es, sin duda, la Jack O’Lantern, una calabaza vaciada y tallada con apariencia de rostro, a menudo de aspecto grotesco o sonriente, que sirve de linterna. En el interior hueco de la calabaza se coloca una fuente de luz, tradicionalmente una vela, cuyo parpadeo proyecta sombras fantasmales y da vida a la mueca tallada, creando una atmósfera inconfundible de misterio y celebración.
La especie de calabaza más utilizada para Halloween –contracción de la expresión inglesa All Hallows’ Eve, que significa “víspera de Todos los Santos”– es la Cucurbita pepo; por ejemplo, las variedades Connecticut field o Jack O’Lantern. Debido a su color, forma redondeada y corteza resistente, resulta ideal para tallar. Pero ojo, porque no siempre se utilizó una calabaza: al principio se tallaban otras hortalizas. Concretamente, nabos.
Jack, el Tacaño
La leyenda más famosa asociada con el origen de la tradición de las hortalizas talladas de Halloween nos lleva a la Irlanda del siglo XVIII y está vinculada a una figura del folclore irlandés conocida como Jack el Tacaño, aunque también recibe los nombres de Jack el Herrero, Jack el Borracho o Jack el Excéntrico.
Jack engañó al diablo dos veces. Primero le convenció para que se transformara en una moneda con la que pagar una última bebida antes de llevárselo al infierno, y después lo dejó atrapado en la copa de un árbol. Para conseguir la libertad, el diablo tuvo que prometer a Jack que jamás volvería a reclamar su alma.
Sin embargo, cuando Jack murió, no pudo entrar en el cielo debido a su vida pecaminosa. El diablo cumplió el trato y nuestro hombre quedó condenado a vagar por la Tierra en la oscuridad eterna, sin poder descansar ni en el cielo ni en el infierno. El maligno, a modo de burla, le arrojó una brasa encendida del averno para que iluminara su camino. Jack puso la brasa dentro de un nabo hueco, que usó a modo de linterna. Y así quedó convertido en Jack O’Lantern (Jack, el de la linterna).
Samhain, la festividad gaélica
La celebración de Halloween tiene raíces arcaicas vinculadas, en gran parte, al festival celta conocido como Samhain, que marcaba el final de la temporada de cosechas y el comienzo del año nuevo celta. En el siglo XVIII, mucha gente de Irlanda celebraba el Samhain, la festividad gaélica, con sus rituales de ir de casa en casa en busca de comida y bebida. De ahí la tradición del “truco o trato”.
Como en la Irlanda preindustrial reinaba la oscuridad, muchos tallaban nabos, patatas y otras hortalizas de raíz y les añadían carbón o velas para crear faroles improvisados que sirvieran de guía a los asistentes. En ocasiones, incluso tallaban rostros en ellas.
Además, existía la creencia antigua que, en la noche de Halloween, la frontera entre el mundo de los vivos y el de los muertos se debilitaba, permitiendo a los espíritus buenos y malos regresar. Por esta razón, esa noche las linternas vegetales eran colocadas en las ventanas o entradas de las casas para ahuyentar a esos espíritus, incluido el de Jack O’Lantern. Con el tiempo, los lugareños comenzaron a tallar caras terroríficas en los nabos para ahuyentar a los espíritus malignos.
Un microorganismo cambió los nabos y las patatas por calabazas
La Gran Hambruna irlandesa, también conocida como “hambruna de la patata”, fue un periodo devastador de inanición, enfermedad y migraciones masivas que afectó a Irlanda entre 1845 y 1849, aunque las consecuencias se extendieron hasta 1852.
Aunque se desconocen las cifras exactas, los registros apuntan que al menos se produjeron 1 100 000 de muertes y una emigración masiva de otro millón de personas, lo que supuso un descenso poblacional de casi el 30 %. En realidad, el funesto desenlace del acontecimiento fue multifactorial: confluyeron motivos políticos, religiosos, económicos y, sobre todo, microbiológicos. En concreto, una plaga provocada por el oomiceto Phytophthora infestans, conocido como tizón tardío, destruyó la cosecha de patatas, el alimento básico para gran parte de la población irlandesa pobre. El patógeno es muy agresivo, y los vegetales y cultivos más importantes a los que afecta son la patata y el tomate.
Para desgracia de los irlandeses, el oomiceto se estableció en el suelo. Los siguientes años fueron demoledores, ya que el patógeno persistía y las patatas desaparecían o malograban. Por entonces, no existían sustancias químicas ni métodos genéticos para combatir a Phytophthora. La hambruna y las enfermedades asociadas se cebaron con los más pobres. La desnutrición facilitó la incidencia de infecciones severas como la fiebre tifoidea, la tuberculosis, la difteria o el cólera. Cientos de miles de personas no pudieron superar la debacle y perecieron.
Los irlandeses llevaron Halloween a Estados Unidos
Los más afortunados consiguieron emigrar en busca de nuevas oportunidades, aunque muchos lo hicieron en condiciones deplorables y sucumbieron durante la travesía. La colonia de emigrantes irlandeses más numerosa se estableció en Estados Unidos, un país de mayoría protestante. Sin embargo, más del 90 % de los migrantes irlandeses eran católicos, lo que supuso un impacto notable en la sociedad estadounidense.
De hecho, con los emigrantes irlandeses muchas tradiciones cruzaron el Atlántico. Cuando llegó el momento de celebrar Halloween, hubo un problema de gran magnitud: en América los nabos no eran fáciles de encontrar. Por suerte, los irlandeses inmigrantes encontraron una hortaliza nativa, mucho más grande, llamativa, abundante en otoño y fácil de vaciar y tallar: la calabaza. De inmediato, ésta sustituyó a los nabos y a cualquier otra hortaliza candidata.
Las calabazas talladas con rostros espeluznantes no tardaron en volverse esenciales en las celebraciones de Halloween. Aunque inicialmente servían para ahuyentar el espíritu de Jack y otras almas errantes, con el tiempo, la Jack O’Lantern evolucionó a un simple símbolo festivo. Hoy adorna las casas, mezclando terror y diversión durante todo el mes de octubre.
Calabazas divertidas, pero también peligrosas
Pero ojo, porque las calabazas pueden esconder un peligro. Un estudio publicado en el año 2006, analizó la rápida descomposición fúngica de las calabazas de Halloween (Cucurbita pepo) en Irlanda del Norte durante octubre de 2005. Encontraron que, tras ser talladas y exhibidas, desarrollaban un notorio deterioro microbiano. La investigación reveló la presencia de al menos cinco géneros de hongos (Penicillium, Gibberella, Mucor, Nectria y Fusarium), tres de ellos conocidos por causar infecciones en personas inmunocomprometidas, un riesgo que no había sido documentado previamente.
Por lo tanto, aunque estas hortalizas pueden seguir siendo parte de las festividades, en entornos sanitarios con pacientes vulnerables se recomienda la inspección periódica de las calabazas talladas para procurar minimizar la dispersión de esporas y realizar su descarte inmediato si se observa contaminación fúngica.
Raúl Rivas González no recibe salario, ni ejerce labores de consultoría, ni posee acciones, ni recibe financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y ha declarado carecer de vínculos relevantes más allá del cargo académico citado.
Source: The Conversation – in French – By François Langot, Professeur d’économie, Directeur adjoint de l’i-MIP (PSE-CEPREMAP), Le Mans Université
Pour stabiliser la dette publique de la France, l’État doit réduire son déficit. Outre la hausse des prélèvements, il doit aussi de diminuer ses dépenses. Mais avant de les réduire, il importe de savoir comment ces dépenses ont évolué ces trente dernières années.
L’analyse historique des dépenses de l’État peut être utile pour prendre aujourd’hui des décisions budgétaires. Qu’ont-elles financé ? Les salaires des agents ? Des achats de biens et services ? Des transferts ? Quels types de biens publics ont-elles permis de produire (éducation, santé, défense…) ?
Le futur budget de l’État doit tenir compte de ces évolutions passées, des éventuels déséquilibres en résultant, tout en réalisant que ces choix budgétaires auront des impacts sur la croissance et les inégalités spécifiques à la dépense considérée.
Près de 30 milliards d’économies annoncées
Le projet de loi de finances actuellement discuté pour l’année 2026 prévoit 30 milliards d’euros d’économies, ce qui représente 1,03 % du PIB. Ces économies sont obtenues avec 16,7 milliards d’euros de réduction de dépenses (0,57 point de PIB), et 13,3 milliards d’euros de hausse de la fiscalité. Le déficit public, prévu à 5,6 % en 2025 (163,5 milliards d’euros pour 2025) ne serait donc réduit que de 18,35 %. Pour atteindre l’objectif de stabiliser la dette publique, il faudra amplifier cet effort les prochaines années pour économiser approximativement 120 milliards d’euros (4 points de PIB), soit quatre fois les économies prévues dans le PLF 2026.
Ces réductions à venir des dépenses s’inscrivent dans un contexte. En moyenne, dans les années 1990, les dépenses publiques représentaient 54 % du PIB. Dans les années 2020, elles avaient augmenté de 3 points, représentant alors 57 % du PIB, soit une dépense annuelle additionnelle de 87,6 milliards d’euros, ce qui représente plus de cinq fois les économies inscrites dans le PLF pour 2026. Depuis 2017, ces dépenses ont augmenté d’un point de PIB, soit une hausse annuelle de 29,2 milliards d’euros (1,75 fois plus que les économies du PLF 2026). Étant données ces fortes hausses passées, des réductions de dépenses sont possibles sans remettre en cause le modèle social français. Mais, quelles dépenses réduire ?
Chaque poste de dépense se compose d’achats de biens et services (B & S) utilisés par l’État (au sens large, c’est-à-dire l’ensemble des administrations publiques centrales, locales et de sécurité sociale) pour produire, de salaires versés aux agents, et de transferts versés à la population. Quel poste a fortement crû depuis 1995 ?
Le tableau 1 montre qu’en 1995, 40,2 % des dépenses étaient des transferts (soit 22,05 points de PIB), 35,5 % des achats de B & S (soit 19,45 points de PIB) et 24,3 % des salaires (soit 13,33 points de PIB). En 2023, 44,1 % étaient des transferts (+ 3,06 points de PIB), 34,5 % des achats de B & S (- 0,15 point de PIB) et 21,4 % des salaires (- 1,07 points de PIB). Le budget s’est donc fortement réorienté vers les transferts. Les dépenses consacrées aux salaires ont évolué moins vite que le PIB, le poids de ces rémunérations dans les dépenses baissant fortement.
Lecture : En 1995, les transferts représentaient 22,05 points de PIB, soient 40,2 % des dépenses totales. Le chiffre entre parenthèses indique la part de cette dépense dans les dépenses totales. Δ : différence entre 2023 et 1995 en points de PIB et le chiffre entre parenthèses l’évolution de la part.
L’État a donc contenu ces achats de B & S et réduit sa masse salariale, quand bien même les effectifs croissaient de plus de 20 % (données FIPECO). Simultanément, l’emploi salarié et non salarié du secteur privé augmentait de 27 % (données Insee). Des effectifs augmentant moins que dans le privé et une part de la production de l’État dans le PIB progressant révèlent une plus forte hausse de la productivité du travail du secteur public. Mais, ceci ne s’est pas traduit par une augmentation des rémunérations du public. Au contraire, l’écart de salaire entre le public et le privé s’est fortement réduit sur la période, passant de +11,71 % en 1996 en faveur du public (données Insee (1999) pour le public et Insee (1997) pour le privé), à 5,5 % en 2023 (données Insee (2024a) pour le privé et Insee (2024b) pour le public).
Cette première décomposition montre que l’organisation de la production de l’État (achat de B & S et salaires) n’a pas dérivé, mais que les hausses des dépenses de redistribution (+ 3,06 points de PIB en trente ans) ont fortement crû. Ces hausses de transferts correspondent aux trois quarts des économies nécessaires à la stabilisation de la dette publique.
De moins en moins d’argent pour les élèves et la défense
Les dépenses de l’État se décomposent en différents services, c’est-à-dire, en différentes fonctions (l’éducation, la défense, la protection sociale…). La figure 1 montre que les dépenses des services généraux, d’éducation et de la défense ont crû moins vite que le PIB depuis 1995 (surface rouge). En effet, leurs budgets en points de PIB ont respectivement baissé de 2,14 points, 0,78 point et 0,68 point de PIB. Si la baisse du premier poste peut s’expliquer, en partie, par la rationalisation liée au recours aux technologies de l’information, et la seconde par l’arrêt de la conscription, celle de l’éducation est plus surprenante.
Elle l’est d’autant plus que Aubert et al. (2025) ont montré que 15 % de ce budget incluait (soit 0,75 point de PIB) des dépenses de retraites qu’il « faudrait » donc réallouer vers les pensions pour davantage de transparence. La croissance constante de cette contribution aux pensions dans le budget de l’éducation indique que les dépenses consacrées aux élèves sont en forte baisse, ce qui peut être mis en lien avec la dégradation des résultats des élèves de France aux tests de type Pisa. Enfin, dans le contexte géopolitique actuel, la baisse du budget de la Défense peut aussi sembler « peu stratégique ».
Lecture : En 1995, les dépenses de protection sociale représentaient 21,41 points de PIB, dont 18,14 points de PIB en transferts, 1,16 point en salaires et 2,11 points en B&S ; en 2023, elles représentaient 23,33 points de PIB dont 20,16 points, 1,12 point en salaire et 2,0 points en B&S.
De plus en plus pour la santé et la protection sociale
La surface verte de la figure 1 regroupe les fonctions qui ont vu leurs budgets croître plus vite que le PIB, de la plus faible hausse (ordre public/sécurité, avec + 0,24 point de PIB) aux plus élevées (santé, + 1,72 point de PIB, et protection sociale, + 1,92 point de PIB). Ces deux postes de dépenses représentent 65,3 % des hausses. Viennent ensuite les budgets sport/culture/culte, environnement et logement qui se partagent à égalité 24 % de la hausse totale des dépenses (donc approximativement 8 % chacun). Enfin, les budgets des affaires économiques et de l’ordre public/sécurité expliquent les 10,7 % restant de hausse des dépenses, à hauteur de 6,4 % pour le premier et 4,3 % pour le second.
Si l’on se focalise sur les plus fortes hausses, c’est-à-dire, la santé et la protection sociale, les raisons les expliquant sont différentes. Pour la protection sociale, les dépenses de fonctionnement sont quasiment stables (B&S et salaires) alors que les prestations sont en forte hausses (+ 2 points de PIB). Les dépenses de santé voient aussi les prestations offertes croître (+ 1 point de PIB), mais se caractérisent par des coûts croissants de fonctionnement : + 0,6 point pour les B&S, et + 0,12 point de PIB pour les salaires des personnels de santé, alors que les rémunérations baissent dans le public, ceux des agents de l’éducation, par exemple, passant de 4,28 à 3,47 points de PIB (-0,81 points de PIB).
Dans la protection sociale, de plus en plus pour la maladie et les retraites
La protection sociale, premier poste de dépense (23,33 % du PIB), regroupe différentes sous-fonctions représentées dans la figure 2. À l’exception des sous-fonctions maladie/invalidité (+ 0,07 point de PIB), exclusion sociale (+ 0,43 point du PIB) et pensions (+ 2,41 points de PIB), les budgets de toutes les sous-fonctions de la protection sociale ont vu leur part baisser (surface en rouge). Les réformes des retraites ont donc été insuffisantes pour éviter que les pensions soient la dépense en plus forte hausse.
Enfin, si on ajoute aux dépenses de santé la partie des dépenses de protection sociale liée à la maladie et à l’invalidité (voir la figure 2), alors ces dépenses globales de santé ont crû de 1,79 point de PIB entre 1995 et 2023.
Quels enseignements tirer ?
Ces évolutions suggèrent que les budgets à venir pourraient cibler les économies sur les dépenses de santé et les pensions, ces deux postes ayant déjà fortement crû dans le passé. Évidemment, une partie de ces hausses est liée à l’inévitable vieillissement de la population. Mais une autre vient de l’augmentation des prestations versées à chaque bénéficiaire. Par exemple, la pension de retraite moyenne est passée de 50 % du salaire moyen dans les années 1990 à 52,3 % en 2023. Le coût de la prise en charge d’un infarctus du myocarde est passé de 4,5 Smic dans les années 1990 à 5,6 Smic dans les années 2020
France 24, octobre 2025.
En revanche, un rattrapage portant sur l’éducation et la Défense semble nécessaire au vu du sous-investissement passé et des défis à venir. Les rémunérations des agents du public doivent aussi être reconsidérées. Le tableau 2 montre que le PLF 2026 propose des mesures répondant en partie a ce rééquilibrage en réduisant les dépenses de protection sociale et en particulier les pensions. Enfin, le PLF 2026 prévoit une hausse du budget de la défense, alors que la réduction de 8,6 milliards d’euros des budgets des fonctions hors défense et ordre public épargne l’éducation.
Au-delà de ces arguments de rééquilibrage, les choix budgétaires doivent aussi reposer sur une évaluation d’impact sur l’activité (croissance et emploi). Les analyses de Langot et al. (2024) indiquent que les baisses de transferts indexés sur les gains passés (comme les retraites) peuvent avoir un effet positif sur la croissance, facilitant alors la stabilisation de la dette publique, au contraire des hausses des prélèvements.
Privilégier la production des biens publics aux dépens des transferts se justifie aussi au regard des enjeux géopolitiques et climatiques, et permet également de réduire les inégalités (voir André et al. (2023)).
Les auteurs ne travaillent pas, ne conseillent pas, ne possèdent pas de parts, ne reçoivent pas de fonds d’une organisation qui pourrait tirer profit de cet article, et n’ont déclaré aucune autre affiliation que leur organisme de recherche.
Suspected witches kneeling before James VI in Daemonologie, his 1597 treatise on witches. Wikimedia Commons
In the 16th century, witches and demons weren’t just for Halloween. People were terrified and preoccupied with them – even kings.
In 1590, James VI of Scotland – who was later also crowned James I of England – travelled by sea to Denmark to wed a Danish princess, Anne. On the return journey, the fleet was hit by a terrible storm and one of the ships was lost.
James, a pious Protestant who would go on to sponsor the translation of the King James bible, was convinced he’d been the target of witchcraft. On his return, he set in motion the brutal North Berwick witch trials.
A few years later, James decide to write a treatise called Daemonologie, setting out his views on the relationship between witches and their master, the devil.
Meanwhile, another firm Halloween favourite – ghosts – had fallen out of favour in the wake of the Protestant Reformation because they were seen as a hangover from Catholicism.
In this episode of The Conversation Weekly podcast, Penelope Geng, an associate professor of English at Macalester College in the US who teaches a class on demonology, takes us back to a time when beliefs around witches, ghosts and demons were closely tied to religious politics. She explains how these beliefs have come to influence the way witches and ghouls have been portrayed in popular culture ever since:
It seemed that at a very grassroots level, people believed in the existence of witches and devils. At a very high theological level, writers were talking about it. So I think, compared to today, the early modern period really was a moment in which people were somewhat obsessed with thinking about this eternal struggle between good and evil and their own place in this warfare.
You can also read an article Penelope Geng wrote on the difference between ghosts and demons, and the way they were portrayed in literature, as part of The Conversation’s Curious Kids series.
This episode of The Conversation Weekly was written and produced by Katie Flood, Mend Mariwany and Gemma Ware. Mixing by Eleanor Brezzi and theme music by Neeta Sarl.
Listen to The Conversation Weekly via any of the apps listed above, download it directly via our RSS feed or find out how else to listen here. A transcript of this episode is available on Apple Podcasts or Spotify.
Penelope Geng does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Meanwhile, the single-use plastic packaging used to reduce food wastage poses a more insidious problem. Once discarded, the single-use plastics that cushion, seal, protect and extend the shelf life of our groceries can linger in landfills, beneath the ground, in rivers and on the seabed for centuries.
This mounting plastic waste could disrupt ecosystems, negatively effect food security through declining animal health and cause health issues in people. If binning good-to-eat food has historically been reviled as consumers’ great moral failing, their over-reliance on single-use plastic food packaging could be a longer-lasting sin.
The rest was incinerated, land-filled, or shipped abroad, typically to countries with weaker waste management systems. There it is buried, burned or haphazardly stored with the risk of leaking into rivers and seas.
Traces of plastic have been detected everywhere from Arctic ice to the hottest deserts, from the bellies of seabirds to human blood, lungs and placentas. Unlike food waste, the damage of plastic waste is cumulative, slowly imparting a toxic legacy throughout ecosystems for future generations.
The scale of the single-use plastics problem is not to diminish the problem of food waste. Throwing out a pack of mackerel fillets or a tub of smashed avocado from the fridge is not only disrespectful to the third of UK children under five living in food insecure homes. It disregards the huge amount of carbon emissions needed to produce, preserve, transport, retail and store those items from producer to consumer.
An estimated 16 million tonnes of carbon dioxide is produced from UK households’ wasted consumable food and drink. But damaging as it is, food waste has an end point: it decomposes, breaks down, then returns to the soil.
In contrast, plastic packaging persists indefinitely, slowly fragmenting into smaller parts and disintegrating into stubborn chemical constituents that stick around. Each plastic bottle, crisp packet and meat tray that ends up in the natural environment represents a long-term alteration of the material world.
Food waste decays, plastic stays
Why then does binning plastic packaging rarely invite as fervent a reaction as scraping a plate of uneaten dinner into the bin? Our research suggests that part of the answer lies in how each act of wastage is morally framed.
Food is very visible, desirable and morally loaded – it is something held dear in most religions and communities. Several faiths explicitly denounce the wasting of food as sinful or wrong. Secular British history too is replete with memories of food shortages, rationing, rising prices and austerity periods which have led to strong moral attitudes against food waste.
According to the anti-poverty charity Trussell Trust’s research, approximately 14 million people in the UK faced hunger in the past year leading up to September 2025.
Binning good-to-eat food is usually considered morally unacceptable. 5PH/Shutterstock
By comparison, plastic is more abstract. Plastic food packaging is hidden in plain sight, often serving as a “passenger” rather than a driver of our consumption. After we remove the food, we toss plastic packaging into the trash – ideally the recycling bin – without a further thought.
Where food is deep-seated in moral and even sacred meanings around nourishing the body, sharing and caring, identity and celebration, plastic is devoid of such values. Throwing food away can feel like an affront to the communities we identify with, but binning plastic does not carry the same stigma. We do not view ourselves as “wasting” plastic, we merely “dispose” of it.
Among the members of 27 households we interviewed, many expressed their frustration about good-to-eat food ending up in bins or landfills. Most cited the usefulness of plastic packaging in keeping food fresh and helping to reduce waste.
For them, the consequences of binning plastics are dispersed and delayed. No great cautionary tale from our collective memory exists to warn us of the complex, longer-term challenges that will follow.
To overcome the challenges of tomorrow, we must reassess the hierarchy of things that we, as consumers, feel guilty about. Food waste certainly matters, but so too does plastic packaging. The problem is that plastics have not been a part of our moral economy for very long.
Plastics arrived as a modern convenience, not as a moral appendage to our sense of identity or community like food has been for millennia. There are no ancient and collective traumas tied to plastics’ wanton consumption, abuse or scarcity, no prayers of gratitude for plastic packaging, and no great piety or moral proverbs condemning its thoughtless disposal.
Our existing moral frameworks are coloured with images of hunger, famine, bread lines and emaciated bodies that provide us with the imagination to condemn the wasting of food.
But we require new stories and perspectives to position plastic waste as an evil that will outlive us, haunt our waterways, crowd the stomachs of wildlife, leach into our food systems, and poison our bodies long after our shopping habits have changed.
Don’t have time to read about climate change as much as you’d like?
James Cronin received funding from the UKRI Natural Environment Research Council as co-investigators of the ‘Plastic Packaging in People’s Lives’ (PPiPL) project. Project Reference: NE/V010611/1. More can be read about the PPiPL project here: https://www.lancaster.ac.uk/ppipl/
Alexandros Skandalis received funding from the UKRI Natural Environment Research Council as co-investigators of the ‘Plastic Packaging in People’s Lives’ (PPiPL) project. Project Reference: NE/V010611/1. More can be read about the PPiPL project here: https://www.lancaster.ac.uk/ppipl/
Charlotte Hadley received funding from the UKRI Natural Environment Research Council as co-investigators of the ‘Plastic Packaging in People’s Lives’ (PPiPL) project. Project Reference: NE/V010611/1. More can be read about the PPiPL project here: https://www.lancaster.ac.uk/ppipl/
While Halloween offers a chance to embrace all things spooky and supernatural, the real terrors this season aren’t confined to ghost stories. From pumpkin-carved fingers to contact lens infections that can lead to life-threatening heart conditions, the festivities come with genuine medical hazards – some surprisingly severe.
In the US, 44% of Halloween-related injuries stem from pumpkin carving, ranging from minor scratches to lacerations that slice through major nerves, blood vessels and tendons. Specific pumpkin carving knives or tools have been shown to be much safer, though not risk-free.
Pumpkins pose additional dangers when candles are lit inside them. The flames can ignite property or costumes, often leaving victims with severe burns. There is a notable spike in burn-related injuries each year around Halloween, particularly among children. One high-profile case involved TV personality Claudia Winkelman’s daughter, Matilda, who suffered life-changing injuries in 2014, aged eight, when her Halloween costume caught fire.
Costumes themselves create multiple hazards beyond burns. Ill-fitting outfits can lead to broken bones from slips and trips, while masks and heavy headwear obscure vision. Latex allergies from costume materials represent another risk, causing anything from irritation and rashes through, in very rare cases, to death .
The combination of dark October evenings and dark costumes creates a particularly dangerous scenario. Data from the UK covering 27 years revealed that on Halloween, the risk of children being killed or seriously injured in traffic accidents is higher than on any other day – and 34% higher between 5pm and 6pm, probably coinciding with rush hour.
In the US, childhood pedestrian deaths are fourfold higher on Halloween than any other day. A separate study found there are four additional pedestrian deaths on Halloween compared with other days.
On Halloween, appearances can be deceiving – sometimes literally. Coloured contact lenses present significant risks to eye health and overall wellbeing. They can cause irritation and redness, eye injury when they snap and cut into the eye, or even a life-threatening heart infection.
Damage to the eye, from ill-fitting or poor quality contact lenses, can promote bacterial growth. These bacteria can migrate from the eye, often in the bloodstream, to elsewhere in the body. One location they can set up camp is in the heart, causing conditions such as infective endocarditis, which kills about one in five people with the condition. This condition is challenging to treat because medicines and immune cells struggle to reach the heart lining.
Face paints carry both short- and longer-term risks. Skin irritation and pore-blocking can be an immediate annoyance, along with corneal scratches if paint enters the eyes. Ingestion and prolonged or repeated exposure can increase the risk of absorbing potentially toxic elements such as heavy metals and arsenic, which increase cancer risk.
Plastic fangs and other teeth-modifying sets can damage teeth. Designed as one-size-fits-all products, they’re likely to loosen teeth and exacerbate existing looseness. If using adhesive to hold them in place, ensure it’s approved for dental use. Products like superglue and nail glues will damage tooth enamel – a layer that cannot regenerate – and can burn the gums and inside of the mouth.
The obvious concern on Halloween is feeling unwell from consuming excessive sweets or chocolate. However, other consumption hazards have emerged in recent years. Hospital admissions for children who’ve ingested gummies containing THC or otherbanned substances have increased noticeably in countries that have legalised or decriminalised cannabis.
For those watching their calorie intake, sugar-free options may backfire – a phenomenon sometimes referred to as “Halloween diarrhoea”. Sorbitol, an artificial sweetener used in sugar-free products, is only about 60% as sweet as sucrose, meaning more must be added to achieve the desired taste. As little as 20g of sorbitol can have a laxative effect in 50% of healthy people. For context, a stick of sugar-free chewing gum contains roughly 1.25g.
Hard sweets present a choking risk year-round, but particularly to younger children, and the increased sweet consumption around Halloween elevates this risk further. Children with nut allergies face additional jeopardy – the incidence of nut-related anaphylaxis increases by approximately 70% on Halloween.
Beyond food, other Halloween traditions carry risks. Trauma to the eye from eggs used as projectiles is commonly seen during the festivities, with some victims losing their sight from such injuries.
Certain crimes and resulting injuries also increase around Halloween, with assaults showing a significant increase. The commercialisation of Halloween celebrations is thought to play a role, with promotional drink offers partly to blame.
Sensible precautions – wearing lights or reflective strips when out with children, moderating sweet intake, and supervising tasks like pumpkin carving – can substantially reduce the risk of becoming another Halloween hospital statistic.
Adam Taylor does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.