Why mistletoe is thriving, even as its traditional orchards are lost

Source: The Conversation – UK – By Adele Julier, Senior Lecturer in Terrestrial Ecology, University of Portsmouth

Reflexpixel / shutterstock

Mistletoe is a richly symbolic winter plant with an unusual life cycle. With more than half of England’s traditional orchards lost since the mid-20th century, it would be easy to assume mistletoe is disappearing too. But that’s not the case. Despite dramatic changes in land use, mistletoe in Britain and Ireland is not in decline – and in some places it may even be spreading.

Mistletoe is a name used for a variety of different plants across the world, but in Britain it generally means European mistletoe (Viscum album), a semiparasitic plant that grows on the branches of trees. Being semiparasitic means it takes water and some of its nutrients from the tree on which it grows, while also capturing its own energy through photosynthesis.

This unusual feature allows mistletoe to thrive high in the tree canopy, but also makes it dependent on both suitable host trees and the animals that help it reproduce.

Mistletoe is best known today for the tradition of kissing beneath it at Christmas, a custom that became popular in the 19th century. The plant also features in Greek and Norse mythology and has some tenuous associations with ancient druidic practices in Britain.

These cultural associations have helped cement mistletoe’s image as a plant of tradition, protection and continuity, even as its ecology proves surprisingly dynamic.

A complicated life

Mistletoe’s life cycle is more complicated than the average plant. It’s among the 6% or so of flowering plants where male and female flowers grow on separate plants. Both produce tiny green flowers that smell sweet and fruity to attract pollinators such as flies and bees. The female plant grows white berries which are coated in a sticky substance called viscin.

Leaves growing on branches
Mistletoe has colonised this apple tree – the leaves and branches are from different organisms.
Tom Meaker / shutterstock

Seeds tend to be dispersed by birds. Species such as mistle thrushes, waxwings and blackcaps eat the berries and then either poop seeds out or wipe them off their beaks and feet onto nearby branches. The sticky coating helps the seeds adhere to the bark, where they can germinate.

The fact that it has separate male and female plants, and its reliance on birds, makes mistletoe surprisingly slow to spread. If a lone plant is growing in a new location, it could be years before more arrive. These biological constraints have traditionally limited mistletoe to places with the right combination of climate, host trees, pollinators and seed dispersers.

Orchards to gardens

Mistletoe is most commonly associated with orchards, especially apple trees, though it can also grow on poplars, lime, hawthorne and willow, and very rarely on oak. It is found all across England, but is most abundant in the south-west midlands. The plant is the official county flower of Herefordshire, where it has long been associated with the county’s orchards.

Fruit tree in winter with mistletoe
Mistletoe is evergreen, and is particularly noticeable in winter when its host sheds its leaves.
Dietrich Leppert / shutterstock

Over the 20th century, however, the National Trust estimates that 56% of England’s traditional orchards have vanished. You might expect this would mean wild mistletoe is in trouble – at least in England.

But that’s not the case. Its conservation status in Britain is “least concern”, and although it is often found in orchards, mistletoe is now most common in gardens. Its abundance in the south-west midlands of England could be more due to a wet and warm climate than the presence of orchards.

Bird eats berry
Waxwings visit the UK in winter, often from Scandinavia. They feed on berries – like mistletoe.
Dmytro Komarovskyi / shutterstock

There have been reports of mistletoe spreading quickly in places such as Essex and Cambridge. Blackcaps, a key disperser of mistletoe seeds, have only recently started overwintering in Britain. Warmer winters have altered their migration, increasing the time they spend in the UK and therefore the time they have to spread mistletoe seeds. Changes in bird behaviour linked to climate change may therefore be affecting the distribution of one of Britain’s most familiar plants.

If you would like to get involved with mistletoe research, there is a citizen science project run by the TreeCouncil called MistleGO! in which you can record sightings, helping researchers to track its distribution. You can also buy mistletoe growing kits, although it is best to wait to sow the seeds until early spring – and it might be several years before your mistletoe plant is large enough to harvest for your Christmas party.

Even as its traditional orchards disappear, mistletoe will remain a festive fixture. It’s a living example of how complex interactions between different species amid climate change and changing landscapes make it hard to predict what the wildlife of the future will look like.

The Conversation

This article would not have been possible without reference to the work of John Briggs, who has an excellent website called The Mistletoe Pages https://mistletoe.org.uk/mp/ and who authored several key works including a 2021 review in the journal British & Irish Botany titled: ‘Mistletoe, Viscum album (Santalaceae), in Britain and Ireland; a discussion and review of current status and trends’.

ref. Why mistletoe is thriving, even as its traditional orchards are lost – https://theconversation.com/why-mistletoe-is-thriving-even-as-its-traditional-orchards-are-lost-272154

How family gatherings unlock forgotten childhood memories that help us understand who we really are

Source: The Conversation – UK – By Jane Aspell, Professor of Cognitive Neuroscience, Faculty of Science and Engineering, Anglia Ruskin University

If you’re driving home for Christmas (insert Chris Rea earworm here) – and by that I mean the old family home – you’re likely to be experiencing a familiar mix of excited anticipation and faint dread of being trapped in close quarters with relatives. There’s nothing like Christmas for mental time travel triggered by family traditions and well-worn arguments.

You might also have the sort of family, like mine, which often insists on perceiving and treating you as you were 40-plus years ago. Although I’m closer than I’d like to 50, my father still voices concerns about me crossing roads, “wrapping up warm” and leaving electric plugs switched on. I will forever be his little girl.

Since our identity is in part created by how those around us see us and behave towards us, the festive season can temporarily cause us to regress to a past, childish version of our self – and this isn’t always welcome. But I’d like to suggest there is a silver lining of opportunity here though: the chance to gain access to forgotten memories.

As a professor of cognitive neuroscience, I’ve been lucky enough to be able to test this idea with colleagues in my lab. In particular, we wanted to scientifically investigate whether people can recall more detailed childhood memories if they can “reinhabit” the body they had as a child.

I think it makes intuitive sense that this might work: the body I had as a child was very different to the one I currently occupy in middle age, and it seems reasonable to suppose that a (usually overlooked) aspect of our childhood memories – indeed of all memories – is the kind of body we used to have.

Our bodily experience is so ever-present that we usually don’t even notice it unless we are in some pain or discomfort. But there is not a minute of your life when your brain does not receive a mass of sensory input from and about your body: the sight of your hands in your peripheral vision, the sound of your footsteps and breathing, the beating of your heart, the contractions of your stomach and the tension in your muscles. Since the body is a big part of what we perceive in every moment, its varying form (as we age and change) should also be encoded in our memories.

As time passes, remote memories can dim, and some may even seem to disappear. But in most cases, they are never really “gone” from the brain – we just need the right trigger to reactivate them and bring them back into our consciousness.

A magic mental jigsaw

Memory is a bit like a magic mental jigsaw. Once you get hold of one jigsaw piece, a linking piece can suddenly pop into your mind. Our idea was to give participants in our lab the piece that enables them to re-experience their childhood bodies, in the hope that this could enable better access to memories that were laid down when they occupied those younger bodies.

We did this by causing our participants to experience a body illusion known as the “enfacement illusion”. We asked them to sit facing a computer screen with an attached webcam. On the screen, they could see a live video of their own face as filmed by the camera, but for half the participants there was a twist: the video had been distorted by a popular Snapchat app filter. Instead of seeing a video of their face as it currently looked, they saw their face morphed into a childlike version: their face resembled how it looked when they were a child.

The ‘enfacement illusion’ experiment explained. Video: Anglia Ruskin University.

We instructed them to move their head from side to side for 90 seconds while keeping their eyes fixed on the screen. This movement was important, as it provided crucial information to their brains about the self-relatedness of the image they saw.

Given that the face on screen moved exactly in time with their own face, this tricked the brain that the face on screen was really theirs. It was as though the participant was looking into a mirror but seeing the face they had as a child looking back at them. A different group of participants watched an undistorted video of their own face as they made the same movements.

To test whether this brief illusion has effects on memory recall, immediately after the illusion the participants took part in an “autobiographical memory interview”. The lead researcher – my former PhD student Utkarsh Gupta – followed a strict protocol to ask them a series of questions that would result in them describing an individual memory from their childhood in as much detail as possible. These interviews were recorded, and the transcripts were later numerically rated for specificity and detail by two researchers who were blind to the group that each participant had been assigned to.

Although the illusion was very brief, we found a significant difference between the memories described by participants in each group. As we had predicted, those who “re-embodied” their childlike faces were able to recall significantly more detailed memories than the participants who viewed their current face.

Our study was therefore able to show that body, self and memory interact, as indeed they must, in order for our brains to create our experience of personal identity – what makes “me” the person that I am.

Our identity necessarily evolves over time (even though our parents may sometimes have difficulty recognising that). And integrating memories of our past with the present moment is not always easy.

Our memories are not only records of the things that we previously saw, thought, smelt and heard. They are also records of the kind of body that our self used to drive around in. All our past selves are etched into our brains. The ghosts of Christmases past never really melt away.

The Conversation

Jane Aspell receives funding from the Leverhulme Trust and has previously been funded by Versus Arthritis, the Bial Foundation, the British Academy, The Urology Foundation and the Wellcome Trust.

ref. How family gatherings unlock forgotten childhood memories that help us understand who we really are – https://theconversation.com/how-family-gatherings-unlock-forgotten-childhood-memories-that-help-us-understand-who-we-really-are-272021

What makes a song sound ‘Christmassy’? Musicologist explains

Source: The Conversation – UK – By Samuel J Bennett, Senior Lecturer in Music Production, Nottingham Trent University

Shutterstock/Krakenimages.com

Within the first notes of many classic Christmas songs, we’re transported directly to the festive season. Why is it that it’s these particular pieces of music that get us thinking of the holidays?

In his book Music’s Meanings, the popular music researcher Philip Tagg explores the ways in which we as listeners construe the music that we hear. Tagg applies semiotics, the study of how we interpret signs in the world around us, to music. These signs may be viewed differently by different people and may change their meaning over time.

To illustrate this concept, Tagg cites the example of the pedal guitar, originally drawn from Hawaiian musical tradition and carrying connotations of the islands. Eventually this instrument found its way into country music, so successfully that Tagg argues at this point, we are likely to immediately think of country music when hearing the instrument, without the concept of Hawaii ever crossing our minds.

As the pedal guitar may place us immediately within the realm of country music, there is one instrument that will likely do the same for Christmas – sleigh bells.

Sleigh bells

From light orchestral pieces such as Prokofiev’s Troika (1933), right through to Ariana Grande’s Santa Tell Me (2014), sleigh bells have long acted as convenient shorthand for composers to tell their listeners that this piece belongs to the Christmas canon.

The reasons for this link stem from the non-musical world. We associate Christmas with the winter season and snowy weather. Sleighs, through their use as transport in such weather, developed a direct associative link with Christmas, and as a result, so did the bells used to warn pedestrians of their approach. As with Tagg’s pedal guitar example, we’ve reached the point where we generally link sleigh bells directly with the concept of Christmas, rather than thinking of the intermediary idea of the sleigh at all.

Santa Tell Me uses sleigh bells to evoke a Christmassy sound.

There’s a link to the wider instrument family of bells too. Through the practice of churches ringing out their bells, particularly in celebration of the birth of Christ, larger bells have also developed a presence, not only in Christmas music, but in Christmas decorations and art.

Last year, the UK Official Charts Company published a list of the “top 40 most-streamed Christmas songs”. If you were to listen to the list, you’d find bell-like sounds in the majority of them, from the glockenspiel-like introduction of Mariah Carey’s All I Want for Christmas Is You (1994) to the synthesised tubular bells of Band Aid’s Do They Know It’s Christmas (1984).




Read more:
Band Aid at 40: how the problematic Christmas hit changed the charity sector


There are other musical elements which help spread the Christmas cheer, from lyrical melodies to strident brass parts. Most of these elements though, have one thing in common. They aren’t modern sounds, or particularly common in modern pop music, and instead, they remind us of the past.

The nostalgia of Christmas

Christmas is a nostalgic holiday, in more ways than one. The word “nostalgia” initially referred to a type of homesickness, rather than the fond remembrance of a hazy past time that we more commonly use it to refer to now. But both senses of the word can be used to describe the feelings we associate with Christmas.

It’s a time where many of us travel home to family, taking not only a geographical trip, but a temporal one, immersing ourselves in a world of well-worn tradition and familiarity, where the pace of our day-to-day life doesn’t apply.

Artists know this, feeding our nostalgia through music, lyrics and visuals which evoke the past. This is possibly why most Christmas albums consist of interpretations of past holiday classics, rather than original material. It’s a straightforward appeal to the nostalgic and the familiar; if we already know a song, it’s easier to immediately latch on to this new recording. Some artists though, take the nostalgia trip one step further, emulating what is arguably the ultimate Christmas style of music – the easy listening crooner song.

Billie Eilish performs Have Yourself A Merry Little Christmas in 2023.

Whether it’s Bing Crosby or Nat King Cole, the warmth of a crooning voice nestled among light orchestral instrumentation has become inextricably linked with Christmas. It’s a sound that, unless you have a personal affinity with the style, you’re unlikely to hear much outside of the festive season.

It’s telling that when Billie Eilish performed a version of Have Yourself a Merry Little Christmas on Saturday Night Live in 2023, she eschewed her usual synthesised sounds in favour of a traditional trio of piano, drums and upright bass, and delivered the vocal in a gentle, warming tone. It all conspires to make us think of some imagined, simpler past, with chestnuts by the fire and picturesque snow settling outside.

Finally, we return to that list of the most-streamed Christmas songs. There’s one artist, and indeed one album, that makes the top 20 with two entries – Michael Bublé, with his 2011 album Christmas. Checking this album against our list of Christmas musical elements reveals a clean sweep. It’s crooned from top to bottom, features lightly orchestrated versions of classic Christmas songs, and yes, includes sleigh bells. It doesn’t get much more Christmassy than that.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

Samuel J Bennett does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. What makes a song sound ‘Christmassy’? Musicologist explains – https://theconversation.com/what-makes-a-song-sound-christmassy-musicologist-explains-271349

Could your boss be lonely? Here’s why it matters more than you think

Source: The Conversation – UK – By Karolina Nieberle, Associate Professor of Social and Organisational Psychology, Durham University

PeopleImages/Shutterstock

Loneliness is the pain we feel when our social connections fall short of fulfilling our needs. At its core, it reflects a fundamental human need: to feel close to and connected with others. But it is also often an invisible experience.

Loneliness is not just a personal issue. It is also a workplace one. Gallup’s 2025 global workplace report showed that 22% of employees felt lonely on their previous workday. Managers weren’t immune either: 23% of them reported feeling lonely.

Workplace loneliness can affect anyone and can quietly damage engagement, wellbeing and performance. For leaders, the stakes are high. When they experience loneliness, it can subtly shape how they interact with their teams. They may communicate less openly, avoid feedback or appear withdrawn. A lonely leader influences their entire workplace environment, shaping team dynamics, morale and performance.

With our colleagues, Michelle Hammond (Oakland University) and Keming Yang (Durham University), we have studied loneliness in the workplace and found that managers might feel lonely due to the demands of their role and the things they experience during the workday. These things can vary from day to day.

As managers move up the hierarchy, their status and responsibilities increase, which can create distance from both their team members and peers. Building connections depends on being able to show vulnerability. But daily pressures, tough decisions and confidentiality constraints often make it difficult for managers to open up. As a result, their need for social connection can go unmet on some days, while on other days they may feel engaged and well connected.

Our research looked at the consequences of short-term loneliness among leaders. In two independent studies with UK managers, we found that fluctuations in their loneliness levels had implications for how they approached leadership.

On days or in situations when managers felt lonely, they engaged less with their work (this could be spending time on matters unrelated to work or letting others do their tasks) and lower levels of engagement with their team members (avoiding their employees, for instance).

The consequences of short-term loneliness for managers did not stop at the end of the workday. After a day in which they felt lonely, managers distanced themselves more from others in the evening. This created a loop that perpetuated loneliness into the next workday, and it helps to explain why managers sometimes feel lonely for extended periods.

work papers scattered on a desk with a framed photo of a young child in the background.
There’s more to life than reports.
Khakimullin Aleksandr/Shutterstock

But our research uncovered a key resource outside of work that helped managers mitigate the consequences of loneliness and stopped it from affecting them for a longer period. This centred on how important their relationships with family and friends were in their life – something we called “family identity salience”.

Managers who placed greater value on their family and social connections were better able to switch off from work in the evenings, and loneliness from their workday did not spill over into their home life. Loneliness still affected their leadership at work, but it didn’t lead them to withdraw socially at home. As a result, they were able to start the next workday with a clean slate.

This “family identity salience” motivates managers to create protective boundaries between their daily work and home domains. It helps them shift out of work mode and reconnect with their friends and families after work – especially important on tough days.

Not just managers

Although managers’ loneliness has the greatest implications for the health of the workplace overall, anyone can feel lonely at work sometimes, whether or not they are a manager.

It may be helpful for workers to explore which experiences and situations make them feel lonely. They could also consider the situations when their manager might feel lonely. On the other hand, some situations might make them feel close to others, including managers. Talking to peers and sharing experiences can help to raise awareness of the issue.

To prevent occasional loneliness, workers could make themselves (and others) aware of the networks and groups that offer connection. These could be immediate team members, peers, (senior) managers, colleagues in other departments or external partners. They should think about what connects them with each of these groups and the steps they can take to strengthen their connection with them.

In addition to workplace networks, employees should invest in their relationships outside work. They can remind themselves why these relationships matter, and keep family and social goals visible (with photos, reminders or personal notes) to reinforce a sense of identity beyond work. The energy and support resources that people gain from time with friends and family outside work can unlock benefits in the professional sphere too.

Workers can also take steps to sustain and expand their relationships outside of work. For example, they might be the one who arranges dates, phone calls and shared activities with the people they value.

The best way for people to shield themselves from workplace loneliness is by not placing all their eggs in their “work basket”. Building resilience by nourishing and investing in interests and connections to places and people is a good way to celebrate all the facets of what makes us human.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Could your boss be lonely? Here’s why it matters more than you think – https://theconversation.com/could-your-boss-be-lonely-heres-why-it-matters-more-than-you-think-272129

With UK unemployment rising, will the government’s plan for young people pay off? An economist’s view

Source: The Conversation – UK – By Rachel Scarfe, Lecturer in Economics, University of Stirling

Monkey Business Images/Shutterstock

There are nearly one million young people in the UK who are not in employment, education or training (so-called Neets). After falling in number during the 2010s before the pandemic, this cohort of 16 to 24-year-olds has grown from 750,000 only six years ago. This is a worrying shift, for several reasons.

Research shows that a spell of unemployment at a young age can have outsized negative effects on the young person. Workers who were unemployed for even a short time at a young age have to contend with lower wages and poorer mental health even years later. In the three months to October, unemployment in the UK climbed to 5.1%, with young people particularly badly affected.

To address these challenges, the UK’s autumn budget introduced a package of measures intended to help young people move into stable work. The announcements include more apprenticeships, employment support and a guaranteed work placement for long-term unemployed young people.

There were also policies aimed at young people already in work. The government previously promised to abolish the “discriminatory” lower minimum wage for 18 to 20-year-olds. As a step towards that, the minimum wage for this age group will increase by 85p per hour in April 2026, from £10 to £10.85. This compares to an increase of 50p per hour, from £12.21 to £12.71, for workers aged 21 or over.

To make sure employers play by the rules, the government also announced stricter enforcement of employment regulations, including the minimum wage, by the new Fair Work Agency.

Together, these policies have a range of implications for young workers. The minimum wage increase means that full-time workers aged over 21 will earn around £900 more per year. And those aged 18 to 20 will receive about £1,500 more.

Stronger enforcement should reduce the risk of young people being underpaid. This year, more than 40,000 workers won compensation for earning less than the minimum wage. But of course, these are only employees of firms that have been caught – the actual number of underpaid workers is likely to be higher. More effective enforcement should boost workers’ pay and living standards.

The guaranteed jobs scheme is expected to create around 55,000 jobs – and research indicates that programmes of this kind can help young people remain in employment even after the placement ends. More funding for apprenticeships also opens up opportunities for young people to enter skilled careers.

The other side of the coin

But there are also downsides. Although the minimum wage has increased substantially over the past few years from a maximum of £8.91 in 2022 to £12.71 from April, living costs have been rising as well. As the table below shows, increases in other costs have absorbed much of the rise. In particular, average monthly rents have been rising nearly as fast as the minimum wage over the last few years.

Not only that, but employers may respond to higher minimum wages by reducing new hires or relying more heavily on flexible arrangements, such as zero-hours contracts. Evidence shows that as the minimum wage has risen, employers have moved towards flexible, temporary and hourly-paid jobs.

This is concerning for full-time workers, but also for young people relying on part-time work in sectors such as hospitality or retail while studying.

For businesses, the debate has centred on rising costs, but the picture is actually more nuanced. Higher minimum wages do increase labour and administration costs. And employing young workers can be riskier – they have less experience and it is not easy for firms to know how productive they might be compared to more seasoned workers. As a result, higher minimum wages for young workers can encourage firms to substitute towards hiring older, and possibly less risky, workers.

A more cautious approach might have been for the government to address the challenges for young people sequentially, first expanding employment opportunities, and then later raising their minimum wage.

Yet the measures in the budget could create opportunities. Evidence has consistently shown that higher minimum wages can reduce staff turnover by encouraging workers to stay in their jobs, which are now worth more to them. This is particularly true for younger workers, who tend to move jobs more often. This can lower recruitment costs and reduce interruptions for businesses, especially when they have invested in training staff.

Small and medium-sized firms will benefit directly from government-funded apprenticeships. They will no longer have to pay 5% of the training costs, making employing an apprentice more cost effective. And more flexible rules around apprenticeships give businesses greater freedom to tailor training to their needs, helping them build a workforce with relevant skills at a time of increasing technological change.

Today’s young people face significant uncertainty – nobody knows what the labour market will look like in five years’ time. But these changes represent a modest step towards supporting them.

But by increasing the minimum wage at the same time, the government is taking a gamble. On the one hand, higher wages alongside policies aimed at reducing the number of Neets could help young people into work and encourage them to stay there. But on the other, the wage increase could undermine these efforts if firms begin hiring fewer young workers. In that case, even well-designed employment schemes would struggle to offset the loss of opportunities.

The Conversation

Rachel Scarfe is a member of the Labour Party.

ref. With UK unemployment rising, will the government’s plan for young people pay off? An economist’s view – https://theconversation.com/with-uk-unemployment-rising-will-the-governments-plan-for-young-people-pay-off-an-economists-view-271993

With UK unemployment rising, will the goverment’s plan for young people pay off? An economist’s view

Source: The Conversation – UK – By Rachel Scarfe, Lecturer in Economics, University of Stirling

Monkey Business Images/Shutterstock

There are nearly one million young people in the UK who are not in employment, education or training (so-called Neets). After falling in number during the 2010s before the pandemic, this cohort of 16 to 24-year-olds has grown from 750,000 only six years ago. This is a worrying shift, for several reasons.

Research shows that a spell of unemployment at a young age can have outsized negative effects on the young person. Workers who were unemployed for even a short time at a young age have to contend with lower wages and poorer mental health even years later. In the three months to October, unemployment in the UK climbed to 5.1%, with young people particularly badly affected.

To address these challenges, the UK’s autumn budget introduced a package of measures intended to help young people move into stable work. The announcements include more apprenticeships, employment support and a guaranteed work placement for long-term unemployed young people.

There were also policies aimed at young people already in work. The government previously promised to abolish the “discriminatory” lower minimum wage for 18 to 20-year-olds. As a step towards that, the minimum wage for this age group will increase by 85p per hour in April 2026, from £10 to £10.85. This compares to an increase of 50p per hour, from £12.21 to £12.71, for workers aged 21 or over.

To make sure employers play by the rules, the government also announced stricter enforcement of employment regulations, including the minimum wage, by the new Fair Work Agency.

Together, these policies have a range of implications for young workers. The minimum wage increase means that full-time workers aged over 21 will earn around £900 more per year. And those aged 18 to 20 will receive about £1,500 more.

Stronger enforcement should reduce the risk of young people being underpaid. This year, more than 40,000 workers won compensation for earning less than the minimum wage. But of course, these are only employees of firms that have been caught – the actual number of underpaid workers is likely to be higher. More effective enforcement should boost workers’ pay and living standards.

The guaranteed jobs scheme is expected to create around 55,000 jobs – and research indicates that programmes of this kind can help young people remain in employment even after the placement ends. More funding for apprenticeships also opens up opportunities for young people to enter skilled careers.

The other side of the coin

But there are also downsides. Although the minimum wage has increased substantially over the past few years from a maximum of £8.91 in 2022 to £12.71 from April, living costs have been rising as well. As the table below shows, increases in other costs have absorbed much of the rise. In particular, average monthly rents have been rising nearly as fast as the minimum wage over the last few years.

Not only that, but employers may respond to higher minimum wages by reducing new hires or relying more heavily on flexible arrangements, such as zero-hours contracts. Evidence shows that as the minimum wage has risen, employers have moved towards flexible, temporary and hourly-paid jobs.

This is concerning for full-time workers, but also for young people relying on part-time work in sectors such as hospitality or retail while studying.

For businesses, the debate has centred on rising costs, but the picture is actually more nuanced. Higher minimum wages do increase labour and administration costs. And employing young workers can be riskier – they have less experience and it is not easy for firms to know how productive they might be compared to more seasoned workers. As a result, higher minimum wages for young workers can encourage firms to substitute towards hiring older, and possibly less risky, workers.

A more cautious approach might have been for the government to address the challenges for young people sequentially, first expanding employment opportunities, and then later raising their minimum wage.

Yet the measures in the budget could create opportunities. Evidence has consistently shown that higher minimum wages can reduce staff turnover by encouraging workers to stay in their jobs, which are now worth more to them. This is particularly true for younger workers, who tend to move jobs more often. This can lower recruitment costs and reduce interruptions for businesses, especially when they have invested in training staff.

Small and medium-sized firms will benefit directly from government-funded apprenticeships. They will no longer have to pay 5% of the training costs, making employing an apprentice more cost effective. And more flexible rules around apprenticeships give businesses greater freedom to tailor training to their needs, helping them build a workforce with relevant skills at a time of increasing technological change.

Today’s young people face significant uncertainty – nobody knows what the labour market will look like in five years’ time. But these changes represent a modest step towards supporting them.

But by increasing the minimum wage at the same time, the government is taking a gamble. On the one hand, higher wages alongside policies aimed at reducing the number of Neets could help young people into work and encourage them to stay there. But on the other, the wage increase could undermine these efforts if firms begin hiring fewer young workers. In that case, even well-designed employment schemes would struggle to offset the loss of opportunities.

The Conversation

Rachel Scarfe is a member of the Labour Party.

ref. With UK unemployment rising, will the goverment’s plan for young people pay off? An economist’s view – https://theconversation.com/with-uk-unemployment-rising-will-the-goverments-plan-for-young-people-pay-off-an-economists-view-271993

How cranberries can be a Christmas cracker for health this festive season

Source: The Conversation – UK – By Dipa Kamdar, Senior Lecturer in Pharmacy Practice, Kingston University

Media_Photos/Shutterstock

From festive sauces to brightly coloured juices, cranberries have long been part of our diets. Beyond their tart flavour and seasonal appeal, these red berries are often described as a superfood with several potential health benefits.

Cranberry supplements are promoted as a convenient way to get these benefits without the sugar or sharp taste of the juice. So what does the science actually say about cranberries, and are supplements as effective as eating the fruit?

Cranberries are best known for their role in helping prevent urinary tract infections (UTIs). The fruit contains compounds called proanthocyanidins. These compounds appear to stop bacteria such as E. coli from sticking to the lining of the urinary tract, which is one of the first steps in developing an infection. This explains why cranberry products may help prevent UTIs, although they do not treat infections once bacteria have already attached and multiplied. Research supports cranberry’s preventive role in women who experience recurrent infections and in children, although results vary between studies. One study found both cranberry juice and tablets reduced UTI rates in women, but tablets worked slightly better and were more cost-effective. Both forms reduced antibiotic use compared with placebo.

Hand pouring cranberry juice into a glass with ice cubes. A bowl of fresh cranberries is nearby.
Some research suggests cranberry juice can help reduce urinary tract infections in women and children.
Pixel-Shot/Shutterstock

Cranberries have also been investigated for their effects on heart health. They are rich in antioxidants such as anthocyanins, proanthocyanidins and quercetin. Antioxidants help protect cells from damage caused by unstable molecules called free radicals. Research shows that cranberry juice or extracts can improve several risk factors for heart disease.

These include raising levels of HDL cholesterol, often called good cholesterol because it helps remove excess cholesterol from the bloodstream, and lowering LDL cholesterol in people with diabetes. LDL is sometimes described as bad cholesterol because high levels can build up in artery walls, and it becomes even more harmful when it is oxidised. Oxidised LDL is more likely to stick to artery walls and fuel inflammation, which contributes to plaque formation. Cranberries’ antioxidants may help slow this process. They may also improve flexibility in blood vessels, reduce blood pressure and lower homocysteine, an amino acid linked to inflammation at high levels. However, not all studies report the same findings, so the evidence remains mixed.

Researchers are also studying cranberries for their possible role in cancer prevention. Lab and animal studies show that cranberry compounds, including ursolic acid, may slow the growth of tumour cells. Some compounds have anti-inflammatory effects, which is important because chronic inflammation can contribute to the development of cancer. A clinical trial found that cranberry juice may help reduce the risk of stomach cancer by blocking H. pylori, a bacterium strongly linked to this form of cancer, from attaching to the stomach lining. Adults who drank about two glasses of cranberry juice had lower infection rates. Lab and animal studies point to other possible anti-cancer effects, and upcoming research will determine whether these laboratory findings translate to humans.

The antioxidant and anti-inflammatory properties of cranberries may also support brain health. A 2022 study found that adults who consumed freeze-dried cranberry powder each day, which is equivalent to about 100 grams of fresh cranberries, showed better memory for daily tasks and improved blood flow to brain regions involved in learning. They also had reduced LDL cholesterol. High LDL can contribute to hardened arteries, which affects circulation.

Cranberries may also support the immune system. Studies suggest their natural compounds can make it less likely to catch colds or flu. Cranberries are a source of vitamin C, vitamin E, carotenoids and iron, all of which contribute to normal immune function.

Supplements, juice and whole fruits

Cranberry supplements are often promoted as an easier alternative to juice or fresh fruit. They deliver concentrated extracts of dried, powdered cranberries, usually standardised to contain a set amount of proanthocyanidins. This allows people to obtain active compounds without the sugars found in many commercial cranberry juices. However, whole fresh or frozen cranberries provide fibre and a wider range of nutrients that may be missing in supplements. Eating fruit also encourages healthier overall habits, while capsules can tempt people to treat them as a shortcut.

Wooden spoon with cranberry supplements and fresh cranberries in a bowl berries
Supplements provide concentrated extracts of dried, powdered cranberries but the whole fruit provides fibre and a wider range of nutrients too.
Pixel-Shot/Shutterstock

For most people, cranberries are safe to consume in moderation. Large amounts of juice or supplements can cause stomach upset or diarrhoea. Cranberries contain oxalates, natural chemicals that may contribute to kidney stones in people who are prone to them. Sweetened cranberry juices can also undermine potential health benefits by raising sugar intake.

The most important safety concern is the potential interaction between cranberries and certain medicines. Some case reports suggest cranberry juice may enhance the blood thinning effect of warfarin, which increases the risk of bleeding. Evidence is inconsistent, but people taking warfarin are usually advised to avoid large quantities of cranberry products. There may also be interactions with other drugs processed by the liver, although these effects are not well established.

Cranberries, then, whether eaten whole or taken as supplements, offer real health benefits, especially in reducing the risk of recurrent urinary tract infections. They may also support heart health, reduce inflammation and provide some protection against certain cancers, although the evidence for these effects is less clear. Supplements cannot replace a balanced diet, and whole cranberries provide additional nutrients and fibre that extracts cannot match. Some people should exercise caution, particularly those at risk of kidney stones or those taking specific medications.

Cranberries are not a magic solution, but they can be a valuable addition to the table, whether in a festive sauce, a handful of fruit or an occasional supplement. Enjoy them for their flavour and colour, and consider any health benefits a welcome bonus.

The Conversation

Dipa Kamdar does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How cranberries can be a Christmas cracker for health this festive season – https://theconversation.com/how-cranberries-can-be-a-christmas-cracker-for-health-this-festive-season-269522

Doubts about women in combat don’t stand up to history

Source: The Conversation – UK – By Ashleigh Percival-Borley, PhD Candidate in the Department of History, Durham University

British special forces soldiers take part in a training exercise. PRESSLAB / Shutterstock

Germany has unveiled plans to introduce voluntary military service. From January 2026, all 18-year-old men will be required to complete a questionnaire asking if they are interested and willing to join the armed forces. Women will not be required to fill out this form.

Across Europe, the pattern is similar. In countries where military service is compulsory such as Austria, Cyprus, Estonia, Finland, Greece, Latvia, Lithuania, Switzerland and Ukraine, women’s enlistment remains voluntary.

The German government’s move, which has sparked a debate within the country about the role of women in the armed forces, comes months after the US defence secretary, Pete Hegseth, said in a speech to a hall of generals that if “no women qualify for some combat jobs, then so be it”.

As a former British Army combat medic who served in Afghanistan, what I recognise here is an age-old myth that war is, and always has been, a man’s world.

During my military service, I learned the different sounds made by bullets whizzing past my ears or pinging overhead. I also became familiar with the unmistakable ringing after an IED explosion. I know from experience that competence, professionalism, teamwork and a certain amount of luck all matter on the battlefield. A person’s gender does not.

History agrees with this sentiment. From the Scythian warriors of the ancient steppes – the inspiration for the Amazons’ race of women warriors in Greek mythology – and Viking shieldmaidens, to the Japanese samurai and women fighting in the crusades, evidence reveals women not only participating in battle but leading it.

The modern era has been no different. Women like Harriet Tubman guided raids during the American civil war in the 19th century.

Polish women performed crucial roles in the Warsaw uprising against German forces in 1944. And Britain’s female agents in the Special Operations Executive (SOE) assassinated, sabotaged and led resistance forces in the second world war.

A portrait image of Odette Hallowes.
Odette Hallowes joined the Special Operations Executive in 1942 and was sent to occupied France to work with the French resistance.
Imperial War Museums / Wikimedia Commons

Yet these women are largely remembered as exceptions, having performed extraordinary roles due to wartime necessity, rather than as proof of a long tradition of competence and ability under fire. Their stories remain at odds with the wider war narrative in a culture that is uncomfortable seeing women as combatants.

This was evident in Britain following the second world war, which saw the largest mobilisation of women for war work in history. Women were called upon to carry out a variety of war roles, including pilots and anti-aircraft gunners. Some women even parachuted into occupied territories as secret soldiers.

These roles allowed women to bypass the combat taboo. Yet they were still regarded as temporary, effectively excluding them from the broader war story. After the war ended, there was a strong push in Britain for women to return to traditional roles as housewives and mothers.

This was not new. Following the first world war, the 1919 Restoration of Pre-War Practices Act forced women out of the jobs they had taken during the war so that returning soldiers could be reinstated. There was no similar law following the second world war, but the government and media still encouraged women to leave working roles and focus on home life.

Magazines promoted the idea of the perfect homemaker, with Christian Dior’s 1947 “new look” fashion collection reinforcing a nostalgic vision of femininity that symbolised the broader cultural return to pre-war gender norms.

Some women welcomed this return to gendered ideals, others resisted. Pearl Witherington, an SOE agent who commanded 3,500 Maquis resistance fighters in France, was recommended for a Military Cross medal following the war. But, as a woman, she was not allowed to receive it.

Witherington refused a civil MBE honour when offered it instead, writing in a letter to Vera Atkins, an intelligence officer in the SOE: “The work which I undertook was of a purely military nature in enemy occupied country … The men have received military decorations, why this discrimination with women when they put the best of themselves into the accomplishment of their duties?”

Witherington became so important in Nazi-occupied France that the Germans put up posters offering one million francs for her capture. The reluctance to recognise her achievements shows how women’s military service was quietly stripped of its combat significance in the post-war years.

Excluding women no more

Modern conflicts have made the exclusion of women’s presence in war increasingly untenable. Insurgencies, as well as cyber and drone warfare, mean the boundaries between combatants and non-combatants have become much more blurred. Many wars nowadays no longer have clear frontlines, making it harder to distinguish between those who fight and those who don’t.

The increasing complexity of modern battlefields has demanded broader thinking and adaptability beyond traditional combat practices. This shift has contributed to the adoption of gender-neutral military standards and the more widespread inclusion of women in combat roles in many armies.

A female soldier in the Ukrainian army with a Ukraine flag wrapped around her.
Women are serving on the frontlines in Ukraine.
Dmytro Sheremeta / Shutterstock

The British Army has employed gender-neutral physical standards for combat roles since 2019. Male and female recruits must pass a 4km march carrying 40kg of equipment in less than 40 minutes, followed by a 2km march carrying 25kg of equipment in under 15 minutes.

The Australian Defence Force has adopted similar standards since 2017, while the Canadian military has been employing women in combat roles for 25 years. As a former combat medic, I support this approach.

War has always been a test of human skill and courage, not of gender. A bullet doesn’t care which body it shatters and nor should history.

The Conversation

Ashleigh Percival-Borley does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Doubts about women in combat don’t stand up to history – https://theconversation.com/doubts-about-women-in-combat-dont-stand-up-to-history-268589

Teenagers are preparing for the jobs of 25 years ago – and schools are missing the AI revolution

Source: The Conversation – UK – By Irina Rets, Research Fellow, Institute of Educational Technology, The Open University

Matej Kastelic/Shutterstock

The government has recently released its national youth strategy, which promises better career advice for young people in England. It’s sorely needed: for teenagers today, the future of work probably feels more like a moving target than a destination. Barely three years after ChatGPT went mainstream, the labour market has already shifted under young people’s feet.

In the US, job postings for roles requiring no degree have dropped by 18% since 2022, and roles requiring no prior experience by 20%. Administrative and professional service jobs – once key entry points for school-leavers – are down by as much as 40%.

While headlines often warn of looming mass job losses due to GenAI, the reality is more complex. Jobs are not simply disappearing but transforming, and new kinds of jobs are appearing.

Research has projected that the adoption of new technologies will displace around two million jobs in the UK by 2035. However, this loss is expected to be offset by the creation of approximately 2.6 million new roles, particularly in higher-skilled occupations and healthcare roles.

Despite a transformed job market, OECD data from 80 countries shows that most young people still aim for traditional roles – as architects, vets and designers as well as doctors, teachers and lawyers – even as demand rises in digital, green and technical sectors. One-third of students in the OECD survey said school has not taught them anything useful for a job.

Students from more disadvantaged backgrounds are hit hardest. They engage less in career development activities, have less access to online career information and are less likely to recognise the value of education for future transitions.

Meanwhile, the very skills young people say they lack – digital skills and being informed, followed by drive, creativity and reflection – are the ones the labour market now demands.

The workforce challenge is, fundamentally, an education challenge. But schools aren’t keeping up with the world students are entering. Despite unprecedented labour-market change, teenagers’ career aspirations have not shifted in 25 years.

While older students and graduates often have networks or some workplace experience to fall back on, school-leavers do not. Yet they need to prepare for a future in which the labour market is changing faster than ever.

Future-proof skills

Young people are told they need “skills for the future”. But the evidence about which skills matter is messy, uneven and often contradictory.

A few things are clear, though. One is that digital and AI-related skills now carry significant premiums. Workers with AI or machine-learning skills earn more, and early evidence suggests that GenAI literacy can boost wages in non-technical roles by up to 36%.

Cognitive skill requirements have also surged. Critical thinking, prompt engineering – the ability to ask the right questions and provide clear, context-rich instructions to AI tools to obtain relevant results – and evaluating AI outputs are increasingly valued.

Boy with laptop looking stressed
School leavers are likely to need AI skills in the job market.
MAYA LAB/Shutterstock

However, not everything can be outsourced to AI – especially numbers. While large language models (LLMs) excel at text, they do not perform as well on quantitative tasks that involve pattern detection or numerical reasoning, although this may change with new LLM models. This makes strong numeracy a growing advantage for humans, not a declining one.

Creativity and empathy also matter – even though AI is everywhere. The future paradox is clear: young people are expected to adapt to AI systems while also offering the human qualities that machines cannot. They must be data-savvy and emotionally intelligent, digitally fluent and genuinely collaborative.

It doesn’t help that even employers are confused. Many organisations, especially small and medium-sized businesses, may not fully understand which AI-related skills they need or how to identify them. This confusion shows up in job ads, which shape who applies and who is excluded.

My research with colleagues shows, for example, that language describing jobs influences the gender and racial makeup of applicants. Ads emphasising flexibility and caring qualities tend to attract more women, reinforcing workforce segregation. If employers do not know what skills they need, or what signals they are sending, it is unreasonable to expect schools to fill the gap alone.

Identifying demand

The UK lacks a coordinated national labour market information system that could help schools, policymakers and employers see – in real time – where demand is emerging.

Preparing teenagers for the future cannot be left to a single careers lesson or a one-off talk from a visiting employer. Nor can it rely solely on career advisers operating in isolation.

A whole-school approach, supported by the wider employment and labour-market ecosystem, would make a significant difference. This means linking every subject to real-world skills and careers, and every student routinely encountering employers, workplaces and skills-building opportunities. Teenagers need up-to-date information and advice about higher education and careers, and support that challenges stereotypes and barriers.

This is not about telling students there is a “right” job or a single future path. It is about giving them tools to navigate uncertainty with confidence.

Young people need schools that understand the world they are entering, and employers who understand what they are asking for. Most of all, they need systems that recognise the future of work has changed – and help them change with it.

The Conversation

Irina Rets does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Teenagers are preparing for the jobs of 25 years ago – and schools are missing the AI revolution – https://theconversation.com/teenagers-are-preparing-for-the-jobs-of-25-years-ago-and-schools-are-missing-the-ai-revolution-270630

Christmas at the end of the world: the curious allure of festive apocalypse films and TV

Source: The Conversation – UK – By Andrew Crome, Senior Lecturer in History, Manchester Metropolitan University

Navigating the chaos of Christmas celebrations can feel a bit like fighting through the battle of Armageddon. Yet while it might be tempting to escape this with a hot chocolate and another viewing of Love Actually, Christmas films needn’t be jolly.

Each year brings its share of snowbound action films and bauble-laden slasher movies. But some filmmakers choose to take things a step further – to the apocalypse. If you find yourself longing for the end of civilisation as December 25 nears, fear not – film and TV have you covered.

This link isn’t as counterintuitive as it might seem. In the Christian church calendar, the lead up to Christmas is supposed to heighten anticipation for Christ’s return. The theme of apocalypse resonates through some of the best-known Christmas images: Sandro Botticelli’s famous Mystic Nativity(1500), for example, depicts the birth of Christ along with scenes from the Bible’s Book of Revelation. An inscription declares that the artist was living through “the second woe of the Apocalypse”.

My research has explored how and why popular culture might use Christmas when depicting “the end”. Like the Ghost of Christmas yet-to-come, I can therefore point you in the direction of some of the best festive end-times stories.

The trailer for Night of the Comet.

Some seasonal horror films use festive settings to add a lighter, playful touch. In the December zombie apocalypse Night of the Comet (1984), Christmas trees and Santa suits appear amid the chaos, while Scottish musical horror Anna and the Apocalypse (2017) creatively turns giant candy canes into weapons against the undead.

Others use the holiday to generate strong emotions. As the most widely celebrated cultural festival in the west, depictions of Christmas have an obvious emotional appeal. This is why the imminent destruction of Earth sees families recreate Christmas celebrations at all times of the year as they await the end, as in Last Night (1998) or the Netflix animation Carol and the End of the World (2023).

TV shows, from Fear the Walking Dead (2015) to the comedy The Last Man on Earth (2018), have depicted characters drawing strength from memories of festivities or attempting to recreate a post-apocalyptic Christmas.

This reflects religious studies researcher Christopher Deacy’s observation that even secular visions of Christmas often contain a sense of “eschatological hope” – the desire to enter a transformed, ideal and perfected world.

While in English the word “apocalypse” suggests catastrophe or extinction, in Greek the term signifies a “revelation” of reality on both a personal and cosmic level. A last Christmas, therefore, serves as revelatory for characters – as they realise what truly matters to them beyond their own needs, fulfilling one of the classic functions of an apocalyptic story.

Christmas after the bomb

Ancient depictions of the apocalypse, like the Book of Revelation, often sought to confront readers with the horrors awaiting those who did not repent. In apocalyptic media, Christmas can serve a similar, confrontational role. The 1939 animated, Oscar-nominated short Peace on Earth depicted humanity’s destruction through endless warfare, with animals rebuilding a new world after discovering the Bible and the hope of Christmas.

Hanna-Barbera’s 1955 remake, Good Will to Men (also Oscar-nominated), heightened the Dickensian festive imagery before delivering an even more devastating vision, as an elderly mouse graphically recounts humanity’s annihilation by the atomic bomb.

Although Christmas survived the fallout in this instance, in British productions it wasn’t so lucky. The haunting portrayal of the first Christmas after the bomb, in Peter Watkin’s 1965 docudrama The War Game, showed an unshaven and haggard vicar playing Silent Night on a gramophone to traumatised survivors. The carol’s lyrics about hopeful birth and childhood are undercut with narration revealing the fate of survivors – a mother who will give birth to a stillborn child, a child who will be bedbound until death, and other youngsters expressing their desire to die.

Hanna-Barbera’s Good Will to Men.

Even grimmer is the brief festive scene in the BBC’s notorious 1984 nuclear apocalypse film, Threads. A group of shattered survivors sit in silence around a fire, the only soundtrack a baby’s wails in a grim parody of the nativity scene. The on-screen caption identifies the date only as December 25, rather than as Christmas Day. The festival has ceased to exist here; it is a day of subsistence survival like every other.

Perhaps the bleakest depiction in recent years belongs to 2021 British black comedy Silent Night. When a group of British families gather in the country to celebrate Christmas, it slowly becomes apparent they are awaiting certain death at the hands of climate catastrophe on Boxing Day. Armed with government-issued suicide pills and a special “Exit” app, the cosy festive stylings of the majority of the film are replaced by toxic fogs, horrifying injuries and parents euthanising their own children.

These apocalyptic scenarios are what researchers have described as “avertive”. They portray a horrifying future to encourage viewers to fight against it, whether encouraging protest to nuclear proliferation or environmental destruction.

Although this sort of vision might not seem particularly festive, it has a deeper root in Christmas storytelling than we might think. After all, when Scrooge is shown Tiny Tim’s death in A Christmas Carol (1843), or George Bailey the corruption of Pottersville in It’s a Wonderful Life (1946), it is precisely to avert those horrific visions from becoming a reality.

So why not wrap up in a warm blanket, grab a mulled wine, and settle down to consider the end of everything – and your role in it – this Christmas? On second thoughts, maybe Love Actually doesn’t sound so bad.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

Andrew Crome does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Christmas at the end of the world: the curious allure of festive apocalypse films and TV – https://theconversation.com/christmas-at-the-end-of-the-world-the-curious-allure-of-festive-apocalypse-films-and-tv-271025