Following recent antisemitic violence and aggression, calls from some quarters for a temporary ban on pro-Palestine marches have gained traction. Conservative party leader Kemi Badenoch has firmly supported a ban, while
Keir Starmer, the prime minister, has suggested that some protests may need to be stopped. The government’s independent reviewer of terrorism legislation has called for a moratorium on such marches.
Those who have made such calls do so on the grounds that pro-Palestine marches, whatever their intent, are contributing to a “tone of Jew hatred within our country”, in the words of Chief Rabbi Sir Ephraim Mirvis. Starmer has also expressed concern about the “cumulative” effect of the marches on Jewish communities.
This is an understandable position in some ways. There can be little denying that some participants in pro-Palestine events have articulated antisemitic positions. And in a period where more clearly needs to be done to address antisemitic violence and aggression, a ban appears to provide a way for authorities to send a clear message that there is no place for antisemitism in Britain today.
Yet there are also problems with such proposals. As policymakers consider their options, it is important that these problems are taken seriously.
Evidence on the relationship between protest activity and targeted violence outside of the protest arena is limited. The available evidence points to a complex and context-dependent relationship.
Some studies have found that when protests increase, extremism and extremist violence can also rise, especially when society is more divided. Such a pattern has been observed, for example, in the US, where the bipartisan thinktank the Center for Strategic and International Studies identified heightened protest activity and rising domestic terrorism during the early 2020s.
However, many studies of nonviolent protest show that it reduces political violence, by providing nonviolent means of pursuing social and political objectives.
Where heightened protest activity coincides with increased extremist violence, it is often unclear whether protests or marches themselves are the cause. Today, people participating in social movements are likely to access and share information through a range of (often unregulated) spaces both offline and online. It is difficult to assess how important protests themselves might be in influencing people to go on to engage in targeted violence.
This is not simply academic nitpicking. It means that it is possible that a ban on marches would have little to no effect on the use of targeted violence against Jewish communities.
In fact, there is a distinct possibility that banning pro-Palestine marches, even if only temporarily, might actually increase violence.
Studies show that violence is less likely to escalate when moderate groups within protest movements are present and have influence. This has been observed, for example, in research into the escalation or inhibition of violence during waves of far-right protest.
Expanded state repression – such as bans on certain forms of previously legal protest – can weaken the position of moderate factions. When this happens, calls for restraint and advocacy of non- or less-violent strategies can lose credibility within the movement, weakening the “internal brakes” on violence.
Practicalities of enforcement
A moratorium on pro-Palestine marches would also raise many questions about the practicalities of any restrictions. For one, calls on the police to ban other contentious demonstrations that risk hostility towards different groups would increase.
What particular types of action would be banned? Marches? Demonstrations? Would size be a factor? Would it cover a protest against the ban on the protest? What about other forms of action such as sit-ins, information stands or coordinated online action? And what sanctions would be imposed on those who did not comply?
Attempting to enforce such bans could become a significant drain on already stretched public resources, not least because activists would probably seek to increase pressure on authorities because of those costs. This is one of the most obvious lessons to draw from responses to the government’s attempts to ban the group Palestine Action.
In addition to this, police have also recently been authorised to consider the “cumulative impact” of protests on local areas when policing. They have had to grapple with how and when to incorporate this in addition to their usual powers.
Before introducing a ban, it’s important to think about the example it would set and how it could influence future decisions about the right to protest. The UK would be less able to criticise authoritarian countries and illiberal democracies that misuse counterextremism and counter-terrorism powers that limit people’s freedom.
None of this is to deny the urgency of confronting antisemitic violence and aggression in the UK. This requires sustained political commitment, effective policing and community protection. But restricting the right to protest is a blunt and risky instrument.
The available evidence suggests it may do little to reduce harm and could, in some circumstances, make matters worse. Politicians should therefore be cautious before treating bans on marches as a solution to complex and deeply rooted problems.
Joel Busher has received funding from the Centre for Research and Evidence on Security Threats (CREST) for his work on the escalation and inhibition of political violence.
Tufyal Choudhury does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
The upcoming Senedd elections may shift the balance of power in Wales. Any new government must immediately grapple with the significant ongoing challenges of embedding educational reforms across the additional learning needs system.
Recent policy proposals to change the system of support for children with special educational needs in England have brought a heightened focus on how education systems might best support all learners. In Wales, special educational needs and disabilities are referred to as additional learning needs (ALN).
Wales reached a major milestone in August 2025 when the ALN code came fully into effect, four years after its publication.
Despite the devolution of education and increasing divergence in education policy between Wales and England, the ALN code in Wales shares some similar ambitions to England’s recent policy plans.
These reforms in Wales sought to increase the rights and autonomy of children and young people. They provide statutory individual development plans for those needing anything additional to universal learning provision. They also extend support for learners aged up to 25. The intention is to improve consistency and strengthen multi-agency collaboration across education, health and social care.
The progress of reform
The additional learning needs reforms in Wales reflect a commendable shift towards rights-based, person-centred planning and autonomy for children and young people and their families.
This is also a key tenet of the Curriculum for Wales. This has been implemented since 2022 in primary schools, and gradually over subsequent years in secondary schools. The curriculum framework has a focus on learner voice and providing a broad, purpose-led and flexible curriculum. It is designed to ensure that even those from disadvantaged backgrounds or with complex needs are supported to access a meaningful education.
A key issue relates to the identification of learners with ALN. Under the new system, there has been a 53% decrease in the number of learners being identified as having ALN. This is despite a reported increase in children presenting with more complex needs, indicating that learning needs may in fact be increasing. Data also suggests that it is those with low to moderate needs who are much less likely to be formally identified.
It has been suggested that this reduction could be due to children who might previously have been identified with ALN being catered for through an improved universal offering.
However, teachers have reported that the proportion of learners in their classes with ALN has increased over the past five years. A majority – 65% – of teachers in Wales reported that there were learners in their classes who still needed additional support, but were no longer identified as having ALN following changes in identification criteria.
Lacking resources
This has caused hugely increased workloads in attempts to provide adequate learner support. At the same time, the number of in-house specialist staff to advise and support delivery has dramatically reduced. Without the resources to support more learners with additional needs, many teachers have reported that children are often not receiving the education they are entitled to.
There have been significant strides towards developing inclusive schools across Wales. Even in the best cases, though, there is a long way to go. In reality, the overall picture behind the reduction in identification of ALN indicates issues with identification criteria and resources, and whether the current policy encompasses all children in need, rather than a sudden shift to high quality inclusive education.
Schools report an increase in local authorities refusing requests for assessments or access to support for struggling learners. They have suggested the bar is being raised for access to support, without clarity or transparency. There’s also a clear indication from specialist staff in Wales that they have insufficient time to fulfil their ALN duties.
This suggests that processes and resources for identifying learners with ALN are playing a significant part in the reduced identification. Many learners could be slipping through the net, rather than experiencing effective inclusive provision.
This tension between policy intent and practice is familiar territory when it comes to inclusion. There are ongoing concerns that legislative reform has outpaced operational readiness and available resources, leading to a crisis point.
This crisis is exacerbated for Welsh-medium learners. The policy intention is for a fully bilingual system. But finding Welsh-medium specialists and honouring language preference is proving challenging. This has lead to families struggling to find support in their preferred language. Such battles are at odds with both Welsh Language policy and the principles of Person-Centred Planning and autonomy that are central to the reforms.
Whatever the outcome of the Senedd elections, educators and families across Wales will be hoping for an increased sense of momentum and urgency. They’ll also be looking for a commitment to sustained and appropriate levels of funding to ensure learners in Wales can be supported to access their education.
Emily Roberts-Tyler does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Four humans recently looped around the Moon. Their vessel, an Artemis capsule, was a thin metal shell whose life-support system kept them alive: it provided a carefully balanced atmosphere, a closed water loop, a finite supply of food and a means for disposing human waste. The life support was not optional. It was a necessity.
Consider this: not once in the history of human spaceflight has an astronaut been known to tamper with their life support system. No one has ever decided to vent some oxygen for fun. No one has argued for a personal right to increase their CO₂ output. Sabotage is unthinkable – socially intolerable. Their fellow crew members and mission control would intervene immediately.
Now consider Earth.
We are doing to our planetary life support what no astronaut has done to theirs. We are damaging it – venting carbon, acidifying the oceans, stripping topsoil and collapsing biodiversity – not maliciously, but with a shrug. It is legal. It is profitable. And in most circles, it is entirely socially acceptable.
The Victorian novelist George Eliot would have understood why. In Middlemarch, she showed us a town that preferred a satisfying, simple myth (that a charismatic quack can cure ills) over difficult, complex truths (the role of germs, statistics, slow systematic change). Humans, she argued, do not naturally reach for what is true. We reach for what is near, simple and emotionally rewarding.
Climate science is the anti-myth. It is delayed, diffuse, impersonal and global. It asks us to change behaviour today for a benefit that will arrive decades away, elsewhere on the planet, for people we will never meet.
The Artemis crew members live by a different narrative. They are guided by a simple, undeniable truth. That they are in a small, fragile vessel. The life support is essential. Damaging it is not an option.
Often people don’t treat planet Earth as a precious life support system. Gorodenkoff/Shutterstock
Earth is a vessel too. It is just larger, its support systems less visible, and the consequences of damage slower to arrive. As the economist Kenneth Boulding argued 60 years ago, we must learn to see our planet as a closed system – not an open frontier.
What narrative could protect Earth like it protects astronauts?
Not a policy paper. Not a carbon tax (though we need those). A story.
We have candidate myths already. None is perfect, but each is more powerful than the cold scientific facts.
The one pane of glass narrative outlines that Earth is not a planet we live on. It is a pressurised cabin with a single irreplaceable window. Every tonne of CO₂ scratches a crack in that glass. You wouldn’t hammer the Artemis capsule window. Why do it here?
The blood of the body myth portrays the biosphere not as nature but as the collective and extended organ system of humanity. Deforesting the Amazon and burning oil are not business as usual, they are acts of self-harm.
The crew of the damned narrative hinges on the concept that you are not a consumer. You are a temporary tenant on a multi-generational voyage. Nature and the previous shift built the vessel. The next shift will inherit it. To degrade Earth’s systems is to defile the ancestors and curse the children. That is not a crime. It is a sin that will outlast your name.
None of these stories will work if they remain metaphors. They become common sense only when they are visibly, socially and economically enforced – when a CEO who opens a new coal mine is treated with the same universal horror as an astronaut reaching for the oxygen valve.
Imagine every human decision – personal, professional, political – tested against one simple question: “If we were in a capsule looping around the Moon, would this be a safe use of our shared life support?”
Repeated sufficiently, the right conclusion would become habitual. For those resisting, the rest of the crew would intervene. On Earth, there is no mission control – only us.
Chris Rapley does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Linus Pauling was one of the most brilliant scientists of the 20th century. He won two Nobel prizes and transformed our understanding of chemical bonds and the structure of proteins. Late in his career, though, he became famous for something very different: a passionate belief that very high doses of vitamin C could help people with cancer. Many doctors scoffed. When Pauling himself later died of cancer aged 93, he was held up as a classic case of the “halo effect”: being a genius in one field doesn’t guarantee wisdom in another.
Half a century on, the story looks more complicated. Pauling was wrong in important ways, but he was not entirely wrong. Modern research is giving vitamin C a second look in cancer, and it turns out that under certain conditions it can behave less like a gentle vitamin and more like a drug.
Pauling’s vitamin C story began in the 1970s, when he teamed up with the Scottish doctor Ewan Cameron and gave patients with advanced, incurable cancer very large amounts of vitamin C – first as a drip into a vein, then as tablets. Compared with similar patients who did not get vitamin C, they reported that the vitamin‑treated group lived longer and felt better. For some, they suggested, survival could be several times longer.
Two large trials, run by the Mayo Clinic, a leading non-profit medical centre in the US, then put this to the test. The results were clear: there was no benefit.
Patients who took vitamin C pills lived no longer than those who didn’t. For most oncologists, that was the end of the matter. Vitamin C was filed away with other “alternative” remedies, and Pauling’s late-career crusade was widely seen as a sad mistake.
What neither trial’s critics nor defenders noticed at the time: Pauling and Cameron had started with vitamin C into a vein; the Mayo Clinic trials used tablets only. That matters because the gut can only absorb so much vitamin C. Once you reach a modest daily dose, the body simply stops taking in much more. Swallow as many tablets as you like, and the level of vitamin C in your blood levels off.
By contrast, a drip into a vein can raise blood levels to tens or even hundreds of times higher than tablets ever could. At those extreme levels, vitamin C starts to behavedifferentlyinside the body.
At everyday levels, vitamin C acts as an antioxidant: it mops up harmful molecules and protects our cells. At very high levels, especially around tumours, it can flip roles.
In laboratory studies, high-dose vitamin C helps generate hydrogen peroxide, a reactive substance that can damage cells. Cancer cells seem especially vulnerable because they are already under stress. They grow rapidly, often in areas with poor blood supply, and produce lots of reactive molecules of their own. Their internal “cleanup” systems are stretched thin.
Add a sudden pulse of hydrogen peroxide and many cancer cells tip over the edge: their DNA and energy machinery are damaged and they die. Normal cells, which are under less strain and have better defences, are more likely to survive. In this way, very high doses of vitamin C behave less like a daily supplement and more like a weak, selective chemotherapy drug. Crucially, the doses needed for this effect cannot be reached with tablets.
What the latest evidence shows
In people, the evidence is still early and mixed. Small trials have given high-dose vitamin C through a vein to patients with hard-to-treat cancers such as ovarian, pancreatic or brain tumours. So far, many patients can receive large doses several times a week without serious side-effects. Problems can occur, especially in people with poor kidney function or rare inherited conditions, which is why this is not a harmless wellness drip to be sold on the high street.
A few studies suggest that adding vitamin C infusions to chemotherapy may help some patients live a little longer or help with side-effects, but other studies show no clear benefit. The trials are small and varied, so we cannot draw firm conclusions.
One consistent signal is quality of life: patients given vitamin C alongside chemotherapy often report less fatigue, less pain and fewer side-effects, such as nausea. For someone with advanced cancer, that matters, even if it is not the sweeping cure Pauling once promised.
Lab work also hints at subtler roles. Vitamin C is involved in enzymes that influence how our DNA is “marked” and read, and in how cells divide and respond to low oxygen – important in cancer behaviour.
In some experiments, high vitamin C levels make cancer cells grow less aggressively and make them more sensitive to treatment. There are even early suggestions that vitamin C may help the immune system recognise and attack tumours, though this remains speculative.
Partly right
So, was Pauling right after all? The fairest answer is that he was partly right, for reasons he did not fully understand, and he exaggerated the promise. He was wrong to promote vitamin C tablets as a powerful cure for cancer. Large, careful trials have not found that swallowing high-dose vitamin C helps people with established cancer live longer. He was also wrong to present vitamin C as a near-universal remedy for many illnesses.
But he was not entirely wrong to suspect that vitamin C might have a special role in cancer treatment. He sensed, long before we could prove it, that very high doses given into a vein would behave quite differently from ordinary supplements.
Modern research has confirmed that intravenous vitamin C reaches much higher levels in the blood and has distinct biological effects. What we do not yet have are large, definitiverandomised trials showing that high-dose intravenous vitamin C clearly prolongs life for most cancer patients. Until we do, it should be seen as experimental – promising enough to study, but not proven enough to replace standard therapies. Any use belongs in clinical trials or in carefully supervised medical settings, not in clinics selling expensive “immune boosts”.
The “vitamins in cancer” story continues to evolve. If the story of vitamin C and cancer teaches us anything, it is that science rarely moves in straight lines. A bold idea, some flawed early studies, a fierce backlash – and then, years later, a quieter, more careful return to the question.
Pauling may never be fully vindicated, but neither was he simply deluded. In his enthusiasm, he may have glimpsed a sliver of truth long before the rest of us knew how to look for it.
Justin Stebbing does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Source: The Conversation – UK – By Giulia De Togni, Chancellor’s Fellow, School of Population Health Sciences, University of Edinburgh
The robot pauses at the edge of the room as an engineer checks its sensors. Then, with a soft mechanical hum, this humanoid machine begins to move. It lifts a mannequin from a bed, slowly and carefully. The engineers hold their breath.
I am in a robotics lab in Tokyo, Japan, as part of my Wellcome research fellowship. The engineers have repeated this test hundreds of times over several weeks, with mixed results.
Japan has one of the world’s oldest populations, and a strained health and care workforce. It has also long been the global leader in the development and deployment of care robots.
Government-led initiatives such as Society 5.0 and Moonshot promote a “super-smart” society in which, by 2050, robots could be integrated into everyday life. One early example is the impending trial of humanoid baggage handlers at Tokyo’s Haneda airport.
In a care sector that is globally under pressure, different types of robot – from humanoid and pet-like “companions” to more straightforward mechanical aids – could prove useful. Some help lift people, reducing physical strain on care workers. Others remind a patient to take medication, support rehabilitation exercises, and monitor their vitals.
However, my research shows there is still a big gap between staged robotic demonstrations and everyday reality.
Uniquely human skills
Many of the robots I observed were tested in carefully controlled environments. Floors were cleared, lighting was adjusted, engineers stood nearby ready to step in. In some cases, the robots’ actions were partly, if not entirely, controlled from a distance.
In contrast, real care environments are busy, unpredictable and crowded. People move suddenly. Their needs change from moment to moment. Technologies that work well in labs still struggle in these settings.
A carer can notice a change in someone’s mood and adjust how they speak. They can offer comfort without being asked. These are uniquely human skills. As one family caregiver put it: “The promise of robotic care is practical, but the experience of care is emotional – that’s where the tension lies.”
Video: Reuters.
Some family carers and professional careworkers welcomed the idea of robotic assistance, especially for physically demanding tasks like lifting. Others worried that too much reliance on machines could make care feel impersonal.
“To some older adults, these technologies are helpful tools,” said one careworker. “To others, they feel confusing, frustrating – a glimpse of a future they never asked for.”
Such perspectives are often missing from media narratives that focus on robot success stories. In Japan, these are shaped by government strategies and economic priorities. Innovation is never neutral, reflecting political agendas about how society should respond to ageing and labour shortages.
The challenges over care that societies face are not only technical but social, ethical and cultural. They raise questions about what care should be, how it is valued, and what kind of future we want. “Among families and caregivers, hope and hesitation sit side by side,” a technology developer told me. “Efficiency is often welcome, but not at the cost of losing the human touch.”
The future for care?
While Japan has been successful in exporting socially assistive robots such as Paro (a therapeutic robotic that resembles a baby seal) and the humanoid Pepper, China is rapidly expanding the market with more affordable, mass-produced technologies and humanoid innovation.
However, we are still a long way from the vision of care robots feeding, washing and otherwise supporting people in the way human carers do every day. Participants in my research, including technology developers, all agreed that robots should never fully replace human carers.
Technologies that assist with lifting, mobility and routine monitoring are the most likely to become widely used and ethically and socially accepted. In these areas, robots can complement human care rather than try to replace it.
Care is, at its core, a deeply human activity, not just a series of programmable tasks. It relies on relationships, trust and mutual understanding. Robots may support these processes, but they cannot replace them.
Additionally, some technologies are likely to remain expensive, available mainly to well-funded care homes or private users. This raises issues about access to good-quality care.
Care robot developments in Japan show what can be achieved through sustained investment and political support. But they also shed light on the large amount of work needed to ensure responsible research and innovation practices in this area.
The real question is not just what robots can do. It is what kind of care we want in the future – and how technology can support it without deepening inequalities, limiting access to good-quality care, and losing the power of human touch.
Giulia De Togni received funding from the Wellcome Trust. She is affiliated with the University of Edinburgh.
Sir David Attenborough has mastered the craft of storytelling. He has undoubtedly inspired generations of people around the globe to love and care for the natural world. And in doing so, he’s become one of the most recognisable – and most trusted – faces on our screens.
Now, he’s celebrating his 100th birthday and a lifetime of wildlife filmmaking. As part of The Conversation UK’s climate storytelling strand, four experts critique how he has influenced everything from conservation and documentary production to the communication of the biggest story of all – climate change.
Scientific insight
Ben Garrod, science broadcaster and Professor of Evolutionary Biology and Science Engagement at the University of East Anglia, has presented alongside Attenborough in several landmark documentaries. Here he reflects on Attenborough’s passion for furthering our scientific understanding of the natural world.
I once sat on a remote beach with Attenborough, near the very tip of South America. I can still clearly remember the warmth of the rounded, flat stones beneath me. We sat only a metre or so apart. We’d just spent the morning filming the excavation of the largest dinosaur ever discovered.
Over lunch, Attenborough had recalled we were close to a beach he’d filmed at years before, where grey whale mothers drew in close to shore with their calves to rub against the stone in the shallows to exfoliate their skin. As luck would have it, it was the perfect time of year and before long, there we were watching a mother and calf just a few metres offshore.
Facts and figures bubbled out of Attenborough excitedly, not at all like the calm and more measured way we’re all so used to. For those few minutes, he was childlike in his wonder and excitement at the scene in front of us and I marvelled at how he has not only maintained that love for the natural world for so long but how he has always so passionately shared it with the rest of us.
For a century now, Attenborough’s life has been intimately interwoven not only with humanity’s growing scientific understanding of the natural world but also its
accelerating loss. Spanning over 70 years, Attenborough has been our most trusted and prolific mediator between scientific knowledge and the public.
His early landmark BBC series Life on Earth: A Natural History (1979) did something few academic texts ever could. It made the complexity of evolutionary biology accessible. Across his work, natural selection, adaptation, ecology and behaviour are not presented as intangible concepts but as organic processes shaping form, function and ultimately survival across the natural world.
In doing so, Attenborough helped normalise evolutionary thinking for hundreds of millions of viewers worldwide, embedding complex scientific principles into popular culture, right in our living rooms.
Sir David Attenborough and Professor Ben Garrod spending a day at Norfolk Wildlife Trust.
Central to his work has been a commitment to scientific accuracy. Attenborough’s
programmes have been developed in close collaboration with academics and field
researchers, ensuring narratives about animal behaviour, ecosystems and
biodiversity reflect current evidence.
This relationship between science and storytelling has been crucial because rather than dumbing down complexity, Attenborough’s “everyday” approach demonstrates audiences can engage with content that could all too easily be written off as belonging to more academic and scientifically literate viewers.
The Insights section is committed to high-quality longform journalism. Our editors work with academics from many different backgrounds who are tackling a wide range of societal and scientific challenges.
Yet the tone of his work has changed. His early documentaries were
characterised by a sense of abundance and discovery. Over time, as scientific
evidence for biodiversity loss and climate change mounted, his work shifted
accordingly. More recently, his documentaries increasingly shine a light on
human impact, habitat destruction and extinction risk. This evolution of change in his own tone mirrors the science itself, highlighting Attenborough’s credibility as a communicator willing to adjust his message as the evidence demands.
Attenborough’s contribution to conservation has not come through activism alone. Research shows that an emotional connection to nature precedes any behavioural change. Attenborough has actively helped build the public conditions necessary for conservation policy and action by fostering wonder, curiosity and empathy for the natural world. His influence can be traced in the generations of scientists, conservationists and educators who cite his programmes as formative experiences.
For many, particularly those without access to wild spaces, Attenborough’s work provides an opportunity and gateway to encounter wild animals and remote ecosystems but also local habitats, helping give us all access to the wonder he perceives in the world around him.
As he turns 100, Attenborough’s legacy is surely inseparable from the global environmental challenges we now face. He has helped society understand not only how life evolved, but, more importantly, why it matters that we protect it now. In an era defined by ecological crisis, his work reminds us that scientific knowledge is most powerful when it connects people to the living world so strongly, it compels us to care enough to protect it, so that we might carry on his legacy and, just like him, act as stewards.
Natural history filmmaking
Jean-Baptiste Gouyon, Professor of Science Communication at the UCL Department of Science and Technology Studies, explains the impact Attenborough has had on natural history television.
In the early 1950s, television was taking off across Britain, but the BBC was still finding its visual voice. Its controller, Cecil McGivern, warned in June 1952 that there was “far too much emphasis…on the spoken word and far too little on the thing seen”. Most early television producers had come from BBC radio and initially made programmes that resembled radio with pictures.
Into this world stepped a young David Attenborough, unencumbered by a career in sound, ready to invent a new language for television and, in the process, reshape natural history filmmaking. At 26, he earned his first natural history credit as producer of The Coelacanth (1953), a 20-minute programme prompted by the capture of a live coelacanth “living fossil” fish off Madagascar.
Eschewing sensationalism, Attenborough tied the story to Darwin’s theory of evolution. This use of wildlife programmes to communicate scientific ideas became his trademark.
The programme blended prerecorded footage with live studio sequences featuring evolutionary biologist Julian Huxley, who used the coelacanth to illustrate life’s transition from sea to land.
With the Zoo Quest series (1954), Attenborough began reshaping wildlife television. For these programmes, he travelled to exotic places with staff from the London Zoo to capture animals for the collection. Each episode relied on prerecorded film linked by live studio sequences, allowing tighter narrative control. The hero in the films, shot by Charles Lagus, was Attenborough himself, who back in London also presented the studio sequences. By assuming all the roles of hero, producer, narrator and presenter, Attenborough became the central performer in the story.
From then on, Attenborough’s fluid on-screen performances gained him much acclaim. A very hard worker, he put much effort in producing highly detailed scripts, which left little to chance. Indeed, by the early 1960s, he had all but lost faith in live television, writing to a BBC colleague:
Zoo Quest was one of Attenborough’s early documentary series.
To begin with I got a tremendous kick out of the excitement of putting out programmes live. But it wore off after a bit and really, except for challenging interviews with lots of ‘immediacy’, I’m for film or some other sort of controlled recording process every time. It is so maddening to miss an effect because of some small mechanical hitch, as so often happens live.
Consistently high ratings encouraged others to emulate his method, and live formats became less fashionable. Film-based production also allowed programmes to be stockpiled, repeated and sold, supporting a more sustainable business model.
After Attenborough moved into BBC management in 1965, his goal was to turn natural history television into a science communication genre. He argued that it was “important” to move away from programmes that simply showcased the beauty of nature and instead engage viewers “to examine in a serious and critical way new trends and ideas in zoology”. Returning to hands-on programme-making a decade later, he embedded this vision in his magnum opus, Life on Earth (1979).
Attenborough looks back on filming Life On Earth.
In the early 1950s, when Attenborough joined the BBC, natural history television had been mostly conceived of as a specialist genre catering for amateur naturalists to share in the aesthetic and emotional enjoyment of nature. By the 1980s, he had helped transform it into one of the most popular genres of TV programming and a powerful conduit for science communication. This influence continues in his later work, including Planet Earth II, Blue Planet II and Our Planet, which combine cinematic storytelling with urgent environmental themes.
As he celebrates his 100th birthday, Attenborough’s legacy endures, defining natural history television as one of the most powerful forms of science communication and inspiring generations to look at the living world with wonder and understanding.
Communicating research
Saffron O’Neill researches climate communication and public engagement. She explains the ways Attenborough has shaped climate communication techniques across the world.
Attenborough is one of the few voices on climate change that almost everyone is willing to listen to. Over seven decades, his work has transformed how scientific knowledge is communicated, combining advances in broadcasting with powerful storytelling.
My colleague, PhD researcher Kate Holden, is exploring how young people engage with marine sustainability through online video, from traditional nature documentaries to YouTubers like MrBeast. Attenborough still stands out as an expert young people take seriously.
Part of his appeal lies in his willingness to meet audiences where they are, adapting to changing media habits. He joined Instagram in 2020 (breaking the Guinness World Record for the fastest time to reach one million followers) and has collaborated with Netflix to stream shows.
In recent years Attenborough has worked on programmes for more modern platforms, including Netflix.
Attenborough has shown the power of the media to shape how we see the natural world. Although there is little evidence for the appealing notion that watching a documentary like Blue Planet II directly drives behavioural change (such as reducing peoples’ plastic consumption), nature documentaries can certainly drive both public and policy interest via increased media attention.
Engaging the public on climate and nature requires moving beyond a simple notion of “getting the message across” and towards recognising the complexity and power of storytelling. For this, Attenborough’s success is an invaluable model.
His programmes combine top-class storytelling with pioneering technology. The visual appeal of his richly crafted documentaries is matched by compelling stories about little-known species. His work forms a substantial archive of success – many of the most popular TV programmes of all time are his nature documentaries.
In a highly cited paper from 2007, a team led by environmental social scientist Irene Lorenzoni defined engagement with climate change. They claimed that: “It is not enough for people to know about climate change in order to be engaged; they also need to care about it, be motivated and able to take action.”
Early Attenborough programming focused on increasing peoples’ knowledge about the natural world and as part of this, implicitly providing a reason to care about it. Increasingly though, he has moved to a more explicit stance about the climate emergency and our moral and ethical duty to act. An analysis of Attenborough’s use of language carried out in the late 2010s demonstrates this. It shows how he now uses emotional appeals to action. During an appearance on the Outrage + Optimism podcast he said: “we have an obligation on our shoulders and it would be to our deep eternal shame if we fail to acknowledge that.”
When a communicator like activist Greta Thunberg makes an appeal to morality, it can polarise audiences. Attenborough’s broad popularity makes his message reach wider audiences. His trustworthiness, storytelling mastery and innovative use of technology helps explain why he continues to have such a lasting impact on science and environmental communication, seven decades after his first broadcast.
Speaking up about climate change
Chloe Brimicombe, Climate Scientist at the University of Oxford, explores whether Attenborough’s on-screen attention to the climate crisis could have started earlier.
In his early documentaries, Attenborough focused on the wonder of the natural world.
However, in recent years his beliefs changed with the science and more of his films started to cover climate change directly. For example, Climate Change: The Facts in 2019 and Perfect Planet 2021.
Attenborough’s works are part of the culture of the UK and the world. In my own life Attenborough’s works have always been present. During my undergraduate degree at Aberystwyth University, I was shown Frozen Planet in a lecture about glaciers and ice sheets because my lecturer was featured in the series. That moment stuck with me as I started my career as a climate scientist.
During my PhD in environmental sciences at the University of Reading, my fellow researchers were all big fans of Attenborough and of what could be achieved through the power of documentary film-making. In 2025, I was lucky enough to attend the film premier of Ocean with David Attenborough, something I consider a once-in-a-lifetime opportunity.
As well as inspiring audiences with awe and wonder, documentaries can be an important way to communicate what is happening to our changing climate. They reach audiences that might not otherwise engage on the subject. Documentary making has drawn critique for focusing on a producer’s interest instead of capturing the scientific background behind a certain issue.
This has led to schemes such as the Wellcome Trust Public Engagement Scheme being setup to help bring scientists and documentary makers together.
In Attenborough’s A Life on Our Planet (2020), he talks about the changes he has seen in the natural environment and his concern for the future of the planet. In the film Ocean with David Attenborough, the 2025 premier took place just before the UN’s ocean summit in Nice, France. This helped lead to real policy discussions and changes. That includes supporting the global ocean’s treaty, a landmark international agreement which creates a network of protected ocean sanctuaries.
Attenborough may have been late in communicating specifically on climate change. But, in recent years he has changed to being a strong advocate. Now, it’s time to make sure that message is heard and acted upon so that the world’s wonders remain for many generations to come.
The climate crisis has a communications problem. How do we tell stories that move people – not just to fear the future, but to imagine and build a better one? This article is part of Climate Storytelling, a series exploring how arts and science can join forces to spark understanding, hope and action.
Chloe BrimicombeI is part-funded by the ESRC research council and she is a heat ambassador for shade the UK.
Jean-Baptiste Gouyon received funding from The Leverhulme Trust and the AHRC
Saffron O’Neill receives funding from the ESRC (UKRI3360 and ES/W00805X/1). She also receives funding for C3DS (Centre for Climate Communication and Data Science) from the Children’s Investment Fund Foundation (grant: 2210–08101) and the University of Exeter. The funders had no role in the conceptualization, design, data collection, analysis, decision to publish, or preparation of the manuscript and therefore the findings and conclusions are those of the author and do not necessarily reflect the positions or policies of the funders.
Ben Garrod does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Source: The Conversation – UK – By Heather Ellis, Vice-Chancellor’s Fellow, School of Education, University of Sheffield
The UK government has launched its first review of school food standards in over a decade, alongside plans to extend free school meals to an additional 500,000 children in families receiving universal credit.
Much of the coverage has focused on specific menu changes, including the possible removal of sugary desserts such as steamed sponge. The focus on such changes might be reflective of how school food has never been only about nutrition for those who have experienced it. It is also about welfare, discipline, pleasure, stigma and care.
The School Meals Service: Past, Present – and Future? is a project I worked on that brings together archival research, oral histories and ethnographic work in schools across the UK. We were also the principal academic partner for the Food Museum’s ongoing School Dinners exhibition near Ipswich, which explores the changing history of school meals through objects, menus, memories and tastes – from semolina and sponge pudding to Turkey Twizzlers.
Since school meals were first introduced in legislation in 1906, they have changed repeatedly. Early provision was patchy and often associated with charity. After the 1944 Education Act, school meals became part of the postwar welfare settlement, intended to provide children with a nutritious meal during the school day.
For decades, the classic image of the school dinner was “meat and two veg”, followed by puddings such as sponge, semolina, rice pudding, jam roly-poly or custard.
From the 1980s, the provision of school meals became more fragmented. Nutritional standards were removed, local authorities had more freedom, and commercial catering reshaped menus. Later debates around Turkey Twizzlers and processed food, driven by people like celebrity chef Jamie Oliver, were part of this longer story. Today’s government review of school food standards is another chapter in that history.
What children remember
When people recall school dinners, they rarely talk about calories or guidelines. They remember texture, smell and noise.
Joanne, who attended school in Surrey and East Yorkshire from the late 1960s to 1980, described being served vegetables she could not eat: “Mush. Cold … you can’t have that unless you eat your beans … it put me off for life.”
The dining hall mattered as much as the food. Ella, who went to school in Rotherham from 1996 to 2010, remembered the anxiety of a space where “someone would puke and I would freak out … I can’t be in here”. Lauren, who attended schools in Northumbria and Merseyside from 1998 to 2012, recalled mashed potato that “you could pick up with a fork and it would just stick”.
Stigma, inequality and school food
School meals could also expose inequality. Free school meals have long been a vital safety net, but they have also carried stigma.
Joyce, who went to school in Glasgow in the 1960s, remembered the teacher calling children forward with the phrase “come out the frees”. She described it as “the walk of shame”.
Naomi, who attended school in Birmingham in the 1980s, showed how this could intersect with racism. Her mother paid for school meals despite financial strain because she worried Naomi might be singled out: “there weren’t many Black kids in my school”.
Yet school dinners were also remembered with affection. For many people, puddings such as sponge and custard were the best part of the day. For others they evoke control, compulsion or, like for Joyce and Naomi, embarrassment. That is why the removal of steamed sponge resonates. It is not just dessert. It is part of a shared national memory.
Beyond the menu
The Food Museum exhibition captures this complexity. Visitors encounter the familiar foods, but also the people behind them: pupils, parents, cooks, dinner staff, teachers and policymakers.
The exhibition, which has been shortlisted for a 2026 Museums and Heritage Award, draws directly on our research into how school meals changed over time and why those changes mattered socially, economically and culturally.
Today’s reforms emphasise healthier ingredients, more fruit and vegetables, fewer fried foods and less sugar. These aims matter. History and our research suggests what is served matters. So do the dining hall, the queue, the noise, the payment system, the stigma, the pleasure and the memories children carry into adulthood.
School dinners are one of the most widely shared experiences of British childhood. As they continue to evolve it is worth considering not just what is on the plate, but how it feels to eat it.
Podcasting has become one of our most intimate cultural forms. We often listen alone, through headphones, to voices that guide us through complex or deeply personal stories. Over time, we come to trust these voices not just for the information they convey, but for the sense that someone has listened, selected and shaped what we hear.
That relationship is unsettled by The Epstein Files, a new AI-generated podcast series that promises to process millions of Epstein-related documents into a coherent narrative. But when no one is clearly responsible for what we hear, the authority of the voice becomes harder to trust.
Created by data entrepreneur Adam Levy, the series draws on more than three million documents linked to Jeffrey Epstein and presents them as a “forensic audit” in the form of a conversational podcast between two AI-generated hosts.
Launched in February 2026, it’s had more than two million downloads so far. It’s a daily, self-updating show built through an automated pipeline that ingests, cross references and scripts material using AI systems, operating at a speed that traditional newsrooms could only dream of.
At first listen, The Epstein Files works, sounding like a carefully crafted podcast. But despite the jokes, cross-talk, hesitations and filler words that mirror shows like This American Life, Serial or S-Town, there are no identifiable human speakers behind the voices. From research to publication, the process appears to be largely automated, in line with Levy’s intention to “strip the emotion” from the story.
The hosts also claim that the podcast acts as a filter, combining AI-assisted processing with “human analysis” to review the records rather than speculate. But this distinction is harder to verify when the processes behind selection, interpretation and emphasis remain largely invisible.
Emotion, judgement and interpretation are seen here as irritations or threats. However, systems that select, rank and narrate information do not become neutral simply because those decisions bypass direct human involvement.
The series presents itself as “the first AI native” investigative documentary. Yet it lacks many of the features we’ve come to expect. There are no interviews, no location recordings, and hardly any sonic cues to guide the listener. Instead, it relies almost entirely on simulated conversation.
Scale is not judgement
The use of AI in podcasting is not simply a technical development. It disrupts the way shows are produced, structured and distributed. Rather than acting as a tool, these systems are beginning to reshape or obscure editorial processes that usually rely on human judgement.
The Epstein Files demonstrates how effectively AI can process vast quantities of material, producing a narrative that sounds coherent. But coherence is not the same as sense making, and pattern recognition is not interpretation. Deciding what matters, what is credible, and what should be left out remains a human task.
Automation does not remove judgement. Instead it relocates it, often in ways that are harder to see. Decisions are embedded in training data, system design and weighting mechanisms while appearing as neutral or unbiased outputs.
When information can be processed at scale, the question is no longer just what we know, but how we decide what counts as knowledge. Editorial standards don’t disappear, but they become harder to identify.
Why audio makes this harder
The human voice carries assumptions of authenticity. It signals presence, experience and connection. When we hear someone speak, we tend to assume a relationship between voice and responsibility. That assumption becomes more difficult to sustain when the voice is artificial yet sounds convincingly human.
These nameless hosts are not neutral. They are modelled on familiar broadcast styles associated with authority in western media. In doing so, they reproduce ideas about professionalism and trust, while remaining detached from any identifiable speaker.
What is striking about The Epstein Files is how persuasively authority is performed. The conversational structure suggests multiple perspectives, the tone implies neutrality, and the pacing suggests careful deliberation. But none of this guarantees that the material has been critically evaluated.
Content that creates itself
It could be argued that automation results in more transparency. But this relies on the assumption that volume can substitute for editorial oversight. When material is misinterpreted, stripped of context or simply wrong, it’s often unclear how those mistakes might be identified or addressed.
This is particularly troubling with material such as the Epstein case, which centres on human harm and exploitation. Such stories demand sensitivity, restraint and clearly traceable accountability. The way these stories are processed and retold can also feel detached from the people most affected by them.
At the same time, AI generated podcasts are growing. They are cheap to produce and increasingly difficult to distinguish from human made content. Their appeal may lie in speed, availability and the impression that someone has already done the work of sorting through chaos.
For audiences, the question is not only how to identify what is true or false. It’s also about recognising what is missing. Listening has typically meant encountering different voices, perspectives and forms of responsibility. When those elements are reduced or removed, the act of listening itself begins to change. The Epstein Files offers little sense of a right of reply for its audience. There is no clear editorial voice and no visible chain of accountability.
Broadcasting always depended on relationships between voices and listeners, and between storytelling and editorial judgement. This is beginning to change. The Epstein Files does not signal the end of podcasting or investigative journalism. But it marks a moment in which the cultural meaning of the voice is being tested.
Co-presence and community is central to radio and podcasting. But in The Epstein Files, nobody is there. There may be voices but if you listen very closely, you’ll notice that no one ever takes a breath.
Kathryn McDonald does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Across the UK, working‑class boys are navigating an unprecedented convergence of pressures. There are entrenched gaps between working-class boys and their peers in their levels of attainment at every stage of education.
Often, however, the solutions for addressing this gap in attainment have roots in assumptions and stereotypes. These tend towards positioning working-class boys as somehow suffering from an innate deficiency: apathy, laziness or a lack of ambition for their future careers and employment. The evidence does not back these stereotypes up.
Our research has focused on understanding the experiences of these boys. In 2023, we carried out research that used creative activities to explore what being a young man meant for them. We found that some of the young men felt the need to create protective identities linked to aggression, emotional suppression and educational disinterest at school to avoid harm. For them, being a boy who expressed themselves was a risky enterprise. One boy said:
I feel like you know the bullying and torment would definitely go up quite a bit for, I guess, you know, something stupid like writing how I feel on a page.
We worked with young men who were open and able to engage in challenging and complex discussions, but who made it clear to us that doing this in their own educational environments would potentially lead to social, emotional and potentially even physical harm.
We saw young men with deep rooted aspirations they were often afraid to express. We did not see problematic boys in need of disciplining, but a need to understand and address the relational and structural conditions which shape their behaviour.
Lacking resources – and evidence
In almost every public debate about boys, whether it be attainment gaps, misogyny or youth violence, teachers are positioned as society’s key defence. The government’s recent violence against women and girls strategy, for instance, foregrounds the role of educators in shaping boys’ attitudes and preventing future harm.
But it assumes that schools possess the frameworks, training, and relational bandwidth to meet these challenges. Crucially, it also assumes that we truly understand the daily dynamics between teachers and working‑class boys.
Teachers are under pressure to take responsibility for shaping boys’ attitudes. Dean Drobot/Shutterstock
The reality is that we don’t. The last major study of teacher perceptions was over 20 years ago. This decades-wide gap in evidence and understanding is a void which, our findings demonstrate, has been filled by stereotype and assumption. Rather than a focus on what boys need to achieve at school, there’s a risk that they are seen, both within schools and by the general public, as perpetrators of misogyny and violent behaviour in waiting – that they are an issue that needs to be targeted.
We’ve recently carried out a national survey of over 500 teachers, exploring their perceptions of boys and young men in the classroom. It was followed up with in-depth focus groups with 40 working-class boys aged from 12 to 16, as well as 17 teachers.
We found that teachers showed a high level of confidence in their ability to model dignity, respect and active listening in the classroom. However, the perspectives of young men painted a far more inconsistent picture. It pointed toward two significant disconnects.
First, that respect is defined very differently by educators and the boys and young men they taught. Around 90% of teachers reported that they consistently modelled dignity and respect in the classroom. But when speaking with the boys, often they described the respect they received from teachers as conditional, inconsistent or transactional. The expectation was that respect for teachers came from their position of authority and respect was only paid to the young men in return for theirs.
Second, that masculinity, emotion and online influence are poorly understood and rarely discussed. When asked, just a third of the educators we surveyed could recall a meaningful conversation with a male student about masculinity. Many felt uncomfortable and unprepared to have conversations like this. From the boys’ side, they described significant emotional needs which were often unmet, limited safe spaces to discuss feelings, and punitive responses to distress.
How teachers perceive working-class boys, and the opportunities they have to discuss masculinity at school, aren’t the only factors affecting academic attainment for these young men. Poverty, for instance, has a significant impact on early attainment and a lasting impact on educational success. But our research showed that when reflective, safe, judgement-free conversations occurred, the boys and young men responded positively. It demonstrates that working‑class boys engage, reflect and thrive in educational contexts where they feel respected, listened to and understood.
On the other hand, though, the research suggests that teachers are influenced by wider societal narratives. Within the study many educators defaulted to talking about misogyny or Andrew Tate even when not asked directly. This suggests a narrow lens of focus on issues related to masculinity, shaped by wider social anxieties.
The boys and young men consistently faced contradictory expectations about who they should be. They reported being told to “open up”, yet faced being penalised or ridiculed when they did. They were told to avoid harmful online content, yet weren’t provided any space to engage in critical, deliberate conversations about what they had seen.
Without that space for conversation on which to build, it is our fear that efforts to tackle misogyny, disengagement or disparities in educational outcomes will continue to fall short.
Jon Rainford conducts research for Boys Impact
Alex Blower is affiliated with Arts University Bournemouth and Boys’ Impact.
In central Seoul, South Korea, a motorway once covered a buried urban stream. Today, that same stretch has been uncovered – a process known as daylighting – and this river is home to plants, fish and insects. This flowing water cools the city in summer and attracts tens of thousands of people every day. What used to be concrete now boosts biodiversity, the local economy and community wellbeing.
Similar transformations are unfolding elsewhere.
In Christchurch, New Zealand, river habitats and wetlands were rebuilt after a major earthquake in 2011, guided in part by Māori knowledge of waterways and floodplains. In Vancouver, Canada, nature-based stormwater systems have been integrated into urban design through long-term collaboration with local First Nations.
Across the world, urban planning projects are beginning to take a different approach. One that designs with living freshwater systems, rather than trying to control and contain them.
In a new study, our international team of freshwater scientists and planning experts highlights that, while our towns and cities contain some of the world’s most degraded rivers, wetlands and ponds, they also provide huge opportunities for protecting and restoring freshwater wildlife.
Cities and towns have historically been designed with people in mind. Planning systems prioritise housing, transport, economic growth and flood defence – often treating rivers and streams as infrastructure rather than living ecosystems.
This hasn’t always been the case. Ancient civilisations, from the Indus to the Maya, built settlements around water. They worked with floods, wetlands and seasonal flows in ways that supported both people and nature. With the dawn of industrialisation and modern planning, floodplains were built on, rivers were straightened, streams buried and waterways increasingly engineered to move water through cities rather than support wildlife.
The consequences are stark and hard to ignore: degraded urban waterways, declining freshwater species, and whole cities are more vulnerable to climate-driven floods, heatwaves and water scarcity, contributing to a global collapse in freshwater biodiversity.
Our rivers, lakes, ponds and wetlands occupy only a tiny fraction of the planet while supporting roughly a third of all vertebrate species. Importantly, freshwater acts as an ecological life-support system, sustaining a range of species – including us.
This is why the latest figures are so alarming. Freshwater vertebrate animals such as salmon and eel populations have fallen by 85% over the last 50 years. This is one of the steepest collapses of any group of species on Earth. Urban waterways sit at the heart of this rapid decline.
Movement to deal with this crisis has started. Countries have signed up to ambitious global agreements, pledging to halt and reverse biodiversity loss.
But translating these promises into real change remains a major challenge.
Urban planners as allies
Urban planners shape the very environments where freshwater pressures are most intense – towns and cities. Every day, they make decisions affecting how land is zoned, how stormwater is managed, where green space goes, what buffers are protected, and how urban form evolves. Their actions ripple through entire catchments.
Yet most urban planners often aren’t supported or equipped with the ecological knowledge needed to incorporate freshwater biodiversity into daily practice.
Urban planners need the tools, training and support to recognise freshwater ecosystems as valuable living systems that underpin city resilience, human health and everyday wellbeing – rather than obstacles to be overcome.
In cities such as Breda in the Netherlands, Los Angeles in the US and Nanjing in China, this different way of thinking about freshwater is taking hold. And planners aren’t working alone.
Local residents and Indigenous communities, ecologists, engineers and even schools are often involved from the outset. Together, they bring diverse knowledge of the local context and can build a shared environmental stewardship. Early collaboration helps ensure freshwater biodiversity isn’t an afterthought and results in lasting care for rivers, ponds and wetlands.
Education matters too.
To foster this transition, silos between planning, ecology and engineering can be broken down. Land-use decisions can then be made with a clearer understanding of how water behaves across an entire catchment and how that shapes freshwater habitats.
Just as important is how knowledge flows. Freshwater biodiversity research doesn’t always reach the people making day-to-day planning decisions, or those designing and building projects on the ground. When planners, scientists and delivery teams have access to shared tools, open data or simple design guidance, nature-positive ideas are far more likely to make it off the page and into our cities.
Clear rules are also useful. Biodiversity targets only make a difference if they are backed up by practical local standards and the resources to implement them. For example, we need standards on how to protect riverbanks, restore floodplains or design stormwater systems that work with nature, rather than against it. Without that clarity – and the training and resources to support it – planners are often left trying to balance competing demands on their own.
There are still big gaps in what we know. How much space do urban rivers really need, and how does this vary from place to place? Which nature-based solutions work best across different landscapes? Urban planners can help answer these questions by learning from what works and using that knowledge to improve outcomes for freshwater biodiversity.
Urban planners – often working behind the scenes within local and devolved governments – are at the forefront of this transformation. They can embed freshwater biodiversity into the hearts of our cities.
However, planners cannot do this alone. Freshwater scientists, policymakers, river restoration specialists, engineers, social scientists and economists can work with planners. Universities and professional bodies can rethink how planning is taught. Governments can recognise planners as agents of ecological recovery, not just arbiters of urban growth.
Cities could become hubs for freshwater restoration and recovery, rather than hotspots of decline. They can become places where rivers, wetlands and people thrive together – with benefits that flow far beyond city boundaries.
Helen A. L. Currie receives funding from UKRI Engineering & Physical Sciences Research Council Noise Network +.
Irene Gregory-Eaves receives funding from the Natural Sciences and Engineering Research Council of Canada.
Steven J Cooke receives funding from the Natural Sciences and Engineering Research Council of Canada. He is appointed as Canadian Commissioner for the Great Lakes Fishery Commission.