How politics, technology and the environmental crisis turned these movies into horror films in 2026

Source: The Conversation – UK – By Alexander Sergeant, Lecturer in Digital Media Production, University of Westminster

A famous expression, often wrongly attributed to Mark Twain, states that comedy is merely tragedy plus time. This theory highlights how our response to films can depend on the context in which we see them.

We tend to think of the genre of a film as something very fixed, decided by a combination of studio producers and marketers. But, in the right context, films can move across many different genres in the span of their lifetime, depending on the audiences that watch them.

To demonstrate this idea, here are five scary films for 2026. The twist, however, is that none of these films were ever intended to be horror films. Most on the list were satire or comedy when they were made. Instead, they have become horrific due to the way they touch on contemporary issues surrounding the global politics of President Donald Trump, impending environmental disaster, ever-accelerating technology and contemporary attitudes towards gender.

1. Duck Soup (1933)

The finest film produced by the famous Marx Brothers comedy troupe, Duck Soup is an anarchic political satire that tells the story of an unserious playboy president named Rufus T. Firefly. Beloved by film enthusiasts, the film showcases a series of mishaps and misdeeds caused by his selfish, erratic behaviour which inadvertently led his country of Freedonia into a war with its neighbours.

Duck Soup is considered a classic of Hollywood slapstick and quick-witted verbal comedy. But, in an era of a genuine unserious president, its central joke might not feel funny any more. Nor indeed is the idea that, nearly 100 years after its release, this biting satire on the politics of rising authoritarianism would be as timely now as it was then.

2. The Apartment (1960)

People often say “they don’t make them like they used to any more” when trying to articulate a nostalgia for the films of the past. That description can be aptly applied to Billy Wilder’s romantic comedy-drama The Apartment. They do not make films like this any more. But in this case, that’s a good thing.

Jack Lemmon’s “Buddy Boy” Baxter is the bachelor who routinely loans his apartment out to his bosses for them to conduct extra-marital affairs. Shirley MacLaine’s Fran is the loveable but down-on-her-luck elevator operator involved in a tawdry situation with Baxter’s boss. Their own romance emerges out of a suicide attempt, workplace harassment and abuses of power. It feels like the film is set not just in the past, but in some creepy alternative world.

To be fair to The Apartment, it hardly treats some of the more problematic behaviour of its characters as virtues we are supposed to admire. But it never quite attacks the deeply unpleasant nature of its central conceit either. Baxter is not just a loveable goof unaware of what he’s got himself mixed up in. He’s a complicit enabler. And Fran is not a ditsy but loveable woman mixed up with the wrong crowd. She’s a victim.

3. Idiocracy (2006)

Idiocracy was something of a box office bomb, given neither the marketing campaign nor the reviews it needed to ensure success. The fact it has since become a cult hit speaks to how startlingly prescient the film is for contemporary audiences now discovering the film in droves 20 years later.

Idiocracy tells the story of a young man put into suspended animation who wakes up 500 years in the future. The average intelligence of the population has severely decreased, to the extent that the world has become increasingly consumerist, vulgar, crass and prejudiced in its thinking. America has even elected a former pro wrestler and porn star, Dwayne “Mountain Dew” Camacho, as its leader.

Made in 2006 during the final year of George W. Bush’s presidency and set against the rise of Barack Obama, the film failed initially as a comedy. It now works perfectly as a terrifying exaggeration of what the world looks like in 2026.

4. Wall-E (2008)

Wall-E is part of a long history of animations with an interest in the environment, from Princess Mononoke (2001) to Ferngully: The Last Rainforest (1992). That part of its dystopic vision still stands up. The film’s vivid opening of Wall-E wandering around a silent world of trash is still its best moment.

The film’s vision of the humanity that has left the garbage-strewn world behind, however, has become increasingly concerning over time. Predicting a world of humans who are dumb, obese and screen-obsessed, it is increasingly difficult to watch Wall-E as a nostalgic childhood treat.

5. Her (2013)

The amazing feat pulled off by this absurdist romantic drama was to somehow get an audience to root for the idea of a romantic pairing between a lonely middle-aged man and an AI-enabled operating system. More than a decade later, Her’s open-minded approach to AI seems far more fraught.

As the romance develops between Theodore (Joaquin Phoenix) and Samantha (voiced by Scarlett Johansson), it is difficult not to imagine the fingerprints of powerful but not necessarily benign tech moguls turning the screws tighter, manipulating Theodore further into spurning human contact for his digital desires.

Equally, it is difficult not to wonder whose voice has been stolen to create her warm, affectionate tones, or to ask what the company might do with the recording of their conversations. The dangers in our current technological reality have once again spoilt a perfectly good film.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

Alexander Sergeant does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How politics, technology and the environmental crisis turned these movies into horror films in 2026 – https://theconversation.com/how-politics-technology-and-the-environmental-crisis-turned-these-movies-into-horror-films-in-2026-268679

Winter Olympics: the new video technology that could help power Britain’s skeleton team to gold

Source: The Conversation – UK – By Steffi Colyer, Senior Lecturer in Sports Biomechanics, Centre for Health and Injury and Illness Prevention in Sport,, University of Bath

Skeleton is an exhilarating Winter Olympic sport in which athletes race head-first down an ice track at speeds reaching over 80 miles per hour (130km/h). While the event can look basic at first glance, success relies heavily on highly engineered equipment and extensive wind‑tunnel testing – much like elite Olympic track cycling programmes.

Each run begins with the athlete pushing a sled (also known as a “tea tray”) explosively off the starting block, then sprinting rapidly for about 30 metres downhill. After diving on the sled, they ride the rest of the course with their head just a few inches above the ice. The sleds have no brakes, and riders wear only a thin suit and helmet for protection.

A powerful start is considered the defining component of skeleton performance. So, developing a skeleton athlete’s strength and power while refining their pushing technique is a central focus in the lead-up to competitions. The biggest of all these, the Winter Olympics, is being held in Milan and Cortina d’Ampezzo, Italy, this month. Skeleton events start on February 12.

While Britain does not tend to rank highly in Winter Olympic sports, in skeleton it has won a world-best nine Olympic medals, including three golds. Over the past ten years, my colleagues and I at the University of Bath have worked with Team GB skeleton athletes to help improve their starts, using a form of “markerless” motion capture technology.

But the applications of this technology extend far beyond the Winter Olympics. There is potential for it to replace traditional motion capture systems in the film, TV and gaming industries, and to be used in injury rehabilitation.

How motion analysis began

The origins of motion analysis can be traced back to the pioneering work of English photographer Eadweard Muybridge in the late 19th century. Muybridge developed early techniques for capturing sequences of images, including documenting equine gait.

Eadweard Muybridge developed pioneering motion capture techniques. Video: Cantor Arts Centre.

By manually annotating specific features across successive images, researchers have since been able to build a detailed picture of how a person or animal moves. But while this method was the standard for many decades, it was both time- and labour-intensive.

So, technological advances in cameras and computer processing led to the development of automated methods of motion analysis – notably, marker-based motion capture. This uses reflective markers placed on key parts of the body, which are automatically tracked by infra-red cameras as the person moves around.

In film, animation and gaming, this mean an actor’s body movements and facial expressions can be translated into to realistic CGI characters. Marker-based technology is currently the most widely used 3D motion analysis technique across the film, gaming and health sectors, with an estimated global market value of over US$300 million (£220 million).

However, this advanced technology has limitations too, including the need for specialist equipment, controlled laboratory environments, and lengthy preparation time to attach the markers. These can be problematic in sports and many other fields – particularly during live competitions and public performances.

As a result, the field of motion analysis has come almost full circle. Thanks to major advances in computer vision and artificial intelligence, biomechanists such like me are once again extracting detailed movement information directly from video images – but this time in an automated way.

The markerless motion capture systems we use rely on deep‑learning models that are trained on a huge number of images of people performing everyday activities. When applied to unseen images, the algorithms can then automatically detect the same body landmarks. By fusing multiple camera views, a simplified digital 3D skeleton can be extracted, from which the person’s movement across time can be modelled and analysed.

Video: CNN.

Analysing the optimum technique

Markerless motion capture makes it possible to unobtrusively measure athletes’ movements outside the lab, in training and even during competitions. Our recent research has demonstrated its value in many different sports, including badminton, tennis and Olympic weightlifting.

In skeleton, the unique, bent-over position at the start of each run, as the athlete sprints alongside the sled with one hand holding it, makes this form of biomechanical analysis particularly important.

Using markerless motion capture, we have explored the differing roles of an athlete’s limbs in the push-start performance, comparing these biomechanics with conventional sprinting. Importantly, we have also validated this markerless approach by comparing it with a traditional marker‑based system.

The optimum starting technique for each skeleton athlete is shaped by their physical characteristics, including factors such as relative limb lengths and flexibility. Analysing each athlete’s pushing technique, how it relates to their performance and how this evolves over time, can help give them a crucial competitive edge during this all-important first phase of each skeleton run.

Medals can be won and lost by hundredths of seconds as athletes sprint away from the starting block. In these first few seconds, we hope Britain’s athletes reap the benefit of our markerless motion capture technology.

The Conversation

Steffi Colyer receives funding via the Centre for the Analysis of Motion, Entertainment Research & Applications (CAMERA) from the UKRI’s Engineering and Physical Sciences Research Council.

ref. Winter Olympics: the new video technology that could help power Britain’s skeleton team to gold – https://theconversation.com/winter-olympics-the-new-video-technology-that-could-help-power-britains-skeleton-team-to-gold-274859

New Start’s expiration will make the world less safe – even if it doesn’t spark another nuclear arms race

Source: The Conversation – UK – By Paul van Hooft, Research Leader, Defence and Security, RAND Europe

The New Start nuclear arms control treaty expires on February 4, opening up the way for a period of great power uncertainty and the possibility of a new arms race.

The US-Russian agreement, negotiated in 2010 and extended in 2021, limited the number of deployed nuclear warheads to 1,550, with roughly 3,500 non-deployed warheads in reserve. It was the last of the arms control agreements that were rooted in the legacy of the cold war negotiations between the US and the Soviet Union, and the era of relative optimism that followed it.

The current US administration has shown little appetite for extending Start on a purely bilateral basis, as it hopes to include China in any new agreement. China’s nuclear arsenal has more than doubled in size in the past five years to an estimated 600 warheads with the expectations of further increases over the next decade.

The emphasis on China is not recent. Even before the latest Trump administration took office in January 2025, a 2023 bipartisan committee concluded that the US nuclear force posture was insufficient for a modern “Two Peer Plus” environment. Two Peer Plus means the US needs to be able to jointly confront two major nuclear-armed powers – Russia and China – while having the capacity to face threats from other lesser adversaries, mainly North Korea.

The Trump administration would prefer a treaty that would address this new reality. However, Chinese officials are reluctant to discuss joining such an agreement. They argue their arsenal is still far smaller than that of either the US or Russia and that it is unfair to ask China to restrain itself.

Allowing New Start to lapse is a negative signal but it is unlikely to immediately trigger a cold war–style arms race. Washington and Moscow seem more confident in their assured second-strike capabilities than they were then, with fewer concerns about a disarming first strike. And, no matter the rhetoric, both will probably want to avoid rocking the boat too much or too soon.

More likely is that the US will invest in further modernising its delivery systems to improve accuracy against mobile targets and penetration against hardened Russian bases and missile silos.

Russia, meanwhile, has recently displayed some exotic new delivery systems, including the Poseidon nuclear torpedo and the Burevestnik nuclear cruise missile.

But it is unclear whether the country could produce these systems at a scale to make a difference. Or even whether such systems truly change anything about the overall US-Russia calculus, given the large numbers of land- and submarine-based ballistic missiles that Russia already has.

Russia may therefore be incentivised to deploy more warheads with existing missile systems This would be faster and cheaper than developing and producing new delivery methods.

End of the peace dividend?

Perhaps more importantly, the consequences of New Start’s expiration extend far beyond gradual shifts in nuclear force posture. Its demise marks the collapse of yet another pillar of the post–cold war international order. With the end of the intermediate nuclear forces (INF) treaty in 2019, the conventional force in Europe treaty in 2023, and others, the arms control architecture is already on its last legs,

The demise of the INF treaty was also partly triggered by Sino-American competition. The US wanted to be less restrained in the Indo-Pacific, where China had built a formidable arsenal of short- and medium-range ballistic and cruise missiles.

The broader system of negotiations and institutionalised diplomatic exchanges is equally important. They facilitated understanding of the other’s interests and red lines, not out of sympathy but to allow states to manage the inherent uncertainty and mistrust present in strategic competition.

Graphic showing current nuclear stockpiles, 2026.
Note: arrows refer to general trends over a multi-year period, rather than individual year-to-year changes. Numbers may fluctuate year-over-year for several reasons, including actual changes in a countries’ stockpiles and/or reassessments by the authors based on new data.
Federation of American Scientists

The decline of this kind of diplomacy may not have immediately noticeable effects on international security, but it makes misperceptions and misjudgements during peacetime and during crises likelier. Personalised diplomacy is ill suited to managing long-term rivalry between nuclear-armed states. They are less likely to address deeper structural conflicts of interest or build deep knowledge of the other side.

Even if New Start were extended once more, it would not resolve the underlying structural challenges. China’s unlikely participation remains a central obstacle. So does the broader Two Peer Plus problem, where the US perceives the risks of Russia and China together.

At the same time, US strategic priorities have shifted. The Trump administration’s new national security strategy and national defense strategy made clear that Europe and Russia are no longer central concerns, eclipsed by the western hemisphere and the Indo-Pacific.




Read more:
What the US national security strategy tells us about how Trump views the world


Where does this leave Europe?

Europe has limited leverage to prevent the drift away from arms control. French and British nuclear forces are deliberately small and geared toward existential deterrence, leaving little scope for negotiated reductions. If anything, given the growing pressures on these states to take a greater role in European deterrence, the incentive is to expand their nuclear options, not to limit themselves.

Over the long term, Europe’s most viable path may therefore be to first invest significantly in conventional capabilities to generate pressure on Russia and then to consider building up the existing European nuclear arsenals. Such an asymmetric approach to arms control would be difficult. It would depend on Russia reaching exhaustion in its war on Ukraine, leaving it unable and unwilling to compete with both the US and Europe.

The end of New Start will leave the strategic environment more uncertain and thus more dangerous for everyone. It would not trigger an immediate arms race, but it would further erode the norms, transparency and institutionalised dialogue that have helped manage nuclear competition for decades.

It may well further incentivise non-nuclear states to reconsider their non-proliferation commitments, already under pressure due to the uncertainty surrounding the US security guarantees. In a more multipolar nuclear world marked by mistrust, technological rivalry and diverging priorities, deterrence without arms control becomes more brittle, not more stable.

The Conversation

Paul van Hooft receives funding from RAND Europe to work on deterrence and arms control related issues.

ref. New Start’s expiration will make the world less safe – even if it doesn’t spark another nuclear arms race – https://theconversation.com/new-starts-expiration-will-make-the-world-less-safe-even-if-it-doesnt-spark-another-nuclear-arms-race-275116

What will a rebuilt Gaza look like? The competing visions for the Strip’s future

Source: The Conversation – UK – By Timothy J. Dixon, Emeritus Professor in the School of the Built Environment, University of Reading; University of Oxford

A girl walks along a street in Gaza to get food during the war between Hamas and Israel. Jaber Jehad Badwan / Wikimedia Commons, FAL

Following a visit to Gaza in January, the UN undersecretary general, Jorge Moreira da Silva, called the level of destruction there “overwhelming”. He estimated that, on average, every person in the densely populated territory is now “surrounded by 30 tonnes of rubble”.

This staggering level of destruction raises urgent questions about how, and by whom, Gaza should be rebuilt. Since 2023, a variety of reconstruction plans and other initiatives have tried to imagine what Gaza could look like when the conflict ends for good. But which of these visions will shape Gaza’s future?

The Israeli government’s Gaza 2035 plan, which was unveiled in 2024, lays out a three-stage programme to integrate the Gaza Strip into a free-trade zone with Egypt’s El-Arish Port and the Israeli city of Sderot.

AI renderings show futuristic skyscrapers, solar farms and water desalination plants in the Sinai peninsula. The plan also shows offshore oil rigs and a new high-speed rail corridor along Salah al-Din Road, Gaza’s main highway that connects Gaza City and Rafah.

The US government has proposed a similar futuristic vision for Gaza. Its August 2025 Gaza Reconstitution, Economic Acceleration and Transformation Trust plan shows a phased series of modern, AI-powered smart cities developed over a ten-year time frame. The plan, which would place Gaza under a US-run trusteeship, suggested that poor urban design lies at the heart of “Gaza’s ongoing insurgency”.

Jared Kushner presenting the ‘Gaza Riviera’ Project at World Economic Forum in Davos, January 2026.

The latest iteration of this vision was unveiled by Donald Trump’s son-in-law, Jared Kushner, at the recent World Economic Forum in Davos.

He presented slides showing Gaza reconstructed as a “Riviera” of the Middle East, with luxury beachfront resorts, gleaming tower blocks, residential zones and modern transport hubs. Kushner suggested it was “doable” to complete the construction of a “new” Rafah city in “two to three years”.

It has been reported that the US and Israeli visions are heavily influenced by US-based economics professor Joseph Pelzman’s economic plan for Gaza. This plan, Pelzman said on a podcast in 2024, would involve destroying Gaza and restarting from scratch.

In contrast to the US and Israeli visions, the February 2025 Gaza “Phoenix” plan includes input from the people of Gaza. It has a much stronger focus on maintaining and reconstructing the existing buildings, culture and social fabric of the enclave.

The plan was developed by a consortium of international experts together with professionals and academics from Gaza, the West Bank and the Palestinian diaspora, and suggests a reconstruction and development phase of at least five years.

Other plans from the Arab world take a more technocratic view of reconstruction, but still have a short timescale for reconstruction. These include a five-year plan by the United Arab Emirates-based Al Habtoor Group, which promises to grant 70% of ownership in the holding company that will manage Gaza’s reconstruction to the Palestinians.

Feasibility of rebuilding Gaza

So, how feasible are these different visions and how inclusive are they for the people of Gaza? Rebuilding cities after war takes time and money, and also requires local resources. Even in China, a country with plentiful resources and abundant skilled labour, major new cities are rarely completed in less than 20 years.

And in Gaza rebuilding will be complicated by the fact that there are now 61 million tonnes of rubble there, as well as other hazardous debris such as unexploded munitions and human remains. This will need to be removed before any reconstruction can commence, with the UN estimating that clearing the rubble alone could take as long as 20 years.

For comparison, the Polish capital of Warsaw experienced a similar level of destruction during the second world war and it took four decades to rebuild and reconstruct the city’s historic centre. The time frames for reconstruction outlined in all of the plans for Gaza are far shorter than this and, even with modern construction methods, are unlikely to be feasible.

The US and Israeli visions also fail to include Palestinians in the planning of Gaza’s future, overlooking any need to consult with Gazan residents and community groups. This has led critics to argue that the plans amount to “urbicide”, the obliteration of existing cultures through war and reconstruction.

Reports that suggest Gazan residents will be offered cash payments of US$5,000 (£3,650) to leave Gaza “voluntarily” under the US plan, as well as subsidies covering four years of rent outside Gaza, will not have alleviated these concerns.

At the same time, the US plan does not propose a conventional land compensation programme for Gazan residents who lost their homes and businesses during the war. These people will instead be offered digital tokens in exchange for the rights to redevelop their land.

The tokens could eventually be redeemed for an apartment in one of Gaza’s new cities. But the plan also envisages the sale of tokens to investors being used to fund reconstruction. The Council on American-Islamic Relations, the largest Muslim civil rights and advocacy organisation in the US, says the “mass theft” of Palestinian land through the token scheme would amount to a war crime.

With their emphasis on community engagement and the repair and renewal of existing structures, the Phoenix plan and the other Arab-led visions are at least a step forward. But without a fully democratic consensus on how to rebuild Gaza, it is difficult to see how the voices of the Gazan people can be heard.

Whichever vision wins out, history shows that post-war reconstruction succeeds when it involves those whose lives have been destroyed. This is evidenced somewhat ironically by the US Marshall Plan, which funded the reconstruction of many European economies and cities after the second world war, and involved close engagement with civil society and local communities to achieve success.

The Conversation

Timothy J. Dixon does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. What will a rebuilt Gaza look like? The competing visions for the Strip’s future – https://theconversation.com/what-will-a-rebuilt-gaza-look-like-the-competing-visions-for-the-strips-future-274591

Bruce Springsteen’s Streets of Minneapolis: how digital circulation boosts the impact of a protest song

Source: The Conversation – UK – By Adam Behr, Reader in Music, Politics and Society, Newcastle University

In moments of creeping authoritarianism, culture sometimes reacts faster than institutions. Bruce Springsteen’s rush-released song in the wake of killings of two Minneapolis residents by agents of US Immigration and Customs Enforcement (ICE) was not just an act of commentary, but a deliberate intervention in public discourse.

Streets of Minneapolis operates as an alarm signal, its directness placing it in the public square, where naming and narration carry political weight. What also distinguishes Streets of Minneapolis is not just its fidelity to the tradition of the protest song, but its mode of circulation as a rapid response in the digital age.

This is Springsteen at his most declarative, operating not in the interior emotional space of the confessional singer-songwriter but in the outward-facing register of public address. His specificity – naming people, streets, organisations and the “winter of ’26” – marks the song as political communication rather than personal reflection.

His framing of the killings involves a shift from individual tragedies towards a shared civic injury. The repeated invocation of “our Minneapolis” performs rhetorical work, translating private loss into a shared collective experience and situating it as a wider public concern that extends beyond the city itself.

That movement from the individual to the collective places Streets of Minneapolis within a wider lineage of protest song, creating narratives out of real events so they can be remembered and acted upon. In this sense, the song does not simply respond to politics, but actively participates in political thought and action. “We’ll take a stand” is not a metaphorical flourish but a direct appeal.

Springsteen makes this lineage explicit through the early acoustic section, replete with insistent harmonica, and a vocal delivery and intonation that consciously signal Bob Dylan’s early protest music. Structurally, too, Springsteen’s call to action echoes Dylan works like The Lonesome Death of Hattie Carroll – moral force emerging through the accumulation of detail and reportage.

While Dylan’s later career moved away from direct protest toward the personal and allegorical, Springsteen here leans into that more direct mode of storytelling. It follows the protest song logic whereby narration becomes an engine of persuasion, reshaping contemporary events into historical record.

The reference carries added resonance given Dylan’s Minnesota roots, serving as a reminder that place, memory and music have long been intertwined in American protest culture.

Springsteen quotes himself, too, both musically and thematically, with a clear nod in the title to Streets of Philadelphia and a closing musical call-back to Born in the USA, its own tub-thumping aesthetic belying the portrait of a disillusioned Vietnam veteran in the lyrics.

These are not just nostalgic gestures but also markers of continuity. By folding earlier works into this new song, he situates the current moment within a longer trajectory of American struggle, via musical linkages between himself and Dylan – and Woody Guthrie before that.

Digital circulation and rapid response

Where protest songs once depended on live performance, radio play and physical distribution, they now travel through platforms. Within hours of release, Streets of Minneapolis was embedded in news coverage, shared across social media and dissected in comment threads and reaction videos.

Listeners encounter it not only as a song but as a reference point to be reposted, quoted, argued or aligned with. In that process, its energy comes less from a single, fixed message than from how it is used, repeated and spread through ongoing conversations.

This dynamic places protest music alongside other contemporary forms of political communication, particularly those shaped by meme culture and the logic of the online platforms through which much creative work is experienced. Short excerpts, lyrical fragments and recognisable musical cues circulate easily across feeds, videos and posts, where they are paired with captions, visuals and commentary.

In recent election cycles, for instance, music has functioned less as a background soundtrack or simple celebrity endorsement than as material that can be repurposed – looped in clips, used ironically, set against images, or mobilised to signal approval or dissent. In this environment, music functions as a part of the communicative infrastructure, enabling participation as much as persuasion.

This also comes amid growing political conflict around culture itself. While there is a longer history of public disputes between the Trump administrations and the artistic community, these tensions have recently escalated into direct interventions, including the cancellation of shows and the temporary closure of the Kennedy Center, pointing to an environment in which music and performance are increasingly politicised and directly entangled with power.

Seen in this context, Streets of Minneapolis is both traditional and distinctly contemporary. It draws on the narrative starkness and moral framing of folk protest, but gains traction through digital circulation. The killings in Minneapolis of Renée Good and Alex Pretti were the immediate catalyst, but the song’s significance lies in how it carries that moment forward.

As authoritarian power shifts gear, from creeping practice to open and violent assertion, the protest song adjusts its form and reach. Streets of Minneapolis reflects that transition, drawing on Springsteen’s longstanding role as a public narrator of American life. It can’t halt state action, but it can help to prevent it from going unnoticed and unrecorded.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

Adam Behr has received funding from the Arts and Humanities Research Council and the British Academy.

ref. Bruce Springsteen’s Streets of Minneapolis: how digital circulation boosts the impact of a protest song – https://theconversation.com/bruce-springsteens-streets-of-minneapolis-how-digital-circulation-boosts-the-impact-of-a-protest-song-275004

The sale of Russell & Bromley is a symbol of the challenges facing independent heritage brands

Source: The Conversation – UK – By Naomi Braithwaite, Associate Professor in Fashion and Material Culture, Nottingham Trent University

William Barton/Shutterstock

The UK heritage shoemaker Russell & Bromley has been bought by high-street clothing giant Next. Despite the brand’s rescue from administration, dozens of jobs will be lost in initial redundancies, and there are rumours that more than 30 shops could close. As one of the few independently owned footwear brands left in the UK, the sale spells another loss to the industrial heritage of the British footwear industry.

The closure of fashion stores is nothing new, and the gradual demise of the British high street has been well documented. In fact, research in 2021 revealed that the fastest-declining sector on the UK high street was fashion retail. Shifts in consumer behaviour driven by online shopping, alongside fast fashion, placed inevitable pressure on independent, mid- to high-end stores like Russell & Bromley.

With so much competition (particularly in the context of footwear, where many clothing retailers and supermarkets have added shoe lines), staying relevant has become even more challenging.

What set Russell & Bromley apart was its long history. It was founded in 1880 in Sussex and continued under the leadership of five generations of the same family. It has a strongly defined heritage as a British independent brand, with a focus on craftsmanship and understated luxury.

It has also been a favourite of the Princess of Wales, which guaranteed the brand further endorsement. More recently, another brand linked to the princess, LK Bennett, was sold to the owner of Poundland. LK Bennett was founded in 1990, also as a high-end shoe retailer, later branching out into clothing as well.

The Russell & Bromley sale followed the announcement that heritage sports shoe brand Gola had been acquired by Japanese conglomerate Marubeni – a response to booming sales in retro trainers. Gola, too, has a long history. It was founded in Leicester, once a centre for British shoemaking in 1905, making it one of Britain’s oldest sportswear brands.




Read more:
The history of sneakers: from commodity to cultural icon


Its origins were in making football boots, but in the 1960s the brand took off with its Harrier style being favoured by football fans, and its later endorsement by celebrities including Liam Gallagher and Paul Weller. But in recent years, Gola struggled to compete with the powerhouses of Nike and Adidas.

The cases of Russell & Bromley and Gola exemplify the challenges of maintaining independence in the complex global footwear industry where conglomerates are taking a dominant stance. The brands’ change in ownership highlights the transformation of what was once a flourishing footwear manufacturing and retail industry.

Dominance and decline

The 1960s was the heyday of fashion retail on the British high street with the emergence of boutiques like Mary Quant’s Bazaar and the advent of Topshop in 1964, which brought a new, younger consumer.

Footwear retailers were always a staple on the high street, with brands like Dolcis, and Lilley and Skinner. Both were part of the Leicester-based conglomerate the British Shoe Corporation, and alongside Clarks and Russell & Bromley they captured the footwear retail market.

But the UK’s fashion footwear retail industry started to decline in the 1990s with the closure of the British Shoe Corporation and its huge portfolio of stores.

This decline in shoe retail followed a significant change in the UK’s footwear manufacturing industry. While Northampton remains a centre of excellence for men’s footwear manufacturing, shoe factories in Leicester started to close from the 1980s. They could no longer compete with the prices and volumes of manufacturers in Brazil, India and China.

Recently, China has taken the lead in global shoe manufacturing, adapting the traditional skills and craftsmanship with technical advances and the ability to produce high volumes.

So where does Next fit into this picture? In 1982, the Midlands-based clothing company opened its first womenswear store and by 1988 had launched the Next Directory, which introduced home delivery. Consumers no longer had to go to separate shops to find shoes to match their outfits – suddenly it was all available in one place.

It was not just Leicester that suffered the decline of its footwear industry. London also had a long history in shoemaking, but failed to weather the competitive landscape. The 1990s saw an increase in international brands and retailers entering the UK retail space, placing further competition on domestic brands.

aretesanal shoemaking equipment laid out on a workbench alongside completed and half-made shoes.
There’s still a market for artisanal footwear.
sopf/Shutterstock

Despite this uncertainty and change in UK footwear and retail, Russell & Bromley continued to thrive well into the 21st century. This is testament to its position as a high-end retailer that sold its own well-crafted shoes and bags with the desirable Made in Italy label. Investments in a refresh in 2025 may have proved too costly, as the market became increasingly difficult.

While there is plenty of choice for consumers at the lower and designer ends of the footwear market, the mid to high-price points where Russell & Bromley sits could perhaps be at risk of becoming squeezed out.

Italy, Spain and Portugal have maintained their rich shoemaking heritage. While this has much to do with legacy, it may also be the result of these countries’ continued endorsement by luxury brands, where the allure of artisanal shoes resonates with higher price points.

Despite the sales, the Russell & Bromley and Gola brands are not being lost. Under their new owners, they will be able to go on representing the legacy of the British footwear industry, which has been defined by heritage, fashion and craftsmanship.

The Conversation

Naomi Braithwaite does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. The sale of Russell & Bromley is a symbol of the challenges facing independent heritage brands – https://theconversation.com/the-sale-of-russell-and-bromley-is-a-symbol-of-the-challenges-facing-independent-heritage-brands-274444

The British public has lost faith in politics – the Peter Mandelson scandal must be a wake-up call for Keir Starmer

Source: The Conversation – UK – By Sam Power, Lecturer in Politics, University of Bristol

Peter Mandelson giving a speech in Washington during his tenure as Ambassador. Flickr/UKinUSA, CC BY-SA

Peter Mandelson is under criminal investigation after documents released by the US government appeared to show that he released sensitive information to Jeffrey Epstein and his associates while he was a government minister.

He could potentially face charges of misconduct in public office. This is a law, as outlined by Spotlight on Corruption, that more often than not covers offences conducted by serving police officers and prison staff.

In Mandelson’s case, the police are likely to be looking at four specific elements: whether he was a public officer, whether he wilfully misconducted himself (which is a weird way to put it but I didn’t write the law), whether this happened to such a degree that it amounted to an abuse of public trust in the office holder, and whether all of the above happened without reasonable justification.

Mandelson was appointed by Keir Starmer to serve as his ambassador to the US and questions remain about how thoroughly he was vetted or how much Starmer knew about his relationship with Epstein. Either way, it is worth restating that Mandelson was literally known as The Prince of Darkness. It is also worth restating that he twice resigned during the New Labour years following allegations of impropriety.

Starmer, Johnson and standards in public office

Starmer may not like it but his approach to standards is aligned with former prime minister Boris Johnson. When he came into office, Starmer updated the ministerial code in a number of small but important ways. And, in the foreword he said that “restoring trust in politics is the greatest test of our era”. Where he differs from Johnson is that I genuinely think Starmer and his team believe that.

However, neither of them think (or thought) – as far as I can see – that improving trust has all that much to do with matters of ethics, standards or corruption. Or trust. Take, for example, something we hear again and again about the Starmer administration’s fundamental political approach – the so-called “pothole strategy”. This is the idea that fixing things like potholes and delivering tangible everyday improvements in people’s lives will rebuild public faith in politics.

This is underpinned by the five missions of government, its “plan for change”: kickstarting economic growth, building an NHS fit for the future, creating safer streets, breaking down barriers to opportunity and making Britain a clean energy superpower.

All of these may be laudable goals but they don’t amount to a genuine reflection on how to restore trust in politics. That is entirely absent from governmental priorities. Indeed, in the very next sentence in the ministerial code, Starmer wrote: “The British people have lost faith in [politics’] ability to change their lives for the better.” The pothole strategy writ large. Implicit in this is that standards and ethics are secondary. That if you deliver, the trust will come.

Johnson made the same miscalculation. He too assumed that the public cares more about outcome than process. That if you fix the potholes, standards can come second. Both prime ministers have misread the terrain in that they have effectively seen this as a zero-sum game.

Johnson didn’t like the idea of standards, and simply thought that if he did a good enough job the people wouldn’t care either. Starmer, I suspect, thinks that they do matter, but are also something that can be put in a box to be managed later. Neither are correct. Standards, delivery and trust are not mutually exclusive but fundamentally interconnected.

Keir Starmer in parliament.
Starmer responds to questions about Mandelson at PMQs on February 4.
Flickr/UK Parliament, CC BY-ND

The appointment of Mandelson is actually a pretty good microcosm of this. He had a well-known and chequered record. But he was also (until he obviously wasn’t) believed to be good at his job. He was seen as someone who delivered. Michael Gove, editor of The Spectator, thought so at least when in May last year he described Mandelson as someone who was “thoughtful and original”, had a “breadth of vision” and an “intrinsically worthwhile … skillset to interpret the British government to Americans”. The Guardian headline, when a trade deal was (briefly) secured with the US, said: “Cometh the hour, cometh the Mandelson.”

He was also, of course, good friends with Epstein, a convicted paedophile. And the public takes a pretty dim view of this kind of thing – while at the same time wanting someone to fix their potholes. The research bears this out. A UCL study found that “in short, the public does care about integrity”.

Standards and growth are not mutually exclusive

As someone who spends their life studying ethics, standards and corruption, I think there are reasons to be cheerful. Johnson didn’t care, Starmer does. He just has the wrong diagnosis. And I hope his team is beginning to realise this after this sordid affair, because it’s not too late.

Starmer’s government is about to deliver an elections bill, which offers a prime opportunity to tighten the rules in British politics. If it is truly ambitious and caps donations, it would prevent the overreach of a wealthy elite (and isn’t the Epstein story just a truly horrific example of this?).

Meanwhile, the Lobbying Act is in dire need of reform. In fact, some reports have discussed the manner in which Mandelson lobbied Barack Obama’s chief economic adviser Larry Summers (someone also linked to Epstein) in an attempt to water down restrictions on banking activities. These kinds of informal advances are proximate to concerns subsequently raised during the Greensill scandal which engulfed David Cameron a few years ago.

And the UK’s Ethics and Integrity Commission, if properly supported, could chart a new path in our understanding of standards in public life. This is a new body (which has largely subsumed the Committee on Standards in Public Life), but little is known about its role beyond that. A clear steer (and further investment) from the government would help to clarify this.

And the best thing is you can do all these things and fix the potholes as well. Standards are, when it comes down to it, pretty cheap. It’s just a shame some of our politicians are as well.

The Conversation

Sam Power has received funding from the Economic and Social Research Council and the Engineering and Physical Sciences Research Council.

ref. The British public has lost faith in politics – the Peter Mandelson scandal must be a wake-up call for Keir Starmer – https://theconversation.com/the-british-public-has-lost-faith-in-politics-the-peter-mandelson-scandal-must-be-a-wake-up-call-for-keir-starmer-275113

Does the exodus to UpScrolled signify the end of TikTok?

Source: The Conversation – Canada – By Jaigris Hodson, Associate Professor of Interdisciplinary Studies, Royal Roads University

Soon after American investors took control of TikTok’s U.S. operations, users started complaining that content on certain topics was being suppressed. (Unsplash/Appshunter.io)

Until recently, you might have never heard of the TikTok competitor UpScrolled. But as of Jan. 29, the app reached No. 1 one in Apple’s app store as disgruntled TikTok users in the United States rushed to sign up.

The exodus to UpScrolled comes after a group of American investors, including Oracle founder Larry Ellison, acquired a majority stake in TikTok’s U.S. operations on Jan. 22, a day before the deadline set by President Donald Trump for the app’s U.S operations to be separated from Chinese parent company ByteDance.

Trump and other American officials have long pushed for acquiring TikTok’s U.S. operations, citing concerns over China accessing the data of U.S. citizens. However, soon after the acquisition, TikTok users started complaining of shadow banning, a disputed tactic whereby people suggest social media sites will allow you to post, but will not allow anyone else to see what you post.

The acquisition comes amid civil unrest in the U.S. as Immigration and Customs Enforcement officers (ICE) conduct raids in cities like Minneapolis that have resulted in multiple deaths and hospitalizations. Concerned users have been uploading video documenting ICE’s actions, but began to notice their videos were not garnering any attention on TikTok, or sometimes, not uploading at all following the acquisition.




Read more:
Blaming ‘wine moms’ for ICE protest violence is another baseless, misogynist myth


Users posting about other topics such as Palestine have also expressed concerns about censorship. Palestinian journalist Bisan Owda’s account was banned shortly after the acquisition. It was restored following an outcry from users.

TikTok says anyone experiencing a disruption over the last couple of weeks has not been shadow banned; it was result of technical problems following a polar vortex and associated weather-related issues. But this statement from U.S. TikTok came after one million downloads of UpScrolled and reports of concerned users deleting TikTok.

Controlling the algorithm

It may indeed be a coincidence that people had trouble uploading videos critical of ICE at a time of changing ownership, but the whole incident had users talking.

As part of the acquisition, TikTok has been programmed with a new U.S.-specific content moderation algorithm that influences what people do and don’t see. Like with every other social network, the algorithm is considered proprietary information, meaning no academic nor policymaker can independently audit it.

Trump has expressed interest in controlling social media algorithms, so it’s no wonder people are connecting the outage with possible censorship. Looking at Reddit posts about the TikTok sale reveals how upset some users are.

It’s well known that China engages in censorship on the Chinese version of TikTok, Douyin. In fact, this practice was commented on by France’s President Emmanuel Macron, who stated that children on TikTok in China receive more educational content than children in France do.

Knowing this, it’s not surprising that American users would connect the dots and suggest that any TikTok outage would be a result of government censorship.

The truth is, there’s no way to know for sure whether censorship did occur in the first week of the takeover, or whether it’s still occurring in less obvious ways now. Regardless of whether direct government interference is an issue, the algorithm still filters content in ways that often lead to misinformation spreading among a global user base.

Is time up for TikTok?

Does the rush of users from TikTok to apps like UpScrolled spell hard times ahead for TikTok U.S.? We’ve been here before, and the apps that take a temporary hit usually bounce back. After Elon Musk took over Twitter and rebranded it X in 2022, many users, including high-profile celebrities and corporations, left the platform. However, engagement is still strong among people who identify as right wing and MAGA.

Every couple of years, it seems, news outlets publish articles about reasons to leave Facebook. But Facebook and X are still going strong. The fact that these sites survive the exodus of both high-profile and regular users is likely due to network effects.

Social media platforms become more valuable the more people are on them. Not only do they become more interesting when there are more people posting content, but people also want to be on platforms where their friends, family and favourite celebrities already are.

Network effects mean that unless UpScrolled continues its explosive growth, people are unlikely to continue to choose it over the more established TikTok. At best, we might see a Twitter/X effect, which is where TikTok will host more pro-U.S. government content creators and those people who want to follow them, and UpScrolled will host more critical content creators and their followers. This is basically what happened when many left-leaning users moved to BlueSky as an alternative to X.

Because each social network engages in or facilitates different types of content filtering, each provides a different kind of echo chamber that people self-select into or out of.

These echo chambers are a problem because they reinforce beliefs, even ones grounded in mis- and disinformation, and in turn create deeper more polarized divisions between people that are hard to escape from. Since young people report getting most of their news from social media sites, people concerned about algorithms have more than just government censorship to worry about.

The Conversation

Jaigris Hodson receives funding from the Social Sciences and Humanities Research Council of Canada (SSHRC)

ref. Does the exodus to UpScrolled signify the end of TikTok? – https://theconversation.com/does-the-exodus-to-upscrolled-signify-the-end-of-tiktok-274813

CFC replacements cause vast ‘forever chemical’ pollution – new research

Source: The Conversation – UK – By Lucy Hart, PhD Candidate, Environmental Science, Lancaster Environment Centre, Lancaster University

Arctic ice samples show how concentrations of an abundant forever chemical have changed over recent decades. WizartoProduction/Shutterstock

When the phaseout of ozone-destroying chlorofluorocarbons (CFCs) was first agreed in 1987, the world narrowly avoided an environmental catastrophe. However, the replacement of CFCs is causing the pollution of the Earth’s surface with a “forever chemical” that could remain in the environment for centuries.

The chemical trifluoroacetic acid (TFA) is a breakdown product of numerous chemicals, including CFC replacement gases used in refrigeration and air conditioning, pharmaceuticals such as gases used in inhalation anaesthesia, pesticides, solvents and other forever chemicals from a class known as per-and polyfluoroalkyl substances (PFAS).

Concentrations of TFA have been increasing in rainwater, drinking water, soil and plants over the past two decades. Environmental removal of any of the thousands of different PFAS chemicals is extremely challenging because existing removal technology is difficult to scale up.

If emissions aren’t restricted, the projected cost of PFAS removal has been estimated at €100 billion (£86 billion) per year for Europe. Some researchers have labelled TFA as a “planetary boundary threat” which means it could disrupt Earth’s natural systems beyond repair and threaten our survival.

While some PFAS have been linked to numerous cancers and fertility problems, the long-term health effects of TFA on humans and wildlife remains unknown. However, it has been detected in human blood, breast milk and urine, and is being considered for classification as toxic to reproduction by German government agencies.

While understanding of its consequences continues to develop, increasing TFA pollution urgently needs to be addressed.




Read more:
The last ozone-layer damaging chemicals to be phased out are finally falling in the atmosphere


A better understanding of the many TFA sources and their relative contributions to environmental levels is required to inform targeted policy.

Evidence from ice cores can offer clues to help detangle these sources. TFA concentrations in Arctic ice over recent decades match the their increasing use. In 2020, Canadian researchers hypothesised that some CFC replacement gases which are known to break down to produce TFA in the atmosphere could be a major source.

These CFC replacements – known as hydrochlorofluorocarbons (HCFCs) and hydrofluorocarbons (HFCs) – are commonly used in refrigeration, air conditioning and for making insulating foams. They eventually leak into the atmosphere as gases and can travel vast distances. These CFC replacements break down to form TFA and other gases. TFA can be either dissolved in clouds then washed out of the atmosphere through rain or deposited directly from air onto the Earth’s surface.

Our new study, published in the journal Geophysical Research Letters, quantified the contribution of these CFC replacements and also inhalation anaesthetics to global TFA production. We found that one-third of a million tonnes of TFA (335,500 tonnes) has been deposited to the Earth’s surface from these sources between 2000 and 2022.

raindrops falling on puddle
The forever chemical TFA is transported vast distances in the air and can end up washing back to the Earth’s surface in rain.
Astrid Gast/Shutterstock

HCFCs and HFCs have now been phased down under various amendments to the 1987 Montreal protocol on substances that deplete the ozone layer, because they are potent greenhouse gases. Despite this, TFA production increased over the period with the peak production projected to be anywhere between 2025 and 2100.

By comparing the amounts of TFA in our model to Arctic ice core records, we found that these sources can explain virtually all of the TFA deposited in the Arctic. This is particularly concerning because it highlights the ability of TFA pollution to spread around the globe. Emissions from highly populated regions in the northern hemisphere can have a big effect on far-flung regions once considered to be pristine, such as the Arctic.




Read more:
What’s the forever chemical TFA doing in the UK’s rivers?


Peak TFA

However, when we compared our model results to rainwater concentrations closer to emissions regions in developed countries with extensive infrastructure or manufacturing, we found that the sources in our model could not explain all the observed TFA. We questioned whether this missing TFA could be explained by a refrigerant known as HFO-1234yf. This chemical is increasingly used in vehicle air-conditioning because of its low impact on global warming.

While often promoted as a sustainable climate-friendly alternative to HFCs, hydrofluoroolefins (HFOs) can produce TFA much more quickly than HFCs (this process takes days for HFOs and years for HFCs). This may mean that the HFOs don’t travel as far in the atmosphere before breaking down, so more TFA gets deposited back on land closer to the regions they are emitted from.

By adding estimated emissions of HFO-1234yf to the model, we were able to considerably explain the gap between the predicted and actual measurements of TFA.

Emissions of HFOs are highly uncertain, so there may be other unknown sources contributing to the TFA observed in rainwater. But with the increasing use of HFOs, TFA will certainly continue to accumulate in the environment. The peak of TFA emissions from these sources will be well into the future if left unregulated now.

Given the risk of its irreversible accumulation in the environment, animals and people, plus a growing understanding of its effects on human health and nature, preventing pollution at source is the safest and healthiest option.


Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 47,000+ readers who’ve subscribed so far.


The Conversation

Lucy Hart receives funding from Natural Environment Research Council ECORISC CDT.

Ryan Hossaini receives funding from the Natural Environment Research Council

ref. CFC replacements cause vast ‘forever chemical’ pollution – new research – https://theconversation.com/cfc-replacements-cause-vast-forever-chemical-pollution-new-research-274776

How our lab is helping develop an Alzheimer’s test that can be done at home

Source: The Conversation – UK – By Eleftheria Kodosaki, Research Fellow in Neuroimmunology, UCL

These tests could be done at home, making Alzheimer’s diagnosis more accessible. nito/ Shutterstock

Imagine diagnosing one of the most challenging neurological diseases with just a quick finger-prick, a few drops of blood and a test sent in the post. This may sound like science fiction, but we are hoping our research could soon help it become a reality.

Our team at the UK Dementia Research Institute’s Biomarker Factory at UCL are part of the global effort working to develop and validate a test for Alzheimer’s disease. We’re currently working to overcome the various technical challenges facing these tests so that this test can one day soon be available to the broader public.

What do finger-prick tests look for?

At their core, these finger-prick tests are designed to detect specific biomarkers.
Biomarkers are biological molecules found in the blood which indicate signs of disease. In the case of Alzheimer’s disease, the brain gradually accumulates abnormal proteins. These proteins form structures such as amyloid plaques and tau tangles which damage the brain’s neural networks. They’re also involved in brain inflammation.

These abnormal proteins can be detected in the brain, cerebrospinal fluid and, importantly, the blood years before symptoms arise.

Recently, research has also shown these biomarkers can be measured in dried blood samples from a simple finger-prick. A study focusing on 337 people showed that these dried blood samples can reliably detect Alzheimer’s-related changes in biomarkers with a diagnostic accuracy of around 86% compared to conventional methods.

Once refined and validated, these tests could aid with early detection, screening at-risk people, tracking disease progression or even evaluating the effectiveness of emerging treatments.

What are the shortcomings of current diagnostic tools?

In addition to cognitive tests (which check for cognitive decline and memory problems), there are currently two robust approaches for diagnosing signs of Alzheimer’s in the brain.

The first is PET imaging. These scans detect disease characteristics using radioactive tracers which light up areas of the brain where tangles and plaques may be present. However, PET scans are expensive, use radioactivity and require specialist facilities.

The second method uses a spinal tap to extract cerebrospinal fluid (the clear, colourless liquid that protects the brain and extracts waste). This looks for the same biomarkers as finger-prick tests. However, this method is invasive and can be painful and stressful to patients. Some people also may not be eligible to have it done.

A person lays inside a PET scan machine. A screen to the right shows the ongoing scan of their brain.
PET scans are expensive and require specialist facilities.
Gorodenkoff/ Shutterstock

Cognitive tests also have shortcomings. As a result, people whose first language isn’t the one in which the test is being administered, or those who have other health conditions that also cause cognitive problems, may be misdiagnosed.

And, while cognitive testing can give an idea about a potential issue, these tests alone can’t tell us what specific condition is causing symptoms. This can also lead to misdiagnosis.

Even traditional blood tests done in a clinic have limitations. These tests require immediate processing (or refrigeration) and careful handling to avoid influencing biomarker levels. This makes traditional blood tests impractical for large-scale, population-level screening – particularly in underserved or rural regions.

By contrast, the finger-prick test we’re developing can be done at home and posted to a lab without refrigeration.

What are we working on in the lab?

Our lab is currently working to improve the sensitivity, reliability and real-world usability of these finger-prick tests.

We’re currently experimenting with different, sensitive biomarker detection methods – using just tiny volumes of blood collected from either the finger or the vein and seeing how these compare.

Alongside tau and amyloid, we’re also testing other proteins associated with Alzheimer’s and various neurodegenerative disorders – such as Parkinson’s disease and multiple sclerosis.




Read more:
New Alzheimer’s drug: what you need to know about donanemab’s promising trial results


Our hope with these tests is not only to identify Alzheimer’s disease, but to catch it before irreversible brain damage occurs. This would open a window for early intervention.

With novel therapies emerging that may slow the disease, early identification is critical.

What challenges have we encountered?

Designing these tests hasn’t been straightforward. We’ve encountered a few major hurdles along the way.

The first hurdle we encountered had to do with the biomarkers themselves.

Alzheimer’s biomarker levels are often much lower in the blood than they are in cerebrospinal fluid. So the technological methods needed to measure them accurately had to be very sensitive.

Another obstacle we encountered related to sample quality. Without refrigeration, the proteins can degrade – giving inaccurate readings and potentially misdiagnoses. So we’re currently working to develop collection and mailing methods that ensure these dried blood proteins are stable and don’t degrade before testing.

Data interpretation has also been a challenge. Although these tests are accurate for the majority of cases, we still need to figure out how to interpret outliers – such as participants who have high biomarker levels without other signs of the disease, and those who have low biomarker levels with significant signs of the disease. So even when we detect elevated biomarkers, interpreting what that means for a person’s Alzheimer’s risk is complex.

Alzheimer’s biomarkers are also not exclusive to the disease. Similar biomarkers can occur in other neurological conditions such as vascular dementia, multiple sclerosis, and even in otherwise asymptomatic people or even healthy newborns.

We’ve since refined our tests so they’re more sensitive and have sourced and are currently comparing devices that make at-home sample collection easier. These solutions are steadily improving test reliability.

What could our work mean for Alzheimer’s diagnosis?

It’s important to emphasise that these tests are still at least a few years away from routine use. But, if validated, finger-prick tests could revolutionise Alzheimer’s diagnosis in several ways.

It would allow for earlier detection of the disease and broaden access for patients. It would also enable larger, more diverse population studies to be conducted – reducing historical gaps in Alzheimer’s research and improving our understanding of the disease globally.

The idea of diagnosing Alzheimer’s with a quick, finger-prick test marks a profound shift in how we could approach neurodegenerative diseases. Moving beyond invasive, costly procedures toward accessible, patient-friendly diagnostics carries enormous potential — for patients, their families and future research.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. How our lab is helping develop an Alzheimer’s test that can be done at home – https://theconversation.com/how-our-lab-is-helping-develop-an-alzheimers-test-that-can-be-done-at-home-273967