Source: The Conversation – UK – By Dale Pankhurst, PhD Candidate and Tutor in the School of History, Anthropology, Philosophy and Politics, Queen’s University Belfast
After two years of war, Israel and Hamas have agreed on the “first phase” of a US-backed peace plan for Gaza. The deal, if it holds, will involve the release of Israeli hostages and Palestinian prisoners, the withdrawal of Israeli troops from Gaza and the entry of aid into the enclave.
The president of the Palestinian Authority, Mahmoud Abbas, has welcomed the news. He has expressed hope that the deal acts as a “prelude to reaching a permanent political solution” between Israel and Palestine.
But what lies ahead for Hamas? A clause in the wider peace plan calls for the full dissolution of the group, both as a militant organisation and as a civil administration. It is difficult to see how Hamas leadership will negotiate their way through this without some form of disarmament or demobilisation.
The Israeli government, with backing from the US and other western countries such as the UK, has repeatedly said the full demobilisation of Hamas and its militant wing is the only possible outcome it will accept. This leads to a significant dilemma for Hamas.
Its entire reason for existence is to seek the destruction of the Israeli state through violence. There is no room for peaceful, democratic means in its objectives. So if the Hamas leadership are to pursue some form of demobilisation, they risk fracturing the organisation into dissenting armed factions that continue their militancy against Israel.
The Wall Street Journal reports that Hamas’s lead negotiator, Khalil al-Hayya, as well as other political officials living outside of Gaza, are ready to accept disarmament as part of a wider peace process. But analysts suggest other leaders and militants still based in Gaza may be less willing to compromise.
Hamas has remained remarkably resilient throughout the two years of war in Gaza. US figures from early 2025 showed that Hamas had added up to 15,000 new volunteers since the October 7 attacks in 2023, largely replacing those it had lost since the start of the conflict. Many of these recruits may be reluctant to surrender their weapons after losing family and property during the war.
At the same time, Hamas is not the only armed Palestinian group operating in Gaza. Although Hamas led the October 7 attacks against Israel, the attacking force contained militants from multiple armed groups.
These included Palestinian Islamic Jihad (PIJ), the Marxist-Leninist Popular Front for the Liberation of Palestine, the Al-Aqsa Martyrs Brigades, the Maoist Democratic Front for the Liberation of Palestine (DFLP) and the Palestinian Mujahideen Movement.
Some of these groups, including the PIJ, are thought to have joined Hamas in peace talks with Israel. Others are less willing to enter negotiations. The DFLP, for example, has said in a statement that it rejects any form of international mandate or guardianship in Gaza. This includes the future involvement of the former British prime minister, Sir Tony Blair, or an international security force.
Beyond Gaza, Hamas has to consider its future in broader Palestinian politics. The armed group has ruled over Gaza since 2007. But its traditional opponent, Fatah, which Hamas expelled from the Gaza Strip in 2007 following a bloody feud, continues to wield significant political authority in the West Bank through its dominance of the Palestinian Authority.
Relations between Hamas and Fatah have been cordial in recent years. But Hamas may fear any demobilisation of its armed forces could shift the balance of power within Palestinian politics, enabling the Palestinian Authority to renew efforts for Gaza to rejoin the West Bank under a single, unified political authority.
Some form of disarmament is possible
Comparable case studies show that the disarmament and demobilisation of insurgent groups is possible, at least in part. In Northern Ireland, the Provisional Irish Republican Army (Pira) decommissioned a large portion of its weaponry in 2005 following protracted peace negotiations.
The Revolutionary Armed Forces of Colombia (Farc) also demobilised its armed units in 2017, a year after a historic peace settlement was reached between the Colombian state and the leftist rebels. Both organisations disarmed despite the presence of other armed groups, such as dissident republicans in Northern Ireland and the National Liberation Army in Colombia, that continued to wage violent campaigns.
Yet in Northern Ireland, the Pira never fully demobilised its volunteer base nor did it decommission all of its weapons. British security services and the Northern Irish police have found evidence that Pira members have been involved in several murders against internal opponents since the group decommissioned.
British intelligence also believes that the Pira’s militant structures and decision-making body, the army council, remain intact. They allege that these people now oversee the political strategy of Sinn Féin, an Irish republican political party.
While some insurgent groups disarm and demobilise, their legacy is slow to fade. Would Israel be willing to accept a similar disarmament, demobilisation and reintegration arrangement in Gaza as the British have done in Northern Ireland?
It is difficult to see the government of Israeli prime minister, Benjamin Netanyahu, which has continually reiterated that Hamas must be completely destroyed, doing so. Yet a different Israeli administration might.
It also remains to be seen whether Hamas could plausibly disarm a portion of its forces, such as its rocket units and armed assault groups, and allow others to be absorbed into a security force system governed by a body styled on the Palestinian Authority.
A monumental shift in strategic direction would be required for Hamas to reach this point. And the group is arguably more ideologically entrenched now as an Islamist Palestinian movement than the Pira was in the 1990s or the Farc in the 2010s.
Hamas is at a crossroads. It now faces either a period of negotiating for its future with little room to manoeuvre or further war with Israel if it refuses to dissolve. The challenge for mediators is to find a pathway that satisfies Israeli security demands and Hamas’s own quest for survival and transformation within Palestinian politics.
Dale Pankhurst does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
A major shift is unfolding in the field of skin cancer prevention, ignited by new research showing that an everyday vitamin supplement may prevent many cases of the world’s most frequently diagnosed cancer.
Whereas previous studies hinted at a potential benefit, the latest research – spanning more than 33,000 US veterans – suggests that adding this simple vitamin pill to daily routines could dramatically lower skin cancer risk, especially for those who have already experienced their first case.
The scale, breadth and clarity of this evidence are driving calls to rethink how skin cancer is prevented.
Skin cancer is the world’s most common form of cancer. Non-melanoma types, including basal cell carcinoma and cutaneous squamous cell carcinoma, account for millions of new cases each year.
These cancers are linked to cumulative sun exposure, fair skin and ageing. Existing prevention strategies focus on avoiding ultraviolet (UV) rays and using sunscreen, but rates continue to climb, and patients diagnosed with one skin cancer typically face a stubborn cycle of recurrence.
Enter nicotinamide, a cheap, widely available supplement. Researchers observed that this form of vitamin B3 bolsters the skin’s natural repair systems after UV damage, reduces inflammation, and helps the immune system detect and clear abnormal cells.
In the new study, over 12,000 patients who began taking nicotinamide at 500mg twice daily for more than a month were compared to more than 21,000 who did not. Those taking nicotinamide saw a 14% lower risk of developing any new skin cancer. The protective effect was most profound when started promptly after a first diagnosed skin cancer, resulting in a 54% drop in the risk of additional cancers.
This benefit faded if supplementation started only after multiple recurrences, suggesting that timing matters. The effect was seen across both main skin cancer types but was particularly robust for squamous cell carcinoma, which can behave more aggressively and carries a greater risk of complications.
It’s important to underscore that, while hopeful, these findings do not suggest nicotinamide should replace sun avoidance or routine skin checks. Wearing hats, using sunscreen and seeking shade remain pillars of prevention.
Still, the simplicity, safety and low cost of nicotinamide mean that its incorporation as a daily “add-on” is an accessible step for most people, especially those with a track record of skin cancer. For dermatologists, this is an attractive profile compared to some prescription medicines used to prevent recurrence, which may be more expensive or have worse side-effects.
As a secondary prevention tool, it stands out as effective and practical. The timing of intervention appears paramount, with the greatest benefit gained when nicotinamide is offered straight away. In practice, this shifts the conversation, urging healthcare professionals and patients to view the first cancer as a red flag to act decisively.
The findings emerge from an observational study using real-world data, meaning researchers looked at health records and drew statistical associations. Most participants were white males, so the broader relevance of these findings remains uncertain.
While this type of study cannot prove cause and effect as powerfully as a randomised trial, the results align with earlier, smaller trials that hinted at the same benefit. They reinforce the idea that a simple, non-pharmaceutical intervention could help in the battle against the world’s most common cancer, and at a fraction of the expense or risk of more intensive therapies.
This research does not settle every question. It remains to be seen how nicotinamide performs over very long periods and whether the benefit is as robust in more diverse populations. Additionally, people who have never had skin cancer were not the focus, so broader recommendations are likely to stay reserved for those with a prior history.
Still, for those confronting the anxiety of a first skin cancer diagnosis, the promise of a readily available, low-cost and well-tolerated supplement offers a new sense of control.
Justin Stebbing does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
I recently heard Professor Luigi Ferrucci, an expert on ageing, speak at my local university’s medical school. One line really stuck with me: “The next great step in ageing science will be understanding how lifestyle factors slow down ageing.”
That, to me, is the ultimate goal. If we can slow the ageing process, we could delay or shorten the time we spend living with age-related illnesses. In other words, we might stay healthy for longer and only experience those diseases in the last few years of life, feeling younger and better overall.
As Ferrucci gave his talk, a new study was being published showing that one of the most surprising factors influencing ageing is our social life. It turns out that staying connected to others could slow how fast we age.
We’ve known for a while that people with strong social ties tend to live longer and enjoy better health. What’s been less clear is how our social connections affect our bodies on a biological level.
In this new American study of more than 2,000 adults, researchers looked at the strength and consistency of people’s social connections – things like family relationships, involvement in community or religious groups, emotional support and how active they were in their communities.
They devised a measure called “cumulative social advantage” (CSA) – essentially, how socially connected and supported someone is. This was a step forward because most earlier studies looked only at single factors like marriage or friendship.
The researchers then compared CSA to different measures of ageing. They looked at biological age (based on DNA changes, known as “epigenetic clocks”), levels of inflammation throughout the body, and how people’s stress-related hormones – such as cortisol and adrenaline – were behaving.
They found that people with stronger social connections tended to show slower biological ageing and lower inflammation. However, there wasn’t much of a link between social life and short-term stress responses, though the researchers suggested that this might simply be because those are harder to measure.
Altogether, the study adds to growing evidence that our social connections are closely tied to how we age. But perhaps we shouldn’t be too surprised. Humans have evolved over hundreds of thousands of years as social beings.
For our ancient ancestors, belonging to a group wasn’t just about company – it was key to survival. Working together kept us safer, helped us find food and supported our wellbeing. It makes sense, then, that our bodies have developed to thrive when we’re socially connected.
The study also found that social advantage is linked with broader inequalities. People with higher levels of education, better income or belonging to certain ethnic groups often showed slower ageing and lower inflammation. This suggests that both our social and economic circumstances affect how we age.
There seem to be two ways to respond to this. First, we need social policies that reduce poverty and improve education and opportunity, because these factors clearly shape health and ageing. But second, we also have some individual control. Strengthening our own social lives – staying connected, supportive and involved – can also make a difference.
I remember being in Washington DC in 2014 for the 40th anniversary of the US National Institute on Aging, where Ferrucci now serves as chief scientific director. During the event, someone asked the head of social sciences: “What will be the most important research area for the next century?” Without hesitation, he replied: “Social science and genetics.”
At the time, no such research programme existed – but he was right. As this new study shows, bringing together these two fields is helping us understand not just how we age, but how we might age better.
James Goodwin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Before we can answer this question, we need to think about another one: “what is art?” Art is something people make to share ideas or feelings. It can make others think or feel something too. Art can be many things including music, stories, paintings or drawings.
Cave paintings are often called the first art ever made. However, it’s possible the people who created the paintings thought of them as mysterious and powerful, quite different from art as we think of it today.
So who made them, why did they make them, and where can we find them? In a cave called Chauvet in southern France, archaeologists found drawings of animals such as woolly rhinos and mammoths that died out over 10,000 years ago. The people who made the drawings used black charcoal and red ochre – a colour made from crushed-up rocks that were chewed and spat into the artist’s hand, then pressed against the cave walls. Similar cave paintings have been found in Australia, India and Somaliland.
Curious Kids is a series by The Conversation that gives children the chance to have their questions about the world answered by experts. If you have a question you’d like an expert to answer, send it to curiouskids@theconversation.com and make sure you include the asker’s first name, age and town or city. We won’t be able to answer every question, but we’ll do our very best.
Some people think the cave paintings weren’t just for fun or decoration. They believe the drawings were supposed to be a kind of “magic”. By drawing animals like deer or bison, they argue, the person who made the picture (maybe a hunter) thought it would give them magical power over the animal they were hoping to catch.
Early thinking about art
A long time ago, a Greek thinker named Aristotle said that the point of art was to imitate the world around us. For him, art wasn’t just painting or drawing – it also included acting and even giving speeches. Because artists used their hands to make things, people thought of them like workers or craftspeople – similar to cooks, hairdressers, or blacksmiths.
In 13th- and 14th-century Europe, art was mostly connected to the church, and was made to help people feel closer to God. Artists were part of groups called guilds, based on the kind of work they did, and people saw them more as skilled workers than as creative individuals.
It wasn’t until the 15th and 16th centuries, known as the Renaissance in Europe, that artists began to see themselves as creators, not just craftsmen. A big change happened in 1436 when a man named Leon Battista Alberti wrote a famous book called On Painting, which claimed that art was just as important as poetry and science. His ideas had a huge effect in the city of Florence in Italy, where three very famous artists worked: Leonardo da Vinci, Michelangelo and Raphael.
People started to think more about artists as special individuals, which was shown in another important book, Lives of the Artists, written by Giorgio Vasari in 1550.
Art began to be divided into two groups. The first was called the “fine arts”, which included painting, sculpture and drawing. These were seen as more important because they expressed big ideas and emotions. The second group was called the “decorative arts”, like glass-making, wood-carving and book decorations. These were thought to be less important because they were more about looking nice or being useful.
In the late 19th century, people started to like the decorative arts more, because artists wanted to focus on handmade things instead of factory-made items. But painting was still seen as the most important kind of art. Then, in 1914, a French artist named Marcel Duchamp changed how people thought about art.
He started using everyday objects and turning them into art just by choosing them and signing them. He called these “readymades”. His most famous one was called Fountain – it was actually a type of toilet (a urinal) that he signed with a fake name, “R. Mutt”, and tried to put in an art show in New York in 1917. Duchamp said that picking an ordinary object and calling it art was enough to make it art, because the artist made the choice.
Duchamp helped change art by showing that it isn’t just about painting or making statues – it’s also about ideas.
Today, many artists use their work to talk about important issues and to make people think. In this way, they are no different from the artists of the past – such as the first cave dwellers who exerted power over their prey, or Duchamp, who challenged the very meaning of art.
And so the answer to the question “who invented art?” is quite simple. Humankind invented art – from the moment we were able to trace a pattern in the sand, or transfer a simple idea to the wall of a cave.
Frances Fowle does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Source: The Conversation – UK – By Thankom Arun, Professor of Global Development and Accountability, University of Essex
Keir Starmer’s first visit to India was a chance to talk about trade, technology and a closer relationship. The UK prime minister said he was impressed by the country’s “sheer scale” and impressive economic growth.
He may be fairly envious of that growth which, at 7.8% for the first quarter of the year, is several times higher than the UK’s. The country is projected to become the world’s third-largest economy by 2030, with an estimated GDP of US$7.3 trillion (£5.5 trillion). Starmer may also have noticed that one of India’s biggest economic successes is in the burgeoning sector of financial technology, where it is in direct competition with the UK.
Commonly referred to as “fintech”, financial technology involves digital tools and software which make things like banking and investing more efficient and accessible. For years, London has been celebrated as a global hub.
But our research suggests that India’s very different approach to fintech may be a more resilient and forward-looking model – and one which offers important lessons for the UK and its government.
For in the UK, fintech is almost entirely a London-based affair. The capital attracts more than 80% of the country’s investment in the sector, and is home to most of its startups.
But the cost of this success is that other parts of the UK lag behind. Our study shows that this concentration limits innovation and employment outside of London. In effect, the city’s “superhub” status may now be holding back the next stage of national fintech development.
India’s story looks very different. Rather than revolving around a single big city, fintech has evolved across a broad range of regional hubs. Bangalore, Mumbai and New Delhi lead the way, but newer centres such as Telangana and Tamil Nadu are also rapidly emerging.
This spread of growth is not accidental. It reflects years of government investment in digital public infrastructure across the country, as well as a great deal of foreign investment.
We found that between 2000 and 2022, India pulled in US$144 billion compared to Britain’s US$82 billion, reflecting growing investor confidence in India’s fintech market.
That investment is also much more widely spread out in India, where we found a much more balanced and resilient innovation landscape compared to the UK. The outcome is not only faster growth, but more development throughout the country.
Overall we found that the trajectory of India’s fintech sector appears to be more sustainable and regionally dispersed than the UK’s – a clear validation of the country’s “digital-first” strategy, which was launched ten years ago.
We also noted the success of direct government involvement in platforms like the Unified Payments Interface (a game changing initiative which allows instant money transfers between any two bank accounts using a smartphone and now processes over 12 billion transactions a month).
Indian innovation
For the UK, our research concluded that although London’s dominance in fintech has served the country well, it now risks becoming a bottleneck. A fintech sector model built around a single global city limits regional opportunities and undermines national productivity.
If the government’s broader “national renewal” agenda is to succeed, fintech policy could become a test case for rebalancing the economy. That means encouraging investment in regional clusters and directly supporting innovation outside London.
But the two countries could also help each other out. Together, they could create one of the most dynamic fintech partnerships in the world.
As well as trade and investment, such a collaboration could provide a dynamic model for inclusive, technology-driven growth, linking Britain’s financial expertise with India’s digital ingenuity.
For the global fintech landscape is changing. The era of a dominant hubs, whether London, New York or Singapore, is giving way to a more decentralised model. And India’s rise shows that the future of finance lies not in concentration, but in connectivity.
Sustainable innovation depends not just on capital and talent, but on geography, inclusion, and the ability to share the digital dividends of growth. If Keir Starmer looks east for inspiration and partnership, he may find that India’s fintech journey offers precisely the blueprint the UK needs – one that proves there is greater strength to be found not in one hub, but in many.
Source: The Conversation – UK – By Deborah Fry, Professor of International Child Protection Research and Director of Data at the Childlight Global Child Safety Institute, University of Edinburgh
Around 5 million children across western Europe report having been raped or sexually assaulted by the age of 18, according to the latest data gathered by Childlight, the Global Child Safety Institute. That’s about 7% of the child population.
In south Asia, data for India, Nepal and Sri Lanka suggests the figure rises to 12% of children – more than 50 million young people in those three countries alone.
The online picture is equally alarming. In western Europe alone, one in five children (19.6%) say they have faced unwanted or pressured sexual interactions online before adulthood.
The data also reveals that over 60% of all child sexual abuse material in western Europe (and 30% globally) is hosted in the Netherlands.
These shocking figures come from Childlight’s latest Into the Light index. As Childlight’s director of data, and as a professor of international child protection research, I have spent nearly 20 years studying the patterns of child sexual exploitation and abuse worldwide. What our data shows is both deeply troubling and a call to urgent action.
How we measure the scale
In 2024, we launched the inaugural Into the Light index – the first comprehensive global report of child sexual exploitation and abuse. It introduced a new framework, the first regional prevalence estimates and indicators of child sexual abuse material.
The 2025 edition goes further. It covers both online and offline abuse and country-level data for 41 countries in western Europe and south Asia, incorporating the analysis of:
89 studies which used survey methods to identify victims of rape and sexual assault
crime statistics and child helpline data
global child sexual abuse material trends, including AI-generated imagery and hosting patterns.
For western Europe, we reviewed 48 studies from 19 countries, finding that between 3.7% and 9.6% of children reported being raped or sexually assaulted by the age of 18. For south Asia, representative data from India, Nepal and Sri Lanka shows around 12% of children were raped or sexually assaulted by 18 – 14.5% of girls and 11.5% of boys.
What the data reveals
Our research points to widespread abuse and some keys issues emerged.
AI-generated child sexual abuse material is rising: reports rose 1,325% between 2023 and 2024, amid growing concerns about deepfake images placing children’s faces onto sexual material. This rise was seen in reports to the National Center for Missing and Exploited Children, which rose to over 67,000 in 2024, from 4,700 reports logged in 2023.
Meanwhile, familial abuse is leading to the creation of new child sexual abuse material, with a large proportion of identified material depicting immediate family members.
Behind these numbers are real children, millions who stay silent out of fear, guilt or loyalty to family members. Yet the consequences are lifelong, affecting mental health, physical health and even life expectancy.
Childlight, hosted by the University of Edinburgh and the University of New South Wales, is the world’s first independent global data institute dedicated to protecting children from sexual exploitation and abuse.
As I have written before, the fight to keep young people safe from harm has been hampered by how data differs in quality and consistency around the world. Our aim is to work in partnership with many other organisations to help join up the system and close the data gaps.
What can be done
The good news is that solutions exist and momentum is building, with 30 governments globally pledging action to improve online safety for children since an intergovernmental summit in Colombia last November.
The legislation is showing promising signs. The EU Digital Services Act and the UK Online Safety Act now require platforms to assess child risk, report incidents and publish transparency data. Australia’s eSafety Commissioner has also compelled firms to publish reports revealing how they are failing to track the problem.
Enforcement is having an impact. In April 2025, Kidflix, one of the largest paedophile platforms in the world, was shut down through an international Europol-backed operation, with servers seized and perpetrators arrested.
Prevention is working too. The Barnahus model in Europe, for example, brings together police, health and social services to support children in a child-friendly environment. It has been linked to more perpetrators being charged and convicted.
In addition, “blocklist” technology which acts as a virtual shield is thwarting 3 million attempts to view illegal sexual images of children online every week. Lists of known online addresses which host child sexual abuse material are compiled and shared by organisations including Internet Watch Foundation, so they can be blocked by major internet service providers, shutting down access to harmful images.
Urgency matters
The law must require proactive detection and removal of child sexual abuse material. Education and open conversations that empower children and families must be supported and encouraged. And finally we must invest in prevention models that work.
In the UK, that could mean extending the law on criminalising paedophile manuals to include material generated by AI. It could mean a Barnahus expansion, and it certainly should mean reforming the Criminal Injuries Compensation Scheme so all victims of child sexual abuse (including those harmed “virtually” through technology) are recognised and supported.
Child sexual exploitation and abuse is not inevitable. Like other public health crises, it is preventable and can prevent a lifetime of trauma with benefits for children, families, communities and economies.
But prevention depends on first understanding the true scale and nature of the problem. Our data is a spotlight, exposing what too often remains hidden in the shadows.
Deborah Fry receives funding from Human Dignity Foundation.
Give me control of a nation’s money supply, and I care not who makes its laws. (Mayer Amschel Rothschild, founder of the Rothschild banking dynasty.)
Throughout history, control over money has been one of the most powerful levers of state authority. Rulers have long understood that whoever issues and manages the currency also commands the economy and, by extension, society itself.
In Tudor England, Henry VIII’s “Great Debasement” between 1542 and 1551 reduced the silver content of coins from more than 90% to barely one-third, while leaving the king’s portrait shining on the surface, of course. The policy financed wars and courtly extravagance, but also fuelled inflation and public distrust in coinage.
Centuries earlier, Roman emperors had resorted to similar tricks with the denarius, steadily reducing its silver content until by the 3rd century AD, it contained little more than trace amounts, undermining its credibility and contributing to economic instability.
Outside Europe, the same pattern held. In 11th-century China, the Song dynasty pioneered paper money, extending state control over taxation and trade. This was a groundbreaking innovation, but later dynasties such as the Ming over-issued notes, sparking inflation and loss of trust in the currency.
Such episodes underline a timeless truth: money is never neutral. It has always been an instrument of governance – whether to project authority, consolidate control or disguise fiscal weakness. The establishment of central banks, from the Bank of England in 1694 to the US Federal Reserve in 1913, formalised that authority.
The Insights section is committed to high-quality longform journalism. Our editors work with academics from many different backgrounds who are tackling a wide range of societal and scientific challenges.
Today, the same story is entering a new digital chapter. As Axel van Trotsenburg, senior managing director of the World Bank, wrote in 2024: “Embracing digitalisation is no longer a choice. It’s a necessity.” By this he meant not simply switching to online banking, but making the currencies we use, and the mechanisms for regulating it, entirely digital.
Just as rulers once clipped coins or over-printed notes, governments are now testing how far digital money can extend their reach – both within and beyond national boundaries. Of course, different governments and political systems have very different ideas about how the money of the future should be designed.
In March 2024, then-former President Trump, back on the hustings trail, declared: “As your president, I will never allow the creation of a central bank digital currency.” It was a campaign moment, but also a salvo in a much larger battle – not just over the future of money, but who controls it.
In the US, the issuance of currency – whether in the form of physical cash or digital bank deposits and electronic payments – has traditionally been monopolised by the Federal Reserve (more commonly known as “the Fed”), a technocratic institution designed to operate independently from the elected government and houses. But Trump’s hostility toward the Fed is well-documented, and noisy.
During his second term, Trump has publicly berated the Fed’s chair, Jerome Powell, calling him “a stubborn MORON” over his interest rate policies, and even floating the idea of replacing him. Trump’s discomfort with the Fed’s autonomy echoes earlier populist movements such as President Andrew Jackson’s 1830s crusade against the Second Bank of the United States, when federal financial elites were portrayed as obstacles to democratic control of money.
In March 2025, when Trump issued an executive order establishing a Strategic Bitcoin Reserve, he signalled the opening of a new front in this institutional battle. By incorporating bitcoin into an official US reserve, the world’s largest economy is, for the first time, sanctioning its use as part of state financial infrastructure.
For a leader like Trump, who has consistently sought to break, bypass or dominate independent institutions – from the judiciary to intelligence agencies – the idea of replacing the Fed’s influence with a state-aligned crypto ecosystem may represent the ultimate act of executive assertion.
Such a step reframes bitcoin as more than an investment fad or criminal fallback; it is being drawn into the formal monetary system – in the US, at least.
America’s crypto future?
Bitcoin is, by a distance, the world’s most valuable cryptocurrency (at the time of writing, one coin is worth just shy of US$120,000) having established a record high in August 2025. Like gold, its value is ensured in part by its finite supply, and its security by the blockchain technology that makes it unhackable.
For most who buy bitcoins, its key value is not as a currency but a speculative investment product – a kind of “digital gold” or high-risk stock that investors buy hoping for big returns. Many people have indeed made millions from their purchases.
But now, thanks in particular to Trump’s aggressively pro-crypto, anti-central bank approach, bitcoin’s potential role as part of a new form of state-controlled digital currency is in the spotlight like never before.
Trump’s framing of bitcoin as “freedom money” reflects its traditional sales pitch as being censorship-resistant, unreviewable, and free from state control. At the same time, his blurring of public authority and private financial interest, when it comes to cryptocurrencies, has raised some serious ethical and governance concerns.
But the crucial innovation here is that Trump is not proposing a truly libertarian system. It is a hybrid model: one where the issuance of money may become privatised while control of the US’s financial reserve strategy – and associated political and economic narratives – remains firmly in state hands.
This raises provocative questions about the future of the Federal Reserve. Could it be sidelined not through legal abolition, but by the growing relevance of parallel monetary systems blessed by the executive? The possibility is no longer far-fetched.
According to a 2023 paper published by the Bank for International Settlements, a powerful if little-known organisation that coordinates central bank policy globally: “The decentralisation of monetary functions across public and private actors introduces a new era of contestable monetary sovereignty.”
In plain English, this means money is no longer the sole domain of states. Tech firms, decentralised communities and even AI-powered platforms are now building alternative value systems that challenge the monopoly of national currencies.
Calls to diminish the role of central banks in shaping macroeconomic outcomes are closely tied to the rise of what the University of Cambridge’s Bennett School of Public Policy calls “crypto populism” – a movement that shifts legitimacy away from unelected technocrats towards “the people”, whether they are retail investors, cryptocurrency miners or politically aligned firms.
Supporters of this agenda argue that central banks have too much unchecked power, from manipulating interest rates to bailing out financial elites, while ordinary savers bear the costs through inflation or higher borrowing charges.
In the US, Trump and his advisers have become the most visible proponents, tying bitcoin and also so-called “stablecoins” (cryptocurrencies designed to maintain a stable value by being pegged to an external asset) to a broader populist narrative about wresting control from elites.
The emergence of this dual monetary system is causing deep unease in traditional financial institutions. Even the economist-activist Yanis Varoufakis – a long-time critic of central banks – has warned of the dangers of Trump’s approach, suggesting that US private stablecoin legislation could deliberately weaken the Fed’s grip on money, while “depriving it of the means to clean up the inevitable mess” that will follow.
Weaponisation of the dollar
Some rival US nations also feel deep unease about its approach to money – in part because of what analysts call the “weaponisation of the dollar”. This describes how US financial dominance, via Swift and correspondent banking systems, has long enabled sanctions that effectively exclude targeted governments, companies or individuals from global finance.
These tools have been used extensively against Iran, Russia, Venezuela and others – triggering efforts by countries including China, Russia and even some EU states to build alternative payment systems and digital currencies, aimed at reducing dependency on the dollar. As the Atlantic put it in 2023, the US appeared to be “pushing away allies and adversaries alike by turning its currency into a geopolitical bludgeon”.
Spurred on by these concerns and an increasing desire to delink from the dollar as the world’s anchor currency, many countries are now moving towards creating their own central bank digital currencies (CBDCs) – government-issued digital currencies backed and regulated by state institutions.
While fully live CBDCs are already in use in countries ranging from the Bahamas and Jamaica to Nigeria, many more are in active pilot phases – including China’s digital yuan (e-CNY). Having been trialled in multiple cities since 2019, the e-CNY now has millions of domestic users and, by mid-2024, had processed nearly US$1 trillion in retail transactions.
A key part of Beijing’s ambition is to use the digital yuan as a strategic hedge against dollar-based clearance systems, positioning it as part of a wider plan to reduce China’s reliance on the US dollar in international trade. Likewise, the European Central Bank has framed its digital euro – which entered its preparation phase in October 2023 – as essential to future European monetary sovereignty, stating that it would reduce reliance on non-European (often US-controlled) digital payment providers such as Visa, Mastercard and PayPal.
In this way, CBDCs are becoming a new front in global competition over who sets the rules of money, trade and financial sovereignty in the digital age. As governments rush to build and test these systems, technologists, civil libertarians and financial institutions are clashing over how best to do this – and whether the world should embrace or fear the rise of central bank digital currencies.
Trojan horses for surveillance?
The experience of using a CBDC will be much like today’s mobile banking apps: you’ll receive your salary directly into a digital wallet, make instant payments in shops or online, and transfer money to friends in seconds. The key difference is all of that money will be a direct claim on the central bank, guaranteed by the state, rather than a private bank.
In many countries, CBDCs are being pitched as more efficient tools for economic inclusion and societal benefit. A 2023 Bank of England consultation paper emphasised that its proposal for a digital pound would be “privacy-respecting by design” and “non-programmable by the state”. It would not replace cash but sit alongside it, the BoE suggested, with each citizen allowed to hold up to a capped limit digital pounds (suggested at £10,000-£20,000) to avoid destabilising commercial bank deposits.
However, some critics see CBDCs as Trojan horses for surveillance. In 2019, a report by the professional services network PWC suggested that CBDCs, if unchecked, could entrench executive power by removing intermediary financial institutions and enabling programmable, direct government control over citizen transactions. According to the report, this could mean stimulus payments that expire if not spent within 30 days, or taxes deducted at the moment of transaction. In other words, CBDCs could be tools of efficiency – but also of unprecedented oversight.
A 2024 CFA Institute paper warned that digital currencies could allow governments to trace, tax or block payments in real time – tools that authoritarian regimes might embrace. The Bank for International Settlements (BIS) has called the advent of this “programmable money” inevitable.
Imagine, for example, a parent transferring 20 digital pounds to their child’s CBDC wallet, but with a rule that this money can only be spent on food, not video games. When the child uses it at a supermarket, their payment is programmed so that the retailer’s suppliers and the tax authority are paid instantly (£15 to the shop, £3 to wholesalers, £2 straight to the tax office) with no extra steps. In theory, at least, everyone is happy: the parent sees the child spent the money responsibly, the suppliers are paid immediately, and the retailer’s tax bill is settled automatically.
In technical terms, programmable payments such as this are straightforward for CBDCs. But such a system raises big questions about privacy and personal freedom. Some critics fear that programmable CBDCs might be used to restrict spending on disapproved categories such as alcohol and fuel, create expiry dates for unemployment benefits, or enforce climate targets through money flow limits. The BIS has warned that CBDCs should be “designed with safeguards” to preserve user privacy, financial inclusion and interoperability across borders.
Even well-intentioned digital systems can create tools of surveillance. CBDC architecture choices, such as default privacy settings, tiered access or transaction expiry can all shape the extent of executive control embedded in the system. If designed without democratic oversight, these infrastructures risk institutional capture.
Some CBDC pilots – including China’s e-CNY, the Sand Dollar and the eNaira – have been criticised for omitting clear privacy guarantees, with their respective central banks deferring decisions on privacy protections to future legislation. According to Norbert Michel, director of the Cato Institute’s Center for Monetary and Financial Alternatives and one of the most prominent US voices warning about the risks of CBDCs:
A fully implemented CBDC gives the government complete control over the money going into, and coming out of, every person’s account. It’s not difficult to see that this level of government control is incompatible with both economic and political freedom.
Fears of mission creep
The concerns being raised about central bank digital currencies extend beyond personal payment controls. A recent analysis by Rand Corporation highlighted how law enforcement capabilities could dramatically increase with the introduction of CBDCs. While this could strengthen efforts to stop money laundering and the financing of terrorism, it also raises fears of “mission creep”, whereby the same tools could be used to police ordinary citizens’ spending or political activities.
Concerns about mission creep – the idea that a system introduced for limited goals (efficiency, anti-money laundering) gradually expands into broader tools of control – extend into other areas of digital authoritarianism. The Bennett School has cautioned that without legal and political safeguards, CBDCs risk empowering state surveillance and undermining democratic oversight, especially in an interconnected global system.
It is not anti-technology or overly conspiratorial to ask hard questions about the design, governance and safeguards built into our future money. The legitimacy of CBDCs will hinge on public trust, and that trust must be earned. As has been highlighted by the OECD, democratic values like privacy, civic trust and rights protection must all be integral to CBDC design.
The future of money
Predictably, the public view of what we want our money to look like in future is mixed. The tensions we see between centralised CBDCs and decentralised alternatives reflect fundamentally different philosophies.
In the US, populist rhetoric has found a strong base among cryptocurrency investors and libertarian movements. At the same time, surveys in Europe suggest many people remain sceptical of replacing a central bank’s authority, associating it with stability and trustworthiness.
For the US Federal Reserve, the debate over bitcoin, decentralised finance (“DeFi”) and stablecoins goes to the heart of American financial power. Behind closed doors, some US officials worry that both the unchecked use of stablecoins and a widespread adoption of foreign CBDCs like China’s e‑CNY will erode the dollar’s central role and weaken the US’s monetary policy apparatus.
In this context, Trump’s push to elevate crypto into a US Strategic Bitcoin Reserve carries serious implications. While US officials generally avoid direct comment on partisan moves, their policy documents make the stakes clear: if crypto expands outside regulatory boundaries, this could undermine financial stability and weaken the very tools – from monetary policy to sanctions – that sustain the dollar’s global dominance.
Meanwhile, the Bank of England’s governor, Andrew Bailey, writing in the Financial Times this week, sounded more accommodating of a financial future that includes stablecoins, suggesting: “It is possible, at least partially, to separate money from credit provision, with banks and stablecoins coexisting and non-banks carrying out more of the credit provision role.” He has previously stressed that stablecoins must “pass the test of singleness of money”, ensuring that one pound always equals one pound (something that cannot be guaranteed if a currency is backed by risky assets).
This isn’t just caution for caution’s sake – it’s grounded in both history and recent events.
During the US’s Free Banking Era in the middle of the 19th century, state-chartered banks could issue their own paper money (banknotes) with little oversight. These “wildcat banks” often issued more notes than they could redeem, especially when economic stress hit – meaning people holding those notes found they weren’t worth the paper they were printed on.
A much more recent example is the collapse of TerraUSD (UST) in May 2022. Terra was a so-called stablecoin that was supposed to keep its value pegged 1:1 with the US dollar. In practice, it relied on algorithms and reserves that turned out to be fragile. When confidence cracked, UST lost its peg, dropping from $1 to as low as 10 cents in a matter of days. The crash wiped out over US$40 billion (around £29 billion) in value and shook trust in the whole stablecoin sector.
But Bailey’s crypto caution extends to CBDCs too. In his most recent Mansion House speech, the Bank of England governor said he remains unconvinced of the need for a “Britcoin” CBDC, so long as improvements to bank payment systems (such as making bank transfers faster, cheaper and more user-friendly) prove effective.
Ultimately, the form our money takes in future is not a question of technology so much as trust. In its latest guidance, the IMF underscores the necessity of earning public trust, not assuming it, by involving citizens, watchdog groups and independent experts in CBDC design, rather than allowing central banks or big tech to shape it unilaterally.
If done right, digital money could be more inclusive, more transparent, and more efficient than today’s systems. But that future is not guaranteed. The code is already being written – the question is: by who, and with what values?
To hear about new Insights articles, join the hundreds of thousands of people who value The Conversation’s evidence-based news. Subscribe to our newsletter.
The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.
Source: The Conversation – UK – By Zsuzsanna Varga, Senior Lecturer, Political & International Studies, University of Glasgow
When László Krasznahorkai, winner of the 2025 Nobel prize in literature, first burst onto Hungary’s literary scene in 1985, it was clear he was a unique talent. His first novel, Satantango, soon became a cult classic.
The novel’s Hungarian readers were living in the stifling atmosphere of the dying years of state socialism. They were quick to understand the parallels between the the novel – about an isolated rural community – and their own isolation from the rest of the world.
They were drawn, too, to Satantango’s sense of physical and psychological decay, and the way it recognised the mundanity of their everyday lives. At least, that was my experience when I first read the book in 1985 in Budapest as an undergraduate student of Hungarian literature.
Oppressive atmosphere and stagnation often feature in the work of Central European writers. But, unlike the oeuvre of many earlier authors, Krasznahorkai’s writing also gained immense popularity on the international – or more specifically, German – scene.
To some extent, this was the result of timing. In the 1980s, western readers often still reacted to art portraying the world behind the recently demolished iron curtain with a mixture of amazement and curiosity.
Novels set in “new Europe” appeared in great numbers, exemplified by British novelists Julie Burchill’s No Exit (1993) and Tibor Fischer’s Under the Frog (1992). But Germany was more receptive to Central European authors who wrote in less widely spoken languages. For this reason, it served as a seat of literary consecration for them.
László Krasznahorkai’s first interview about his Nobel prize win.
Critics in the early 1990s were inclined to read both Satantango and Melancholy of Resistance as reflections of historical cataclysms. Yet, though Krasznahorkai’s fiction is deeply rooted in Hungarian history, Satantango keeps references to the country’s history vague and fairly abstract. The novel’s universe is only dystopian on the surface: tragic-comedic elements abound, leaving the reader simultaneously baffled and entertained.
International recognition
It is usually English-language publications that lead to the popular rise of non-Anglophone fiction – meaning it took a decade for Krasznahorkai to be recognised.
The novel that first drew wider international attention was George Szirtes’ 1998 English translation of Melancholy of Resistance, which follows the journey of a stuffed whale transported by a travelling circus. This success was followed by translations of War and War in 2006. Satantango, while already a cult classic translated into other languages, did not appear in English until 2012.
As his works became better known, critics increasingly understood Krasznahorkai’s writing within a postmodern framework. Critic Jacob Silverman suggested that Satantango’s main concern is “the realisation that knowledge led either to wholesale illusion or to irrational depression”.
Writer David Auerbach, in a similar tone, suggested that Krasznahorkai’s major concern was the process of making meaning in a world where psychology and rationality are no longer serviceable tools of interpretation.
It was the award of the Man Booker international prize in 2015 that cemented Krasznahorkai’s reputation with the English-language reading public. The author’s decision to split the prize between his translator Szirtes, who was responsible for introducing him to the English–speaking world, and Ottilie Mulzet, who produced a stream of translations of his later work, shows that perceptive translators play a key role in international recognition.
Hungarian fiction has never fared better in the international arena than in the 21st century. The process started with the Nobel prize being awarded to Imre Kertész in 2002. Since then, the works of Antal Szerb (Journey by Moonlight, 2016) and Sándor Márai (Embers, 2016), as well as Magda Szabó (The Door, 2020) have garnered considerable critical success and reached a wide audience in English translation.
These immensely different writers have shown that the audience – readers, translators, critics and publishers – need to pay attention to work coming from languages that are not necessarily seen as part of the movements of world literature. Their efforts will be amply awarded.
Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.
Zsuzsanna Varga does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Across Latin America, democracy is coming under severe pressure. Authoritarian leaders across the continent have been entrenching political power through constitutional manipulation, militarised policing and the persecution of dissent.
In Venezuela, Nicaragua, El Salvador and Argentina, regimes are increasingly eroding democracy and mounting a backlash against human rights.
It is in this bleak regional landscape that the Nobel Committee’s decision to award the 2025 peace prize to María Corina Machado has landed. The award is a recognition of one woman’s defiance. But it is also an opportunity to ask what kind of democracy and what kind of peace the world should aspire to.
Machado has long been the face of Venezuela’s democratic opposition. Disqualified from public office, vilified by Nicolás Maduro’s regime and repeatedly threatened, she embodies the persistence of civic dissent.
The Nobel prize committee’s citation reads: “She is receiving the Nobel Peace Prize for her tireless work promoting democratic rights for the people of Venezuela, and for her struggle to achieve a just and peaceful transition from dictatorship to democracy.”
Yet that transition is a long way from being achieved and remains deeply uncertain. Venezuela has fallen victim to increasing political polarisation and is now suffering one of the worst displacement crises in the hemisphere, with 8 million people having left the country since 2014. And the threat of US interference is ever present.
The prize thus risks celebrating an aspiration more than an outcome. It represents a fragile hope in a region where democratic renewal is both urgent and unfinished.
A feminist reading of courage and contradiction
The award makes Machado the first Venezuelan to receive the Nobel peace prize, underscoring the international significance of her career and support for the Venezuelan democratic cause. There is no doubt that her courage is extraordinary.
Machado has refused exile, rejected violence and unified a fragmented opposition under conditions that would crush most political careers. She was forced to go into hiding last year shortly after alleging fraud in Nicolás Maduro’s reelection on July 28 2024.
For decades, women in Latin America have been at the forefront of resistance movements. From the Mothers of the Plaza de Mayo in Argentina to the Ni Una Menos protests in Buenos Aires, Santiago, Bogotá and Mexico City, women’s groups have focused on advancing human rights and social justice. Machado’s recognition inserts Venezuelan women’s political agency into that prominent tradition.
In a region where politics remains saturated by machismo and military archetypes, this is not trivial. The image of a woman – assertive, unapologetic, unbowed – being recognised as a symbol of peace and democratic resistance matters deeply.
Yet the discussion cannot stop there. Machado comes from a powerful business family. She was educated in exclusive schools in Venezuela and the US and shaped by early work in her family’s steel company – all of which may have informed and defined her political outlook.
Her position as an elite woman in a country whose crisis has hit the poor and working-class hardest highlights the need to broaden the conversation about what a just and inclusive democracy looks like.
Her economic agenda – market-oriented and pro-privatisation – raises questions about whether democratic renewal can balance economic reforms, social protection and grassroots priorities. It asks how best to address inequalities that underpin Venezuela’s crisis.
In recognising Machado, the Nobel committee has invited reflection not only on the courage of individual leaders but also on how democratic movements can more fully integrate issues of peace and social justice for all alongside the fight against authoritarianism.
The announcement of the 2025 Nobel peace prize.
Democracy, peace, and the displaced
The Nobel committee described Machado’s resistance as peaceful. In Venezuela the concept of peace is multifaceted. It encompasses not only the absence of violence but also the profound challenges of hunger, displacement and uncertainty that millions continue to face.
The mass displacement since 2014 has disproportionately affected women and girls. They often flee for gender-specific reasons such as the collapse of maternal healthcare and increased rates of gender-based violence. Many have been exposed to trafficking and sexual violence and have faced bureaucratic indifference.
They are the collateral damage of Venezuela’s authoritarian collapse. But they are also symptomatic of an international order that fails to protect women.
This situation underscores the necessity of broadening our understanding of peace to include the protection and rights of women – in this case, the many displaced Venezuelan women and girls.
It must demand a transition that not only restores electoral democracy but guarantees dignity for those who lost everything to repression and political, economic and humanitarian decay.
A mirror for the region
Machado’s Nobel prize is especially timely given that it has been awarded against a backdrop of democratic backsliding and even erosion across Latin America. Her experience highlights the way that the more democracy is undermined by a regime in power, the more difficult it becomes for an opposition to unseat that regime in elections – or assume office if it does win power.
Latin American democracies are losing institutional capacity to restrain the executive – while on the streets, popular protest is often forcibly repressed. Many opposition politicians and activists have no option but to flee or hide.
This has been Machado’s experience. But this Nobel prize sends a signal that global institutions are watching and highlights the deep concern for the future of democracy and the fragility of peace.
Pia Riggirozzi have received funding from ESRC for the project Redressing Gendered Health Inequalities of Displaced Women and Girls in Contexts of Protracted Crisis in Central and South America (ReGHID)
Source: The Conversation – UK – By Michael Collins, Reader in American Studies and Chair of The British Association for American Studies, King’s College London
When Jack Kerouac published On the Road in 1957, he presented the novel as the product of a single marathon writing binge. It was a method he had been working on since the late 1940s that his friend Allen Ginsberg dubbed “spontaneous bop prosody”.
Despite the manuscript actually being a synthesis of years of ten years of notes and fragments (as this film shows), the press went mad for the myth of spontaneous prose. What could be more exciting at the height of the cold war than presenting the US as the place of ultimate freedom and possibility, something apparently unavailable to non-western or socialist nations?
On the Road seemed to be an organic, undiluted product of America. It poured out raw and thick from the mind of a man whose voice was marketed as a synthesis of the repressed forces that lay buried beneath the veneer of American postwar prosperity. The US “culture industry” burned heavy diesel in the promotion of Kerouac.
There were naysayers. Other writers like Truman Capote loathed the book. Of the Beat generation in general he once quipped: “None of them can write, not even Mr Kerouac … [it] isn’t writing at all – it’s typing”.
There were also many conservative critics that denounced the work’s sexual morality. But here was the cleverness of the “culture industry”. These critics could be rendered as an older cultural elite unable to grasp the significance of the novel. Or even as dried-up husks of an obsolescent religious right. This was the late fifties; a new world was coming. Better get off the freeway if you can’t stand the speed.
The trailer for Kerouac’s Road: The Beat of A Nation.
However much his fans might cling to this vision of the novel, Kerouac does not have the reputation now he did in the 1960s and 1970s. Waves of feminist criticism, ecological theory and a more cautious stance in literary culture toward the American political project have left him something of a fossil.
It is intriguing, therefore, to see the director of Kerouac’s Road, Ebs Burnough (former deputy social secretary to the Obama White House), return to this mythologisation of the American open road. Does something of Kerouac’s Americana still exist, the film asks, in the era of Black Lives Matter (BLM), of militarised police, at a time when those conservatives who were once so easy to denounce have taken control of the public discourse?
Burnough’s film attempts to synthesise two competing narratives that do not quite hold together. First, it’s a strong (if not wholly original) account of the reception of Kerouac’s novel and the author himself. It uses interviews with celebrity fans, the American writer Joyce Johnson (a former girlfriend of Kerouac’s who has superb things to say about life for the “Beat” women) and Kerouac’s biographer Ann Charters.
Second, the director interweaves three micro-narratives of contemporary American road trips that have some rather loose relationship to the ideas of freedom Kerouac is held to represent. One follows a young Black man from Philadelphia who is in the process of leaving the poverty of his home city for the promises of Morehouse University (the reverse narrative of Kerouac who dropped out of Columbia). Another is a couple who are living on the road to re-energise their marriage. Another shows a woman’s reunion with her abusive father.
The principal issue is that the film cannot reconcile its nostalgia for Kerouac’s era with the true historical and political conditions of the contemporary US.
For one, Kerouac’s novel is anything but an account of American plurality. On the Road has a relentlessly over-determined first-person voice that is notable for its blind spots far more than the truly expansive panoramic vision of American life Burnough takes to it be.
To its credit, the film does address the fact that really, at core, On the Road is about one man’s (the narrator Sal Paradise) obsession with another (Dean Moriarty). Yet there is nothing much here about the irony that the very obsession with unfiltered first-person speech, which Kerouac’s novel made so fashionable in American literature, has since toxified. It is now associated with masculine “free speech” and the suppression of alternatives that define contemporary political discourse.
Today, Sal Paradise would have a podcast. And I am not convinced, swathed in deep misogyny and violence as the novel is, that it would be much different from some of the worst of the manosphere.
On The Road is an exercise in resource extraction (of people, especially women, fuel and landscape, seen as salve to the troubled male soul). This is what makes it interesting as a cultural account of the 1950s. In the film, only the singer-songwriter Natalie Merchant (who is predictably brilliant, insightful and wise) and the comic and cultural critic W. Kamau Bell come close to seeing this.
The Trayvon generation
At 25 minutes in the director overdubs Joyce Johnson speaking about the Beat generation as the voice of the underclass of the 1950s on to an image of Amin (the Morehouse student) wearing a BLM hoodie.
It is hard to know if the director is being ironic, or if what the poet Elizabeth Alexander has called “the Trayvon generation” (after Trayvon Martin, the 17-year-old African-American boy who was fatally shot by his neighbour in 2012) is meant to be seen as a Beat generation in utero. The chasm economically and socially between the conditions of the later 1940s-50s and the present day make this parallel seem highly dubious. Most Black men in the contemporary US would not risk crossing the country at 70mph while drinking and driving a Chevy, as Sal Paradise does in On the Road.
The film does address the pervasive culture of violence in contemporary America at moments. One of most poignant interviews is with Amin’s mother who is worried about her son being shot. Yet, the film does not suggest this could be the responsibility of the police. The fault, it seems to imply, lies within the community. This is an egregious misrepresentation of the purpose of BLM, and seems at best politically muted from the director.
The film is very unwilling to undertake the critique needed to measure the distance between the Beats and Trump’s America. Indeed, it actually reproduces many of the flaws of Kerouac himself in being so optimistic about the US. The problems of the world today are not solved by a road-trip anymore than they were in 1957.
Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.
Michael Collins does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.