Jane Austen: why are adaptations of Mansfield Park and Northanger Abbey so rare?

Source: The Conversation – UK – By Amy Wilcockson, Research Assistant, Scottish Literature, University of Glasgow

More than two centuries after her death, Jane Austen is one of the most adapted authors of all time, her life and novels dramatised for film and television from every angle imaginable. Despite the plethora of Pride and Prejudice adaptations, Netflix is making its own version, starring Emma Corrin, Jack Lowden and Olivia Colman.

Sense and Sensibility is being rehashed too, with Daisy Edgar-Jones as Elinor Dashwood. On the small screen, the BBC released the hit documentary Jane Austen: Rise of a Genius alongside an adaptation of Gill Hornby’s novel Miss Austen that centres on Austen’s sister Cassandra (plus a forthcoming sequel). A dramatisation of Janice Hadlow’s novel The Other Bennet Sister which takes up the story of Mary, the dowdy younger sister of Lizzy, has also commenced filming.

Emma got a smart and entertaining do-over in 2020 starring Anya Taylor-Joy as the arrogant but well-meaning matchmaker. And of course Carrie Cracknell’s Persuasion got its Fleabag moment in 2022 starring Dakota Johnson as a wine-swigging heartbroken Anne Elliot whispering acerbic asides to the audience.


This article is part of a series commemorating the 250th anniversary of Jane Austen’s birth. Despite having published only six books, she is one of the best-known authors in history. These articles explore the legacy and life of this incredible writer.


So far so good, say Austen fans, who rejoice in these continued adaptations as they celebrate the 250th anniversary of her birth this year. Yet it is the same stories being told.

Given the packed cinemas for the 20th anniversary screenings of the 2005 Pride and Prejudice film starring Keira Knightley as Lizzy, plus the enduring appeal of that wet-shirted Mr Darcy moment from the BBC series in 1995, it is clear this novel is Austen’s most enduring work. But do we need another adaptation? Or another “alternative” view of Austen or Lizzy Bennet’s lives?

I’m the first to admit that I’m an Austen fan. Her stories have timeless appeal. They focus on romance and class, alongside larger issues of the Regency period such as power, the role of women and even slavery – although the representation of slavery and empire in Austen’s work is long contested.

So what of the “forgotten”, less-adapted novels: Northanger Abbey and Mansfield Park? These are the two of Austen’s novels that bring wider issues into focus. Why are film-makers happy to leave these stories be? Are their narratives less compelling or have we been brainwashed by Mr Darcy and his breeches?

Northanger Abbey was last adapted in 2007 for ITV, starring Felicity Jones as the heroine Catherine Morland. Its previous iteration premiered in 1987 with Katharine Schlesinger in the lead role. There has never been a film version.

Written in 1798-99, Northanger Abbey was not published until six months after Austen’s death in December 1817. It is a gothic pastiche, satirising the melodramatic plots and moody locations of popular novels at the time. It also offers a harsh criticism of the conventions of marriage, wealth and social status faced by young women.

Influenced by her sensational gothic reading material, Catherine Morland initially believes General Tilney, with whom she is staying, is guilty of killing his wife. While not a murderer, General Tilney does treat Catherine callously.

After learning that she is not a wealthy heiress, he declares her unsuitable to marry his son, Henry, turfs her out of Northanger Abbey, and leaves her facing a long journey home alone – a fate perilous to any proper young lady. Snobbishness and gender conventions combine as Austen ridicules class and social ambition.

Published in 1814, Mansfield Park was Austen’s third novel. Long considered to be the odd one out of Austen’s works, it was adapted as a TV series in 1983, with film versions released in 1999 and 2007.

In Mansfield Park, Austen examines bigger issues, including infidelity, gambling and most problematically of all, the fact that Sir Thomas Bertram (the uncle of the heroine Fanny Price) owns a plantation in Antigua.

Bringing up the slavery question

Fanny asks her uncle about the slave trade, but is ignored. By positioning a key character as a plantation owner, many scholars – myself included – argue that Austen was trying to draw attention to this debate in her novel. There is also plenty of circumstantial evidence that Mansfield Park is named for Lord Mansfield, a judge who played an important role in ending slavery in England.

Recent research examining Austen’s family demonstrates that three of her brothers were engaged in anti-slavery activism, her letters share that she was “much in love” with the abolitionist Thomas Clarkson, and some critics posit the view that Austen herself supported abolition. Mansfield Park and Emma both feature discussions on the slave trade.

At the very least, Austen was interested in questions of slavery and race. While it is impossible to definitively decipher her personal views from her literary works, it is clear that important issues such as slavery feature in her novels, albeit subtly.

Perhaps it is this serious and timely subject matter, so unlike the usual Austen narrative, that puts off film-makers. But Northanger Abbey and Mansfield Park deserve their time in the limelight.

Rather than iterations of Austen’s afterlives or Lizzy’s family members, powerful and original adaptations of these two novels would invigorate new generations of readers and filmgoers. Who wouldn’t want to watch Greta Gerwig’s Northanger Abbey? It is a serious travesty that a film version has never been released.

Perhaps big studios simply haven’t got around to commissioning a new Northanger Abbey or Mansfield Park. But in doing so, they are neglecting a third of Austen’s published novels.

They represent Austen’s most nuanced works, focusing not just on romance (although both heroines get their happy endings) but on society’s wider issues. Crucially, they demonstrate that their author was not just a writer of fluffy romance, but an informed observer of politics and society and the structures that underpinned them.

Even more radically, film-makers could offer a different perspective by adapting one of her contemporaries’ novels – Austen was not the only female author writing during this period.

Scottish novelist Susan Ferrier admired Austen’s work, yet was a hugely successful author in her own right, outselling Austen in the 1800s. Any of her three novels – Marriage; The Inheritance; and Destiny, would be sure-fire Regency hits. Or Gothic pioneer Ann Radcliffe whose tale The Mysteries of Udolpho, one of the Gothic novels Catherine Morland so enjoys, is ripe for the big screen.

Audiences would perhaps see film versions of her fellow authors’ works as a way to honour Austen’s legacy too, offering viewers something familiar yet different.

The Conversation

Amy Wilcockson does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Jane Austen: why are adaptations of Mansfield Park and Northanger Abbey so rare? – https://theconversation.com/jane-austen-why-are-adaptations-of-mansfield-park-and-northanger-abbey-so-rare-262739

Blair’s ID cards failed in the 2000s – could Starmer’s version fare better?

Source: The Conversation – UK – By Tim Holmes, Senior Lecturer in Criminology and Policing, Bangor University

The UK government is once again looking at the possibility of introducing identity cards, with the prime minister Keir Starmer announcing plans for a new scheme for all UK citizens.

The argument is familiar. With tougher ID systems, illegal immigration would be harder and the UK less appealing. But it also raises a familiar set of questions. How would such a scheme work? And what lessons are there to be learned from the last time the UK had ID cards?

Identity cards were compulsory during the second world war, but the system was scrapped in 1952 after growing unease about police powers and civil liberties.

Fifty years later, Tony Blair’s Labour government proposed new biometric ID cards backed by a national database. Ministers claimed they would help tackle terrorism, illegal immigration and identity theft while giving people secure access to public services.

At the time, terrorism, illegal immigration and identity theft were major concerns. The 9/11 bombers had avoided detection in the US, 23 illegal immigrants had died while cockle picking in Morecambe Bay in 2004 and people were increasingly falling victim to online fraud and identity theft.

In 2006 the Identity Cards Act was introduced. The scheme would introduce cards for citizens with new biometric security features and data stored on a national database. Eventually, whether you wanted a card or not, you could not function in UK society without one.

Some argued it would lead the UK to becoming a surveillance society. Protest groups warned of the risks, while Liberal Democrat MP Simon Hughes vowed to go to prison rather than accept the card and the power it gave the state.

In the end, the cards were never tested. The scheme collapsed in 2010, undone not by principle but by cost and a change of government.

2025 proposals

Rising public concern over illegal immigration has once again led to calls for solutions.

The UK government’s latest proposals follow a home affairs committee inquiry into digital IDs and electronic visas in June. It examined whether migrants should be required to use them to prove their status when applying for jobs. The argument being that with a tougher ID system, illegal immigrants would be deterred from attempting to enter the country.

The UK is already far more digitally monitored than it was 20 years ago. Biometric passports, digital driving licences and online identity checks are used as a matter of course.

In 2010, when the last ID card scheme was scrapped, public attitudes towards surveillance were generally favourable when used in public spaces. But monitoring in private spaces was not.

In 2025, attitudes towards surveillance vary depending on the type. There is now more concern around the mass surveillance of people’s online activities, for example.

Identity schemes are used in 142 countries around the world, 70 with electronic ID. Biometric technology has improved considerably over the past 20 years. More than 120 countries now use facial recognition in passport systems, while UK police forces have integrated the technology into their work.

The question is not whether cards can verify identity – they can. It’s whether they reduce crime or illegal immigration. That depends on how essential they become to everyday life. If an ID check is required for employment, housing and access to services, people without documents may be pushed into the margins, rather than required to leave the country.

In 2005, writer Arun Kundnani argued that ID cards risked becoming “exclusion cards”, creating a new underclass of people unable to access services legally but still present in the shadow economy. That would give organised crime networks even greater power over undocumented migrants, offering illegal routes into housing and work.

Another unresolved question is cost. The last scheme collapsed under the financial weight of setting up the infrastructure and issuing cards nationwide. With public finances tight, the government could find itself facing the same problem again.

Surveillance

There are also broader questions about trust. Academic Clive Norris, who has studied mass surveillance, has warned that constant monitoring encourages the view that ordinary citizens cannot be trusted: “If we are gathering data on people all the time on the basis that they may do something wrong, this is promoting a view that as citizens we cannot be trusted.”

Digital identity cards could bring benefits. For those entitled to live and work in the UK, they might make access to services simpler and faster. But the debate is about more than efficiency. It goes to the heart of how much oversight the state should have over everyday life, and whether a costly system would achieve its stated aims.

The last attempt at ID cards was sunk before it could be tested. Two decades on, the UK is more accustomed to digital surveillance and more anxious about immigration. The question is whether that makes this the right time for a second attempt – or whether the country risks repeating old mistakes.

The Conversation

Tim Holmes does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Blair’s ID cards failed in the 2000s – could Starmer’s version fare better? – https://theconversation.com/blairs-id-cards-failed-in-the-2000s-could-starmers-version-fare-better-264517

HPV: what you need to know about the common virus linked to cancer

Source: The Conversation – UK – By Dan Baumgardt, Senior Lecturer, School of Psychology and Neuroscience, University of Bristol

Few viruses are as widespread – and sometimes misunderstood – as the human papillomavirus, or HPV. It’s so common that most of us – up to 80% – will encounter it at some point in our lives, often without even realising it. Understanding HPV matters, given that it is linked to several types of cancer.

Scientists have identified more than 200 types of HPV, making it one of the most diverse viral families known – and a complex one at that. Many strains are low risk, causing either no symptoms or benign warts. HPV types 1, 2 and 4, for instance, are responsible for the common skin wart. Many will have experienced these, including the familiar verruca (plantar wart) picked up at swimming pools.

Some strains, such as HPV 6 and 11, cause genital wartssmall growths that appear on the genitals or around the anus. Treatments such as creams, surgical removal or freezing can get rid of the visible warts, but they don’t remove the virus itself. This means the virus can still be passed to sexual partners until the body’s immune system clears it.

Most seriously, certain types of HPV – particularly 16 and 18 – have known links to cancer. They belong to a group of about 14 high-risk strains that can enter human cells and damage their DNA. This damage interferes with the cells’ normal controls on growth and division, which can lead to the development of cancer.

Repeated or persistent infection with these strains increases the risk of developing cancer. So, too, does smoking, which reduces the ability of the immune system to clear the virus.

Because HPV comes in so many forms – from harmless skin warts to strains linked with cancer – it’s easy to see how myths and confusion can take hold. To separate fact from fiction, here are five key points that everyone should know about the virus.

1. HPV is not just associated with cervical cancer

While cervical cancer remains the most recognised HPV-related malignancy, the virus is also linked to cancers of the vulva, vagina, anus, penis, mouth and throat. Emerging evidence suggests some types may also contribute to developing skin cancer.

This broad cancer risk explains why the widely available HPV vaccine is recommended for both sexes. The vaccine’s ability to prevent HPV infection makes population-wide immunisation beneficial, as transmission may occur between heterosexual and homosexual partners alike.

2. You don’t need to have symptoms or genital warts to pass the virus on

HPV can remain on the skin for months before the immune system clears it, allowing transmission through contact even before genital warts appear and after they’ve been treated. This is why condoms should be used for at least three months after visible warts have resolved.

A condom in gold packaging.
A condom should still be used three months after genital warts have resolved.
AtlasStudio/Shutterstock.com

3. HPV transmission can occur from more than just vaginal or anal sex

Oral and throat cancers can develop following HPV infection acquired through oral sex. The incidence of mouth and throat cancer is increasing worldwide, with oral sex now the most significant behavioural risk factor. Using condoms during oral sex can help reduce this risk.

HPV can also spread through the use of sex toys. One study highlighted the ability of transmissible HPV to remain on sex toys and the need to develop proper hygiene practices for cleaning, and avoiding shared use.

4. Condoms are not 100% effective at preventing spread

Condoms can lower the risk of HPV transmission, but they can’t offer full protection, as uncovered skin can still carry the virus.

This is why many sexually active people will come into contact with a strain of the virus at some point in their lives, even when practising safe sex.

5. Even vaccinated women need to have smear tests

Current HPV vaccines target the main high-risk virus types but cannot cover all cancer-causing strains, or treat existing infections. In rarer cases, cervical cancer can also arise without HPV infection. This is why women aged 25 to 64 are still invited for cervical screening every five years, even after vaccination.

Women should also seek urgent medical review for other indicators of cervical cancer. These include pain or bleeding after sex, bleeding between periods or after menopause, and changes in vaginal discharge.

Even though the HPV vaccine is widely available, uptake has dropped in some areas. The COVID pandemic disrupted routine vaccination programmes, while misinformation about the vaccine’s safety and effectiveness has shaken trust. In some places, low awareness of HPV’s link to different cancers – and of the need to vaccinate boys as well as girls – has also made public understanding more difficult.

The World Health Organization has set a target of fully vaccinating 90% of girls by age 15 by 2030. At present, only about 48% of girls worldwide are fully vaccinated, so there is more work to be done.

Although HPV is often harmless, the potential consequences of some strains are too significant to ignore. But no one should be fearful of an active sex life. For those eligible for the HPV vaccine, protection is not just for the individual, but also for future sexual partners who could otherwise be exposed. By staying informed and taking preventative measures, we can reduce the effect of this common virus and keep ourselves and others safer.

The Conversation

Dan Baumgardt does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. HPV: what you need to know about the common virus linked to cancer – https://theconversation.com/hpv-what-you-need-to-know-about-the-common-virus-linked-to-cancer-263678

Jamaican prime minister returns to power amid reduction in violent crime

Source: The Conversation – UK – By Amalendu Misra, Professor of International Politics, Lancaster University

Jamaicans voted Andrew Holness, the leader of the centre-right Jamaica Labour party (JLP), into power for a third consecutive term on September 3. Holness beat Mark Golding of the People’s National party (PNP) in a tight election, with the JLP winning 34 seats and the PNP 29.

A head of state winning a third straight term is a rare feat in a participatory democracy. There are several factors that have contributed to Holness’s enduring appeal before the voters.

Jamaica’s economy has improved since Holness first took office in 2016. Public debt fell from more than 140% of GDP in 2013 to 73.4% by 2024, while the World Bank expects the economy to grow by 1.7% in 2025.

At the same time, Holness has sought to upgrade and improve public access to beaches across Jamaica. Most of the country’s beaches are gated and held in the hands of hoteliers, expats and private companies.

The Holness government introduced an initiative in 2024 to create new “beach parks” for ordinary Jamaicans. It also strenghtened regulations to prevent private developments from blocking public access to beaches.

But what is likely to have contributed most to Holness’s victory are his crime-busting policies. Jamaica has been reeling under gang violence for the past 25 years. As recently as 2023, Jamaica had the second-highest rate of homicide in the Caribbean region – behind only Haiti.

Holness has overseen a steady decline in the country’s murder rate since then. There was an 18.7% decline in homicides in Jamaica from 2023 to 2024, while the island registered an even greater drop of 43% between 2024 and 2025.

Like in nearby gang-infested Haiti, criminality and violence has long thrived in Jamaica owing to political patronage. Most homicides there are carried out by gangs known as “posses”. These groups have in the past been linked to major Jamaican political parties.

The JLP and PNP both fostered the development of posses in inner-cities and deprived areas by providing them with weapons and a free hand to run protection rackets in exchange for political support. Jamaican politicians have on occasion also allegedly paid gangsters to carry out crimes for political gain.

In the 2010s, an article in the Irish Independent accused Bruce Golding, the then-Jamaican JLP prime minister, of openly using the powerful “Shower Posse” gang to intimidate opposition voters in elections three years earlier. Shower Posse was led by Christopher “Dudus” Coke, a convicted drug kingpin who is now jailed in the US.

Golding “categorically denied” the claims at the time, and called them part of a conspiracy to undermine his government. However, regardless of the accuracy of these specific allegations, collusion between criminals and political elites turned Jamaica into a hotbed of criminality and spiralling violence.

Responsive governance

Holness’s success in fighting crime rests on three pillars: fighting urban poverty, clamping down on the drugs trade and putting tight restrictions on the importation of firearms.

One of his main focuses has been enhancing social programmes to reduce the allure of gang membership. His government has put in place a social pension, while also raising the minimum wage. These policies contributed to the national poverty rate falling to 8.2% in 2023 – its lowest level since measurements began in 1989.

Holness also amended Jamaica’s 2014 Gang Suppression Act in 2021, a year into his second term. This gave the police and military more power to combat criminality and was followed by the launch of an anti-gang task force in 2022. The task force oversaw direct combat with national and transnational gangs operating within Jamaica.

That same year, Holness launched his “Get Every Illegal Gun” campaign. This initiative was accompanied by severe penalties for those found in possession of illegal weapons. The countrywide illegal firearms crackdown is widely attributed as having brought down rates of violence across Jamaica.

However, while Holness’s zero tolerance stance towards criminality has successfully tackled crime rates, there are some concerns about his approach. His critics often cite human rights violations associated with the introduction of a state of emergency in parts of the capital Kingston and 14 other parishes in 2022. The measure enabled the authorities to arrest people and search buildings without a warrant.

Holness justified the move by saying gang violence had forced Jamaicans “to hide under their beds, hide their daughters, can’t go to church, and they see their sons and their boyfriends and husbands killed. That’s the reality”.

The election of Holness for a third time is by no means a guarantee that Jamaica will complete its transition from rampant violence to peace. His populist economic promises, such as lowering income tax rate from 25% to 15% earned him much-needed votes. But it is unlikely that such promises can be sustained in the long run.

Jamaican society has also not been completely freed from the ravages of its violent past. Parts of the country, such as Tivoli Gardens, Grants Pen and Trench Town in Kingston, Rose Heights, Flankers and Norwood in the city of Montego Bay, and the most of Spanish Town (colloquially known as the valley of death), still reel from vendetta violence.

It is these lingering fears that may have motivated a voter turnout of just 39.5% in the recent election – a turnout far lower than when Jamaicans last went to the polls in 2020. Holness’s vision of “a stronger, safer, more prosperous Jamaica” is still a long way from the finishing line.

The Conversation

Amalendu Misra is a past recipient of British Academy and Nuffield Foundation Fellowships.

ref. Jamaican prime minister returns to power amid reduction in violent crime – https://theconversation.com/jamaican-prime-minister-returns-to-power-amid-reduction-in-violent-crime-264644

We risk a deluge of AI-written ‘science’ pushing corporate interests – here’s what to do about it

Source: The Conversation – UK – By David Comerford, Professor of Economics and Behavioural Science, University of Stirling

Back in the 2000s, the American pharmaceutical firm Wyeth was sued by thousands of women who had developed breast cancer after taking its hormone replacement drugs. Court filings revealed the role of “dozens of ghostwritten reviews and commentaries published in medical journals and supplements being used to promote unproven benefits and downplay harms” related to the drugs.

Wyeth, which was taken over by Pfizer in 2009, had paid a medical communications firm to produce these articles, which were published under the bylines of leading doctors in the field (with their consent). Any medical professionals reading these articles and relying on them for prescription advice would have had no idea that Wyeth was behind them.

The pharmaceutical company insisted that everything written was scientifically accurate and – shockingly – that paying ghostwriters for such services was common in the industry. Pfizer ended up paying out more than US$1 billion (£744 million) in damages over the harms from the drugs.

The articles in question are an excellent example of “resmearch” – bullshit science in the service of corporate interests. While the overwhelming majority of researchers are motivated to uncover the truth and check their findings robustly, resmearch is unconcerned with truth – it seeks only to persuade.

We’ve seen numerous other examples in recent years, such as soft drinks companies and meat producers funding studies that are less likely than independent research to show links between their products and health risks.

A major current worry is that AI tools reduce the costs of producing such evidence to virtually zero. Just a few years ago it took months to produce a single paper. Now a single individual using AI can produce multiple papers that appear valid in a matter of hours.

Already the public health literature is observing a slew of papers that draw on data optimised for use with an AI to report single-factor results. Single-factor results link a single factor to some health outcome, such as finding a link between eating eggs and developing dementia.

These studies lend themselves to specious results. When datasets span thousands of people and hundreds of pieces of information about them, researchers will inevitably find misleading correlations that occur by chance.

A search of leading academic databases Scopus and Pubmed showed that an average of four single-factor studies were published per year between 2014 and 2021. In the first ten months of 2024 alone, a whopping 190 were published.

These weren’t necessarily motivated by corporate interests – some could, for example, be the result of academics looking to publish more material to boost their career prospects. The point is more that with AI facilitating these kinds of studies, they become an added temptation for businesses looking to promote products.

Incidentally, the UK has just given some businesses an additional motivation for producing this material. New government guidance asks baby-food producers to make marketing claims that suggest health benefits only if supported by scientific evidence.

While well-intentioned, it will incentivise firms to find results that their products are healthy. This could increase their demand for the sort of AI-assisted “scientific evidence” that is ever more available.

Fixing the problem

One issue is that research does not always go through peer review prior to informing policy. In 2021, for example, US Supreme Court justice Samuel Alito, in an opinion on the right to carry a gun, cited a briefing paper by a Georgetown academic that presented survey data on gun use.

The academic and gun survey were funded by the Constitutional Defence Fund, which the New York Times describes as a “pro-gun nonprofit”.

Since the survey data are not publicly available and the academic has refused to answer questions about this, it is impossible to know whether his results are resmearch. Still, lawyers have referenced his paper in cases across the US to defend gun interests.

One obvious lesson is that anyone relying on research should be wary of any that has not passed peer review. A less obvious lesson is that we will need to reform peer review as well. There has been much discussion in recent years about the explosion in published research and the extent to which reviewers do their jobs properly.

Over the past decade or so, several groups of researchers have made meaningful progress in identifying procedures that reduce the risk of specious findings in published papers. Advances include getting authors to publish a research plan before doing any work (known as preregistration), then transparently reporting all the research steps taken in a study, and making sure reviewers check this is in order.

Also, for single-factor papers, there’s a recent method called a specification curve analysis that comprehensively tests the robustness of the claimed relationship against alternative ways of slicing the data.

Young man looking at a screen
Peer review is under threat from AI publshing.
Gorodenkoff

Journal editors in many fields have adopted these proposals, and updated their rules in other ways too. They often now require authors to publish their data, their code and the survey or materials used in experiments (such as questionnaires, stimuli and so on). Authors also have to disclose conflicts of interest and funding sources.

Some journals have gone further, such as requiring, in response to the finding about the use of AI-optimised datasets, authors to cite all other secondary analyses similar to theirs that have been published and to disclose how AI was used in their work.

Some fields have definitely been more reformist than others. Psychology journals have, in my experience, gone further to adopt these processes than have economics journals.

For instance, a recent study applied additional robustness checks to analyses published in the top-tier American Economic Review. This suggested that studies published in the journal systematically overstated the strength of evidence contained within the data.

In general, the current system seems ill-equipped to cope with the deluge of papers that AI will precipitate. Reviewers need to invest time, effort and scrupulous attention checking preregistrations, specification curve analyses, data, code and so on.

This requires a peer-review mechanism that rewards reviewers for the quality of their reviews.

Public trust in science remains high worldwide. That is good for society because the scientific method is an impartial judge that promotes what is true and meaningful over what is popular or profitable.

Yet AI threatens to take us further from that ideal than ever. If science is to maintain its credibility, we urgently need to incentivise meaningful peer review.

The Conversation

David Comerford currently receives funding from Open Philanthropy for a project to design a system that incentivizes meaningful and timely peer review. He has previously received funding from UKRI, IDRC and the Chief Scientist’s Office of the Scottish Government.

ref. We risk a deluge of AI-written ‘science’ pushing corporate interests – here’s what to do about it – https://theconversation.com/we-risk-a-deluge-of-ai-written-science-pushing-corporate-interests-heres-what-to-do-about-it-264606

Plans to ‘maximise extraction’ of North Sea oil and gas would soon run into geological limits

Source: The Conversation – UK – By Mark Ireland, Senior Lecturer in Energy Geoscience, Newcastle University

North Sea oil is in its geological twilight. James Jones Jr / shutterstock

“We are going to get all our oil and gas out of the North Sea”, Conservative Party leader Kemi Badenoch said recently. Her promise to “maximise extraction” sets up a clash between political ambitions, economic reality and geological limits.

Reform UK has also said drilling for more oil and gas in the North Sea would be a “day one” priority. But even if the Conservatives or Reform were to be elected and lifted the current moratorium on new exploration licenses, there might not be the promised prizes of oil and gas under the seabed – or enough appetite from investors – to deliver on that promise.

BP, in those days British Petroleum, first extracted gas from under the North Sea in 1967. It marked the start of what was to become, for decades, one of the most valuable sectors of the UK economy, with more than 400 separate oil and gas fields developed to date.

But production peaked in 1999 and the North Sea now produces less than half as much as in its heyday.

It is now a “mature” basin: most of the biggest and easiest-to-develop fields have already been discovered and depleted. What remains are smaller, sometimes more remote, and often more technically challenging or expensive resources and reserves.

This is typical of ageing oil and gas provinces, where production declines even as operating costs rise. New projects must compete with oil and gas extracted from other parts of the world where it is easier and cheaper and more appealing to investors.

Finding oil and gas

Historically, only one in eight exploration wells in the North Sea led to a field producing oil and gas. That ratio has improved: between 2008 and 2017, a bit more than one in four wells led to a commercial success.

But far fewer wells are being drilled today. Even with the advances in technology, such as improved geophysical imaging which allows us to better define opportunities ahead of drilling, the big discoveries were probably made decades ago.

UK exploration wells vs offshore fields by year:

Graph showing wells and oil fields by year
The number of exploration wells is down hugely from its peak in the 1980s and early 90s.
Mark Ireland / NSTA

The UK government’s North Sea Transition Authority estimates there could still be around 3.5 billion barrels of oil equivalent in more than 400 undeveloped prospects. But most of these potential fields are small, isolated or technically complex. Developing them will require high oil and gas prices, fiscal stability, and a lot of investor confidence.

Politics vs geology

Even if a future government relaxes exploration licensing rules, geology will remain the bigger constraint. The North Sea is simply not as cheap as it was, and global fossil fuel giants have many other options. It is currently far cheaper to produce oil and gas in other regions, the Middle East or North Africa for example. Projects in these countries are all competing for the same capital.

Volatility in the energy sector will continue to make investors cautious. The 2015 oil price crash cut activity in the UK sector to its lowest level in decades, and it has never fully recovered. As fossil fuels are sold on the global market, political volatility, international and national, can lead to rapid shifts in investor confidence.

In the UK the introduction of a windfall tax in 2023 and changing requirements for environmental impact assessments are all making decision making on long-term projects riskier. And while the UK still needs considerable volumes of gas in future (and more modest amounts of oil) both are declining as our energy system evolves and renewable energy expand.

The UK’s mix of economic uncertainty, mature geology and smaller discoveries will make it harder to attract major international energy firms.

The future of the North Sea

That doesn’t mean the North Sea has finished as a source of oil and gas. For instance, undeveloped discoveries – where oil or gas has been confirmed but not yet produced – represent a lower-risk opportunity. But returns may be modest as many are relatively small and isolated from existing infrastructure.

New exploration licenses, if issued, might extend production modestly, but they are unlikely to deliver another game-changing discovery.

Some analysts argue that future licensing should be highly strategic, limited to projects with clear economic importance or climate compatibility. That approach could reduce reliance on imported gas, which tends to be more carbon-intensive than gas produced domestically. This would certainly make more sense than restarting fracking. But it would still not recreate the industry’s heyday.

Easy oil is over

The North Sea will still produce oil and gas for years to come, but its role will shrink. Even with friendlier policies, the era of big discoveries and rapid growth isn’t coming back.

Maximising extraction may sound appealing to politicians, but geology, economics and climate commitments all point to the North Sea’s best oil and gas days being behind it. The real challenge now is managing the investment during decline while investing in the cleaner solutions that will replace it.


Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 45,000+ readers who’ve subscribed so far.

The Conversation

Mark Ireland has previously received funding from companies with fossil fuel interests.

ref. Plans to ‘maximise extraction’ of North Sea oil and gas would soon run into geological limits – https://theconversation.com/plans-to-maximise-extraction-of-north-sea-oil-and-gas-would-soon-run-into-geological-limits-264614

Can certain food cravings predict a cancer diagnosis, up to three months before other symptoms appear?

Source: The Conversation – UK – By Justin Stebbing, Professor of Biomedical Sciences, Anglia Ruskin University

Roman Samborskyi/Shutterstock.com

Why do health stories about food and cancer grab so much attention? Because they offer an enticing promise: that a single item on your plate, or even a sudden change in what you crave, might hold the key to spotting disease early.

It’s a compelling idea, but in reality the science of appetite, taste, and cancer is far messier than the headlines suggest.

This eye-catching idea oversimplifies reality. While cancer can change appetite and taste, there’s no solid evidence that a sudden craving, such as an abrupt fixation on sweets, serves as a dependable early warning signal for undiagnosed cancer.

This is a classic case where interesting clinical anecdotes and stories have been stretched into a sweeping rule that doesn’t work as a screening tool.

The grain of truth behind these headlines comes from clinical observations. Some cancer patients do report altered taste and appetite. In older case studies, patients described dramatic changes – tea suddenly tasting awful, or favourite foods becoming repulsive – sometimes before diagnosis, sometimes after treatment began.

These accounts seem compelling, but they were never designed to prove that a particular craving reliably predicts cancer. They show that cancer can affect how we taste and eat, not that a single symptom can replace proper diagnosis.

Modern research paints a more complex picture. Studies examining “altered food behaviour” around cancer cover a wide range of changes: cravings, aversions, emotional eating and treatment-related appetite shifts.

These studies look at different cancers, stages, and time points – before, during and after treatment. The overall message is that eating behaviour can change in the context of cancer, influenced by biology (inflammation and metabolism), physiology (changes to taste and smell) and psychology (stress and mood).




Read more:
Why a daily glass of milk really could reduce bowel cancer risk – an oncologist explains


What we don’t see is a specific craving pattern that reliably warns of cancer in healthy people. Appetite changes can be part of the cancer story, but they’re not a diagnostic shortcut.

It’s worth bearing in mind how common appetite changes are in everyday life. Many ordinary factors affect what tastes good and what the body wants, including medications, pregnancy, stress, quitting smoking and anaemia.

A sudden enthusiasm for a particular food might be interesting, but it rarely points to a single cause. That’s why doctors look for clusters of symptoms and lasting patterns rather than drawing conclusions from one change.

Chewing ice

There is one area where cravings connect meaningfully to health: ice chewing. Constantly chewing ice (called pagophagia) can signal iron deficiency, which has treatable causes that should be found and addressed. This is completely different from claims that tumours program sugar cravings.

Ice chewing represents a well-established link between unusual eating behaviour and a specific, testable condition. Iron deficiency itself is both common and often missed.

Iron is essential for making haemoglobin, which carries oxygen in red blood cells, and plays broader roles in energy and immune function. When levels drop, symptoms are often vague: persistent fatigue despite adequate sleep, exercise intolerance, shortness of breath and headaches, to name a few.

These overlap with many other conditions, which is why testing matters rather than guessing. Iron comes from red meat, poultry, seafood, beans, lentils, leafy greens, and fortified cereals and breads. However, a “good” diet doesn’t always guarantee adequate iron if losses are high, needs are elevated, or absorption is poor – another reason to confirm and treat the problem with proper testing.

Person chewing ice.
A craving for chewing ice could signal an iron deficiency.
New Africa/Shutterstock.com

No magic clues

Returning to the headlines, it’s easy to see why supposed tell-tale cravings capture attention. They promise a simple signal in a confusing health landscape. But medicine rarely offers magic clues.

The sensible approach is twofold. First, treat new, persistent, and unexplained changes in taste or appetite as worth noting – not panicking about. Consider the full picture: other symptoms, recent illnesses, medications, stress and overall health. If behaviour like ice chewing appears or fatigue becomes stubborn, checking for iron deficiency makes sense.

Second, for cancer risk concerns, rely on established warning signs and screening tests. Unexplained weight loss, unusual bleeding, changes in bowel habits, swallowing difficulties, new or changing lumps and age-appropriate screening catch far more cancers than chasing a single craving ever will.




Read more:
King Charles is changing his diet to keep his cancer at bay – here’s what the evidence says


The craving narrative carries another danger: it can fuel harmful behaviour, like trying to “starve” a tumour by cutting out major nutrients.

Severe restriction can cause dangerous weight loss, malnutrition and, worse, treatment tolerance, undermining recovery rather than helping. Tumours don’t outsmart sensible nutrition. What helps most is maintaining strength with a balanced diet, staying active when possible, following evidence-based screening and treatment, and using targeted tests – like iron studies – when symptoms suggest they might be helpful.

Appetite and taste are sensitive measures of health and their changes deserve attention. They’re part of the medical conversation, not a crystal ball.

If something feels wrong and stays wrong – whether that’s a new aversion to familiar foods, an odd fixation that won’t go away, or constant ice chewing – the next step isn’t to search Google for hidden meanings. Instead, talk with a doctor.

Simple tests can quickly rule out common problems, and if something more serious is happening, acting on established warning signs and screening guidelines offers the best chance of catching it early.

The Conversation

Justin Stebbing does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Can certain food cravings predict a cancer diagnosis, up to three months before other symptoms appear? – https://theconversation.com/can-certain-food-cravings-predict-a-cancer-diagnosis-up-to-three-months-before-other-symptoms-appear-264378

History is full of failed attempts to establish new currencies. So what makes crypto different?

Source: The Conversation – UK – By Hiroki Shin, Associate Professor of History, University of Birmingham

Bukhta Yurii/Shutterstock

The confusion and commotion over cryptocurrency often reminds me of the 19th-century German drama Faust. In Goethe’s masterpiece, the devil Mephistopheles offers an emperor the tantalising vision of limitless wealth through the printing of paper money.

The emperor grasps the idea (unheard of at the time the play is set), and the magical wealth which paper creates brings brief prosperity to his troubled dominion.

But what appeared to be an inexhaustible source of value soon proves illusory. A combination of misunderstanding and hype leads to economic and moral corruption, and the empire descends into chaos.

It is a tale which could well have parallels for digital currencies now. People use them despite not fully understanding how they work, sometimes to their financial loss.

And history shows us that we should be wary of the idea that currency systems always change for the better, following some sort of natural evolutionary path. In fact, new currency systems don’t always succeed, and even when they do, monetary regime change can be a long and arduous process.

Coins and tokens were used more than 2,000 years ago, and continued through to the 19th century before paper finally dominated. Rather than a clean, irreversible shift from coins to notes, nations often alternated between the two systems.

There were failed experiments with paper money in 14th-century China, 17th-century Sweden and 18th-century France, to mention just a few.

Research on these problematic attempts suggests that social division also makes new currency shifts especially vulnerable.

During the American war of independence for example, a currency (the “continental dollar”) was briefly introduced in 1775. It was later abandoned due to mismanagement and misunderstanding, but had also served to sharpen political tensions between the patriots who supported it, and the loyalists to Britain, who detested it.

Similarly, in the 1750s and 1760s, the Swedish government issued non-redeemable paper money to pay its war debts. The consequent extreme inflation coincided with intense social division and led to a period of political chaos.

In 1789, at the start of the French Revolution, a paper form of government bond was issued, which rapidly lost its value. Seven years later, the “assignat” had become virtually worthless.

Britain fared slightly better, as I explore in my book The Age of Paper. Its departure from the metal standard in 1797, amid the financial pressures of the Anglo–French war, did not produce a collapse of the nation’s paper currency.

But the paper-based regime came to a halt in 1819, a year of bitter class conflicts, which culminated in the Peterloo massacre, where at least 18 people were killed and hundreds wounded by the cavalry at a peaceful rally for democratic reform. The public had come to detest Bank of England notes, which became a symbol of economic depression and political oppression.

Britain then followed the pattern of other nations, reverting to a traditional monetary system that rested on the solid value of precious metals.

These cases of failed paper currencies – and there are many more – show that the general acceptance of currency ultimately requires shared values and social solidarity. Paper currency works when people trust it, knowing that it has been valued and accepted by others in the past, and will be valued and accepted in the future.

It would otherwise be unlikely for a piece of paper to become a reliable means of payment and value. Once such shared values are lost, there is usually a downward cycle of currency depreciation.

Cryptic currency

In the 21st-century, cryptocurrencies challenge the conventional idea of money as something of value – or at least linked to something of value, like gold. And as something that is issued and managed by a trusted central authority.

For cryptocurrencies exist only in the realm of blockchain technology. Their value is created and maintained not by central banks, but by complex computer algorithms.

To many, all of these abstract computational processes make cryptocurrencies as mysterious as Mephistopheles’ dark magic in Faust.

Even so, with Donald Trump’s strong support, cryptocurrencies are enjoying a surge in popularity. This trend will undoubtedly be reinforced by further deregulation which means requirements for transparency will be relaxed, and safeguards for consumer protection weakened.

The rise in popularity has coincided with the US government’s apparent preference for weakening the dollar in the international currency markets as a way of boosting US exports by pushing down the prices of US goods abroad.

These events may lead to a profound transformation in the monetary system on a global scale. As the US dollar loses value and crypto regulations are relaxed, countries and investors around the world may be enticed to diversify their assets and increase their holdings of cryptocurrency.

But the combination of social division and rapid expansion may not be a positive sign for the future of cryptocurrency. Far from establishing it as a dominant medium of exchange in a new decentralised regime, history suggests that its rapid growth in a fractured society might instead accelerate its self-destruction.

The Conversation

Hiroki Shin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. History is full of failed attempts to establish new currencies. So what makes crypto different? – https://theconversation.com/history-is-full-of-failed-attempts-to-establish-new-currencies-so-what-makes-crypto-different-258867

Stories of people at war – what to watch, read and see this week

Source: The Conversation – UK – By Naomi Joseph, Arts + Culture Editor, The Conversation

Gritty social realism is never an easy watch. It depicts characters, often working class, struggling to find their bearings in a society that is hostile to their very existence. They tend to expose hard truths about the world we live in and often how we are complicit in upholding harsh systems that continuously fail people.

In Britain, this genre is championed by the likes of Ken Loach and Mike Leigh. In the Francophone world, it is dominated by Luc and Jean-Pierre Dardenne and now Jasmin Gordon.

The Courageous is Gordon’s debut film as a director. It follows the complicated and messy character of Jule, a single mother of three who has made many mistakes (one of which has left her with a jaunty ankle bracelet) and just can’t make things work.

As our reviewer Alison Smith, an expert in European cinema, writes: “Jule is fundamentally at odds with the ordered society of rural Switzerland, and in consequence her life is a constant struggle.” But, she doesn’t want anyone, especially her kids, to know how bad things are.

This is a morally murky film, Jule’s attempts to keep up appearances drives her to behave in ways which further marginalises her. However, it is clear that the system, and those who exist comfortably within it, don’t want to help in any way pushing her to more desperate acts and into deeper isolation. As Smith found, it is a incredibly powerful film but, like many other social realist works, there are few uplifting moments.

The Courageous is in select cinemas now

Warring couples and kings

If you are looking for something a bit more silly and lighthearted, then we recommend you go and see The Roses instead. A remake of the 1989 film The War of The Roses, this reboot stars the endlessly charming Olivia Coleman and Benedict Cumberbatch as the Roses who turn from lovers to fighters when one’s career takes off as the other’s tanks.

We sent Veronica Lamarche, an expert in relationships and psychology, to see the movie and she was struck by how people were affected by the dynamic between the couple. One woman, she notes, left the screening in tears having seen a reflection of her relationship in the film.

In her piece, she highlights how the film was a good reflection of how couples need to learn how to talk to each other and maybe even argue. In it, she outlines the psychology and offers advice on how to handle bumps before things get so bad you’re throwing knives at each other.

The Roses is in cinemas now

Someone who was good at winning battles and also negotiating was King Æthelstan whose skills in these fields led to the unification of several regions, which would become the kingdom England. The First King of England: Æthelstan and the Birth of a Kingdom by David Woodman addresses both themes of English unification and viking politics.

There is family drama, legal intrigue and exploration of how historical narratives are formed in this book. Clare Downham, an expert in the period, found it a welcome addition to the history on England’s first king, which shies away from simplistic views of this complicated and, at times, unlikable monarch.

Battling Normans and playwrights

It is often incorrectly asserted that the formation of England purely came as a result of the battle of Brunanburh (937). However, much paperwork and negotiation was also involved. A much more important battle to the history of England was the battle of Hastings in 1066.

A new BBC drama King and Conqueror depicts the events leading up to the battle and the Norman conquest. It stars James Norton as the Anglo-Saxon king, Harold II and Nikolaj Coster-Waldau as the Norman duke, William II (who would go on to be known as William the Conqueror).

We know what happened at that battle thanks two main sources: the Bayeux Tapestry and the chronicler William of Poitiers. We asked art historian and expert in the tapestry Millie Horton-Insch what she thought of the series and she said that she was pleased to see “the narrative devices that are most effective in this new drama are those also included in tapestry”.

King and Conqueror is on BBC iPlayer now

A new play at London’s Wyndham’s theatre, Born With Teeth, imagines the process of Shakespeare (Edward Bluemel) and his contemporary Christopher “Kit” Marlowe (Ncuti Gatwa) writing Henry VI Parts 1, 2 and 3 together. These plays have historically been attributed to the bard alone – but some have argued since at least the 18th century that Marlowe contributed to these works. Recent linguistic analysis of the plays does seem to back up this hypothesis.

The play presents three imagined secret meetings in the back room of a pub where the pair butt heads as they are forced to write together. Will Shüler, an expert in theatre, found it be a funny and imaginative play about theatre itself and the creative process, particularly under the religious constraints of the Elizabethan age.

Born with Teeth is on at Wyndham’s Theatre until November 1 2025

The Conversation

ref. Stories of people at war – what to watch, read and see this week – https://theconversation.com/stories-of-people-at-war-what-to-watch-read-and-see-this-week-264636

Why the US new military operation against Latin American drug cartels stokes regional tensions

Source: The Conversation – UK – By Adriana Marin, Lecturer in International Relations, Coventry University

The US president, Donald Trump, has signalled a new approach to tackling the “narco-terrorists” in Latin America, and particularly Venezuela, making it clear he is willing to use military force against them. A report in the New York Times that Trump had issued a “secret directive” to the Pentagon to employ force against certain drug cartels appeared to be borne out by a US strike, on September 2, on a Venezuelan speed boat in the southern Caribbean that killed 11 people.

The president said the strike was against members of Tren de Aragua (TdA), a Venezuelan gang he has branded “narco-terrorists”.

The situation escalated when two Venezuelan fighter jets flew over US Navy ships in the Caribbean Sea two days later in a move that the Pentagon condemned as “highly provocative”.

The US secretary of state, Marco Rubio, has warned that operations against drug cartels “will happen again”. He added that previous US drug policies had not worked and “what will stop them is when you blow them up”.

Trump released a grainy video on social media of a speeding boat after the September 2 incident. US officials said the boat was carrying drugs, but attempts at verification were inconclusive. A Venezuelan government official had questioned whether the video depicted what Washington claimed.

The operation raises legal questions over proportionality and use of force. If this was an intentional strike against a non-state armed group, it signals a significant shift in US policy. The deployment of counterterrorism methods – once directed at al-Qaida or the Islamic State – against a Latin American criminal cartel represents a dramatic escalation with serious implications.

This also fits within a wider Trump initiative to take on drugs cartels including issuing a US$50 million (£37 million) reward for information leading to the arrest of Venezuela’s president, Nicolás Maduro, who the Trump administration links with drug smuggling. In early 2025, the US designated the TdA as a foreign terrorist organisation (FTO), along with several other Latin American cartels.

PBS reports on the attack showing the video of a speeding boat.

The decision was unusual. FTO status has historically been applied to ideologically driven groups, not profit-orientated criminal organisations. Yet the designation unlocked the ability of the US to use counterterrorism measures and a political commentary that frames gangs as wartime adversaries rather than criminals.

The US has not ratified the United Nations Convention on the Law of the Sea , which governs maritime enforcement, but it generally treats many of its provisions as international law. Domestically, only Congress can declare war under the constitution, while the president acts as commander-in-chief. Previous administrations have relied on the 2001 Authorization for Use of Military Force as the legal basis for counterterrorism operations abroad, but this has never been applied to drug cartels. This creates a grey zone: Washington claims authority to act, but both the international and domestic legal foundations remain contested.




Read more:
Guyana’s president wins another term in election watched keenly by Venezuela and US


Expanding the fight

FTO designation expands what can be done under domestic law. However, it does not create a right to kill
suspects in international waters. Such a shift is important, as it changes what would usually fall within the remit of policing, reframing it as armed conflict. This militarisation introduces the apparatus of warfare: missiles, warships, and rules of engagement that lower thresholds for the use of lethal force.

By conflating organised crime with terrorism, responses risk becoming militarised in ways that lack accountability. A warning from the US defense secretary, Pete Hegseth that “it won’t stop with just this strike” is another suggestion that this is a campaign rather than a one-off action. Militarised counter-narcotics operations are not new, but framing them through the lens of counter-terrorism is, and suggests a wider use of military force.

Proponents of a hardline approach contend that cartels such as TdA resemble insurgent organisations. Working across borders, they adapt quickly and use violence, they diversify into trafficking, extortion and protection rackets, while exploiting migration flows and infiltrating law enforcement.

Marco Rubio, the US secretary of state, confirms reports that the Trump administration is going to use ‘full powers’ to take on drug cartels.

From this perspective, conventional criminal justice tools are ineffective. Extraditions are often delayed and prosecutions unreliable because cartels frequently operate across borders, benefit from corrupt protection networks and are difficult to apprehend.

Yet conflating organised crime with terrorism carries serious risks. Unlike al-Qaida or the Islamic State, TdA seeks profit and control, not radical political change. Labelling it a terrorist organisation risks blurring legal boundaries. The designation of an act as terrorism often shift rules of engagement from due process to battlefield logic, lowering the threshold for lethal force.

The legal basis is also tenuous. FTO status broadens domestic authorities but does not itself provide a licence to use force under international law. Any claim of self-defence would require also imminent danger, this has not been shown.

Risks in the region

This strike delivers Maduro a propaganda gift. For years, Venezuela has portrayed US pressure as imperial aggression designed to undermine its sovereignty. The destruction of a Venezuelan vessel by a US missile, even in international waters, appears to validate that claim. It is likely to give Maduro an opportunity to rally domestic supporters, consolidate control over security institutions and court sympathetic foreign allies who share his anti-US position.

Neighbouring governments face a dilemma. Many are weary of cartel violence, human trafficking, and the effects of criminal infiltration. Some may even welcome Washington’s tougher approach. Yet few leaders wish to legitimise unilateral US military action. Even if there is some support for tougher action against cartels, regional political leaders are likely to divide over whether the potential benefits outweigh the risks of being drawn into conflicts they did not sanction.

Finally, there is a deterrence paradox. High-profile strikes may remove leaders, but they rarely dismantle networks. Instead, groups splinter, adapt and sometimes embed further into civilian life. The “balloon effect” – squeezing crime in one place only to displace it elsewhere – remains a constant. In short, military action does not usually eliminate criminal economies, it often changes or moves them. Militarisation risks fuelling escalation.

The US strike against the TdA blurs the line between law enforcement and war. It sets a precedent where states can justify cross-border assassinations under the guise of “counter-terrorism” against criminal suspects. The question is not whether TdA is violent – it is. The real issue is whether labelling it as “terrorism” legitimises a military approach that could be counterproductive, unlawful and dangerous. Washington’s new “narco-terrorism” doctrine risks fuelling the very instability it claims to fight.

The Conversation

Adriana Marin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Why the US new military operation against Latin American drug cartels stokes regional tensions – https://theconversation.com/why-the-us-new-military-operation-against-latin-american-drug-cartels-stokes-regional-tensions-264645