Fining hospitals for medical misogyny won’t help women – it will hurt them

Source: The Conversation – UK – By Philip Broadbent, Wellcome Multimorbidity PhD Fellow & Public Health Registrar, University of Glasgow

At the back of the queue. toodtuphoto/Shutterstock.com

Hospitals that score poorly on feedback from female patients could soon see their budget cut under a plan unveiled in April by Wes Streeting, the UK’s health secretary. Branded “patient power payments”, the scheme would tie a slice of hospital income to women’s experiences of care, a measure designed to end what Streeting himself has called an “appalling culture of medical misogyny” in England’s National Health Service.

The instinct behind the policy is understandable. Women’s anger is real, well founded and widely overdue for a serious answer.

The current backlog experienced by women is the clearest summary of the problem. Nearly a quarter of a million women are on waiting lists for gynaecological care in England. This number has roughly doubled since 2018 and grown faster than any other clinical speciality’s waiting list.

In a survey of more than 100,000 women, half said their pain had been disregarded and overlooked. In the UK today, obtaining a diagnosis of endometriosis (a painful condition affecting roughly one in ten women) now takes an average of nearly nine years and roughly ten visits to a GP.

This is a cultural sickness. Whether the right response is to dock money from overburdened hospitals is a different question.

Pay-for-performance schemes for hospitals have a long and somewhat chequered history abroad. A US review found that American hospitals serving the poorest patients paid roughly 10% of all penalty dollars under Medicare’s quality programmes while taking home only 5% of the bonuses. Research has also found that such programmes risk widening inequality by diverting funds from hospitals that care for the sickest patients.

The surveys themselves are not without bias. Female, Asian and black doctors score lower on patient experience ratings than their white male peers, even when the care is identical. These scores often pick up charm, confidence and continuity rather than clinical quality.

Health secretary Wes Streeting carrying a red binder.
Streeting has a plan – not a great one, though.
repic/Shutterstock.com

The problem with importing this logic into England is that the hospitals most likely to score the worst are already in the worst shape.

Patients in deprived areas tend to be sicker and develop multiple long-term conditions up to seven years earlier than those in wealthier neighbourhoods – factors that drag satisfaction scores down without necessarily reflecting poor clinical care.

Hospitals serving patients in the most deprived areas recorded the steepest fall in finances last year, and the NHS as a whole is carrying a deficit above £1 billion, with more than 20,000 roles needing to be cut just to balance the books.

The inverse care law

Strip cash from hospitals with the longest waits, the hardest caseloads and the smallest budgets, and the women least well served to start with will find their care worse, not better. That is what the inverse care law, first set out in 1971, predicts: good healthcare is scarcest where it is needed most, and scarcer still where market pressures dominate.

The doom loop is easy to foresee: a struggling hospital loses funding because its ratings are poor; it cannot adequately recruit or retain gynaecologists; waiting times lengthen; ratings drop further; more money disappears.

The question, then, is what would actually work.

Do other policies show more promise? Between 2023 and 2025, England funded women’s health hubs – one-stop community clinics offering contraception, menopause care and help with period problems and pelvic pain under one roof. Evaluations led by Rand Europe reported shorter waits, fewer hospital referrals and better patient experience. Dedicated funding for the hubs has not been renewed and many face being scaled back or closed in 2026.

A deeper fix might be found within medicine itself. Much of modern practice was built around male bodies, and women were largely excluded from clinical trials until the 1990s, leaving drug doses, pain research and disease criteria mostly calibrated around men.

The textbook heart attack symptom of crushing chest pain is one more likely to be experienced by men. This is one reason why women are around 50% more likely than men to be given the wrong diagnosis when having a heart attack.




Read more:
Are heart attack symptoms sexist?


Women reporting abdominal pain are routinely sent through slow general clinics rather than dedicated gynaecology services. Resetting those defaults costs relatively little and tackles the bias at its source.

The reality is that this problem is bigger than any one solution can deliver. Women are dismissed in consultations, wait years for routine gynaecology and are misdiagnosed for conditions from endometriosis to heart attacks – issues that need clinical time, training and steady staffing, not budget cuts.

Obstetrics and gynaecology has among the worst vacancy rates in the English health service. One in five obstetricians and gynaecologists plans to leave the NHS within five years and nurses are quitting the profession at record rates. A scheme that cuts the budgets of struggling hospitals threatens to speed the exodus driving the problem in the first place.

“Patient power payments” has the appeal of borrowing the language of consumer choice. It reframes our cultural failure as a marketplace one. But a marketplace will always reward customers with the most power to walk away. And punish the hospitals left to look after the women with the least.

The Conversation

Philip Broadbent receives funding from the Wellcome Trust 223499/Z/21/Z

ref. Fining hospitals for medical misogyny won’t help women – it will hurt them – https://theconversation.com/fining-hospitals-for-medical-misogyny-wont-help-women-it-will-hurt-them-281079

From floppy discs to Claude Mythos, how ransomware grew into a multibillion-dollar industry

Source: The Conversation – UK – By Anja Shortland, Professor in Political Economy, King’s College London

jijomathaidesigners/Shutterstock

When evolutionary biologist Joseph Popp coded the first documented piece of ransomware in 1989, he had little idea it would become a major criminal business model capable of bringing economies to their knees.

Popp, who worked for the World Health Organization at the time, wanted to warn people about the dangers of ignoring health warnings, poor sexual hygiene and (human) virus transmission.

He sent out 20,0000 floppy discs that, when loaded, flashed up a demand for money to regain files that had supposedly been encrypted (in fact, it was just their file names). He was later arrested and charged with 11 counts of blackmail, but declared mentally unfit to stand trial.

In 1996, two Columbia University computer scientists published a paper explaining how criminals could use more sophisticated versions of Popp’s scheme to mount large-scale extortion operations. At the heart of this was malicious software that could be used to encrypt, block access to or steal a person or organisation’s files and data.

However, two preconditions still had to be met for ransomware to become a feasible criminal business: communication channels that were difficult to monitor, and a payments process outside financial regulation.

The Tor protocol, released by US intelligence services to protect their covert communications, solved the first problem in 2004. Cryptocurrencies solved the second – in particular, when bitcoin cash machines started appearing in North American cities from 2013.

Today, artifical intelligence makes malware coding and crafting convincing phishing-emails in any language simple. And the latest model in Anthropic’s AI system, Claude Mythos, recently proved more effective at hacking into computer systems than humans.

As an expert in extortive crime, I am increasingly concerned about public and political apathy to the threats posed by ransomware. To better understand these, it’s worth tracing its evolution over the past two decades – and how improvements in computer security and law enforcement, plus changes in data regulation, have led to new criminal strategies each time.

Cut out the middlemen

The first generation, which came to global attention in the mid-2010s, was known as “commodity ransomware”. A pioneering example, Cryptolocker, was developed by Russia-based hackers who infiltrated hundreds of thousands of computers, seeking to cut out the middlemen previously needed to commit financial fraud. They proved that a large majority of their victims would happily pay a small ransom to restore data that had been locked by their malware.

As both competent and incompetent hackers piled into this new market, victims shared information about rogue operators and put them out of business. This led to the second generation of ransomware such as Ryuk, which emerged in 2018.

In this phase, criminals abandoned the indiscriminate “spray-and-pray” approach in favour of targeting individual cash-rich businesses. They would set an individual ransom, negotiate with the company, and even offer to help with decryption if paid. Fast-rising ransoms more than compensated for this increased administrative effort.

In response, many companies began investing in multi-factor authentication, better threat monitoring, advance warning systems and software patches for known vulnerabilities.

However, these security benefits were soon offset by the impact of COVID on work practices across the world. The pandemic led to widespread remote working, with many people using unsecured devices and connections that were vulnerable to cyber-attack.

A multibillion-dollar industry

The next ransomware innovation was driven by the emergence of back-up systems that enabled companies to restore encrypted files without the criminals’ help. This was coupled with the emergence of tighter data privacy regulation such as GDPR in Europe and the UK.

Invented in 2019, third-generation ransomware weaponised these regulations, which threatened firms with massive fines if confidential data about clients or staff was revealed. The criminal gangs now sought out and exfiltrated an organisation’s most sensitive files, then threatened to publicise them through dedicated dark web leak sites.

This so-called double-extortion model – encrypting an organisation’s data while threatening to make it public – brought many businesses back to the negotiation table.

Ransomware had become a multibillion-dollar industry – with the Conti gang, sheltered by Russia and employing hundreds of people, among the key players setting new records for ransomware demands. Its attacks on critical infrastructure and hospitals saw it sanctioned by the UK government in 2023.

Video: BBC News.

This new approach forced many governments to row back on imposing hefty fines for data breaches, since many were the result of criminal attacks. Meanwhile, new initiatives by law enforcement – supported by the private sector – targeted and broke up the largest and most egregious ransomware gangs.

Today’s fourth generation of ransomware, building on the latest AI technology, looks nimbler and slimmed-down in comparison. Anyone who gains access to a network can lease weapons-grade malware on the dark web without forming long-term ties with a particular gang.

Advanced AI-based hacking tools make ransomware accessible to many more criminals and politically motivated hacktivists. And around one-quarter of breaches still result in ransom payments. For criminals sheltered by their governments, only the digital infrastructure is at risk of being taken down by western law enforcement.

Lessons not learned

While coverage of Claude Mythos suggests even the most sophisticated cyber defences could now be vulnerable, the troubling reality is that many individuals and organisations are still using out-of date, unpatched or only partially upgraded software. This means even early-generation ransomware techniques are still lucrative.

While Popp sent out his floppy discs to promote better sexual hygiene, today’s poor cyberhygiene is leaving many public and private networks open to malware attacks. The intended lesson of his original ransomware caper – be vigilant and properly heed health warnings – has still only been partially learnt in the digital world.

Many western societies appear to have grown accepting of criminals leaching on business conducted on the internet. Not even a steady stream of human fatalities, caused by attacks on hospitals and medical providers, has generated the level of response required to stamp out this dangerous threat.

The hope that governments sheltering cybercriminals can be encouraged (or forced) to stop them targeting critical national infrastructure appears increasingly fragile amid current geopolitical tensions. At all levels of society, we need to get smarter about cyber defence.

The Conversation

Anja Shortland does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment. Anja’s latest book, We Know You Can Pay a Million: Inside the Dark Economy of Hacking and Ransomware, is published by Profile Books.

ref. From floppy discs to Claude Mythos, how ransomware grew into a multibillion-dollar industry – https://theconversation.com/from-floppy-discs-to-claude-mythos-how-ransomware-grew-into-a-multibillion-dollar-industry-281000

Venice is sinking – we analysed every plan to save it, and none would preserve the city as we know it

Source: The Conversation – UK – By Robert James Nicholls, Professor of Climate Adaptation, University of East Anglia

Venice has co-existed with the sea throughout its 1,500-year history, perhaps better than any other city on earth. Yet over the past century it has flooded increasingly often, as the sea rises and the city itself sinks under its own weight.

We recently published an academic analysis of the various options Venice has to ensure its long-term survival.

Our study compares a range of possible strategies against different degrees of sea-level rise. These include maintaining the current system of mobile barriers, building ring dykes to separate the city from the lagoon in which it sits, enclosing the whole lagoon within a much larger defence system, or – in the most extreme case – relocating much of the city and its population inland.

Each option becomes relevant at different points as sea levels rise. The city’s flood defences have already been upgraded substantially, at a cost of €6 billion (£5.2 billion). This involves a series of huge steel gates attached to the seafloor, known as the Mose barriers. When raised, these barriers effectively seal off the Venetian Lagoon from the wider Mediterranean Sea.

Steel barriers rising from the water
Mose barriers sealing Venice off from the sea.
Zaltrona / shutterstock

The Mose barriers mean the flood risks are currently manageable, but the frequency of their use is rising. In the first five years of use (between 2020 and 2025) the system was closed for 108 high waters, while in the first two months of 2026 it was activated 30 times. And as sea levels continue to climb, it would need to be closed more and more often – potentially for weeks at a time each year.

This creates a series of problems. Frequent closures would disrupt shipping and tourism, alter the lagoon’s ecology, and would require major new systems for sewage treatment and huge pumps to maintain lagoon water levels. A system designed for occasional protection risks becoming a semi-permanent barrier – something it was never intended to be.

With additional measures, such as raising the city by injecting sea water into the rocks deep underground, reversing the subsidence to some degree, these barriers could remain effective for some time – perhaps even after a metre of sea-level rise.

But even under relatively low levels of warming, the sea is projected to keep rising for centuries, eventually pushing beyond what the barriers can handle.

At that point, more radical measures may be necessary. Building a ring of dykes around the city would physically separate Venice from the lagoon, but may be necessary by the end of this century.

Aerial shot of Venice surrounded by dykes
Venice in the 2100s? An AI-generated impression of the city surrounded by dykes.
The Conversation / Gemini, CC BY-SA

A fully enclosed lagoon – protected by a much larger “super levee” and supported by continuous pumping – could protect the city from up to 10m of sea level rise, but at severe cost to the living lagoon.

The only other option is to relocate the city to safer ground. This may be necessary beyond about 5m of sea-level rise, which is projected to occur after 2300.

Difficult choices ahead

The financial costs of these choices are substantial. We used the costs of Mose and other previous engineering projects (adjusted for inflation to 2024 prices) to estimate the cost of each adaptation strategy.

Diagrams of Venice's flood protection options
The strategies described in this article, with an additional line showing superlevees (part of the closed lagoon strategy).
Lionello et al (2026) / Scientific Reports, CC BY-SA

The dykes could cost between €500 million and €4.5 billion. Closing the lagoon with a super levee could initially cost more than €30 billion, and relocating the city could cost up to €100 billion.

But costs aren’t the only issue. How do you even put a price on the cultural value of Venice? Especially as none of these measures will be able to sustain the Venice we see today in the long-term. Adaptation can manage change up to a certain point – beyond that, we are no longer preserving the present. Rather, we are designing a fundamentally different future.

Our analysis shows there is no optimal adaptation strategy. Any approach involves trade-offs between the wellbeing and safety of Venice’s residents, economic prosperity, the future of the lagoon’s ecosystems, heritage preservation, and the region’s traditions and culture. In addition, many of these measures can take decades to fully implement, so early planning is essential.

At least Venice is thinking about these things in a long-term way. Most vulnerable coastal areas are not. In fact, many continue to attract businesses and people, even as rising seas gradually narrow the range of viable long-term options.

With its long and unique history, Venice has particular challenges, but all low-lying coastal areas should recognise the danger of long-term sea-level rise and start preparing now.

The Conversation

Robert James Nicholls received funding from the Horizon 2020 program of the European Commission through the CoCliCo project (#101003598).

Piero Lionello has received funding from Italian Ministry for Education and Research (PNRR-HPC Center).
Piero LIonello is a member of the Scientific Committee of the International Centre for Climate Change Research and Studies
Co-coordinator of the MedECC network (Mediterranean Experts on Environmental and Climate Change)

Marjolijn Haasnoot does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Venice is sinking – we analysed every plan to save it, and none would preserve the city as we know it – https://theconversation.com/venice-is-sinking-we-analysed-every-plan-to-save-it-and-none-would-preserve-the-city-as-we-know-it-280891

With talk of closer EU alignment, the UK is signalling to Europe that it’s a partner worthy of trust

Source: The Conversation – UK – By Ursula F Ott, Professor of International Business, Nottingham Trent University

PM Keir Starmer and European Commission President Ursula von der Leyen have reset the UK-EU relationship – but UK alignment would take things a step further. Alexandros Michailidis/Shutterstock

It is now almost a decade since the UK voted for Brexit and since the tariffs of US president Donald Trump’s first term increased global trade frictions. Brexit removed the UK from the European single market for goods and services. Now though, the country is proposing a pivot back towards alignment with EU regulations.

What could have not been widely predicted back in 2016 was the COVID pandemic, nor a war on European soil. The UK has been exposed to these shocks without the EU support system. So what may once have been impossible to imagine is now on the cards: adopting EU single market rules under new UK legislation.

In May 2025, the UK and EU reached a new trade agreement, paving the way for both sides to move closer on their economies and business. This was hastened by unpredictable US trade tariffs and a weakening of the US-UK-EU relationship. In addition, it has been estimated in a comprehensive study that Brexit has reduced the size of the UK economy by 6-8%.

Politically, the approach announced by the UK prime minister, Keir Starmer, is a courageous step. UK legislation would allow the country to adopt new EU laws without the need for parliament to vote each time. But any plan is certain to provoke strong opposition from the Conservatives and Reform UK.

However, it is a signal of the seriousness of the UK’s intentions to move closer to the EU by adapting to its regulations and giving up independence from EU law. That is a costly move for the UK in terms of its credibility, but the U-turn should reinforce its commitment to the EU.

But beyond this, there are three clear benefits to the UK.

  1. The EU is built on rules and regulations that guide the bloc’s labour market, trade and security systems. Alignment would clearly help UK businesses, consumers and individual workers to manoeuvre within these systems.

  2. By breaking from the single market, the UK chose a costlier approach to trading and investing across the EU border. Aligning regulations would reduce cross-border bureaucracy.

  3. The EU is looking for new trading partners after supply chain disruptions from COVID and the Ukraine war – not to mention the current impact on oil and gas supplies. The EU does not need to rely on the UK, but a new direction in the relationship could reduce the threat of supply chain disruption in future.

A better deal for consumers?

So what could this mean for UK businesses and consumers? Food producers trading within the UK-EU zone would have a quicker turnaround of their fresh produce. This would reach shop shelves in the UK and EU more quickly, giving shoppers better-quality fresh foods.

Reducing the amount of complex paperwork and export health certificates at borders would allow a free flow of fresh food even between Great Britain and Northern Ireland (which remained part of the single market). This trade has been disrupted since Brexit and affects both trade between food producers due to paperwork and border delays, and food security.

Border checks, paperwork and adapting to legal requirements are expensive and so increase food prices (and with that, inflation). Bringing trade between the EU and the UK closer could reduce these costs, and should also allow producers to benefit more from global value chains.

US tariffs are at their highest levels since the second world war, and the knock-on cost effects of supply chain disruption in the Middle East make a strong case for strengthening ties between neighbours.

Going forward, it will be resilience rather than efficiency in trade that will be important for both businesses and nations. Both will want to be able to reconfigure networks at speed. If inflation rises due to product shortages, governments have limited fiscal space to offer direct support to citizens (which would mean increased levels of spending), or to cut taxes.

Another benefit could come in the form of foreign direct investment into the UK from overseas. In 2025, this began shifting from low-cost developing countries towards capital-intensive and technologically-driven investments in developed countries – and especially in the EU (Germany, Italy and France).

Alignment with EU regulation could give investors more confidence to commit to the UK. Foreign direct investment in renewable energy and AI products, for example, would benefit both the UK’s workers and its consumers.

This is a time of new geopolitical alliances, cooperation and blocs. Trading and investment options could help secure economic, political and societal stability in a volatile world. So far, this is a relatively small step by the UK – but starting to align to EU regulations could ease a complex relationship.

The Conversation

Ursula F Ott does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. With talk of closer EU alignment, the UK is signalling to Europe that it’s a partner worthy of trust – https://theconversation.com/with-talk-of-closer-eu-alignment-the-uk-is-signalling-to-europe-that-its-a-partner-worthy-of-trust-280961

Our Freedom: Then and Now explores what freedom means to Brits, 80 years after the second world war

Source: The Conversation – UK – By Mark Rawlinson, Associate Professor History of Art, University of Nottingham

Marking the 80th anniversary of the end of the second world war, Our Freedom: Then and Now is a nationwide photography project exploring how communities understand freedom.

The show opened at London’s Southbank Centre in April and is now touring the UK. This exhibition offers an alternative perspective to the idea that this is currently a nation divided. From the Highlands of Scotland to libraries in southwest England, it asks a simple yet powerful question: what did freedom mean in 1945, and what does it mean now?

The Socially Engaged Photography Network sent 22 photographers to work closely with community projects, ensuring the photographs were created in collaboration with participants. This approach is distinct from traditional photojournalism, which often speaks about rather than with the people photographed.

By spending time in places such as Maesteg Town Hall and libraries in Stornoway, artists including Johannah Churchill, Sam Ivin and Leticia Valverdes have focused on making photographs that portray the viewpoints of the people involved.

Projects marking the 80th anniversary of the end of the second world war can easily lapse into cliche, but Our Freedom: Then and Now avoids sentimentality. In fact, part of its power lies in engaging with the complexities of contemporary society and culture. It avoids simple slogans and instead the photographs foreground thoughtful reflections on conflict and the ongoing importance of finding common ground and sustaining connection.

As Stephanie Peacock, the UK’s minister for sport, tourism, civil society and youth, said at the launch, the project comes at an important time. With fewer people having direct memories of the war, sharing their reminiscences alongside the voices of schoolchildren and young artists creates a conversation between those who remember 1945 and those who will shape 2045.

This exchange fosters two forms of understanding: participants learn about themselves, and viewers learn about others. According to Simon Mellor, Arts Council England’s deputy chief-executive, these works bring local experiences into national conversations, offering a valuable space for dialogue in difficult times.

This was certainly my experience. I left the gallery surprised by the many ways freedom is experienced and understood across the UK. Whether it’s a veteran in Wolverhampton or a student in Hartlepool, the cumulative effect of individuals’ thoughts about freedom and community was fascinating and thought-provoking.

The exhibition is grounded by poet laureate Simon Armitage’s specially commissioned poem, Freedom Road. Echoing the participant photographs, the poem shifts its focus from grand images of liberation to the simple, everyday actions that make up real freedom. He writes:

You can’t dig up freedom like a potato

from the verges of Freedom Way, or pan it

from Freedom Beck like inklings of gold;

it won’t be delivered to Freedom Avenue

gift-wrapped in silver string.

Armitage suggests that freedom is most real when it goes unnoticed, such as the ability to disagree with a neighbour, walk where we want, and live as we choose. This idea aligns with the exhibition’s main goal: to show that freedom is something we live every day – not just a piece of history to remember now and then, but something current and vital.

The exhibition on tour

The exhibition’s tour is as ambitious as the work itself. After starting at the Southbank Centre, it travels to places like Eden Court in Inverness, the McKechnie Institute in South Ayrshire and the Strand Arts Centre in Belfast, bringing the art back to the communities that helped create it.

This return is important because it shows that art doesn’t just happen in big cities; it grows from local libraries and community centres and derives its power from these regional identities. In 2025, more than 530,000 people took part in the events and performances leading up to this exhibition.

By steering clear of easy sentimentality, Our Freedom: Then and Now does something more meaningful. It offers an honest look at how we live together. The exhibition recognises the difficult parts of our shared histories while reminding us of our shared humanity.

In a nation that can feel divided, Our Freedom: Then and Now uses photography to highlight what people have in common and where we might work harder to find those commonalities. It’s a reminder that, even though freedom requires work, it is not only worth it but necessary.

The Conversation

Mark Rawlinson does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Our Freedom: Then and Now explores what freedom means to Brits, 80 years after the second world war – https://theconversation.com/our-freedom-then-and-now-explores-what-freedom-means-to-brits-80-years-after-the-second-world-war-280955

Who is calling the shots in Iran?

Source: The Conversation – UK – By Andreas Krieg, Associate Professor, Defence Studies Department, King’s College London

Following the last round of talks between the United States and Iran in Islamabad, Iran’s foreign minister and negotiator Abbas Araghchi declared in a post on X on April 17 that the Strait of Hormuz was “completely open”. This came after he also signalled that his government could be flexible over the issue of nuclear enrichment as well as Iran’s support for its proxies in the region.

Then came an abrupt correction. Mohammad Bagher Zolghadr, a former commander in the Islamic Revolutionary Guards Corps (IRGC) who was recently appointed as secretary of the Supreme National Security Council, is understood to have complained to the IRGC, submitting a report that criticised Araghchi for “deviation from the delegation’s mandate”.

The negotiating team was called back to Tehran. Araghchi was attacked by state-run media which said his post had “provided the best opportunity for Trump to go beyond reality, declare himself the winner of the war and celebrate victory.” And the Strait of Hormuz was declared closed.

This episode demonstrates the new reality in the Islamic Republic, where the IRGC increasingly calls the shots in all matters of statecraft and government. The rest of the state is a façade at most.

Over the six weeks of war, Iran’s former leadership has been decimated: the supreme leader, Ali Khamenei, was killed in a US strike on the first day of US and Israeli attacks. Many of his senior colleagues have also been killed. Iran is no longer best understood as a state with a powerful militia. It has become, more precisely, a powerful militia with a state – a political order with the IRGC at its core.

The other traditional centres of power – the government and the clergy – have effectively been relegated to mere front organisations. Amid the fog of war, even the new supreme leader, Mojtaba Khamenei, appears merely as a legitimising ornament. In any case, Khamenei is reported to have been severely injured in the attack that killed his father and is apparently taking no part in government.

So who is running the country? The answer points unmistakably to the IRGC and its leader, Ahmad Vahidi.

Guardians of the revolution

The IRGC was created after the 1979 revolution, precisely because Ayatollah Ruhollah Khomeini and his allies did not trust the conventional state apparatus to defend the revolution. Over time it grew beyond its role as guardians of the revolution into an all-encompassing, all-channel network. It became a military, an intelligence service, an economic conglomerate and a regional expeditionary network. Its internal security force, the Basij, gave it an arm of mass social control inside Iran. The Quds force was set up to export the revolution across Iran’s proxies in Lebanon, Iraq, Syria, Yemen and beyond.

Far from destroying this architecture, sanctions deepened it. They led to the creation of front companies linked to the IRGC doing illicit deals and operating circuits of patronage that enriched those closest to the centre of power. What emerged was a parallel state that gradually outgrew the formal one.

The IRGC is organised as a network with a core and a periphery. Its central hub decides strategy. This is surrounded by a network of decentralised cells capable of operating with a high degree of autonomy. This is called Iran’s “mosaic defence doctrine”. And it was built to operate precisely the way it is now: to keep fighting amid attempts at decapitation and disruption.

A new leader emerges

After IRGC chief Mohammad Pakpour was killed on the opening day of the conflict, Ahmad Vahidi, a former interior minister and a founding member of the IRGC, has emerged to take his place. After being appointed in an emergency capacity after his predecessor was killed, he has consolidated effective control as the civilian presidency has been hollowed out.

With the new supreme leader apparently incapacitated and the clergy sidelined, Vahidi and his group of allies – IRGC commanders and security council hardliners such as Ali Akbar Ahmadian and Mohammad Bagher Zolghadr – have set the mandate and red lines for the ceasefire talks.

The IRGC’s red lines are clear: it will not surrender uranium enrichment altogether; it wants to preserve its missile program and the axis of resistance; it wants sanctions to lifted and access to Iranian assets overseas that are presently frozen. Room for negotiation only exists on technical details about enrichment levels, timelines for lifting sanctions or the language of any deals that are agreed.

In times of war, states tend to centralise as civilian institutions shrink. Hard men tend to rise, especially after many of the influential political pragmatists, such as Ali Larijani, the former secretary of the security council, were deliberately taken out by Israel.

The IRGC was not suddenly conjured by this war, but prepared by decades of institutional entrenchment, economic capture and delegated coercion. The IRGC’s military dictatorship in the making needed this war to consolidate its influence over competing nodes in the network – most importantly the clergy.

This has profound consequences for the negotiations. Instead of being straightforward bargaining between statesmen, Washington’s real estate moguls turned negotiators are speaking to Iranian counterparts who are on a short lead held by the IRGC. Progress in negotiations should not be judged by what Iran’s diplomats say in public, but by what the guard allows to be implemented in practice.

Trump and Israel’s failed decapitation strategy leaves a potent system in place that feels emboldened by the desperation in the White House to find a diplomatic off-ramp. To think that this war-hardened system of hardliners will capitulate is wishful thinking.

The past few days have made it clear that the IRGC is now a militia with a state using the civic and military institutions of the Islamic Republic as its outer skin. While there is room for negotiation to build a mutually acceptable deal, the US administration needs to be realistic about where the IRGC’s red lines are and what card it actually has to play against a resilient network with a very high threshold for pain.

The Conversation

Andreas Krieg does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Who is calling the shots in Iran? – https://theconversation.com/who-is-calling-the-shots-in-iran-281066

Dan Dare is blasting off again: why, as a scientist, I’m excited for the comics’ return

Source: The Conversation – UK – By Elizabeth Stanway, Reader in Astronomy and Astrophysics, University of Warwick

Dan Dare: Pilot of the Future was a groundbreaking science fiction comic serial, first appearing in the UK comic The Eagle in 1950. Now, more than 75 years later, a reinvention of the series is underway, with the first new graphic novel written by Alex de Campi (Bad Girls and Madi) with art by Marc Laming (Marvel’s Star Wars). set for later this year.

As science fiction enthusiast and a scientist, I am excited to see what it will be like. I’m sure I’m not alone, as a number of scientists – including the late astrophysicist and cosmologist
Stephen Hawking and planetary scientist Colin Pillinger – cited Dan Dare’s exciting vision of the future as instrumental in their decision to pursue science.

Daniel McGregor Dare is an officer in the Britain-based Interplanet Space Force (ISF). Faced with overpopulation and starvation on Earth, the ISF is tasked with exploring the possibility of crop production or trade on Venus. After initial problems, Dan Dare and a small group of colleagues are able to reach the surface of the planet. Once there, they find a habitable world with two native species: the friendly Therons, and the inimical Treens, with the latter led by their “super-scientist” the Mekon. Defeating the Mekon, and making arrangements for food supply with the Therons, Dare opens up the solar system, and ultimately the wider galaxy, to humanity.

While the concept of Dan Dare originated with a clergyman, Marcus Morris, its formative years and storylines were shaped by a very different man. Writer and artist Frank Hampson was known for the attention he paid to the science, working from detailed models and reference photographs. He gave thought to plausible design and stayed abreast of developing vehicle technologies and concepts, while also working with a scientific advisor.

In an early story, “The Red Moon Mystery” (serialised in The Eagle in 1952), for instance, he had the character Professor Peabody explain planetary orbits, magnetic fields and spectroscopic biosignatures to a young audience. He also drew a sequence with accurate representations of the Royal Observatory at Herstmonceux in Sussex, and a character closely based on the astronomer royal of the time, Sir Harold Spencer Jones.

This level of precision both added to the verisimilitude of his stories and appealed to an enthusiastic audience that saw a bright future in space exploration, an audience that included budding scientists Hawking and Pillinger.

Sadly, the level of scientific accuracy in the series declined after Hampson’s departure, with writers introducing more bizarre aliens and unexplained interstellar travel. But its engagement, from the very beginning, with technical accuracy and scientific plausibility, continues in many ways and is also part of the reason for its longevity and why it remains relevant.

A new Dan Dare

Despite its many reinventions over the decades, much of this premise has remained unchanged. Dare has always represented humanity’s best, and is typically shown as an optimistic exemplar of bravery, chivalry and honour. The Kickstarter page for the new Dan Dare: First Contact novel makes it clear that the current creative team respects the character’s origins. As the new reboot’s writer Alex de Campi says:

if you are already a Dan Dare fan, there’s a ton of references to the classic stories as well as a sincere respect for Frank Hampson’s legacy from our entire creative team.

But like the 1990s graphic novel written by Scottish comic writer Grant Morrison or the 2010s audio dramas made by B7 Productions, there will be some changes in the story. For instance, these iterations have given more agency to Dare’s female scientist colleague Professor Jocelyn Peabody. They have also typically been darker and more cynical regarding the political or commercial interests funding human spaceflight.

Cover of Dan Dare
Dan Dare is back!
Wikimedia

The new Dan Dare team also acknowledge Hampson would have expected changes in scientific and contextual representation:

In First Contact, the science is updated, making Dan’s world one we can understand from our current point of view: a world of bickering oligarchs, broken nations, and climate disaster. The stakes are immediate: humanity is only just getting faster-than-light travel.

As I’ve discussed in my own work on the relationship between science and science fiction, the stories have always reflected our changing understanding of solar system habitability. Already by 1950, scientific studies were making it clear that Venus was uninhabitable, although popular culture and even school textbooks often retained the older visions. As a result, more recent versions have tended to gloss over issues such as the origin of the Treens, sometimes relocating their civilisation to cloud cities high in the atmosphere of Venus.

The changing science shouldn’t be a surprise: the role of science fiction has always been to mirror and extrapolate as much from the sociopolitical concerns of a time as from its technology and science. Good science fiction has always balanced accurate science with fine storytelling and a critical eye towards social trends and their logical extremes. The new Dan Dare project will do so for a new audience, adding to a remarkable eight-decade long record of popular engagement with space science.

The Conversation

Elizabeth Stanway does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Dan Dare is blasting off again: why, as a scientist, I’m excited for the comics’ return – https://theconversation.com/dan-dare-is-blasting-off-again-why-as-a-scientist-im-excited-for-the-comics-return-281053

Why understanding autism means looking beyond spoken language – two autistic researchers of communication explain

Source: The Conversation – UK – By Lou Harvey, Associate Professor of Education, University of Leeds

The idea of the “autism spectrum” is widely used in diagnosis, education and public discussion. First developed by the psychiatrist Lorna Wing in the 1980s, the term was intended to reflect the wide range of autistic experiences and needs.

But a growing body of research is questioning whether the concept still helps us understand autistic lives.

We are autistic researchers of communication, education and neurodiversity. Our research focuses on paying attention to how people express knowledge and experience when communication does not fit mainstream expectations, particularly when it goes beyond spoken language.

Across this work, one finding is consistent: both autistic and non-autistic people communicate meaningfully in various ways. But this variety is often overlooked or misunderstood by traditional models of autism.

These models tend to come from cognitive science and clinical practice, where autism is defined primarily as a communication “disorder”. They suggest that autistic people have difficulty speaking, maintaining eye contact, or engaging in back-and-forth conversation.

Diagnosis is typically based on external observation by doctors, rather than on autistic people’s own accounts of their experience.

When different perspectives are dismissed

Critics argue that this approach reflects what is known as “neuronormativity”. This is the belief that there is a standard or “normal” way to communicate, think and behave. It is rooted in an assumption that language, especially speech, is what makes us fully human. Therefore, when people communicate differently, their knowledge can be treated as less valid or harder to access.

Autistic scholar M. Remi Yergeau has argued that autism has often been framed as a “narrative condition” by cognitive scientists. In other words, it is assumed that autistic people are unable to express meaningful self-knowledge.

If someone’s way of communicating is already judged incoherent or unintelligible, their perspective can easily be dismissed. This means autistic people are not considered to be reliable sources of knowledge about their own lives.

Our research, and that of other autistic scholars, challenges this assumption.

Communication is more than words

There is increasing evidence that autistic people express themselves in a wide range of ways that are not always recognised as communication. Chris’s research, for example, shows how autistic people often communicate through deep engagement with particular interests. These interests can be a way of expressing identity, connection and meaning, rather than simply being a “symptom”.

Many autistic people also use rhythmic or repeated movement and sound – often referred to as “stimming” – or repetition of words and phrases, known as echolalia. These forms of expression can communicate comfort, distress, humour, joy or focus. They can also provide sensory regulation or pleasure. They may not fit conventional ideas of language, but they are meaningful.

However, because of the deeply ingrained belief that “real” communication must be verbal, these forms of expression have received little attention within mainstream science. Yet they point to something important: communication and knowledge are not just about words. They are also about feeling, and the things we cannot say.




Read more:
What autistic people – and those with ADHD and dyslexia – really think about the word ‘neurodiversity’


Research by the neuroscientist Antonio Damasio has shown that emotion is not separate from thinking but fundamental to it. Feelings shape attention, decision-making and understanding. In this sense, feeling is part of how we know the world. If we are to take people’s knowledge about themselves seriously, we must pay attention to it.

Our research builds on this idea, showing that communication and knowledge are not limited to what can be clearly spoken or measured.

Aerial view of crowd people connected by lines
Language has limits.
Master1305/Shutterstock

From diagnosis to paying attention

Clinical diagnosis remains necessary because it enables access to support and services. But diagnosis alone may not fully capture how autistic people experience and communicate their needs.

We suggest a shift in emphasis. Rather than asking, “What is wrong with this person?” we may ask, “How can we pay attention to this person?”

Paying attention means taking feeling seriously as a way of knowing and recognising that language has limits. Research by Lou and colleagues has found that when spoken language is not available or not sufficient, other forms of interaction – such as art, play, care and simply being with others – can become more meaningful. These forms of communication are often harder to observe or measure than language, which may explain why they have received less attention in traditional research.

But they are fundamental to how many people, autistic or not, experience connection and understanding. Recognising this has practical implications. It means that decisions about education, support and policy are shaped by how autistic people actually experience the world.

In schools, this could lead to better identification of barriers and more responsive teaching practices. In policy, it could inform more effective approaches to special educational needs provision, diagnosis and employment support.

More broadly, it suggests that expanding how we understand communication could benefit everyone. All of us – regardless of whether we are autistic – have experiences that are difficult to express in words.




Read more:
Why it’s time to rethink the notion of an autism ‘spectrum’


The concept of the autism spectrum was originally intended to reflect diversity. But if it continues to rely on narrow assumptions about communication and knowledge, it may not fully capture that diversity in practice. Our work is part of a growing area of research exploring how to better recognise different forms of expression and understanding, including those that fall outside conventional definitions of language.

Taking these forms seriously does not mean abandoning science. It means broadening what we consider to be valid evidence, and who we recognise as knowledgeable about autistic experience. If we do this, we may find that approaches designed to support autistic people can support many others too.

The Conversation

Lou Harvey has received funding from the UK Arts and Humanities Research Council, the former Higher Education Funding Council for England, and the Society for Research into Higher Education.

Chris Bailey received academic funding from the UK Literacy Association (UKLA) to to total of £1400 for the Ruling Passions project that is mentioned in this article.

ref. Why understanding autism means looking beyond spoken language – two autistic researchers of communication explain – https://theconversation.com/why-understanding-autism-means-looking-beyond-spoken-language-two-autistic-researchers-of-communication-explain-278633

How school grades can affect mental health – particularly for girls

Source: The Conversation – UK – By Anna Linder, Researcher in Health Economics, Lund University

LightField Studios/Shutterstock

Schools increasingly rely on testing, grading and performance accountability. In England, Ofsted inspections and school league tables sharpen the focus on measurable performance. Similar developments have taken place in Sweden, where repeated reforms have introduced earlier and more detailed assessments.

Performance-driven school environments shape young people’s wellbeing. Yet despite frequent reforms to evaluation systems, their psychological consequences rarely take centre stage in policy debates.

Our new study connects these trends with rising youth mental health issues. Our research shows that earlier and more formal grading can increase clinically diagnosed mental health problems, particularly among girls.

Our research examined a Swedish reform introduced in 2012 that moved the start of formal grading from grade eight (around age 14) to grade six (around age 12). This meant official grades and clearer signals of relative performance arrived two years earlier than before.

To estimate the effects, we compared children born just before and just after the reform cut-off. Because exposure depended strictly on date of birth, students on either side were similar in background but differed in whether they received earlier grades. We also accounted for certain underlying trends across this time period, such as an overall increase in mental health diagnoses over time. Comparing cohorts in this way allows us to isolate whether earlier grading itself led to changes in mental health diagnoses.

Our analysis draws on nationwide linked education and health registers covering more than 520,000 children born between July 1992 and June 2000. We examined psychiatric diagnoses recorded in outpatient and inpatient care during the year students entered grade nine (the end of lower-secondary school).

Earlier grades affect girls’ mental health

Earlier grading increased diagnoses of depression and anxiety among girls, with the largest effects among girls whose academic achievement ranged from low to average. Effects for boys were smaller and less consistent.

Among girls, the share diagnosed with depression or anxiety increased from 1.4% to 2.0%. While the absolute change (0.6 percentage points) may appear modest, psychiatric diagnoses at this age are relatively uncommon. The change represents roughly a two-fifths increase compared with before the reform.

A young teen girl with her face in her hands being comforted by adults
Depression and anxiety were shown to be more common in girls who received grades at earlier ages.
SeventyFour/Shutterstock

Our findings point to academic pressure and social comparison as likely reasons for this increase in mental health problems. Formal grades make performance more visible at a younger age, clearly signalling how a child ranks among their peers. At a stage when young people’s understanding of themselves is still developing, this may heighten their sensitivity to comparison and perceived failure.

One plausible explanation is greater sensitivity to performance feedback among girls. In earlier research, we found that when girls received grades more favourable than their measured performance would predict, their mental health improved. This suggests they may be particularly responsive to evaluative feedback, and therefore more vulnerable when grading intensifies.

Wider consequences

Our findings indicate that academic pressure may contribute to gender gaps in adolescent mental health. If girls are more likely to internalise the pressure and stress of academic evaluation, earlier grading may unintentionally widen the well-documented existing gender disparities in mental health.




Read more:
Making sense of the widening gender mental health gap: what teenage girls told us


We do not argue that grading is inherently harmful. Grades can motivate, guide learning and inform parents and teachers. But timing and design matter. When evaluation becomes more formal earlier in schooling, unintended psychological costs can emerge alongside academic goals.

As grading systems continue to evolve, questions of timing and intensity deserve careful thought. Schools are not only institutions for measuring performance, but environments where young people form their identities. Designing education systems that support both learning and healthy development requires taking both aims seriously.

Education policy inevitably involves trade-offs. Systems designed to measure and raise standards also shape students’ daily experience. Our findings suggest that when policymakers move formal evaluation to younger ages, they should weigh mental health impacts alongside academic benefits.

Accountability policies should consider psychological effects. This does not mean abandoning grading, but evaluation systems should be sensitive to the development stage of students and accompanied by relevant support that helps students interpret feedback constructively.

Students respond differently to evaluation. Reforms that work well for some may create strain for others, particularly those already vulnerable to performance pressure. Monitoring wellbeing alongside academic outcomes can help identify unintended consequences early.

The Conversation

Anna Linder receives funding from the Swedish Research Council for Health, Working Life and Welfare and the Public Health Agency of Sweden.

Gawain Heckley currently receives funding from Swedish Research Council , Swedish Research Council for Health, Working Life and Welfare and Jan Wallanders och Tom Hedelius stiftelse.

Ulf Gerdtham receives funding from Swedish Council for Working Life and Social Research.

ref. How school grades can affect mental health – particularly for girls – https://theconversation.com/how-school-grades-can-affect-mental-health-particularly-for-girls-277907

Alzheimer’s drugs offer little benefit, major review finds – and the reasons go deeper than the science

Source: The Conversation – UK – By Simon Kolstoe, Associate Professor of Bioethics, University of Portsmouth

PeopleImages/Shutterstock.com

How is it possible to spend tens of billions of dollars developing drugs to treat a serious disease that affects millions of people, and yet end up with something that does not work? This is a mystery that has bedevilled Alzheimer’s research for years.

A new review of the evidence has concluded that the leading class of Alzheimer’s drugs “probably result in little to no difference” in a range of measures, including reducing dementia severity. This finding was quickly used to further justify the NHS’s decision two years ago not to fund these drugs.

These findings are disappointing, not just for researchers and drug companies, but also for the tens of millions of people and their families suffering from the effects of a devastating disease.

Medical research is often reported through success stories, but Alzheimer’s disease has remained stubbornly resistant to the development of life-changing breakthroughs. This has not gone unnoticed. A couple of years ago, investigative journalists uncovered significant fraud in important studies underpinning some of the science behind the leading Alzheimer’s drugs.

While this fraud is not solely responsible for the lack of progress in Alzheimer’s research, it does reveal how vested interests can distort science and how commercial interests can sometimes override indications that a specific approach may not actually be working. It also reveals how social, political and economic factors can distort and hold back entire fields of research.

A century of science and still no answers

The German psychiatrist Alois Alzheimer first identified the disease that bears his name in 1906. Over the subsequent years, it was found to be characterised by abnormal protein deposits in the brain called amyloid “plaques” and similarly misfolded protein tau tangles.

As these misfolded proteins are not found in healthy brains, it was assumed that they were the cause of the disease. But subsequent studies showed that the amounts of these protein deposits did not correlate well with disease severity, unlike similar diseases, where misfolded protein deposits occurring in other parts of the body led directly to organ failure.

This complex relationship between the pathological changes in the brains of people with Alzheimer’s and the psychological progression of the disease has split the research field for many years.

At one point, those proposing that amyloid deposits (or at least the molecular processes leading to them) were the main cause of the disease were even referred to as “Baptists”, while those holding tau as responsible were called the “Tauists”. Although these have been the main two theories as to the cause of the disease, there have been numerous others, such as linking the disease to the abnormal behaviour of neurotransmitters, inflammation, presence of pollutants, age-related changes, DNA damage, viruses and even sleep disturbance.

In situations like this, when there are many competing theories, researchers who start working on one theory can start to become entrenched. This is an unfortunate byproduct of competitive funding models, where research money tends to flow to the researchers who are most successful at arguing that their approach is the most promising and therefore worthy of receiving more research money. This is an interesting example of how science is not always an entirely objective endeavour.

An illustration of amyloid plaques clogging up a brain's neurons.
Amyloid plaques (in orange) are a hallmark of Alzheimer’s. But are they the cause?
Kateryna Kon/Shutterstock.com

This pressure on researchers to publish papers and attract funding is probably a contributing factor to the significant fraud linked specifically to some working on the amyloid hypothesis for Alzheimer’s. In one case, a researcher in the US was forced to resign from his university following the retraction of a much-cited paper, and the discovery that over 20 other papers may have similarly questionable data.

In a separate case, an academic faced fraud charges, while a pharmaceutical company they worked with came under investigation for allegedly misleading investors. Both of these cases were in connection with a different approach to treating Alzheimer’s, namely, targeting a protein called filamin A.

Indeed, controversies within Alzheimer’s research have become so frequent that they have inspired an entire book dedicated to examining the issue.

Matthew Schrag, a neuroscientist who played a key role in exposing elements of fraud in Alzheimer’s research, said: “You can cheat to get a paper. You can cheat to get a degree. You can cheat to get a grant. You can’t cheat to cure a disease. Biology doesn’t care.”

While scientific breakthroughs undoubtedly underpin much of modern life, the example of Alzheimer’s research serves as a reminder that the path from defining a problem to discovering a solution is rarely straightforward.

It would be nice to think that the main incentive for most researchers might be solving a problem or curing a disease, but the actual situation is far more complex. Research relies on funding, and researchers get jobs based on reputation, often in the form of publications. Because of this, the wrong behaviour can become incentivised.

The complexity of Alzheimer’s disease and the lack of obvious answers or cures make this field particularly susceptible to distortion by the social factors that can influence science.

As researchers and pharmaceutical companies compete for funding and investment, the science starts to get lost behind the games that are played. The end result is not only financial loss and lack of progress, but in the case of this devastating disease, millions of people also end up suffering through a lack of effective treatments.

The Conversation

Prior to moving into Bioethics Simon Kolstoe spent 17 years working on protein folding diseases (including Alzheimer’s) on grants funded by the MRC, BBSRC, Wellcome Trust and with a number of pharmaceutical companies. He receives no income from his former research in this area.

ref. Alzheimer’s drugs offer little benefit, major review finds – and the reasons go deeper than the science – https://theconversation.com/alzheimers-drugs-offer-little-benefit-major-review-finds-and-the-reasons-go-deeper-than-the-science-280833