The rise and fall of globalisation: battle to be top dog

Source: The Conversation – Global Perspectives – By Steve Schifferes, Honorary Research Fellow, City Political Economy Research Centre, City St George’s, University of London

A world map showing the extent of the British Empire in 1886. Norman B. Leventhal Map & Education Center, Boston Public Library/Wikimedia Commons, CC BY

This is the first in a two-part series. Read part two here.

For nearly four centuries, the world economy has been on a path of ever-greater integration that even two world wars could not totally derail. This long march of globalisation was powered by rapidly increasing levels of international trade and investment, coupled with vast movements of people across national borders and dramatic changes in transportation and communication technology.

According to economic historian J. Bradford DeLong, the value of the world economy (measured at fixed 1990 prices) rose from US$81.7 billion (£61.5 billion) in 1650, when this story begins, to US$70.3 trillion (£53 trillion) in 2020 – an 860-fold increase. The most intensive periods of growth corresponded to the two periods when global trade was rising fastest: first during the “long 19th century” between the end of the French revolution and start of the first world war, and then as trade liberalisation expanded after the second world war, from the 1950s up to the 2008 global financial crisis.

Now, however, this grand project is on the retreat. Globalisation is not dead yet, but it is dying.

Is this a cause for celebration, or concern? And will the picture change again when Donald Trump and his tariffs of mass disruption leave the White House? As a longtime BBC economics correspondent who was based in Washington during the global financial crisis, I believe there are sound historical reasons to worry about our deglobalised future – even once Trump has left the building.


The Insights section is committed to high-quality longform journalism. Our editors work with academics from many different backgrounds who are tackling a wide range of societal and scientific challenges.


Trump’s tariffs have amplified the world’s economic problems, but he is not the root cause of them. Indeed, his approach reflects a truth that has been emerging for many decades but which previous US administrations – and other governments around the world – have been reluctant to admit: namely, the decline of the US as the world’s no.1 economic power and engine of world growth.

In each era of globalisation since the mid-17th century, a single country has sought to be the clear world leader – shaping the rules of the global economy for all. In each case, this hegemonic power had the military, political and financial power to enforce these rules – and to convince other countries that there was no preferable path to wealth and power.

But now, as the US under Trump slips into isolationism, there is no other power ready to take its place and carry the torch for the foreseeable future. Many people’s pick, China, faces too many economic challenges, including its lack of a truly international currency – and as a one-party state, nor does it possess the democratic mandate needed to gain acceptance as the world’s new dominant power.

While globalisation has always produced many losers as well as winners – from the slave trade of the 18th century to displaced factory workers in the American Midwest in the 20th century – history shows that a deglobalised world can be an even more dangerous and unstable place. The most recent example came during the interwar years, when the US refused to take up the mantle left by the decline of Britain as the 19th century’s hegemonic global power.

In the two decades from 1919, the world descended into economic and political chaos. Stock market crashes and global banking failures led to widespread unemployment and increasing political instability, creating the conditions for the rise of fascism. Global trade declined sharply as countries put up trade barriers and started self-defeating currency wars in the vain hope of giving their countries’ exports a boost. On the contrary, global growth ground to a halt.

A century on, our deglobalising world is vulnerable again. But to chart whether this means we are destined for a similarly chaotic and unstable future, we first need to explore the birth, growth and reasons behind the imminent demise of this extraordinary global project.

French model: mercantilism, money and war

By the mid-1600s, France had emerged as the strongest power in Europe – and it was the French who developed the first overarching theory of how the global economy could work in their favour. Nearly four centuries later, many aspects of “mercantilism” have been revived by Trump’s US playbook, which could be entitled How To Dominate the World Economy by Weakening Your Rivals.

France’s version of mercantilism was based on the idea that a country should put up trade barriers to limit how much other countries could sell to it, while boosting its own industries to ensure that more money (in the form of gold) came into the country than left it.

England and the Dutch Republic had already adopted some of these mercantilist policies, establishing colonies around the globe run by powerful monopolistic trading companies that aimed to challenge and weaken the Spanish empire, which had prospered on the gold and silver it seized in the Americas. In contrast to these “seaborne empires”, the much larger empires in the east such as China and India had the internal resources to generate their own revenue, meaning international trade – although widespread – was not critical to their prosperity.

Portrait of French finance minister Jean-Baptiste Colbert
French finance minister Jean-Baptiste Colbert, architect of mercantilism.
Metropolitan Museum of Art/Wikimedia

But it was France which first systematically applied mercantilism across the whole of government policy – led by the powerful finance minister Jean-Baptiste Colbert (1661-1683), who had been granted unprecedented powers to strengthen the financial might of the French state by King Louis XIV. Colbert believed trade would boost the coffers of the state and strengthen France’s economy while weakening its rivals, stating:

It is simply, and solely, the absence or abundance of money within a state [which] makes the difference in its grandeur and power.

In Colbert’s view, trade was a zero-sum game. The more France could run a trade surplus with other countries, the more gold bullion it could accumulate for the government and the weaker its rivals would become if deprived of gold. Under Colbert, France pioneered protectionism, tripling its import tariffs to make foreign goods prohibitively expensive.

At the same time, he strengthened France’s domestic industries by providing subsidies and granting them monopolies. Colonies and government trading companies were established to ensure France could benefit from the highly lucrative trade in goods such as spices, sugar – and slaves.

Colbert oversaw the expansion of French industries into areas like lace and glass-making, importing skilled craftsmen from Italy and granting these new companies state monopolies. He invested heavily in infrastructure such as the Canal du Midi, and dramatically increased the size of France’s navy and merchant marine to challenge its British and Dutch rivals.

Global trade at this time was highly exploitative, involving the forced seizure of gold and other raw materials from newly discovered lands (as Spain had been doing with its conquests in the New World from the late 15th century). It also meant benefiting from the trade in humans, with huge profits as slaves were seized and sent to the Caribbean and other colonies to produce sugar and other crops.




Read more:
Why London’s new slavery memorial is so important: ‘The past that is not past reappears, always, to rupture the present’


In this era of mercantilism, trade wars often led to real wars, fought across the globe to control trade routes and seize colonies. Following Colbert’s reforms, France began a long struggle to challenge the overseas empires of its maritime rivals, while also engaging in wars of conquest in continental Europe.

France initially enjoyed success in the 17th century both on land and sea against the Dutch. But ultimately, its state-run French Indies company was no rival to the ruthless, commercially driven activities of the Dutch and British East India companies, which delivered enormous profits to their shareholders and revenues for their governments.

Indeed, the huge profits made by the Dutch from the Far Eastern spice trade explains why they had no hesitation in handing over their small North American colony of New Amsterdam, in return for expelling the British from a small toehold of one of their spice islands in what is now Indonesia. In 1664, that Dutch outpost was renamed New York.

After a century of conflict, Britain gradually gained ascendancy over France, conquering India and forcing its great rival to cede Canada in 1763 after the Seven Years war. France never succeeded in fully countering Britain’s naval strength. Resounding defeats by fleets led by Horatio Nelson in the early 19th century, coupled with Napoleon’s defeat at Waterloo by a coalition of European powers, marked the end of France’s time as Europe’s hegemonic power.

Painting of French ships under fire during the Battle of Trafalgar.
The battle of Trafalgar, off southwestern Spain in October 1805, was decisive in ending France’s era of dominance.
Yale Center for British Art/Wikimedia

But while the French model of globalisation ultimately failed in its attempt to dominate the world economy, that has not prevented other countries – and now President Trump – from embracing its principles.

France found that tariffs alone could not sufficiently fund its wars nor boost its industries. Its broad version of mercantilism led to endless wars that spread around the globe, as countries retaliated both economically and militarily and tried to seize territories.

More than two centuries later, there is an uncomfortable parallel with what the results of Trump’s endless tariff wars might bring, both in terms of ongoing conflict and the organisation of rival trade blocs. It also shows that more protectionism, as proposed by Trump, will not be enough to revive the US’s domestic industries.

British model: free trade and empire

The ideology of free trade was first spelled out by British economists Adam Smith and David Ricardo, the founding fathers of classical economics. They argued trade was not a zero-sum game, as Colbert had suggested, but that all countries could mutually benefit from it. According to Smith’s classic text, The Wealth of Nations (1776):

If a foreign country can supply us with a commodity cheaper than we ourselves can make, better buy it off them with some part of the produce of our own industry, employed in such a way that we have some advantages.

As the world’s first industrial nation, by the 1840s Britain had created an economic powerhouse based on the new technologies of steam power, the factory system, and railroads.

Smith and Ricardo argued against the creation of state monopolies to control trade, proposing minimal state intervention in industry. Ever since, Britain’s belief in the benefits of free trade has proved stronger and more long-lasting than any other major industrial power – more deeply embedded in both its politics and popular imagination.

This ironclad commitment was born out of a bitter political struggle in the 1840s between manufacturers and landowners over the protectionist Corn Laws. The landowners who had traditionally dominated British politics backed high tariffs, which benefited them but resulted in higher prices for staples like bread. The repeal of the Corn Laws in 1846 upended British politics, signalling a shift of power to the manufacturing classes – and ultimately to their working-class allies once they gained the right to vote.

Illustration of an Anti-Corn Law League meeting.
An Anti-Corn Law League meeting held in London’s Exeter Hall in 1846.
Wikimedia

In time, Britain’s advocacy of free trade unleashed the power of its manufacturing to dominate global markets. Free trade was framed as the way to raise living standards for the poor (the exact opposite of President Trump’s claim that it harms workers) and had strong working-class support. When the Conservatives floated the idea of abandoning free trade in the 1906 general election, they suffered a devastating defeat – the party’s worst until 2024.

As well as trade, a central element in Britain’s role as the new global hegemonic power was the rise of the City of London as the world’s leading financial centre. The key was Britain’s embrace of the gold standard which put its currency, the pound, at the heart of the new global economic order by linking its value to a fixed amount of gold, ensuring its value would not fluctuate. Thus the pound became the worldwide medium of exchange.

This encouraged the development of a strong banking sector, underpinned by the Bank of England as a credible and trustworthy “lender of last resort” in a financial crisis. The result was a huge boom in international investment, opening access to overseas markets for British companies and individual investors.

In the late 19th century, the City of London dominated global finance, investing in everything from Argentinian railways and Malaysian rubber plantations to South African gold mines. The gold standard became a talisman of Britain’s power to dominate the world economy.

The pillars of Britain’s global economic dominance were a highly efficient manufacturing sector, a commitment to free trade to ensure its industry had access to global markets, and a highly developed financial sector which invested capital around the world and reaped the benefits of global economic development. But Britain also did not hesitate to use force to open up foreign markets – for example, during the Opium Wars of the 1840s, when China was compelled to open its markets to the lucrative trade in opium from British-owned India.




Read more:
What the Opium Wars can tell us about China, the U.S. and fentanyl


By the end of the 19th century, the British empire incorporated one quarter of the world’s population, providing a source of cheap labour and secure raw materials as well as a large market for Britain’s manufactured goods. But that was still not enough for its avaricious leaders: Britain also made sure that local industries did not threaten its interests – by undermining the Indian textile industry, for example, and manipulating the Indian currency.

In reality, globalisation in this era was about domination of the world economy by a few rich European powers, meaning that much global economic development was curtailed to protect their interests. Under British rule between 1750 and 1900, India’s share of world industrial output declined from 25% to 2%.

But for those at the centre of Britain’s global formal and informal empire, such as the middle-class residents of London, this was a halcyon time – as economist John Maynard Keynes would later recall:

For middle and upper classes … life offered, at a low cost and with the least trouble, conveniences, comforts and amenities beyond the compass of the richest and most powerful monarchs of other ages. The inhabitant of London could order by telephone, sipping his morning tea in bed, the various products of the whole Earth, in such quantity as he might see fit, and reasonably expect their early delivery upon his doorstep.

US model: protectionism to neoliberalism

While Britain enjoyed its century of global dominance, the United States embraced protectionism for longer after its foundation in 1776 than all other major western economies.

The introduction of tariffs to protect and subsidise emerging US industries had first been articulated in 1791 by the fledgling nation’s first treasury secretary, Alexander Hamilton – Caribbean immigrant, founding father and future subject of a record-breaking musical. The Whig party under Henry Clay and its successor, the Republican Party, were both strong supporters of this policy for most of the 19th century. Even as US industry grew to overshadow all others, its government maintained some of the highest tariff barriers in the world.

Alexander Hamilton on the front of a US$10 note from 1934
Founding father Alexander Hamilton on the front of a US$10 note from 1934.
Wikimedia

Tariff rates rose to 50% in the 1890s with the backing of future president William McKinley, both to help industrialists and pay for generous pensions for 2 million civil war veterans and their dependants – a key part of the Republican electorate. It is no accident that President Trump has festooned the White House with pictures of Hamilton, Clay and McKinley – all supporters of protectionism and high tariffs.

In part, the US’s enduring resistance to free trade was because it had access to an internal supply of seemingly limitless raw materials, while its rapidly growing population, fuelled by immigration, provided internal markets that fuelled its growth while keeping out foreign competition.

By the late 19th century, the US was the world’s biggest steel producer with the largest railroad system in the world and was moving rapidly to exploit the new technologies of the second industrial revolution – based on electricity, petrol engines and chemicals. Yet it was only after the second world war that the US assumed the role of global superpower – in part because it was the only country on either side of the war that had not suffered severe damage to its economy and infrastructure.

In the wake of global destruction in Europe and Asia, the US’s dominance was political, military and cultural, as well as financial – but the US vision of a globalised world had some important differences from its British predecessor.

The US took a much more universalist and rules-based approach, focusing on the creation of global organisations that would establish binding regulations – and open up global markets to unfettered American trade and investment. It also aimed to dominate the international economic order by replacing the pound sterling with the US dollar as the global medium of exchange.

Within a week of its entry in the second world war, plans were laid to establish US global financial hegemony. The US treasury secretary, Henry Morgenthau, began work on establishing an “inter-allied stabilisation fund” – a playbook for post-war monetary arrangements which would enshrine the US dollar at its heart.

This led to the creation of the International Monetary Fund (IMF) and World Bank at the Bretton Woods conference in New Hampshire in 1944 – institutions dominated by the US, which encouraged other countries to adopt the same economic model both in terms of free trade and free enterprise. The Allied nations who were simultaneously meeting to establish the United Nations to try to ensure future world peace, having suffered the devastating effects of the Great Depression and war, welcomed the US’s commitment to shape a new, more stable economic order.

How the 1944 Bretton Woods deal ensured the US dollar would be the world’s dominant currrency. Video: Bloomberg TV.

As the world’s biggest and strongest economy, there was (initially) little resistance to this US plan for a new international economic order in its own image. The motive was as much political as economic: the US wanted to provide economic benefits to ensure the loyalty of its key allies and counter the perceived threat of a communist takeover – in complete contrast to Trump’s mercantilist view today that all other countries are out to “rip off” the US, and that its own military might means it has no real need for allies.

After the war finally ended, the US dollar, now linked to gold at a fixed rate of $35 per ounce to guarantee its stability, assumed the role as the free world’s principal currency. It was both used for global trade transactions and held by foreign central banks as their currency reserves – giving the US economy an “exorbitant privilege”. The stable value of the dollar also made it easier for the US government to sell Treasury bonds to foreign investors, enabling it to more easily borrow money and run up trade deficits with other countries.

The conditions were set for an era of US political, financial and cultural dominance, which saw the rise of globally admired brands such as McDonald’s and Coca Cola, as well as a powerful US marketing arm in the form of Hollywood. Perhaps even more significantly, the relaxed, well-funded campuses of California would prove a perfect petri dish for the development of new computer technologies – backed initially by cold war military investment – which, decades later, would lead to the birth of the big-tech companies that dominate the tech landscape today.

The US view of globalisation was broader and more interventionist than the British model of free trade and empire. Rather than having a formal empire, it wanted to open up access to the entire world economy, which would provide global markets for American products and services.

The US believed you needed global economic institutions to police these rules. But as in the British case, the benefits of globalisation were still unevenly shared. While countries that embraced export-led growth such as Japan, Korea and Germany prospered, other resource-rich but capital-poor countries such as Nigeria only fell further behind.

From dream to despair

Though the legend of the American dream grew and grew, by the 1970s the US economy was coming under increasing pressure – in particular from German and Japanese rivals, who by then had recovered from the war and modernised their industries.

Troubled by these perceived threats and a growing trade deficit, in 1971 President Richard Nixon stunned the world by announcing that the US was going off the gold standard – forcing other countries to bear the cost of adjustment for the US balance of payments crisis by making them revalue their currencies. This had a profound effect on the global financial system: within a decade, most major currencies had abandoned fixed exchange rates for a new system of floating rates, effectively ending the 1944 Bretton Woods settlement.

US president Richard Nixon announces the US is leaving the gold standard on August 15 1971.

The end of fixed exchange rates opened the door to the “financialisation” of the global economy, vastly expanding global investment and lending – much of it by US financial firms. This gave succour to the burgeoning neoliberal movement that sought to further rewrite the rules of the financial world order. In the 1980s and ’90s, these policy prescriptions became known as the Washington consensus: a set of rules – including opening markets to foreign investment, deregulation and privatisation – that was imposed on developing economies in crisis, in return for them receiving support from US-led organisations like the World Bank and IMF.

In the US, meanwhile, the increasing reliance on the finance and hi-tech sectors increased levels of inequality and fostered resentment in large parts of American society. Both Republicans and Democrats embraced this new world order, shaping US policy to favour their hi-tech and financial allies. Indeed, it was the Democrats who played a key role in deregulating the financial sector in the 1990s.

Meanwhile, the decline of US manufacturing industries accelerated, as did the gap between the incomes of those in the hinterland, where manufacturing was based, and residents of the large metropolitan cities.

By 2023, the lowest 50% of US citizens received just 13% of total personal income, while the top 10% received almost half (47%). The wealth gap was even greater, with the bottom 50% only having 6% of total wealth, while a third (36%) was held by just the top 1%. Since 1980, real incomes of the bottom 50% have barely grown for four decades.

The bottom half of the US population was suffering from a surge in “deaths of despair” – a term coined by the Nobel-winning economist Angus Deaton to describe high mortality rates from drug abuse, suicide and murder among younger working-class Americans. Rising costs of housing, medical care and university education all contributed to widespread indebtedness and growing financial insecurity. By 2019, a study found that two-thirds of people who filed for bankruptcy cited medical issues as a key reason.




Read more:
International trade has cost Americans millions of jobs. Investing in communities might offset those losses


The decline in US manufacturing accelerated after China was admitted to the World Trade Organization in 2001, increasing America’s soaring trade and budget deficit even more. Political and business elites hoped the move would open up the huge Chinese market to US goods and investment, but China’s rapid modernisation made its industry more competitive than its American rivals in many fields.

Ultimately, this era of intensive financialisation of the world economy created a series of regional and then global financial crises, damaging the economies of many Latin American and Asian economies. This culminated in the 2008 global financial crisis, precipitated by reckless lending by US financial institutions. The world economy took more than a decade to recover as countries wrestled with slower growth, lower productivity and less trade than before the crisis.

For those who chose to read it, the writing was on the wall for America’s era of global domination decades ago. But it would take Trump’s victory in the 2016 presidential election – a profound shock to many in the US “liberal establishment” – to make clear that the US was now on a very different course that would shake up the world.

Making a bad situation more dangerous

In my view, Trump is the first modern-day US president to fully understand the powerful alienation felt by many working-class American voters, who believed they were left out of the US’s immense post-war economic growth that so benefited the largely urban American middle classes. His strongest supporters have always been lower-middle-class voters from rural areas who are not college-educated.

Yet Trump’s key policies will ultimately do little for them. High tariffs to protect US jobs, expulsion of millions of illegal immigrants, dismantling protections for minorities by opposing DEI (diversity, equality and inclusion) programmes, and drastically cutting back the size of government will have increasingly negative economic consequences in the future, and are very unlikely to restore the US economy to its previous dominant position.

US president Donald Trump unveils his global tariff ‘hit list’ on April 3 2025. BBC News.

Long before he first became president, Trump hated the eye-watering US trade deficit (he’s a businessman, after all) – and believed that tariffs would be a key weapon for ensuring US economic dominance could be maintained. Another key part of his “America First” ideology was to repudiate the international agreements that were at the heart of the US’s postwar approach to globalisation.

In his first term, however, Trump (having not expected to win) was ill-prepared for power. But second time around, conservative thinktanks had spent years outlining detailed policies and identifying key personnel who could implement the radical U-turn in US economic policy.

Under Trump 2.0, we have seen a return to the mercantilist point of view reminiscent of France in the 17th and 18th centuries. His assertion that countries which ran a trade surplus with the US “were ripping us off” echoed the mercantilist belief that trade was a zero-sum game – rather than the 20th-century view, pioneered by the US, that globalisation brings benefits to all, no matter the precise balance of that trade.

Trump’s tax-and-tariff plans, which extend the tax breaks to the very rich while reducing benefits for the poor through benefit cuts and tariff-driven inflation, will increase inequality in the US.

At the same time, the passing of the One Big Beautiful Bill is predicted to add some US$3.5 trillion to US government debt – even after the Elon Musk-led “Department of Government Efficiency” cuts imposed on many Washington departments. This adds pressure to the key US Treasury bond market at the centre of the world financial system, and raises the cost of financing the huge US deficit while weakening its credit rating. Continuing these policies could threaten a default by the US, which would have devastating consequences for the entire global financial system.

For all the macho grandstanding from Trump and his supporters, his economic policies are a demonstration of American weakness, not strength. While I believe his highlighting of some of the ills of the US economy were overdue, the president is rapidly squandering the economic credibility and good will that the US built up in the postwar years, as well as its cultural and political hegemony. For people living in America and elsewhere, he is making a bad situation more dangerous – including for many of his most ardent supporters.

That said, even without Trump’s economic and societal disruptions, the end of the US era of hegemonic dominance would still have happened. Globalisation is not dead, but it is dying. The troubling question we all face now, is what happens next.

This is the first of a two-part Insights long read on the rise and fall of globalisation. Read part two here: why the next global financial meltdown could be much worse with the US on the sidelines.


For you: more from our Insights series:

To hear about new Insights articles, join the hundreds of thousands of people who value The Conversation’s evidence-based news. Subscribe to our newsletter.

The Conversation

Steve Schifferes does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. The rise and fall of globalisation: battle to be top dog – https://theconversation.com/the-rise-and-fall-of-globalisation-battle-to-be-top-dog-267910

Keeping up with the Kardashians? Why owning more can leave us feeling less

Source: The Conversation – UK – By Cathrine Jansson-Boyd, Professor of Consumer Psychology, Anglia Ruskin University

The Kardashians are back with a new season of their reality series The Kardashians on Disney Plus.

As a researcher of consumer psychology, I have written about consumer neuroscience and how brands and media shape behaviour and self-perception. Watching The Kardashians through that lens reveals more than entertainment. It exposes how luxury and aspiration are woven into identity and sold back to us as self-worth.

The first episode is a materialistic feast. There are close-ups of Dior and Chanel handbags and belts, diamond jewellery and a house sign that reads: “Need money for Birkin.” The Kardashians drive luxury cars, wear designer sunglasses indoors and chat about their Saint Laurent outfits.

Even the camera lingers on the glittering shop windows of Rodeo Drive in Beverly Hills, home to some of the world’s most exclusive designer stores, though no one is actually shopping. If you haven’t seen it, you probably get the idea. In the Kardashian universe, the unspoken motto seems to be: “To have is to be.”

In their world, material possessions are woven into identity and presented as something to aspire to. But is it really all that glamorous?

Overconsumption can lower our wellbeing. Young people, in particular, often turn to excessive consumption to fit in, boost confidence or gain prestige. Teenagers who idolise others for their wealth or possessions are more likely to struggle with their sense of identity later in life.

Research shows that children and adolescents who place strong importance on material possessions often struggle to develop a clear sense of identity. Without learning who they are beyond what they own, they may find it harder to build lasting self-worth and life satisfaction.

Rather than helping us define who we are, possessions can get in the way. They can obscure or distort our sense of self, leading us to equate value with visibility. On top of that, materialism is linked to depression, likely because people often fail to achieve the identity and happiness they hope consumption will bring.

The Kardashian-Jenners have a massive following. Sisters Kylie, Kim and Khloé each have more than 300 million Instagram followers, a clear sign of their influence.

When we admire someone, we naturally compare ourselves to them, a process known as social comparison. It helps us judge where we stand, whether we are better or worse off than others. In this context, owning the same bag, car or outfit becomes a way to measure worth, since possessions often symbolise status and make the buyer feel closer to the celebrity, as if buying into their world.

Social comparison is known to drive materialism. It can start to feel like a competition to “catch up” with those we look up to through conspicuous consumption.

When we fail to keep up with the Kardashians, we may feel inadequate, even if we know deep down we were never in the same race. The Kardashian brand cleverly capitalises on this very idea.

The original series title, Keeping Up With the Kardashians, puns on the human instinct to compare and compete. This dynamic fuels not only the show’s popularity but also its beauty, fashion and lifestyle empires, which invite fans to buy into the brand both literally and symbolically.

You might think the solution is simply to choose better role models, but it is not that straightforward. People often compare themselves to others without realising it, automatically relying on social comparison when processing information about other people. This tendency does not stop at television.

Social media platforms intensify the same dynamic, giving us endless opportunities to measure our worth against curated snapshots of other people’s lives. Research from 2024 shows that heavy exposure to idealised social-media content is associated with increased materialism, lower life satisfaction and greater stress.

Another study found that engagement with influencer content featuring luxury goods can trigger upward social comparison – the tendency to compare ourselves with people we see as better off – leading to feelings of envy and a stronger urge to buy similar products in order to close that gap.

From influencer “unboxings,” where people film themselves opening luxury purchases, to filtered “day in the life” videos, social media users are constantly exposed to lifestyles that appear effortlessly perfect. When we scroll through feeds full of luxury, beauty and success, we can become more materialistic without ever consciously deciding to.




Read more:
Social media: how to protect your mental health


Seeing the extreme wealth of people like the Kardashians surrounded by luxury can spark feelings of envy and relative deprivation, leading to dissatisfaction with our own lives. That dissatisfaction can then trigger compulsive shopping as we try to soothe those uncomfortable emotions and project wealth ourselves.

Unsurprisingly, compulsive buying is closely tied to materialism. If you value possessions and feel envy toward others, you are far more likely to buy impulsively in an attempt to catch up.

Watching glamorous lifestyles where people seem to have it all can be fun escapism, but it also blurs the line between aspiration and insecurity. Shows like The Kardashians offer a fantasy of perfection that few can match, yet they invite us to measure ourselves against it.

In the end, the pursuit of luxury may leave us feeling emptier, not richer. After all, when having becomes being, it is worth asking what is left of the self once the shopping stops.

The Conversation

Cathrine Jansson-Boyd does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Keeping up with the Kardashians? Why owning more can leave us feeling less – https://theconversation.com/keeping-up-with-the-kardashians-why-owning-more-can-leave-us-feeling-less-268367

How the first animals evolved – a new clue from a tiny relative

Source: The Conversation – UK – By Max Telford, Jodrell Professor of Zoology and Comparative Anatomy, UCL

The next time you go wild swimming, whether in a lake, river or sea, you are probably sharing the water with one of your tiniest, yet closest relatives.

This near-family member is a microscopic, single-celled organism called a choanoflagellate. Scientists are still puzzled by how animals evolved from such simple beginnings. But a new paper describes the discovery of an important new clue.

Choanoflagellates, like most single celled organisms can only survive in water where they live much of their life as a single cell, no more complex than an amoeba.

They are nevertheless more closely related to our own kingdom of life than any other kind of organism is – choanoflagellates are cousins of the animals. Engraved in the structure and function of choanoflagellate cells and written in their DNA code, scientists are finding evidence showing how the very first animals evolved.

Before we get to the new clue, it is worth thinking about what makes animals unique. As I describe in my new book, the simple answer is that, compared to most of the rest of life, animals have large and complicated bodies built from many cells. Your own body contains tens of trillions of cells and even a tiny fruit fly has close to a million.

Ladybird taking flight from a person's finger.
Even a tiny insect like this ladybird has millions of cells in its body.
Pazargic Liviu/Shutterstock

Our oldest animal ancestor must have evolved a way for its many cells to stick together and form a super-organism from a host of cooperating cells.

The first animals must also have invented ways to produce the many different kinds of cells we have today; muscle cells, nerve cells, egg cells and sperm to name a few. The division of labour among different kinds of cells is one aspect of arguably the biggest invention of the first animals which is embryogenesis.

This is the earliest period of an animal’s life when, starting from a single fertilised egg cell, cell division begins to create all the cells that will make up the animal. These cells then each take on their own specific task and finally the many cells get carefully organised to make a functioning organism.

Scientists are hoping that studying choanoflagellates will help them learn how these skills first evolved.

Choanoflagellates don’t have large, complex bodies and they don’t have embryogenesis. They do, however, have a few animal like qualities. Their cells, for example, can adopt a handful of different shapes with different roles.

Just as our cells can take the form of a nerve or a muscle, theirs can switch from the standard funnel and flagellum form to become amorphous, shape-shifting blobs like an amoeba.

Choanoflagellates can also make tiny multi-celled colonies. In the presence of certain species of bacteria their cells stick together to make little groups of cells called rosettes. The rosettes seem to form because they are better at catching the bacteria (which the choanoflagellates eat) than single cells are.

The rosettes can grow a little, but when they reach a size of about ten cells the bonds between the cells stretch and snap, splitting the rosettes down the middle to form two smaller rosettes of cooperating choanoflagellates.

The resemblance of these rosettes to the earliest stages of an animal embryo seems like a coincidence, however. Unlike an animal embryo, they are not destined to develop into anything else. The new study is about the growth of these rosettes.

In many animals, from mice to flies, there is small group of genes that work together to control how big different organs get – how many cells they contain. Named Hippo, Warts and Yorkie, these genes sound like a mob of gangsters.

They are known collectively as the Hippo pathway. When genes in the Hippo pathway mutate in a growing fly or mouse embryo, the result is flies with huge eyes or newborn mice with monstrous livers.

In adult humans, when Hippo genes mutate, they can produce cancerous growths of uncontrollably dividing cells.

Choanoflagellates have a host of genes in common with animals. Although they don’t have organs like eyes or livers (or embryos or cancer), they do have the genes of the Hippo gang. The new paper first describes how the researchers developed a new technique that lets scientists target any gene in a choanoflagellate so that it can be deliberately mutated.

When the Warts gene was mutated, the rosettes of cells grew twice as large so that they ended up containing 20 cells or more. This uncontrolled growth is strikingly similar to what had been seen in previous studies of flies, mice and some human cancers.

The details of just how the Hippo genes control rosette size are not yet know. But the new work adds to the picture we are building of the single celled precursor of the animals. It is another animal-like characteristic of the choanoflagellates.

If we travelled back in time to meet this tiny beast, we would never mistake it for a member of the animal kingdom. But we would nevertheless find in its repertoire a handful of skills that were going to prove useful in the evolution of animals.

Evolution took the materials that were available – the ability to make different kinds of cells; to stick those cells together; to regulate the number of cells and so on – and tinkered with them. From these beginnings, natural selection would then do its thing, resulting, 600 million years later, in the amazing diversity of the animal kingdom from jellyfish to flies, tapeworms, starfish and you.


This article features references to books that have been included for editorial reasons, and may contain links to bookshop.org. If you click on one of the links and go on to buy something from bookshop.org The Conversation UK may earn a commission.

The Conversation

Max Telford does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How the first animals evolved – a new clue from a tiny relative – https://theconversation.com/how-the-first-animals-evolved-a-new-clue-from-a-tiny-relative-268238

Climate change is becoming an insurance crisis

Source: The Conversation – UK – By Meilan Yan, Senior Lecturer in Financial Economics, Loughborough University

oleschwander/Shutterstock

Imagine waking up to find your living room underwater for the second time in five years. You try to claim insurance, only to be told your property is now uninsurable. Premiums have tripled. Your mortgage lender is concerned. And your biggest asset, your home, is rapidly losing value.

This isn’t just a personal disaster. It’s a warning sign of a much broader crisis.

The risks associated with climate change are breaking the insurance industry. In the past decade alone, flood frequency has increased fourfold in the tropics and 2.5 times in mid-latitude regions). In the UK, at least one in six people already live with flood risk, heavy-rainfall extremes are increasing, and expected annual damages could rise by 27% by the 2050s.

Insurance claims from extreme weather are surging. The Association of British Insurers (the UK insurance and long-term savings trade body) reports a record £585 million in home weather-damage payouts for 2024.

Climate change is driving more frequent and severe events, pushing traditional insurance models to their limits. Insurers are left with little choice but to raise premiums sharply or withdraw coverage entirely. When insurance becomes unaffordable or unavailable, households are exposed, property values fall, mortgages become harder to secure, and the risk of a wider financial crisis grows.


Ever wondered how to spend or invest your money in ways that actually benefit people and planet? Or are you curious about the connection between insurance and the climate crisis?

Green Your Money is a new series from the business and environment teams at The Conversation exploring how to make money really matter. Practical and accessible insights from financial experts in the know.


Our research into the insurance industry shows that UK resilience is falling behind. Policymakers in the UK tried to avert an insurance crisis by launching Flood Re in 2016, a joint scheme between government and insurers designed to keep insurance affordable for households in high-risk areas. It was meant as a temporary bridge, due to close in 2039 once stronger flood defences and better land-use planning are in place.

But progress has been painfully slow. In January 2024, the House of Commons public accounts committee reported that the government’s £5.2 billion flood defence programme is 40% behind schedule and expected to protect just 200,000 properties by 2027 — far short of its original 336,000 target.




Read more:
Your essential guide to climate finance


By 2025, Flood Re has been under mounting strain. Reinsurance costs had have risen by £100 million in just three years, and policy uptakes have jumped by 20% in a single year – both signs that private insurers were retreating from high-risk markets.

In July 2025, Flood Re’s CEO, Perry Thomas, warned that the UK’s overall flood resilience have worsened since the scheme’s launch, as mortgage lenders, housebuilders, and successive governments have “failed to pull their weight”.

tree fallen onto building on stree
Storm damage is more likely as climate change risk increases.
pcruciatti/Shutterstock

When insurance becomes unaffordable or unavailable, households are left exposed and property values decline, making mortgages harder to obtain. This erosion of coverage threatens the wider financial system: banks rely on insured property as collateral, but without cover, that collateral rapidly loses value.

If the government fails to meet its climate adaptation targets, as many as 3 million UK homes could become effectively worthless within 30 years.

For the banking sector, this creates the risk of homes becoming stranded assets — uninsurable, unmortgageable and falling in value — leading to rising defaults and mounting losses. Unless lenders adopt climate-adjusted risk models that integrate physical hazards such as flooding, storms and heatwaves, they risk underestimating the true exposure of their mortgage portfolios.

If these climate-risk-exposed mortgages are mispriced and then bundled into mortgage-backed securities and sold to investors, the resulting shock could cascade through credit markets – like the 2008 subprime mortgage crisis, when large volumes of high-risk home loans to borrowers with poor or limited credit histories were repackaged and sold as safe investments. The difference is that this time the crash would be driven by physical climate damage rather than purely financial mismanagement.

A one-way street

Traditional financial crises follow cycles of growth, downturn and recovery, but climate risk moves in only one direction. Rising global temperatures are driving more frequent and severe floods and storms. Without timely adaptation, the damage compounds, eroding property values, undermining insurance and threatening financial stability.

Historical insurance models treated extreme weather as rare “tail risks,” but these events are now more frequent, severe, and interconnected. The tail is becoming “fat,” and shocks ripple across sectors and regions. In short, risk is evolving and insurance frameworks must evolve with it.

Flooding is no longer just an environmental issue. It is a systemic financial threat. Insurers, regulators and lenders must adopt forward-looking models that translate physical climate risks into financial metrics. These models influence market behaviour by shaping how capital is allocated, assets are valued, and risks are priced.

This, in turn, guides investment, planning and adaptation — the process of adjusting systems, infrastructure and practices to withstand and recover from climate impacts.

Effective adaptation measures, such as upgraded flood defences, reduce the future risk of climate-related damage. It’s a feedback loop: better modelling enables smarter adaptation, which in turn strengthens financial stability.


Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 45,000+ readers who’ve subscribed so far.


The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. Climate change is becoming an insurance crisis – https://theconversation.com/climate-change-is-becoming-an-insurance-crisis-260952

After the first world war, séances boomed – and dead soldiers ‘wrote’ home

Source: The Conversation – UK – By Alice Vernon, Lecturer in Creative Writing and 19th-Century Literature, Aberystwyth University

A typical séance in the 1920s. The Graphic, CC BY-SA

In March 1915, a young British man named Raymond Lodge was deployed to Ypres, France, to fight on the front lines of the first world war. By September, he was dead, aged just 26.

A few weeks later, however, Raymond got in touch with his family. “TELL FATHER I HAVE MET SOME FRIENDS OF HIS”, came the message hastily scrawled in all caps by the spiritualist medium Mrs Osborne Leonard.

Raymond’s father was Sir Oliver Lodge, a prominent physicist whose work helped to develop radio communications. Sir Oliver was also a member of the Society for Psychical Research, an organisation which, among other things, investigated ghosts. The “friends” Raymond had apparently met beyond the grave included F.W.H. Myers, a founding member of the society, who had died in 1901.

A black and white photo of a bearded man in a suit
Sir Oliver Lodge.
Lafayette Ltd., CC BY

Sir Oliver, previously fairly sceptical, was soon drawn into lengthy séances with Mrs Leonard, poring over messages allegedly from Raymond about death and the afterlife. He compiled them into a book entitled Raymond, or Life and Death, which was published in 1916. It proved so popular that it ran to many editions, with soldiers on the front being sent copies by their loved ones.

Spiritualism began in the late 1840s as a pseudo-Christian practice that believed communication with the dead was entirely possible. While it dwindled in popularity at the turn of the century, it was reinvigorated to new levels in the aftermath of the first world war. The popularity of Lodge’s book, moreover, led to dozens of copycat publications, where other soldiers “wrote” of their experiences of the utopian spiritualist afterlife to their families.

Claude’s Book (1919) is one such example, “transcribed” from séances with young Claude by his mother, L. Kelway-Bamber. Kelway-Bamber, having been so heartened by Sir Oliver’s sittings, had hired Mrs Leonard herself to get in touch with her son. Spiritualist mediums were in high demand once more.

Beyond the cynicism

It’s easy to dismiss these séances, even to scoff at them as nothing more than charlatans exploiting public grief, especially from the point of view of modern scepticism. When I was researching my new book, Ghosted: A History of Ghost Hunting and Why We Keep Looking, this was my initial reaction to reading about these bizarre encounters with the spirits of the dead.

But as a sociological phenomenon, borne of mass grief, I think it’s more complicated than that. We may laugh at fraudulent mediums quivering melodramatically as they channel the so-called spirits of the dead, but to discuss spiritualism’s cultural significance requires a more nuanced and sensitive approach.

By the end of the first world war, nearly 9 million soldiers had been killed. General mortality rates were already high prior to the war, and people were no strangers to sudden, unexpected bereavement. But never before had death affected so many people at once, and taken so many young men in the prime of their life.

If we look at spiritualism in the aftermath of the first world war, not to identify fraud and shun its believers as being gullible, we can build up an incredibly detailed picture of why so many clung to séance tables in the hope of contacting their loved ones again.

Everyone’s loss of their son, brother or husband was uniquely painful, and yet these deaths lost their significance when half the families on the street had also lost their young men. Yet, suddenly, everyone knew Raymond Lodge’s name. He stood out among the legions of dead Tommies, because of the séances Sir Oliver held with Mrs Leonard.

This, I think, is why so many grieving families took up spiritualism and wrote their own books – not to piggyback on Raymond’s popularity, but to make their sons, brothers and husbands seem special, too. Moreover, many didn’t know exactly what had happened to soldiers; being able to “speak” to them from beyond the grave made it seem like they were happy and at peace, enjoying themselves in the afterlife, and not in pieces in a muddy ditch thousands of miles from home.

Mary Lodge, Raymond’s mother, sums up the problem with a brief sentence her husband includes in the book: “We can face Christmas now.” We can accuse Mrs Leonard of exploiting grief, but we can’t deny that it eased the suffering of many, regardless of the ethical and moral dilemmas posed by spiritualism.

The history of ghost-hunting and séances is rife with fraud, and scepticism is often required to read around anecdotes to uncover what was really happening, but it’s also a vital resource to help us understand grief and fear of death at certain points in human history. By examining the motivation behind ghost-hunting from a more sympathetic perspective, we can learn a great deal about what it means to be alive.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

Alice Vernon does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. After the first world war, séances boomed – and dead soldiers ‘wrote’ home – https://theconversation.com/after-the-first-world-war-seances-boomed-and-dead-soldiers-wrote-home-266508

‘Hallucinated’ cases are affecting lawyers’ careers – they need to be trained to use AI

Source: The Conversation – UK – By Craig Smith, Lecturer in Law, University of Salford

Gorodenkoff/Shutterstock

Generative artificial intelligence, which produces original content by drawing on large existing datasets, has been hailed as a revolutionary tool for lawyers. From drafting contracts to summarising case law, generative AI tools such as ChatGPT and Lexis+ AI promise speed and efficiency.

But the English courts are now seeing a darker side of generative AI. This includes fabricated cases, invented quotations, and misleading citations entering court documents.

As someone who studies how technology and the law interact, I argue it is vital that lawyers are taught how, and how not, to use generative AI. Lawyers need to be able to avoid the risk of sanctions for breaking the rules, but also the development of a legal system that risks deciding questions of justice based on fabricated case law.

On 6 June 2025, the high court handed down a landmark judgment on two separate cases: Frederick Ayinde v The London Borough of Haringey and Hamad Al-Haroun v Qatar National Bank QPSC and QNB Capital LLC.

The court reprimanded a pupil barrister (a trainee) and a solicitor after their submissions contained fictitious and inaccurate case law. The judges were clear: “freely available generative artificial intelligence tools… are not capable of conducting reliable legal research”.

As such, the use of unverified AI output can no longer be excused as error or oversight. Lawyers, junior or senior, are fully responsible for what they put before the court.

Hallucinated case law

AI “hallucinations” – the confident generation of non-existent or misattributed information – are well documented. Legal cases are no exception. Research has recently found that hallucination rates range from 58% to 88% in response to specific legal queries, often on precisely the sorts of issues lawyers are asked to resolve.

These errors have now leapt off the screen and into real legal proceedings. In Ayinde, the trainee barrister cited a case that did not exist at all. The erroneous example had been misattributed to a genuine case number from a completely different matter.

In Al-Haroun, a solicitor listed 45 cases provided by his client. Of these, 18 were fictitious and many others irrelevant. The judicial assistant is quoted in the judgment as saying: “The vast majority of the authorities are made up or misunderstood”.

These incidents highlight a profession facing a perfect storm: overstretched practitioners, increasingly powerful but unreliable AI tools, and courts no longer willing to treat errors as mishaps. For the junior legal profession, the consequences are stark.

Many are experimenting with AI out of necessity or curiosity. Without the training to spot hallucinations, though, new lawyers risk reputational damage before their careers have fully begun.

The high court took a disciplinary approach, placing responsibility squarely on the individual and their supervisors. This raises a pressing question. Are junior lawyers being punished too harshly for what is, at least in part, a training and supervision gap?

Education as prevention

Law schools have long taught research methods, ethics, and citation practice. What is new is the need to frame those same skills around generative AI.

While many law schools and universities are either exploring AI within their modules or creating new modules that look at AI, there is a broader shift towards considering how AI is changing the legal sector as a whole.

Students must learn why AI produces hallucinations, how to design prompts responsibly, how to verify outputs against authoritative databases and when using such tools may be inappropriate.

The high court’s insistence on responsibility is justified. The integrity of justice depends on accurate citations and honest advocacy. But the solution cannot rest on sanction alone.

Hands holding pens over document
How to use AI – and how not to use it – should be part of legal training.
Lee Charlie/Shutterstock

If AI is part of legal practice, then AI training and literacy must be part of legal training. Regulators, professional bodies and universities share a collective duty to ensure that junior lawyers are not left to learn through error or in the most unforgiving of environments, the courtroom.

Similar issues have arisen from non-legal professionals. In a Manchester civil case, a litigant in person admitted relying on ChatGPT to generate legal authorities in support of their argument. The individual returned to court with four citations, one entirely fabricated and three with genuine case names but with fictitious quotations attributed to them.

While the submissions appeared legitimate, closer inspection by opposing counsel revealed the paragraphs did not exist. The judge accepted the litigant had been inadvertently misled by the AI tool and imposed no penalty. This shows both the risks of unverified AI-generated content entering proceedings and the challenges for unrepresented parties in navigating court processes.

The message from Ayinde and Al-Haroun is simple but profound: using GenAI does not reduce a lawyer’s professional duty, it heightens it. For junior lawyers, that duty will arrive on day one. The challenge for legal educators is to prepare students for this reality, embedding AI verification, transparency, and ethical reasoning into the curriculum.

The Conversation

Craig Smith does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. ‘Hallucinated’ cases are affecting lawyers’ careers – they need to be trained to use AI – https://theconversation.com/hallucinated-cases-are-affecting-lawyers-careers-they-need-to-be-trained-to-use-ai-265898

In drug trials, lack of oversight of research ethics boards could put Canadian patients at risk

Source: The Conversation – Canada – By Joel Lexchin, Associate professor, Department of Family and Community Medicine, University of Toronto; York University, Canada; University of Sydney

Research ethics boards are supposed to ensure that, among other things, patients understand the nature of the research and have given informed consent. (Unsplash/Nappy)

New drug approvals by Health Canada are based on the results of clinical trials. But before clinical trials can go ahead, they need to be approved by ethics committees known as Research Ethics Boards (REBs).

Virtually all hospitals where research is conducted have REBs, as do universities and other institutions. The REBs are supposed to ensure that patients understand the nature of the research and have given informed consent, that the trials are conducted in an ethical way that minimizes any harm to them and that the investigators are competent to do the research.

Given the crucial role they play, it’s important that REBs are not influenced by factors like financial motives, conflicts of interest or the goals of drug companies. Without oversight, these factors may encroach on the decisions made by REBs in Canada.

REBs in Canada

All that Canada’s Food and Drug Regulations say about REBs is that they need to approve clinical trials.

The Tri-Council Policy Statement does lay out who needs to be on a REB and gives some details about how REBs should operate, but these regulations only apply to research that’s funded by the tri-council, comprising the Canadian Institutes of Health Research (CIHR), the National Science and Engineering Research Council and the Social Sciences and Humanities Research Council.

Canada has no accreditation or inspection system for REBs and no oversight mechanism for the way that they undertake their reviews. An article in the Journal of Law, Medicine and Ethics noted that: “Aside from identifying information on the REB and its chair, no further information about the REB or its review is required” by Health Canada.

There used to be a National Council on Ethics in Human Research. The organization largely provided education, but there was the possibility that it could have been transformed into a national accrediting and oversight body.

But in 2010, its funding from Health Canada and CIHR was pulled. In its place, the Canadian General Standards Board published the voluntary Canadian Standard for Research Ethics Oversight of Biomedical Clinical Trials, but this guidance was withdrawn in 2018 due to limited use and support for its revision.

The still existing Canadian Association of Research Ethics Boards operates as a forum for discussion and has no regulatory powers.

For-profit REBS

The absence of any standards and regulations is becoming increasingly problematic. At least 70 per cent of clinical trials are now being done in the community, outside of health-care institutions and their in-house REBs. In addition, drug companies, which sponsor the vast majority of clinical trials, want a quick turnaround in approval by REBs.

A report by the Law Commission of Canada described academic based REBs as:

“overburdened and … stretched to the breaking point … As the work becomes increasingly complicated with globalization, technology and commercialization, REBs are struggling to find committee chairs or even members.”

In response to the movement of trials into the community where they aren’t covered by institutional REBs, it’s reasonable to assume that the number of for-profit REBs has grown, although there are no definite estimates of their number. Drug companies pay these for-profit REBs a fee to review their trials.

Trudo Lemmens, professor and Scholl Chair in Health Law and Policy at the University of Toronto Faculty of Law, has argued that the credibility and integrity of the research review is compromised by the perception of a possible conflict-of-interest (COI) when commercial REBs approve a clinical trial.

If the REB turns down too many trials or demands costly changes to the research protocol, companies may be reluctant to continue to submit future research proposals to it. Although to date, there has not been any research to verify or refute this concern, Lemmens argues that the honesty of individual REB members is not enough to remedy this situation.

In early October 2025, the New York Times published an investigation of for-profit Institutional Review Boards in the United States. Institutional review boards are the American equivalent of REBs. The story focused on two companies that dominate the business: WCG and Advarra, the latter controlled by private equity.

According to the Times, both companies “have close corporate relationships with drugmakers. And both have become part of multipronged enterprises selling pharmaceutical companies a wide range of drug-testing services — blurring the line between the reviewer and the reviewed, introducing potential conflicts of interest that threaten the review boards’ mission.”

Several former Advarra employees told the Times that the company had imposed daily quotas on reviewing informed-consent forms for trial volunteers. Alana Levy, a former consent form development editor, said that falling short meant “you get a warning” but if you reviewed over a certain number you could get a bonus. Advarra refuted those allegations and said it “maintains strong safeguards and internal policies to ensure the independence of its Institutional Review Board.”

Advarra also operates in Canada and “supports more Canadian sites than any other partner, offering the broadest provincial coverage and experience in the industry.” On its website, it advertises the speed of its reviews with a turn-around time of four to five days for reviewing protocols and consent forms for trials taking place at multiple sites.

Oversight needed

When good ethical oversight is lacking, the patients in clinical trials may be put at risk. The results from those trials may be compromised, meaning that the information that doctors rely on to prescribe the drugs is unreliable, and their patients are getting suboptimal care.

Health Canada needs to step up and establish regulations for how REBs operate and have an inspection system to ensure that its regulations are being followed.

Alberta is the only jurisdiction in Canada without for-profit REBs. Among its other responsibilities, the Health Research Ethics Board of Alberta oversees ethics approval of research involving human subjects that is done in the community. Other provinces should follow the Alberta model.

The Conversation

Between 2022-2025, Joel Lexchin received payments for writing a brief for a legal firm on the role of promotion in generating prescriptions for opioids, for being on a panel about pharmacare and for co-writing an article for a peer-reviewed medical journal on semaglutide. He is a member of the Boards of Canadian Doctors for Medicare and the Canadian Health Coalition. He receives royalties from University of Toronto Press and James Lorimer & Co. Ltd. for books he has written. He has received funding from the Canadian Institutes of Health Research in the past.

ref. In drug trials, lack of oversight of research ethics boards could put Canadian patients at risk – https://theconversation.com/in-drug-trials-lack-of-oversight-of-research-ethics-boards-could-put-canadian-patients-at-risk-262105

Why Canadians need two dramatic educational shifts to honour reconciliation

Source: The Conversation – Canada – By Jennifer Wallner, Associate Professor, School of Political Studies, L’Université d’Ottawa/University of Ottawa

When speaking about Canada’s Truth and Reconciliation Commission, Mazina Giizhik — also known as Justice Murray Sinclair — often declared: “Education has gotten us into this mess, and education will get us out.”

Sinclair captured an essence of formal schooling that is frequently ignored.

Contemporary discourse often draws on older philosophic traditions to discuss education as a force for democracy, liberation and self-expression. But formal schooling is also a structuring force — an instrument of the state.

Through education, states legitimize their authority while helping to cultivate the kinds of citizens the state wishes to govern — in other words, education is a tool of “statecraft.”

Education as statecraft highlights an ambiguity of schooling. Among its objectives, public schooling is a standardization tool, producing great benefits for some with potentially devastating consequences for others. Such ambiguity is strikingly visible in states forged through processes of contested settler-colonialism, like Canada.

Who do we as inhabitants of Turtle Island, or as Canadians, want to be in the era of reconciliation? If we are committed to truth and reconciliation, we must recognize education’s ambivalent role.

This should have implications for reforming public school curricula and teacher competencies, as well acting in partnership with Indigenous governments to support Indigenous governing autonomy and capacity in education and other matters.

We address these questions as scholars whose combined expertise is partly concerned with education policy. Jennifer Wallner, the lead author of this story, is a settler scholar born in Canada of European immigrants, and Gavin Furrey, co-author, is a settler scholar born in the United States of primarily European descent, with Lakota ancestry and Rosebud Sioux tribal citizenship.

Cultivating citizens

The relationship between schooling and the cultivation of citizens is well-documented. According to data from more than 100 countries, governments began to oversee and direct primary schooling on average 65 years before democratizing.

Other analyses suggest schooling in non-democratic regimes is used to quash rebellion and preserve the status quo. Even when schooling was introduced in democratic regimes, education was perceived as a means to instill a certain order,
and help the state shape its desired citizens.

Public schooling played a pivotal role in legitimizing the nascent authority of the future Canadian state.

Emerging from competing British and French colonial projects, settler authorities used education to encourage migration, enforce preferred linguistic, political and economic order and safeguard their peoples and regimes from Indigenous Peoples.

Broader curricular shifts needed

Prior to Confederation in 1867, colonial legislatures introduced measures to establish formal schooling. Consequently, when leaders negotiated the division of powers, provinces claimed jurisdiction over the field — with one key exception.

Provinces retained responsibility for the schooling of settlers while the federal government claimed authority over Indigenous Peoples, who were seen as a threat to the desired order of the Canadian state centred on liberalism, representative democracy, private property and capitalism.

Residential and day schools overseen by the federal government were the key instrument used to “protect” settlers, secure land and assimilate First Nations, Metis and Inuit peoples.

The language of instruction was predominantly in English, reflecting the preferred Anglo-dominant order being forged throughout most of the country. Provincial curricula long presented racist images of Indigenous Peoples.

Education researcher Dwayne Donald, a descendent of the Amiskwaciwiyiniwak (Beaver Hills people) and the Papaschase Cree, has shown how in Canadian myth, the separation and exclusion of Indigenous Peoples from everyone else has been enforced through the colonial image of the fort.




Read more:
Decolonizing history and social studies curricula has a long way to go in Canada


Deeper and broader curricular shifts are needed, since some provinces’ curricula still does not recognize Indigenous legal traditions or governance practices and Indigenous Peoples are often depicted as being largely without agency.

Investments in schooling infrastructure

Indigenous communities are reckoning with the devastating effects of residential schools and other forms of colonial schooling. Despite the harm caused by colonial policies, Indigenous Peoples note that they continue to survive and thrive through their knowledge, practices, resistance, resilience and activism.




Read more:
Acting with one mind: Gwich’in lessons for truth and reconciliation


But inadequate funding is a barrier. Between 1996 (the year the last residential school closed) to 2016, there was a 29 per cent growth in the First Nations population. In this same window, a federal cap on the annual growth rate of core program funding to First Nations for elementary and secondary education was in effect.

This led to a to four per cent annual decline in funding per student for First Nations throughout this period, which had a notable impact on schooling infrastructure.

Studies confirm
that the majority of First Nations students must leave their communities for secondary school.

Teachers in First Nations schools are paid less than their provincial counterparts and culturally sensitive post-secondary educational programs and professional development tailored to First Nations are wanting.

Meaningful social, economic participation

The Final Report of the Truth and Reconciliation Commission emphasized education in two ways: to ensure Indigenous and non-Indigenous students alike are provided the tools for meaningful social and economic participation, and to ensure all Canadians understand the history and legacy of residential schooling.




Read more:
How Indigenous-led health education in remote communities can make reconciliation real


It highlighted the importance of integrating Indigenous content and perspectives within mainstream curriculum. The Winnipeg School Division foreshadowed such transformative work since adopting its Indigenous Education Policy in 1996. In Saskatchewan, a 2018 policy framework supports the infusion of Indigenous content, perspectives and ways of knowing to the benefit of all learners.

If treaties are to be understood as a framework for relationships of mutual aid and non-domination, schools are essential for preparing settler society to engage in such a relationship.

Self-determining Peoples

Indigenous scholars also emphasize the importance of educating Indigenous youth to prepare them to be members of a self-determining people.

Mi’kmaw professor of education emeritus Marie Battiste, for example, argues that Indigenous peoples ought to focus on building their own institutions and on cultivating knowledge systems in Indigenous languages rather than simply Indigenizing shared school spaces.

An increasing number of modern treaties or negotiations have improved financing, options for education or local management of education in some scenarios.

But some researchers highlight pernicious problems related to large-scale agreements: for example, while the James Bay and Northern Québec Agreement includes a program to recognize and compensate roles in hunting and trapping, it has been criticized for not properly considering women’s realities.

Indigenous signatories to agreements have developed their autonomy steadily as they navigate new questions of how to best invest education funds and what services to prioritize for their students.

Acting in partnership, mutual respect

Problems with collaboration or communication also exist, for example, around secondary diploma accreditation.

Even when funding is available to build schools, limited space can be an issue, as communities also need new homes and other infrastructure for growing populations.

Limited housing for teachers in remote locations contributes to high vacancy rates and impacts what educational services and programs can be offered. Capacities for Indigenous governance, including education governance, are impacted by evolving political, social, economic, geographic, health and environmental factors.

If schools are to fashion a new order of mutual respect between multiple authorities, then settler schools must continue transforming to meet the challenge.

Additionally, federal and provincial authorities must act in partnership with Indigenous governments to support Indigenous governing autonomy and capacity.

The Conversation

Jennifer Wallner received funding from the Social Sciences and Humanities Research Council and is the Jean-Luc Pepin Research Chair in Canadian Politics at the University of Ottawa.

Gavin Furrey works for the Cree School Board as a Project Development Officer.

ref. Why Canadians need two dramatic educational shifts to honour reconciliation – https://theconversation.com/why-canadians-need-two-dramatic-educational-shifts-to-honour-reconciliation-265164

How global cross-cultural folklore and legends shape the monsters we fear

Source: The Conversation – Canada – By Amala Poli, PhD in English (Medical/ Health Humanities), Western University

It’s that time of year again when you grab a tub of popcorn and settle in for a cozy evening with a familiar slasher film — a haunted house, a masked villain and the perfect jump scare that you probably already know is coming but still eagerly anticipate.

While Halloween is a celebration of all things spooky, horror films in particular have seen a significant commercial boom in recent years. Critics dubbed 2023 “the year horror went highbrow” as films like Talk to Me, Beau Is Afraid and Pearl blurred the line between arthouse cinema and mainstream fright in 2022-23.

In 2025, horror remains the highest-grossing genre, earning more than US$54 million at the box office, with Sinners, Final Destination: Bloodlines and 28 Years Later topping the charts.

Associated in the past with pulpy, low-brow or unserious undertones, horror is increasingly being recognized in academic research as a genre that challenges normative beliefs, disturbs the status quo and exposes collective anxieties.

Horror has always drawn on folklore — South Asian stories of churels (witches), African figures like the Sasabonsam and Indigenous stories of the Wendigo. But Halloween, when horror is most visible in the West, often overlooks the global roots of the genre.

This year, let’s look beyond the western canon and consider the global traditions that terrify us and inspire Hollywood.

As we indulge in our annual horror binge, it’s worth asking: whose fears are we watching on screen and whose stories have we overlooked?

The real nightmare behind Freddy Krueger

What if our dreams were real and the monsters we saw in our sleep could haunt our waking lives?

This idea forms the underlying basis for the enduring popularity of the classic slasher series A Nightmare on Elm Street (1984 original). Freddy Krueger embodies the fear of a nightmare that comes alive. However, few know that Wes Craven was inspired by the dab tsog, a malevolent night spirit identified by the Hmong Laotian community.

In the late 1980s, more than 100 men from the Hmong Laotian immigrant community in the United States died in their sleep without clear explanation, as though they had been scared to death. They were diagnosed as victims of Sudden Unexpected Nocturnal Death Syndrome (SUNDS), a mysterious ailment.

The Hmong believed that the dab tsog attacked vulnerable people, producing sensations of breathlessness and chest-crushing pressure.

Hearing about SUNDS and knowing that there was no conclusive scientific evidence about the cause of deaths inspired Craven’s construction of Freddy Krueger. Some Hmong believed that alienation in their new homes and a lack of connection to ancestral spirits caused the dab tsog attacks.

Craven, fascinated by news articles on the subject, was inspired to capture the fear of going to sleep through Krueger’s sleep paralysis attacks.

Greed and the Wendigo warning

Often in horror, the monster outside is a reflection of the one within.

This idea drives Antlers, in which the Wendigo — a legendary spirit from Indigenous North American folklore — manifests as a terrifying creature that preys on human greed and violence.

In Algonquian traditions, the Wendigo is more than a flesh-eating monster: it is a moral warning against unchecked consumption, selfishness and the violation of natural and social order.

The Wendigo legend challenges isolation and greed, emphasizing the importance of community and sustainability.

Director Scott Cooper and writer Nick Antosca drew inspiration from this legend to create a horror story grounded in real-world anxieties such as trauma, poverty and the consequences of human exploitation of the environment.

By translating the Wendigo myth into a cinematic monster, Antlers draws on its ethical warnings as a commentary on the drug crisis in North America.

When the past refuses to rest

What if the ghosts that follow us across borders aren’t just supernatural but memories of generational trauma that we cannot escape?

His House, a horror film set in the aftermath of the South Sudanese civil war, draws on Dinka folklore to explore the haunting of refugee trauma.

The story traces the heartbreak and isolation of Rial and Bol, a couple who grapple with the British immigration system and the dissonance of adjusting to a new country completely alien to their own. Rial tells Bol the story of the apeth, a night witch who torments the Dinka people, which they later encounter in their new home.

In Dinka belief, the apeth is a malevolent force that feeds on guilt and disrupts domestic harmony — a spirit that “eats” the good fortune of its victims. Director Remi Weekes adapted this myth to mirror the couple’s struggle with guilt for escaping and their displacement in a cold and xenophobic British system.

Haunted by migration and heritage

Is migration always haunted by ghosts from another place?

Tracing the complexities of assimilation in the diaspora identity, It Lives Inside follows Samidha, or Sam, an Indian-American teenager struggling to fit into white suburbia while distancing herself from her heritage.

When her estranged friend Tamira warns her of a demonic presence sealed inside a glass jar, Sam dismisses her until the entity, known in Hindu folklore as a Pishacha is unleashed. In Indian mythology, the Pishacha is a flesh-eating demon that feeds on negative emotions and possesses those overcome by shame, anger or grief, devouring the soul from within.

Director Bishal Dutta reimagines this spirit as a metaphor for the cultural and psychological tension of growing up between worlds. The monster that haunts Sam is as much her suppressed East Indian identity as it is a supernatural being — a corporeal embodiment of internalized fear and generational conflict.

Through the language of horror, It Lives Inside transforms the diaspora experience into a chilling allegory of belonging and denial.

What scares us connects us

As horror becomes a genre to reckon with, increasingly immersing us in social critique rather than mere spectacle, it also reminds us that fear is a universal emotion and that the stories we tell about it are profoundly cultural.

From the Hmong’s dab tsog to the Dinka’s apeth, from the Wendigo to the Pishacha, horror offers us ways to rethink the crises of our time — greed, trauma, grief and displacement — through ominous figures familiar to specific communities.

A truly global Halloween looks beyond the usual monsters and toward the myths that continue to unsettle imaginations across the world.

As horror’s commercial success suggests a growing search for catharsis, cultural complexity and emotional depth, it reveals that the genre’s real power lies not only in its ability to frighten us but in how it connects us across borders through our shared fascination with what we fear.

The Conversation

Amala Poli does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How global cross-cultural folklore and legends shape the monsters we fear – https://theconversation.com/how-global-cross-cultural-folklore-and-legends-shape-the-monsters-we-fear-266549

Why are 4.7 million Floridians insured through ACA marketplace plans, and what happens if they lose their subsidies?

Source: The Conversation – USA (3) – By Robert Applebaum, Senior Research Scholar, Scripps Gerontology Center, Miami University

4.7 million Floridians use health insurance plans obtained from the ACA marketplace. Joe Raedle/Getty Images News

Significant Figures is a series from The Conversation in which scholars explain an important number in the news.



The Conversation, CC BY-ND

When the Affordable Care Act, also known as the ACA or Obamacare, was enacted in 2010, lawmakers hoped it would help reduce the number of uninsured Americans. That year, an estimated 48.2 million people – about 18% of the U.S. population under age 65 – did not have health insurance.

By 2023, the number of uninsured Americans had dropped by nearly 50%, to 25.3 million people under 65, or 9.5% of the total population.

I’m a gerontologist who studies the U.S. health care system. ACA health care subsidies are at the center of a now monthlong U.S. government shutdown that could become the longest in U.S. history. So I looked at the available data about ACA marketplace plan usage in Florida to understand how the debates in Washington could affect access to health care in the Sunshine State going forward.

How the ACA expanded access to health insurance

The ACA implemented a three-pronged strategy to expand access to affordable health insurance.

One was the use of fines. The government fined anyone – until 2018 – who chose not to get health insurance. The government also fined businesses with 50 or more full-time employees that didn’t offer their employees affordable health care plans. The idea was to offer incentives for healthy people to get insurance to lower costs for everyone.

Ultimately, the fines had little impact on the number of insured Americans, with one notable exception: The employer-required expansion allowed young adults ages 19 to 25 to remain on their parents’ health insurance plan. For this group, the uninsured rate dropped from 31.5% in 2010 to 13.1% in 2023.

Second, the ACA allowed for Medicaid to be expanded to low-income Americans who were employed but working in low-wage jobs. The expansion of Medicaid to low-income workers at 138% of the federal poverty level was originally required nationwide. But a 2012 Supreme Court ruling allowed states to choose whether they would participate in Medicaid expansion.

As of 2025, 16 million Americans are covered by the expansion. However, 10 states, including Florida, have opted out.

The third way the ACA changed the health insurance system is that it established health insurance subsidies that the government can provide. Those subsidies are for low- and moderate-income Americans who do not receive health insurance through their employers and aren’t eligible for Medicaid, Medicare or any other government-operated health insurance program.

This established a private health insurance marketplace that would include federal subsidies to make insurance more affordable. As of October 2025, more than 24 million Americans currently get their health insurance through the subsidized marketplace.

Florida and the ACA marketplace

The number of people insured under the ACA in each state varies. But the state with the largest number of residents on marketplace insurance plans is Florida. About 4.7 million Florida residents are covered through these plans, representing 27% of the state’s under-65 population, compared to the national average of 8.8%. Of those on marketplace plans, 98% receive a subsidy at some level.

There are several reasons why this rate is so much higher in Florida than elsewhere.

First, only 40% of Sunshine State residents are covered by an employer-based health insurance plan, compared to 49% for the nation as a whole.

This is the lowest rate in the country. A contributing factor is that Florida ranks fifth in the proportion of workforce that is self-employed, with 1.3 million Floridians in this category.

The state’s lower rate could also be related to the high number of seasonal and part-time workers in the tourism industry.

Another reason is that the state has relatively few people enrolled in Medicaid, the federal program that provides mainly low-income people with health insurance coverage. Among Floridians ages 44 to 64, only 11% are enrolled in Medicaid, compared to 17% for the nation overall.

Florida hasn’t expanded Medicaid, and it’s also more restrictive than most states about who can enroll in the program.

States set their own Medicaid eligibility criteria, and they determine what services Medicaid will cover and at what cost. Florida has the second-lowest Medicaid expenditures per enrollee in the nation, and it ranks last on Medicaid expenditures for adults under 65.

An uncertain path ahead

Because Florida residents rely heavily on marketplace plans, ending ACA subsidies would have a big effect on Floridians.

Unless Congress reverses course and preserves the insurance subsidies that have not been renewed, the average marketplace plan premium is predicted to increase by more than 100%, from $74 to $159 per month. An American earning $28,000 annually – $13.50 per hour – would see a fivefold increase, from $27 to $130 per month. And a worker making $35,000 per year would see their premium increase from $86 to $217 per month.

At 13.4%, Florida already has the third-highest proportion of uninsured residents under 65. It is safe to assume that if the federal marketplace subsidies disappear and health insurance premiums become unaffordable for more people, the result will be more uninsured Floridians. And if healthy, younger people can’t afford insurance, premiums are likely to go up for everyone else with insurance.

The path to resolve the ongoing debate is uncertain. In my view, however, it is clear that states such as Florida, Texas and Georgia, which haven’t expanded Medicaid and rely heavily on the marketplace plans, will be dramatically affected by cuts to federal subsidies.

The Conversation

Robert Applebaum does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Why are 4.7 million Floridians insured through ACA marketplace plans, and what happens if they lose their subsidies? – https://theconversation.com/why-are-4-7-million-floridians-insured-through-aca-marketplace-plans-and-what-happens-if-they-lose-their-subsidies-268269