Despite naysayers and rising costs, data shows that college still pays off for students – and society overall

Source: The Conversation – USA (2) – By Stanley S. Litow, Adjunct Professor of International and Public Affairs, Columbia University

College graduates earn more immediately after graduation and later on in their careers than high school graduates. DBenitostock/Moment

No industry has perhaps felt the negative effect of a radical shift in federal policy under the second Trump administration more than higher education.

Many American colleges and universities, especially public institutions, have experienced swift and extensive federal cuts to grants, research and other programs in 2025.

Meanwhile, new restrictive immigration policies have prevented many international students from enrolling in public and private universities. Universities and colleges are also facing other various other challenges – like the threat to academic freedom.

These shifts coincide with the broader, increasingly amplified argument that getting a college degree does not matter, after all. A September 2025 Gallup poll shows that while 35% of people rated college as “very important,” another 40% said it is “fairly important,” and 24% said it is “not too important.”

By comparison, 75% of surveyed people in 2010 said that college was “very important,” while 21% said it was “fairly important” and 4% said it is “not too important.”

Still, as a scholar of education, economic development and social issues, I know that there is ample and growing evidence that a college degree is still very much worth it. Graduating from college is directly connected to higher entry-level wages and long-term career success.

A swirl of white papers hang from a ceiling in an ornate room with a chandelier.
College diplomas are seen on display as part of an art exhibition in Grand Central Terminal in New York in 2022.
Timothy A. Clary/AFP via Getty Images

A growing gap

Some people argue that a college degree does not matter, since there might not be enough jobs for college graduates and other workers, given the growth of artificial intelligence, for example. Some clear evidence shows otherwise.

An estimated 18.4 million workers with a college degree in the U.S. will retire from now through 2032, according to Georgetown University’s Center on Education and the Workforce. This is far greater than the 13.8 million workers who will enter the workforce with college degrees during this same time frame.

Meanwhile, an additional 700,00 new jobs that require college degrees – spanning from environmental positions to advanced manufacturing – will be created from now through 2032.

The gap between those expected to leave and enter the workforce with college degrees creates a serious problem. One major question is whether there will be enough people to fill the available jobs that require a college degree.

In 2023, foreign-born people made up 16% of registered nurses in the U.S., though that percentage is higher in certain states, like California. But restrictions on immigration could limit the number of potential nurses able to fill open positions.

Nursing and teaching are two fields expected to grow over the next few decades, and they will require more workers due to retirements.

Other fields, like accounting, engineering, law and many others, are also expected to have more college-educated workers retire than there are new workers to fill their positions.

Worth the cost

The average annual salary of a college graduate from the class of 2023 was US$64,291 in 2024, according to the National Association of Colleges and Employers.

The overall average salary for this graduation class one year after they left school marked an increase from the average $60,028 that the class of 2022 earned in 2023, equivalent to $63,850 today.

While there is not available data that offers a direct comparison, full-time, year-round workers ages 25 to 34 with a high school diploma earned $41,800 in median annual earnings in 2022, or $46,100 today.

Overall lifetime earnings for those with college degrees is about about $1.2 million more than people with a high school make, according to the recent Georgetown findings.

People who earn more generally have more money to support their families and contribute to their immediate communities. Their higher taxes also contribute to the U.S. economy, supporting needed services like education, public safety and health care.

People with college degrees are also more likely than those who are not college graduates to vote, volunteer and make charitable donations to help others in need.

College matters for individuals, but it clearly also helps improve the economy.

With 64 public colleges across the state, the State University of New York system is the largest post-secondary network of higher education schools in the country. For every $1 the state of New York invests in SUNY, the SUNY system returns $8.70 to the state in terms of economic growth, according to 2024 findings by the Rockefeller Institute, an independent public policy research organization affiliated with SUNY. And that is only one state.

A gray building is seen with red signs hanging nearby that say 'Stony Brook University.'
The Stony Brook University campus, part of the State University of New York system, is shown in May 2022.
Howard Schnapp/Newsday RM via Getty Images

A new way forward

It isn’t likely that the expected number of college-educated people who will soon retire will suddenly decrease, or that the anticipated number of people entering the workforce will unexpectedly increase.

There are practical reasons why some people do not want to go to college, or cannot attend. Indeed, the percentage of young people enrolled as college undergraduates fell almost 15% from 2010 through 2022.

For one, tuition and fees at private colleges have increased about 32% since 2006, after adjusting for inflation. And in-state tuition and fees at public universities have also grown about 29% since 2006.

The total of federal student loan debt for college has also tripled since 2007. It stood at about $1.84 trillion in 2024.

I believe that in order to ensure enough college-educated people can fill the anticipated work openings in the future, universities and the government should embrace needed changes to increase both enrollment and completion rates.

Artificial intelligence will transform work worldwide, for example, and that shift should be incorporated into higher education curriculum and degrees. Soft skills – like problem-solving, collaboration, presentation and writing skills – will become more important and should be prioritized in the learning process.

I believe that universities should also prioritize experiential education, including paid internships that offer students academic credit. This can help students gain experience that is both accredited and is connected to direct career pathways.

Universities and high schools could also expand how much they offer microcredentials – or short, focused learning programs that offer practical skills in a specific area – so students can connect their education with clear career pathways.

These reforms aren’t easy. They require a commitment to change, and all of this work will require deep partnerships with the government. While that might be a heavy lift currently at the federal level, it is both possible and achievable to make advances on these and other changes at the state level.

American universities and colleges have always been key to preparing the workforce for economic opportunity. At the end of World War II, for example, Columbia University and IBM worked together to help create the academic discipline now called computer science.

This action did more than help one university or one employer. It fueled change across higher education and across private companies and the government, leading to massive economic growth.

Universities have made countless other contributions to strengthen and expand the economy. Considering solutions to some of the challenges that stop students from going to college could help ensure that more students see the value in a college education – and a tangible way for them to connect it to a future career.

The Conversation

Stanley S. Litow does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Despite naysayers and rising costs, data shows that college still pays off for students – and society overall – https://theconversation.com/despite-naysayers-and-rising-costs-data-shows-that-college-still-pays-off-for-students-and-society-overall-267612

AI is changing who gets hired – what skills will keep you employed?

Source: The Conversation – USA (2) – By Murugan Anandarajan, Professor of Decision Sciences and Management Information Systems, Drexel University

Success in the age of AI may depend less on technical skills and more on human judgment, adaptability and trust. Malte Mueller/Getty Images

The consulting firm Accenture recently laid off 11,000 employees while expanding its efforts to train workers to use artificial intelligence. It’s a sharp reminder that the same technology driving efficiency is also redefining what it takes to keep a job.

And Accenture isn’t alone. IBM has already replaced hundreds of roles with AI systems, while creating new jobs in sales and marketing. Amazon cut staff even as it expands teams that build and manage AI tools. Across industries, from banks to hospitals and creative companies, workers and managers alike are trying to understand which roles will disappear, which will evolve and which new ones will emerge.

I research and teach at Drexel University’s LeBow College of Business, studying how technology changes work and decision-making. My students often ask how they can stay employable in the age of AI. Executives ask me how to build trust in technology that seems to move faster than people can adapt to it. In the end, both groups are really asking the same thing: Which skills matter most in an economy where machines can learn?

To answer this, I analyzed data from two surveys my colleagues and I conducted over this summer. For the first, the Data Integrity & AI Readiness Survey, we asked 550 companies across the country how they use and invest in AI. For the second, the College Hiring Outlook Survey, we looked at how 470 employers viewed entry-level hiring, workforce development and AI skills in candidates. These studies show both sides of the equation: those building AI and those learning to work with it.

AI is everywhere, but are people ready?

More than half of organizations told us that AI now drives daily decision-making, yet only 38% believe their employees are fully prepared to use it. This gap is reshaping today’s job market. AI isn’t just replacing workers; it’s revealing who’s ready to work alongside it.

Our data also shows a contradiction. While many companies now depend on AI internally, only 27% of recruiters say they’re comfortable with applicants using AI tools for tasks such as writing resumes or researching salary ranges.

In other words, the same tools companies trust for business decisions still raise doubts when job seekers use them for career advancement. Until that view changes, even skilled workers will keep getting mixed messages about what “responsible AI use” really means.

In the Data Integrity & AI Readiness Survey, this readiness gap showed up most clearly in customer-facing and operational jobs such as marketing and sales. These are the same areas where automation is advancing quickly, and layoffs tend to occur when technology evolves faster than people can adapt.

At the same time, we found that many employers haven’t updated their degree or credential requirements. They’re still hiring for yesterday’s resumes while, tomorrow’s work demands fluency in AI. The problem isn’t that people are being replaced by AI; it’s that technology is evolving faster than most workers can adapt.

Fluency and trust: The real foundations of adaptability

Our research suggests that the skills most closely linked with adaptability share one theme, what I call “human-AI fluency.” This means being able to work with smart systems, question their results and keep learning as things change.

Across companies, the biggest challenges lie in expanding AI, ensuring compliance with ethical and regulatory standards and connecting AI to real business goals. These hurdles aren’t about coding; they’re about good judgment.

In my classes, I emphasize that the future will favor people who can turn machine output into useful human insight. I call this digital bilingualism: the ability to fluently navigate both human judgment and machine logic.

What management experts call “reskilling” – or learning new skills to adapt to a new role or major changes in an old one – works best when people feel safe to learn. In our Data Integrity & AI Readiness Survey, organizations with strong governance and high trust were nearly twice as likely to report gains in performance and innovation. The data suggests that when people trust their leaders and systems, they’re more willing to experiment and learn from mistakes. In that way, trust turns technology from something to fear into something to learn from, giving employees the confidence to adapt.

According to the College Hiring Outlook Survey, about 86% of employers now offer internal training or online boot camps, yet only 36% say AI-related skills are important for entry-level roles. Most training still focuses on traditional skills rather than those needed for emerging AI jobs.

The most successful companies make learning part of the job itself. They build opportunities to learn into real projects and encourage employees to experiment. I often remind leaders that the goal isn’t just to train people to use AI but to help them think alongside it. This is how trust becomes the foundation for growth, and how reskilling helps retain employees.

The new rules of hiring

In my view, the companies leading in AI aren’t just cutting jobs; they’re redefining them. To succeed, I believe companies will need to hire people who can connect technology with good judgment, question what AI produces, explain it clearly and turn it into business value.

In companies that are putting AI to work most effectively, hiring isn’t just about resumes anymore. What matters is how people apply traits like curiosity and judgment to intelligent tools. I believe these trends are leading to new hybrid roles such as AI translators, who help decision-makers understand what AI insights mean and how to act on them, and digital coaches, who teach teams to work alongside intelligent systems. Each of these roles connects human judgment with machine intelligence, showing how future jobs will blend technical skills with human insight.

That blend of judgment and adaptability is the new competitive advantage. The future won’t just reward the most technical workers, but those who can turn intelligence – human or artificial – into real-world value.

The Conversation

Murugan Anandarajan does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. AI is changing who gets hired – what skills will keep you employed? – https://theconversation.com/ai-is-changing-who-gets-hired-what-skills-will-keep-you-employed-267376

Trump’s ‘golden age’ economic message undercut by his desire for much lower interest rates – which typically signal a weak jobs market

Source: The Conversation – USA (2) – By Joshua Stillwagon, Associate Professor of Economics, Babson College

President Donald Trump has said he believes the U.S. economy has entered a ‘golden age’ on his watch. AP Photo/Mark Schiefelbein

President Donald Trump seems to want to have it both ways on the U.S. economy.

On the one hand, he recently said the economy is in its “golden age” and referred to the U.S. as the “hottest country anywhere in the world.”

Yet at the same time, he has outright demanded that the Federal Reserve sharply slash interest rates to fuel economic activity. And his recently handpicked governor, Stephen Miran, has led the charge in pushing for a bigger cut than preferred by his new colleagues at the Fed.

When an economy is strong, central banks typically don’t cut interest rates and may even raise them to avoid spurring inflation. And so to support his argument for large cuts, Miran has played up “downside risks” to the economy and a weakening labor market, contrasting with Trump’s talk of a “golden age.”

Trump and Miran also seem to be ignoring the problem of inflation, which the president has said “has been defeated” and Miran considers close enough to the Fed’s target of 2%. Yet, inflation remains high and has been picking back up in recent months – one of the core reasons the Fed has taken a gradual approach to lowering interest rates.

I’m a macroeconomist, which means I study big-picture factors affecting an economy, such as interest rates.

It’s well known that lower rates spur faster growth, and of course all presidents want a stronger economy on their watch. But the Fed’s job when it sets interest rates is to deal with whatever reality the data shows – and make decisions accordingly.

Is the economy hot or not?

In the simplest terms, the Fed raises interest rates when the economy is “hot,” or inflation is above the Fed’s 2% target, and lowers them when there are concerns about unemployment.

At its most recent meeting, in September, the Fed lowered rates a quarter of a point, citing slowing jobs growth, and increased economic uncertainty. Trump nominee Miran was the only one of the 12 members of the Fed’s policy-setting committee to instead vote for a more aggressive half-point cut.

The only credible rationale for that large of an interest rate cut, in the face of still-high inflation, is by believing the labor market is incredibly weak. According to the Fed’s preferred measure, the personal consumption expenditures index, inflation has been accelerating all summer and was 2.7% at the end of August, well above the Fed’s 2% target.

There’s no doubt jobs growth has slowed considerably in recent months, but enough to completely ignore the risk of driving inflation higher? At this point at least, the Fed doesn’t think so.

And if the economy were in fact running hot, as the president claims, the Fed would have little choice but to keep rates flat or raise them, especially given elevated inflation.

a man in a suit speaks in front of a microphone with a few people sitting in the background
Stephen Miran, who was recently nominated to the Federal Open Market Committee, has been pushing for much larger rate cuts than his colleagues.
AP Photo/Mariam Zuhaib

Risks of following political whims

This situation gets at the heart of why central bank independence matters.

Trump’s efforts to influence the Federal Reserve have not been subtle and break with Congress’ intention to insulate the Fed from political manipulation. Besides pressing for big rate cuts, he has tried to fire a member of the Board of Governors over questionable allegations and mused about removing Fed Chair Jerome Powell.

The risks of following the wishes of a president in the face of what the data shows were starkly demonstrated in 2021, when Turkey’s president, Recep Tayyip Erdogan, fired the head of the country’s central bank. The central banker was pushing rates higher to tame inflation, which was at about 20%, but Erdogan demanded they be lowered. In response, Turkey’s lira plunged to record lows and inflation soared to over 70% in 2022.

Something similar could happen in the U.S. if Trump continues down the same path of meddling with the Fed. As a sign of how much Wall Street worries about this risk, a recent study estimated that if Trump followed through on his threat to fire Powell, the stock market could lose an estimated US$1 trillion as a result.

That’s because the Fed’s credibility rests on its ability to make decisions driven by economic evidence, not political expedience. That independence means policymakers must weigh data on inflation, jobs and growth rather than election cycles or partisan demands.

Justifying deeper rate cuts

Looking ahead to the Fed’s next meeting Oct. 28-29, policymakers face a delicate balancing act. With inflation still running above target and signs of slowing jobs growth, it needs to lower rates enough to prevent a downturn but not so low that inflation spirals out of control.

Traders are putting near-100% odds of two more quarter-point cuts this year, one on Oct. 29 and another in December. This would bring the Fed’s benchmark interest rate to a range of 3.5%-3.75% by the end of 2025, down from 4%-4.25% now.

Based on Miran’s own interest rate projections, he’s likely to again push for a larger cut of a half-point or more at both meetings, as he believes the Fed’s benchmark rate should be below 3% by the end of the year.

To me, as an economist, the only way a Fed acting independently could reasonably justify such a significant cut in rates in the next few months is if the unemployment rate begins rising steadily, with the economy clearly at risk of slipping into a recession.

The Conversation

Joshua Stillwagon was a long-time organizer and judge for an academic competition hosted at the Federal Reserve Bank of Boston, and has presented research at the Federal Reserve Bank of Boston.

ref. Trump’s ‘golden age’ economic message undercut by his desire for much lower interest rates – which typically signal a weak jobs market – https://theconversation.com/trumps-golden-age-economic-message-undercut-by-his-desire-for-much-lower-interest-rates-which-typically-signal-a-weak-jobs-market-266969

What’s the difference between ghosts and demons? Books, folklore and history reflect society’s supernatural beliefs

Source: The Conversation – USA (3) – By Penelope Geng, Associate Professor of English, Macalester College

Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to curiouskidsus@theconversation.com.


What’s the difference between ghosts and demons? – Landon W., age 15, The Colony, Texas


Belief in the spirit world is a key part of many faiths and religions. A 2023 survey of 26 countries revealed that about half the respondents believed in the existence of angels, demons, fairies and ghosts. In the United States, a 2020 poll found that about half of Americans believe ghosts and demons are real.

While the subject of demons and ghosts can inspire dread, the concepts themselves can be confusing: Is there a difference between the two?

Historically, communities have understood the supernatural according to their religious and spiritual traditions. For example, the terrifying ghosts of Pu Songling’s “Strange Tales from a Chinese Studio” operate differently than those haunting the works of William Shakespeare, even though both writers lived in the 17th century.

Engraving of three men agog at the appearance of a ghostly man in armor
‘Hamlet, Horatio, Marcellus and the Ghost,’ from Shakespeare’s ‘Hamlet,’ Act 1, Scene 4.
Robert Thew/Gertrude and Thomas Jefferson Mumford Collection via The Met

Literary representations of ghosts and demons often reflect the anxieties of communities experiencing social, religious or political upheaval. As a scholar of early modern English literature, my research focuses on how everyday people in 16th- and 17th-century Europe used storytelling to navigate major social changes. This era, often called the Renaissance, was punctuated by the establishment of mass media through printing, the global spread of colonization and the emergence of modern science and medicine.

Digging into the literary archive can reveal people’s ideas about demons and ghosts – and what made them different.

Martin Luther and the Reformation

On Oct. 31, 1517, Martin Luther, an ex-law student and former monk, boldly published his Ninety-Five Theses. In it, he rejected the Catholic Church’s promise that monetary payment to the church could reduce the amount of time one’s soul spent in purgatory. What began as a local protest in Wittenberg, Germany, soon swept all the major European powers into a life and death struggle over religious reform. Towns were besieged, landscapes scorched, villages pillaged.

This period, called the Reformation, led to the establishment of new Christian denominations. Among these Protestant churches’ early teachings was the edict that purgatory did not exist and souls could not return to Earth to haunt the living. Protestant reformers insisted that after death, one’s soul was immediately judged. The virtuous flew up to God in heaven; the sinful burned in hell with the Devil.

According to Protestants, ghosts were invented by Catholic priests to scare people into obedience. For example, the English translator of Ludwig Lavater’s 1572 book “Of Ghostes and Spirites Walking by Night” insists ghosts are the “falsehood of Monkes, or illusions of devils, franticke imaginations, or some other frivolous and vaine perswasions.” Should you ever encounter an “apparition,” you must call it out for what it truly is: a devil pretending to be a ghost.

Page of a book reading 'The Tragic History of the Life and Death of Doctor Faustus' with an image of a man holding a book standing in a circle of runes summoning a demon
Title page of Christopher Marlowe’s ‘Doctor Faustus.’
Iohn Wright/Wikimedia Commons

Christopher Marlowe’s play “Doctor Faustus” comments on these debates. Written in the 1580s for a primarily Protestant audience, the play features a scene in which Dr. Faustus and his devil companion, Mephistopheles, trick the pope by snatching away his meal. A bewildered member of the papal court concludes “it may be some ghost … come to beg a pardon of your Holiness.” The audience knows full well, however, that these pranks are committed by the necromancer and his demon.

Ghostly haunting

In spite of Protestantism’s official stance against ghosts, belief in them persisted in the popular imagination.

Archival records show that ordinary people held fast to popular beliefs despite what their religious authorities decreed. For example, the casebook of Richard Napier, an astrological physician, reports several cases of “spirit” hauntings, including that of a young mother named Catherine Wells who had been “vexed … with a spirit” for three continuous years.

Popular plays provide additional evidence. Shakespeare’s “Hamlet” opens with a midnight visitation by the ghost of Hamlet’s father, telling his son he cannot rest in peace until his murderer is brought to justice. Ghostly victims seeking justice appear in other Shakespearean plays, including “Macbeth” and “Richard III.”

Cheap print, a form of common media, capitalized on the public’s interest in the paranormal. Part entertainment, part journalism, cheap print was read by all sorts of people. A 1662 pamphlet titled “A strange and wonderfull discovery of a horrid and cruel murther [murder]” describes Isabel Binnington’s unsettling encounter with the ghost of Robert Eliot. In her testimonial, she claims that Eliot’s ghost promised he would never hurt her. What he wanted was simply for her to hear his story: He had been murdered for his coins in the very house she occupied.

A 1730 broadside ballad called “The Suffolk Miracle” – still performed today – tells the tale of young lovers parted by an overprotective father. After the daughter is whisked away, her beloved dies of a broken heart. When his ghost later appears to her, she “joy’d to see her heart’s delight.”

Demonic possession

While reformed Protestant thinkers rejected the existence of ghosts, they enthusiastically accepted the reality of devils.

Reports of demonic possession were popular. Before his ascension to the English throne, King James VI of Scotland published a literary treatise on demonology in 1597. He argues that “assaultes of Sathan are most certainly practized” and “detestable slaves of the Devill” live among us.

Engraving of a man surrounded by devilish creatures tearing at and beating him
‘Saint Anthony Tormented by Demons,’ 1470–74.
Martin Schongauer/Rogers Fund via The Met

The diaries of English Puritans offer further proof that beliefs about devilish encounters were common. In the 1650s, the Calvinist preacher Thomas Hall insisted that his godliness attracted the attention of Satan like a moth to a candle. From an early age, he complained, he was subjected to “Satanicall buffettings” and terrifying dreams. He believed, however, that surviving demonic temptation demonstrated his unwavering devotion to God.

Distinguishing ghosts from demons

Based on the literature, what can we conclude about how people saw ghosts and demons?

Early modern people often represented ghosts as sad and pitiable. They were depicted as the spiritual remainder of a recently deceased person, haunting their friends and kin – or, occasionally, a stranger. They retained some of their humanity and were psychically connected to a place, such as their former home, or to a person, such as their most cherished companion.

Demons, by contrast, were almost always malevolent tricksters who served the Devil. Demons lacked knowledge of what it meant to be human. Hell was the demons’ lair. Early modern texts describe them visiting the earthly plane to corrupt, possess or tempt humans to commit self-harm or violence against others.

Faustus getting dragged to hell at the Globe Theatre.

Then and now, stories of ghosts and demons have provoked fear and wonder. Tales of the supernatural have inspired the imagination of kings, theologians, playwrights and everyday people.

Approaching the topic of the otherworldly with intellectual humility can inspire deeper curiosity about cultures across space and time. As Hamlet muses to his friend after meeting the ghost of his father, “There are more things in Heaven and Earth, Horatio, than are dreamt of in your philosophy.”


Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.

The Conversation

Penelope Geng does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. What’s the difference between ghosts and demons? Books, folklore and history reflect society’s supernatural beliefs – https://theconversation.com/whats-the-difference-between-ghosts-and-demons-books-folklore-and-history-reflect-societys-supernatural-beliefs-250997

An Indigenous approach shows how changing the clocks for Daylight Saving Time runs counter to human nature – and nature itself

Source: The Conversation – USA (3) – By Rachelle Wilson Tollemar, Lecturer in Spanish Environmental Cultural Studies, University of Wisconsin-Madison

Humans and nature can find balance in each other. timnewman/E+ via Getty Images

It is that time again. Time to wonder: Why do we turn the clocks forward and backward twice a year? Academics, scientists, politicians, economists, employers, parents – and just about everyone else you will interact with this week – are likely debating a wide range of reasons for and against Daylight Saving Time.

But the reason is right there in the name: It’s an effort to “save” daylight hours, which some express as an opportunity for people to “make more use of” time when it’s light outside.

But as an Indigenous person who studies environmental humanities, this sort of effort, and the debate about it, misses a key ecological perspective.

Biologically speaking, it is normal, and even critical, for nature to do more during the brighter months and to do less during the darker ones. Animals go into hibernation, plants into dormancy.

Humans are intimately interconnected with, interdependent on, and interrelated to nonhuman beings, rhythms and environments. Indigenous knowledges, which despite their complex, diverse and plural forms, amazingly cohere in reminding humans that we too are an equal part of nature. Like trees and flowers, we are beings who also need winter to rest and summer to bloom.

As far as we humans know, we are the only species that chooses to fight against our biological presets, regularly changing our clocks, miserably dragging ourselves into and out of bed at unnatural hours.

The reason, many scholars agree, is that capitalism teaches humans that they are separate from, and superior to, nature – like the point on top of a pyramid. That, and I argue, that capitalism wants people to work the same number of hours year-round, no matter the season. This mindset runs counter to the way Indigenous people have lived for thousands of years.

A group of people stand around an open circle on an island, as the Sun rises behind a bridge across the water.
A large gathering of people celebrate Indigenous Peoples Day in 2024, by watching the Sun rise over San Francisco Bay.
Tayfun Coskun/Anadolu via Getty Images

The nature of time and work

Indigenous views of the world are not the pyramids or lines of capitalism but the circles and cycles of life.

Concretely, time correlates with terrestrial and celestial changes. Historic records and oral interviews document that in traditional Indigenous cultures of the past, human activity was scheduled according to nature’s recurring patterns. So for example, a meeting might have been scheduled not at 4 p.m. on Thursday, but rather at the next full moon. Everyone knew well in advance when that would arise and could plan accordingly.

Such an acute sensitivity to nature’s calendar has symbolic meaning, too. To look up and see the Moon in the sky at night is to see the same Moon that someone once saw centuries ago and someone else will hopefully see centuries into the future. Time is interwoven with nature in a sense that far exceeds Western understanding. It embodies past, present and future all at once. Time is life.

The 2015 movie ‘El Abrazo de la Serpiente (Embrace of the Serpent)’ examines the relationship between Indigenous cultures of knowing and colonizing forces.

In this Indigenous context, Daylight Saving Time is nonsensical – if not outright comical. Time can’t be changed any more than a clock’s hands can grab the Sun and move its position in the sky. The Sun will continue to cycle at its gravitational will for generations – and economic systems – to come.

Like time, Indigenous approaches to work are also more expansive than the capitalist economy’s. They validate and value all life-sustaining activities as work. Taking care of oneself, of the sick, of the elderly, of the young, of the land, or even merely resting, for example, are equally valuable activities.

That’s because the objective of most Indigenous economies is not to increase an economist-invented measurement of production by working from 9 a.m. to 5 p.m., Monday through Friday. Rather, their goal is to find and generate a holistic well-being for all.

Daylight Saving Time is exclusively designed for 9-to-5 workers. It attempts to boost economic activity by giving them, and them alone, more light. Think about it: Care workers, who are predominantly women, work beyond daylight hours year-round. Where is their temporal accommodation? Though likely not malicious or even purposeful, the political intervention of Daylight Saving Time ignores the massive workforce that operates on the periphery of the mainstream economy. In some ways, it reinforces the discriminatory idea that only some workers are worthy of economic recognition and accommodation.

In this sense, Daylight Saving Time raises the question: Does the economy really need that extra hour of sunshine and worker productivity? Traditional economic philosophies would likely answer no out of principle; they may see Daylight Saving Time as trespassing the biophysical, ethical and sacred limits of the world ecology by encouraging cultures of overwork and overconsumption.

A person swipes a card in a machine on a wall.
A worker swipes a time card to clock in at the beginning of their shift.
halbergman/E+ via Getty Images

The working of time and nature

Since the invention of the clock, capitalism has increasingly treated time as an inanimate object largely independent of the environment.

While the rest of nature rises and slumbers to lunar and solar cycles, humans work and sleep to the resetting of their artificial clocks.

In their 2016 book “The Slow Professor,” humanities scholars Maggie Berg and Barbara K. Seeber connect this objectification of time to an inhumane culture of work.

Modern workers, they write, are increasingly expected to treat time as a numerical asset that can be managed, measured and controlled. Time for rest and relaxation has no countable home in the capitalist economy of life.

There are certainly practical benefits to using time to measure and monitor economic activities – such as knowing the precise time a meeting is scheduled to start and end. But Berg’s and Seeber’s work reveals how that reasonable practicality has been subverted to hold workers captive within what I argue is an unsustainable, unnatural and exploitative environment. Work time and life time have blurred into one.

In capitalism, work is expected to grow infinitely, despite existing within a finite world inhabited by limited beings. At a time when human activity depletes the world’s ecology – rather than sustaining it as it once did – this around-the-clock approach to work is simply incompatible with nature.

In sum, Daylight Saving Time reproduces the same destructive logic that has led humans and nonhumans into the present socio-ecological crises. Disobeying and dominating the laws, rhythms and shape of nature, as seen in the seasonal exploitation of human energy and labor via Daylight Saving Time, perpetuates the unparalleled social and environmental decline uniquely characteristic to the current capitalist era.

Looking backwards, progressing forward

Unlike the relatively recent inception of capitalism, Indigenous wisdom espouses a set of philosophies as old as time. It reminds humans that there are other ways of interacting with time, work and the environment – ways that existed before capitalism and that can exist afterward, too.

In my view, people might be better off if the discussion about changing the clocks in the fall and spring wasn’t about how much time we can “make use of” or how much daylight we might “save,” but rather about reducing the number of hours we are expected to be made useful – and profitable – to secure a more just and sustainable existence for all.

The Conversation

Rachelle Wilson Tollemar does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. An Indigenous approach shows how changing the clocks for Daylight Saving Time runs counter to human nature – and nature itself – https://theconversation.com/an-indigenous-approach-shows-how-changing-the-clocks-for-daylight-saving-time-runs-counter-to-human-nature-and-nature-itself-253097

Why most of us are reluctant to switch banks, even though it could cut our environmental impact

Source: The Conversation – UK – By Marcel Lukas, Senior Lecturer in Banking and Finance and Vice-Dean Executive Education, University of St Andrews

Geobor/Shutterstock

Beyond cutting back on meat or making the jump to an electric vehicle, another way consumers can reduce their environmental impact is to switch to a green bank. It’s a lifestyle change that could deliver powerful effects – removing money from the fossil fuels pipeline – for little effort or inconvenience.

Yet it has been claimed that people in the UK are more likely to get divorced than switch banks – despite there being services that make changing your current account easy.

The UK’s seven-day Current Account Switch Service (Cass), in operation since 2013, has completed more than 11.6 million switches, including over a million in the year to March 2025. The service switches your incoming and outgoing payments including salary payments, direct debits and standing orders.

Cass reports that 99.7% of these account switches were completed within seven working days – and nearly 90% of people who used the service were satisfied with it. Yet relative to the whole UK population, the number of people actually switching remains modest. The process works, but behaviour lags.

A set of well-documented psychological tendencies help to explain this gap.

“Prospect theory” shows that people weigh any potential losses more heavily than equivalent gains. This tilts people toward staying with their familiar provider when a change involves any chance of disruption or error.

The “endowment effect” increases the subjective value of someone’s existing bank account, simply because they already own it.


Ever wondered how to spend or invest your money in ways that actually benefit people and planet? Or are you curious about the connection between insurance and the climate crisis?

Green Your Money is a new series from the business and environment teams at The Conversation exploring how to make money really matter. Practical and accessible insights from financial experts in the know.


Many people’s status quo bias turns hesitation into inertia, because departing from a default requires attention and effort. And our usual bias towards the present adds a timing problem: the admin is immediate while the benefits arrive later and accrue gradually.

All these mechanisms interact with how people organise their money. Many maintain informal “mental accounts” for bills, savings and day-to-day spending. A bank move forces a rewiring of standing orders, direct debits, salary instructions and payees. It feels like opening a filing cabinet and relabelling everything.

Fear of missing a mortgage or utility payment is especially salient. As such, guarantees like those around direct debits and standing orders provided by the Cass system only help if people trust they are actually covered.

shocked man looking at his phone with his hand on his forehead.
The terror of a missed payment can be a barrier to change.
Andrey Popov/Shutterstock

While financial education improves what people know, studies have found it typically only encourages modest changes in behaviour. A meta-analysis of 201 studies reported that education efforts explained only about 0.1% of changes in money-related behaviour (things like saving, dealing with debt and avoiding fees), with these effects often disappearing over time.

A later meta-analysis with a sample size of more than 160,000 people found that financial education improved knowledge more than it prompted changes in behaviour. This was measured across areas such as budgeting and saving.

These reviews do not test current account switching directly. But they support the narrower point that information alone usually only shifts real-world financial actions a little.

However, small changes to the way choices are presented can move outcomes when used at scale. A comprehensive analysis of 23 million participants found that “nudges” – such as making a process simpler or sending a reminder – increased behaviour changes in areas as varied as signing up to a savings plan or making safety improvements in the home by about 1.4 percentage points, on average. While this may seem small, scale is key.

We can view this finding through a banking lens. Without intervention, perhaps 5% of customers would switch to a better account. A simple nudge might boost this to 6.4%. Across 100,000 customers, that’s 1,400 additional people making a beneficial switch.

How to overcome your inertia

So-called “implementation-intention techniques”, where a person invents conditions to help them achieve a goal, are a practical option. Across 94 tests, it was found that forming an explicit “if–then” plan – for example: “If I get the job then I’ll increase the amount I save” – produced a medium-to-large improvement in people attaining their goals.

In terms of banking, this technique could be used along the following lines: “If it is Sunday at 8pm, I will compare three accounts for 30 minutes. If one is clearly better on fees or green credentials, I will apply for it the same evening. Once approved, I will move two direct debits per night until I’m finished.”

In my experience, there are three steps that can help you overcome inertia when it comes to your finances.

First, convert a general goal into time-boxed tasks on your calendar, using an if–then plan rather than a vague intention.

Second, use frustrations with your existing bank account to motivate you – such as being charged a fee you did not expect, or discovering your bank’s environmental failings. Your motivation to act is elevated at these moments.

Third, to make the task less overwhelming, use a short checklist of payees and subscriptions. Ticking items off in small batches should reduce the cognitive load you feel.

Broader lessons

Clear communication about how much switching services such as Cass will do for an account holder can make them worry less about the risks. This should also help them realise if certain authorisations need to be switched manually.

But the lessons here apply beyond current accounts. Loss aversion, attachment to the familiar, present bias and default effects also shape decisions about savings products, energy tariffs and mobile contracts – choices that all come with environmental consequences.

Systems that assume consumers will tirelessly compare their options will disappoint. Those that make better options prominent, easy and well timed are far more likely to encourage meaningful change, at scale.

The Conversation

Marcel Lukas is the Vice-Dean Executive Education at the University of St Andrews Business School and has previously received funding from the British Academy, UKRI and Interface UK.

ref. Why most of us are reluctant to switch banks, even though it could cut our environmental impact – https://theconversation.com/why-most-of-us-are-reluctant-to-switch-banks-even-though-it-could-cut-our-environmental-impact-267042

Why fasting won’t cleanse your body – or beat cancer

Source: The Conversation – UK – By Justin Stebbing, Professor of Biomedical Sciences, Anglia Ruskin University

Jo Panuwat D/Shutterstock

Every few months, a new “miracle cure” for cancer trends on social media. From superfoods and supplements to extreme diets, the promises are always bold – and almost always misleading. The latest claim suggests that a 21-day water fast can “starve” cancer cells and trigger the body to heal itself. It sounds simple, even empowering: stop eating and your body will do the rest.

But biology is rarely that simple. Cancer is not a single disease, and metabolism does not switch neatly between “sick” and “healthy.” While fasting can affect how our cells use energy, there is no scientific evidence that it can eradicate tumours. In fact, prolonged fasting can be dangerous, especially for people already weakened by cancer or its treatments.

While fasting can influence metabolism, immunity and some aspects of cell growth, there is no credible evidence that prolonged water fasting can treat or cure cancer.

Fasting, in its many forms – from intermittent fasting to short-term calorie restriction – has been shown in laboratory studies to influence how cells repair themselves and manage energy. 2024 research shows that fasting temporarily suppresses intestinal stem cell activity, followed by a powerful regenerative phase once food is reintroduced. This rebound in stem cell growth is driven by a pathway known as mTOR, which promotes protein synthesis and cell proliferation.

Some celebrities and influencers claim that water fasting could help
Artfilmphoto/Shutterstock

While this regeneration helps tissues recover, it can also create a vulnerable window in which harmful mutations may occur more easily, raising the risk of tumour formation.

Most research on fasting’s effects has focused on intermittent or short fasts lasting between 12 and 72 hours, not on extreme water-only fasts that continue for weeks. A 21-day water fast, as promoted in some wellness circles, carries serious risks. Extended fasting can cause dehydration, electrolyte imbalances, dangerously low blood pressure and muscle loss.

Cancer itself often leads to malnutrition, and fasting can accelerate wasting (cachexia), weaken the immune system and increase susceptibility to infection. Many cancer patients are undergoing chemotherapies that require adequate nutrition to maintain organ function and safely metabolise drugs. Combining these treatments with prolonged fasting can amplify toxicity, delay recovery and worsen fatigue.

There are ongoing clinical studies into short fasting or fasting-mimicking diets before chemotherapy, but these are medically supervised, typically lasting less than 48 hours and carefully monitored for safety.

Fasting continues to intrigue scientists because it activates ancient survival mechanisms. During food scarcity, the body triggers processes such as autophagy, where cells recycle damaged components. This process can reduce inflammation and improve metabolic health in animal studies.

But in cancer, the story is far more complex. Cancer cells are resourceful. They can adapt to fasting by finding alternative fuel sources, sometimes outcompeting healthy cells under nutrient stress. Long periods without nutrition can also weaken immune cells that normally detect and attack tumours.

The 2024 fasting study demonstrates this duality. Fasting may reset metabolism, but refeeding rapidly activates growth pathways such as mTOR. In healthy cells, this helps repair tissues. In cells already carrying DNA damage or early mutations, it can encourage malignant progression. This makes fasting a complex biological stress factor rather than a harmless or therapeutic intervention.

The ‘detox’ myth

Much of fasting’s popular appeal comes from the myth of “detoxification”: the belief that abstaining from food “cleanses” the body. In reality, organs such as the liver, kidneys and lymphatic system already perform this task continuously. Cancer is not caused by accumulated “toxins” that can be flushed out. It develops through genetic changes that cause uncontrolled cell growth. No research has shown that fasting can eliminate cancer cells or shrink tumours in humans.

Water fasts won’t ‘detox’ your body.
PeopleImages/Shutterstock

Controlled studies have observed only short-term metabolic shifts that may influence inflammation or insulin signalling. These effects could help reduce long-term risk factors for chronic disease, but they do not reverse cancer once it has developed.

The promise and limits of metabolic research

There is scientific interest in how metabolism affects cancer. Researchers are exploring whether targeted calorie restriction or ketogenic diets could make tumour cells more sensitive to treatment while protecting healthy ones. These studies are still in early stages and focus on precision, not deprivation. None involve starving the body of all nutrients for weeks.

Sensational claims blur the line between hypothesis and proof, giving vulnerable patients false hope by cherry-picking facts, mentioning fasting’s role in cell repair while omitting the crucial detail that most findings come from animal models, not human trials. For someone undergoing cancer treatment, attempting an unsupervised extreme fast could delay essential care, worsen side effects, or even put their life at risk.

Fasting is a physiological stressor. In small, controlled doses, it can trigger adaptive processes that benefit health. In excess, especially during illness, it can cause harm.

A 21-day water fast is neither a plausible nor a safe cancer treatment. Research into fasting helps us understand how cells respond to nutrition and stress, but that knowledge underscores fasting’s complexity rather than supports it as therapy. While balanced nutrition, hydration, regular physical activity and adequate sleep can all support resilience during cancer therapy, none replace medical treatments designed to target tumour biology. Cancer care requires targeted, evidence-based treatments such as chemotherapy, radiotherapy, surgery and immunotherapy.

Fasting research is helping us understand the deep connections between metabolism and disease, but that is very different from curing cancer with a glass of water and willpower. It is understandable that people want control when facing something as frightening as cancer. The search for alternatives often comes from fear, frustration or a wish to avoid painful treatments. But hope should never rest on misinformation.

The Conversation

Justin Stebbing does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Why fasting won’t cleanse your body – or beat cancer – https://theconversation.com/why-fasting-wont-cleanse-your-body-or-beat-cancer-267555

How anatomical names can carry hidden histories of power and exclusion

Source: The Conversation – UK – By Lucy E. Hyde, Lecturer, Anatomy, University of Bristol

Gabriel Falloppius explaining one of his discoveries to the Cardinal Duke of Ferrara WellcomeTrust, CC BY-SA

Buried in your body is a tribute to a long-dead Italian anatomist, and he is not the only one. You are walking around with the names of strangers stitched into your bones, brains, and organs. We all are.

Some of these names sound mythical. The Achilles tendon, the band at the back of your ankle, pays homage to a Greek hero felled by an arrow in his weak spot. The Adam’s apple nods to a certain biblical bite of fruit. But most of these names are not myths. They belong to real people, mostly European anatomists from centuries ago, whose legacies live on every time someone opens a medical textbook.

They are called eponyms: anatomical structures named after people rather than described for what they actually are.

Take the fallopian tubes. These small passageways between the ovaries and the uterus were described in 1561 by Gabriele Falloppio, an Italian anatomist with a fascination for tubes who also gave his name to the Fallopian canal in the ear.

Gabriele Falloppio (1523–1562) was an Italian anatomist and surgeon who described the fallopian tubes in his 1561 work, Observationes Anatomicae.
https://commons.wikimedia.org/w/index.php?curid=1724751

Or “Broca’s area”, named for Paul Broca, the 19th-century French physician who linked a region of the left frontal lobe to speech production. If you have ever studied psychology or known someone who has had a stroke, you have probably heard his name.

Then there is the eustachian tube, that small airway you pop open when you yawn on a plane. It is named after Bartolomeo Eustachi, a 16th-century physician to the Pope. These men have all left fingerprints on our anatomy, not in the flesh, but in the language.

Why have we stuck with these names for centuries? Because eponyms are more than medical trivia. They are woven into the culture of anatomy. Generations of students have chanted them in lecture halls and scribbled them into notes. Surgeons drop them mid-operation as if chatting about old friends.

They are short, snappy and familiar. “Broca’s area” takes two seconds to say. Its descriptive alternative, “posterior inferior frontal gyrus,” feels like reciting an incantation. In busy clinical settings, brevity often wins.

Eponyms also come with stories, which make them memorable. Students remember Falloppio because he sounds like a Renaissance lute player. They remember Achilles because they know where to aim the arrow. In a field that can feel like a wall of Latin, a human story becomes a useful hook.

The Achilles tendon was named in 1693 after the Greek hero Achilles.
Panos Karas/Shutterstock

And, of course, there is tradition. Medical language is built on centuries of scholarship. For many, erasing eponyms would feel like tearing down history itself.

But there is a darker side to this linguistic love affair. For all their charm, eponyms often fail at their main purpose. They rarely tell you what a structure is or what it does. “Fallopian tube” gives no clue about its role or location. “Uterine tube” does.

Eponyms also reflect a narrow version of history. Most originated during the European Renaissance, a time when anatomical “discovery” often meant claiming knowledge that already existed elsewhere. The people being celebrated are overwhelmingly white European men. The contributions of women, non-European scholars and Indigenous knowledge systems are almost invisible in this language.

Then there is the truly uncomfortable truth: some eponyms honour people with horrific pasts. “Reiter’s syndrome,” for example, was named after Hans Reiter, a Nazi physician who conducted brutal experiments on prisoners at Buchenwald. Today, the medical community uses the neutral term “reactive arthritis,” a small but meaningful refusal to celebrate someone who caused harm.

Every eponym is a small monument. Some are quaint and historical. Others are monuments we would rather not keep polishing.

Descriptive names, by contrast, simply make sense. They are clear, universal and useful. You do not need to memorise who discovered something, only where it is and what it does.

If you hear “nasal mucosa,” you immediately know it is inside the nose. Ask someone to locate the “Schneiderian membrane,” and you will probably get a blank stare.




Read more:
Medical jargon is often misunderstood by the general public – new study


Descriptive terms are easier to translate, standardise and search. They make anatomy more accessible for learners, clinicians and the public. Most importantly, they do not glorify anyone.

So what should we do with all these old names?

There is a growing movement to phase out eponyms, or at least to use them alongside descriptive ones. The International Federation of Associations of Anatomists (IFAA) encourages descriptive terms in teaching and writing, with eponyms in parentheses.

That does not mean we should burn the history books. It means adding context. We can teach the story of Paul Broca while acknowledging the bias built into naming traditions. We can remember Hans Reiter not by attaching his name to a disease, but as a cautionary tale.

This dual approach allows us to preserve the history without letting it dictate the future. It makes anatomy clearer, fairer, and more honest.

The language of anatomy is not just academic jargon. It is a map of power, memory, and legacy written into our flesh. Every time a doctor says “Eustachian tube,” they echo the 16th century. Every time a student learns “uterine tube,” they reach for clarity and inclusion.

Perhaps the future of anatomy is not about erasing old names. It is about understanding the stories they carry and deciding which ones are worth keeping.

The Conversation

Lucy E. Hyde does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How anatomical names can carry hidden histories of power and exclusion – https://theconversation.com/how-anatomical-names-can-carry-hidden-histories-of-power-and-exclusion-267880

Children with special educational needs are more likely to miss school – it’s sign of a system under strain

Source: The Conversation – UK – By Caroline Bond, Professor of Education, Manchester University

Hryshchyshen Serhii/Shutterstock

Pupils with special educational needs and disabilities are twice as likely as their peers to be persistently absent from school.

Persistent absence means that they miss up to 10% of school sessions (sessions are a morning or afternoon at school). For those with an Education, Health and Care Plan (EHCP) – a legal document that lays out support they are entitled to – the picture is even worse. They are up to seven times more likely to be severely absent, meaning that they are missing more than half of school sessions. Absence is higher still for pupils in special schools compared with those in mainstream education.

Suspensions tell a similar story. Pupils with special educational needs are almost four times more likely to be suspended than those without.

Engagement among pupils with special educational needs also drops sharply in secondary school. Only 45% say they like being at school. And it’s not just pupils who feel the system isn’t working: three-quarters of teachers in a recent survey said schools are not inclusive enough for all pupils.

The current approach to inclusion often relies on case-by-case fixes, but this isn’t sustainable. Since 2016, the number of EHCPs has risen by over 80%, yet the systems for assessing and meeting children’s needs have not kept pace. Many children’s needs go unidentified or unmet, leaving families feeling unsupported and forced to fight for help in an under-resourced system.

Girl holding mother's hand doesn't want to go to school
Many teachers also feel that school isn’t inclusive enough for children with special educational needs.
Ground Picture/Shutterstock

Schools, too, say they struggle to access the external professionals needed for assessments. In one survey, school staff ranked meeting the needs of pupils with special educational needs as their second-biggest challenge, just after budget pressures.

Lifelong effects

When needs go unmet, the consequences can be long-lasting. Persistent absence and suspension both increase the risk of young people leaving school without qualifications and not going into work or training. These issues can spill into adulthood, with poorer job prospects and a higher risk of involvement with the criminal justice system. Addressing special educational needs effectively isn’t just about education – it’s about improving life chances.

The solutions start with making mainstream education genuinely inclusive and properly funded. Schools need cultures that promote belonging and partnership with families to rebuild trust and confidence. National standards for inclusion would help, as would more training for school staff and leaders, alongside better access to specialist support professionals.

We also need to rethink what counts as success in education. A broader mix of qualifications and career paths would help young people play to their strengths and prepare for the future. Schools can also boost engagement by giving pupils more say in decisions that affect them, offering greater choice in the curriculum, and ensuring access to enrichment activities – sport, arts, volunteering and social opportunities – which are proven to improve attendance and wellbeing.

For pupils with special educational needs, timely, targeted support can make all the difference. Skilled mentors, smaller classes, adapted timetables and evidence-based support programmes can help pupils boost school attendance and academic progress. They can also help children manage their emotions and enable them to feel more connected to school. For those struggling with transitions – such as moving schools or preparing for work – proactive planning, supported internships and job coaches can ease the process and build confidence.

Even with good inclusive practice, some pupils will still struggle. In those cases, high-quality alternative provision can offer a temporary respite and a route back to mainstream education.

Unless we rethink what education is for – and how we support pupils to engage with it – thousands of young people will be denied their potential. One of us (Caroline Bond) contributed to the development of an approach that mainstream schools can use to help children feel safe in school. It was created with parents, autistic young people and professionals to offer a practical way for schools to understand and support pupils who find school attendance especially difficult.

With school attendance under national scrutiny and special educational needs funding under pressure, this is a crucial moment to ask how we can build a system that genuinely includes every young person – not just in name, but in practice.

The Conversation

Luke Munford receives funding from UKRI and the National Institute for Health and Care Research (NIHR).

Caroline Bond does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Children with special educational needs are more likely to miss school – it’s sign of a system under strain – https://theconversation.com/children-with-special-educational-needs-are-more-likely-to-miss-school-its-sign-of-a-system-under-strain-266942

Changes to the BBC’s Written Archives Centre threaten open research – and might infringe on the broadcaster’s charter

Source: The Conversation – UK – By John Wyver, Professor of the Arts on Screen, University of Westminster

The BBC’s Written Archives Centre (WAC) is housed in an unassuming bungalow on the outskirts of Reading, 40 miles west of London. It holds one of the greatest document collections of British and global history from the past century.

For half that time, researchers, storytellers and interested members of the public were able to mine its extensive resources for monographs, dissertations and broadcasts relating to the BBC. Recent changes to the conditions of access, however, mean that independent and exploratory research at the WAC is no longer possible.

The centre houses scripts, personnel files, production notes, meeting minutes, correspondence and other materials related to BBC radio and TV broadcasts since 1922. It reveals how politicians, pop stars, monarchs and artists have engaged with one of the most powerful media organisations of the past century. It also captures the debates, decisions, and everyday lives behind the BBC’s operations.

Because of the BBC’s importance, the WAC’s archives reflect countless aspects of our social, political and cultural history. The changing roles of women since the 1920s have been traced through the riches of the archive, as have transformations in ideas of class and social relations, in understandings of LGBT+ identities, and in celebrations and conflicts of race and immigration.

Even so, researchers know there is far, far more to be uncovered. The WAC is one of Britain’s most significant resources for revealing the history of the past century, second only to The National Archives housed at Kew.

But earlier this year, the WAC quietly introduced changes to who can use it and how. Personal enquiries from the public can no longer be answered, and the reading room is now only open on Wednesdays and Thursdays. Most significant for researchers was the decision to end the vetting and opening of files on request.

Many of the archive users, including myself, feel we were not involved in any meaningful consultation before these changes were made. In 2024, there had been a single online meeting at which a small number of users were asked for their suggestions for improvements. At that meeting there was no mention of the proposed changes and no sense of seeking feedback. No other consultation seems to have been undertaken.

Some two-thirds of the hundreds of thousands of WAC files have not yet been opened for use by researchers. Until early this year, the exceptional archivists there would, in response to an enquiry, identify relevant files. They would then read and, if necessary, minimally redact (removing certain personal details, for example) files that had not previously been opened.

In the work for my forthcoming history of television between the wars, Magic Rays of Light: The Early Years of Television in Britain, I estimate that roughly half of the 300-plus files I consulted were opened especially for me.

The ending of on-request vetting has been made by BBC managers for two reasons, which were shared in online meetings that I participated in. One is the straitened finances of the corporation, which have necessitated severe cutbacks to many services. Suggestions for how to help mitigate this, which were made in meetings by users, so far appear to have been ignored by those responsible for the change.

The other reason given for the ending of on-request vetting is an internal shift towards a more focused, curatorial approach to the WAC. Under the new arrangements, batches of files will be made available according to internal priorities decided, like the WAC’s new timetable, solely by the BBC.

Those objecting to this change were told that the new priorities will reflect more closely the BBC’s programming and business concerns. This aims to facilitate, for example, a smoother marking of “content moments” such as anniversaries.

More than 500 academics and independent researchers, including myself, have signed an open letter expressing “profound concern” about the changes. Recognising that the review of the BBC’s charter is fast approaching, the letter calls on the BBC “to publish a code of practice affirming continuing WAC access and the continued availability of files on request”.

Without on-request opening of files, many WAC users feel they are essentially barred from independent research and can no longer plan with any confidence new books or other projects. More generally, they point out that the BBC’s new conditions flout the generally accepted principle for responsible archives of clear separation between the provision of access and the practices of curation.

The campaigners also highlight that the WAC is a public resource paid for over decades by public funds through the licence fee. Closing down the channel for independent access, they suggest, infringes in a significant way one of the five public purposes of the BBC defined by the BBC’s Charter: “To support learning for people of all ages.”

The campaigners laid out their “public purposes” argument in a different, detailed letter sent directly to the BBC board’s chair, Samir Shah, in mid-August, and in individual letters to each of the members of the board, which has the mandate to deliver the BBC’s mission and public purposes. No response has been forthcoming.

The BBC has promised that “some” files will be newly vetted and opened up, decided solely by them, but they have not said how many or what they will be, nor have they outlined a timetable for this. The community of users who journey out to the reading room of the WAC bungalow remain frustrated in their concern to undertake meaningful independent research.

When contacted by The Conversation for comment on its changes to the WAC, the BBC responded:

We are taking on a new approach to make a wider selection of BBC history accessible and searchable, with an ambition to open up more of the written archive from 30% to 50% over the next five years.

Given the level of resource available, we are moving to a series of structured content releases rather than individual requests for specific content, which will open up the written archive further and deliver greater value for all licence fee payers.

The service will continue to offer access and reading room visits for researchers and support freedom of information and subject access requests.

Moving to a series of structured content releases rather than individual requests for specific content … will open up the archive and deliver greater value for licence fee payers and support learning for people of all ages.


Looking for something good? Cut through the noise with a carefully curated selection of the latest releases, live events and exhibitions, straight to your inbox every fortnight, on Fridays. Sign up here.


The Conversation

John Wyver has in the past received funding from the AHRC for a research project that has made use of the resources of the Written Archives Centre.

ref. Changes to the BBC’s Written Archives Centre threaten open research – and might infringe on the broadcaster’s charter – https://theconversation.com/changes-to-the-bbcs-written-archives-centre-threaten-open-research-and-might-infringe-on-the-broadcasters-charter-267929