AI is changing who gets hired – what skills will keep you employed?

Source: The Conversation – USA (2) – By Murugan Anandarajan, Professor of Decision Sciences and Management Information Systems, Drexel University

Success in the age of AI may depend less on technical skills and more on human judgment, adaptability and trust. Malte Mueller/Getty Images

The consulting firm Accenture recently laid off 11,000 employees while expanding its efforts to train workers to use artificial intelligence. It’s a sharp reminder that the same technology driving efficiency is also redefining what it takes to keep a job.

And Accenture isn’t alone. IBM has already replaced hundreds of roles with AI systems, while creating new jobs in sales and marketing. Amazon cut staff even as it expands teams that build and manage AI tools. Across industries, from banks to hospitals and creative companies, workers and managers alike are trying to understand which roles will disappear, which will evolve and which new ones will emerge.

I research and teach at Drexel University’s LeBow College of Business, studying how technology changes work and decision-making. My students often ask how they can stay employable in the age of AI. Executives ask me how to build trust in technology that seems to move faster than people can adapt to it. In the end, both groups are really asking the same thing: Which skills matter most in an economy where machines can learn?

To answer this, I analyzed data from two surveys my colleagues and I conducted over this summer. For the first, the Data Integrity & AI Readiness Survey, we asked 550 companies across the country how they use and invest in AI. For the second, the College Hiring Outlook Survey, we looked at how 470 employers viewed entry-level hiring, workforce development and AI skills in candidates. These studies show both sides of the equation: those building AI and those learning to work with it.

AI is everywhere, but are people ready?

More than half of organizations told us that AI now drives daily decision-making, yet only 38% believe their employees are fully prepared to use it. This gap is reshaping today’s job market. AI isn’t just replacing workers; it’s revealing who’s ready to work alongside it.

Our data also shows a contradiction. While many companies now depend on AI internally, only 27% of recruiters say they’re comfortable with applicants using AI tools for tasks such as writing resumes or researching salary ranges.

In other words, the same tools companies trust for business decisions still raise doubts when job seekers use them for career advancement. Until that view changes, even skilled workers will keep getting mixed messages about what “responsible AI use” really means.

In the Data Integrity & AI Readiness Survey, this readiness gap showed up most clearly in customer-facing and operational jobs such as marketing and sales. These are the same areas where automation is advancing quickly, and layoffs tend to occur when technology evolves faster than people can adapt.

At the same time, we found that many employers haven’t updated their degree or credential requirements. They’re still hiring for yesterday’s resumes while, tomorrow’s work demands fluency in AI. The problem isn’t that people are being replaced by AI; it’s that technology is evolving faster than most workers can adapt.

Fluency and trust: The real foundations of adaptability

Our research suggests that the skills most closely linked with adaptability share one theme, what I call “human-AI fluency.” This means being able to work with smart systems, question their results and keep learning as things change.

Across companies, the biggest challenges lie in expanding AI, ensuring compliance with ethical and regulatory standards and connecting AI to real business goals. These hurdles aren’t about coding; they’re about good judgment.

In my classes, I emphasize that the future will favor people who can turn machine output into useful human insight. I call this digital bilingualism: the ability to fluently navigate both human judgment and machine logic.

What management experts call “reskilling” – or learning new skills to adapt to a new role or major changes in an old one – works best when people feel safe to learn. In our Data Integrity & AI Readiness Survey, organizations with strong governance and high trust were nearly twice as likely to report gains in performance and innovation. The data suggests that when people trust their leaders and systems, they’re more willing to experiment and learn from mistakes. In that way, trust turns technology from something to fear into something to learn from, giving employees the confidence to adapt.

According to the College Hiring Outlook Survey, about 86% of employers now offer internal training or online boot camps, yet only 36% say AI-related skills are important for entry-level roles. Most training still focuses on traditional skills rather than those needed for emerging AI jobs.

The most successful companies make learning part of the job itself. They build opportunities to learn into real projects and encourage employees to experiment. I often remind leaders that the goal isn’t just to train people to use AI but to help them think alongside it. This is how trust becomes the foundation for growth, and how reskilling helps retain employees.

The new rules of hiring

In my view, the companies leading in AI aren’t just cutting jobs; they’re redefining them. To succeed, I believe companies will need to hire people who can connect technology with good judgment, question what AI produces, explain it clearly and turn it into business value.

In companies that are putting AI to work most effectively, hiring isn’t just about resumes anymore. What matters is how people apply traits like curiosity and judgment to intelligent tools. I believe these trends are leading to new hybrid roles such as AI translators, who help decision-makers understand what AI insights mean and how to act on them, and digital coaches, who teach teams to work alongside intelligent systems. Each of these roles connects human judgment with machine intelligence, showing how future jobs will blend technical skills with human insight.

That blend of judgment and adaptability is the new competitive advantage. The future won’t just reward the most technical workers, but those who can turn intelligence – human or artificial – into real-world value.

The Conversation

Murugan Anandarajan does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. AI is changing who gets hired – what skills will keep you employed? – https://theconversation.com/ai-is-changing-who-gets-hired-what-skills-will-keep-you-employed-267376

Trump’s ‘golden age’ economic message undercut by his desire for much lower interest rates – which typically signal a weak jobs market

Source: The Conversation – USA (2) – By Joshua Stillwagon, Associate Professor of Economics, Babson College

President Donald Trump has said he believes the U.S. economy has entered a ‘golden age’ on his watch. AP Photo/Mark Schiefelbein

President Donald Trump seems to want to have it both ways on the U.S. economy.

On the one hand, he recently said the economy is in its “golden age” and referred to the U.S. as the “hottest country anywhere in the world.”

Yet at the same time, he has outright demanded that the Federal Reserve sharply slash interest rates to fuel economic activity. And his recently handpicked governor, Stephen Miran, has led the charge in pushing for a bigger cut than preferred by his new colleagues at the Fed.

When an economy is strong, central banks typically don’t cut interest rates and may even raise them to avoid spurring inflation. And so to support his argument for large cuts, Miran has played up “downside risks” to the economy and a weakening labor market, contrasting with Trump’s talk of a “golden age.”

Trump and Miran also seem to be ignoring the problem of inflation, which the president has said “has been defeated” and Miran considers close enough to the Fed’s target of 2%. Yet, inflation remains high and has been picking back up in recent months – one of the core reasons the Fed has taken a gradual approach to lowering interest rates.

I’m a macroeconomist, which means I study big-picture factors affecting an economy, such as interest rates.

It’s well known that lower rates spur faster growth, and of course all presidents want a stronger economy on their watch. But the Fed’s job when it sets interest rates is to deal with whatever reality the data shows – and make decisions accordingly.

Is the economy hot or not?

In the simplest terms, the Fed raises interest rates when the economy is “hot,” or inflation is above the Fed’s 2% target, and lowers them when there are concerns about unemployment.

At its most recent meeting, in September, the Fed lowered rates a quarter of a point, citing slowing jobs growth, and increased economic uncertainty. Trump nominee Miran was the only one of the 12 members of the Fed’s policy-setting committee to instead vote for a more aggressive half-point cut.

The only credible rationale for that large of an interest rate cut, in the face of still-high inflation, is by believing the labor market is incredibly weak. According to the Fed’s preferred measure, the personal consumption expenditures index, inflation has been accelerating all summer and was 2.7% at the end of August, well above the Fed’s 2% target.

There’s no doubt jobs growth has slowed considerably in recent months, but enough to completely ignore the risk of driving inflation higher? At this point at least, the Fed doesn’t think so.

And if the economy were in fact running hot, as the president claims, the Fed would have little choice but to keep rates flat or raise them, especially given elevated inflation.

a man in a suit speaks in front of a microphone with a few people sitting in the background
Stephen Miran, who was recently nominated to the Federal Open Market Committee, has been pushing for much larger rate cuts than his colleagues.
AP Photo/Mariam Zuhaib

Risks of following political whims

This situation gets at the heart of why central bank independence matters.

Trump’s efforts to influence the Federal Reserve have not been subtle and break with Congress’ intention to insulate the Fed from political manipulation. Besides pressing for big rate cuts, he has tried to fire a member of the Board of Governors over questionable allegations and mused about removing Fed Chair Jerome Powell.

The risks of following the wishes of a president in the face of what the data shows were starkly demonstrated in 2021, when Turkey’s president, Recep Tayyip Erdogan, fired the head of the country’s central bank. The central banker was pushing rates higher to tame inflation, which was at about 20%, but Erdogan demanded they be lowered. In response, Turkey’s lira plunged to record lows and inflation soared to over 70% in 2022.

Something similar could happen in the U.S. if Trump continues down the same path of meddling with the Fed. As a sign of how much Wall Street worries about this risk, a recent study estimated that if Trump followed through on his threat to fire Powell, the stock market could lose an estimated US$1 trillion as a result.

That’s because the Fed’s credibility rests on its ability to make decisions driven by economic evidence, not political expedience. That independence means policymakers must weigh data on inflation, jobs and growth rather than election cycles or partisan demands.

Justifying deeper rate cuts

Looking ahead to the Fed’s next meeting Oct. 28-29, policymakers face a delicate balancing act. With inflation still running above target and signs of slowing jobs growth, it needs to lower rates enough to prevent a downturn but not so low that inflation spirals out of control.

Traders are putting near-100% odds of two more quarter-point cuts this year, one on Oct. 29 and another in December. This would bring the Fed’s benchmark interest rate to a range of 3.5%-3.75% by the end of 2025, down from 4%-4.25% now.

Based on Miran’s own interest rate projections, he’s likely to again push for a larger cut of a half-point or more at both meetings, as he believes the Fed’s benchmark rate should be below 3% by the end of the year.

To me, as an economist, the only way a Fed acting independently could reasonably justify such a significant cut in rates in the next few months is if the unemployment rate begins rising steadily, with the economy clearly at risk of slipping into a recession.

The Conversation

Joshua Stillwagon was a long-time organizer and judge for an academic competition hosted at the Federal Reserve Bank of Boston, and has presented research at the Federal Reserve Bank of Boston.

ref. Trump’s ‘golden age’ economic message undercut by his desire for much lower interest rates – which typically signal a weak jobs market – https://theconversation.com/trumps-golden-age-economic-message-undercut-by-his-desire-for-much-lower-interest-rates-which-typically-signal-a-weak-jobs-market-266969

What’s the difference between ghosts and demons? Books, folklore and history reflect society’s supernatural beliefs

Source: The Conversation – USA (3) – By Penelope Geng, Associate Professor of English, Macalester College

Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to curiouskidsus@theconversation.com.


What’s the difference between ghosts and demons? – Landon W., age 15, The Colony, Texas


Belief in the spirit world is a key part of many faiths and religions. A 2023 survey of 26 countries revealed that about half the respondents believed in the existence of angels, demons, fairies and ghosts. In the United States, a 2020 poll found that about half of Americans believe ghosts and demons are real.

While the subject of demons and ghosts can inspire dread, the concepts themselves can be confusing: Is there a difference between the two?

Historically, communities have understood the supernatural according to their religious and spiritual traditions. For example, the terrifying ghosts of Pu Songling’s “Strange Tales from a Chinese Studio” operate differently than those haunting the works of William Shakespeare, even though both writers lived in the 17th century.

Engraving of three men agog at the appearance of a ghostly man in armor
‘Hamlet, Horatio, Marcellus and the Ghost,’ from Shakespeare’s ‘Hamlet,’ Act 1, Scene 4.
Robert Thew/Gertrude and Thomas Jefferson Mumford Collection via The Met

Literary representations of ghosts and demons often reflect the anxieties of communities experiencing social, religious or political upheaval. As a scholar of early modern English literature, my research focuses on how everyday people in 16th- and 17th-century Europe used storytelling to navigate major social changes. This era, often called the Renaissance, was punctuated by the establishment of mass media through printing, the global spread of colonization and the emergence of modern science and medicine.

Digging into the literary archive can reveal people’s ideas about demons and ghosts – and what made them different.

Martin Luther and the Reformation

On Oct. 31, 1517, Martin Luther, an ex-law student and former monk, boldly published his Ninety-Five Theses. In it, he rejected the Catholic Church’s promise that monetary payment to the church could reduce the amount of time one’s soul spent in purgatory. What began as a local protest in Wittenberg, Germany, soon swept all the major European powers into a life and death struggle over religious reform. Towns were besieged, landscapes scorched, villages pillaged.

This period, called the Reformation, led to the establishment of new Christian denominations. Among these Protestant churches’ early teachings was the edict that purgatory did not exist and souls could not return to Earth to haunt the living. Protestant reformers insisted that after death, one’s soul was immediately judged. The virtuous flew up to God in heaven; the sinful burned in hell with the Devil.

According to Protestants, ghosts were invented by Catholic priests to scare people into obedience. For example, the English translator of Ludwig Lavater’s 1572 book “Of Ghostes and Spirites Walking by Night” insists ghosts are the “falsehood of Monkes, or illusions of devils, franticke imaginations, or some other frivolous and vaine perswasions.” Should you ever encounter an “apparition,” you must call it out for what it truly is: a devil pretending to be a ghost.

Page of a book reading 'The Tragic History of the Life and Death of Doctor Faustus' with an image of a man holding a book standing in a circle of runes summoning a demon
Title page of Christopher Marlowe’s ‘Doctor Faustus.’
Iohn Wright/Wikimedia Commons

Christopher Marlowe’s play “Doctor Faustus” comments on these debates. Written in the 1580s for a primarily Protestant audience, the play features a scene in which Dr. Faustus and his devil companion, Mephistopheles, trick the pope by snatching away his meal. A bewildered member of the papal court concludes “it may be some ghost … come to beg a pardon of your Holiness.” The audience knows full well, however, that these pranks are committed by the necromancer and his demon.

Ghostly haunting

In spite of Protestantism’s official stance against ghosts, belief in them persisted in the popular imagination.

Archival records show that ordinary people held fast to popular beliefs despite what their religious authorities decreed. For example, the casebook of Richard Napier, an astrological physician, reports several cases of “spirit” hauntings, including that of a young mother named Catherine Wells who had been “vexed … with a spirit” for three continuous years.

Popular plays provide additional evidence. Shakespeare’s “Hamlet” opens with a midnight visitation by the ghost of Hamlet’s father, telling his son he cannot rest in peace until his murderer is brought to justice. Ghostly victims seeking justice appear in other Shakespearean plays, including “Macbeth” and “Richard III.”

Cheap print, a form of common media, capitalized on the public’s interest in the paranormal. Part entertainment, part journalism, cheap print was read by all sorts of people. A 1662 pamphlet titled “A strange and wonderfull discovery of a horrid and cruel murther [murder]” describes Isabel Binnington’s unsettling encounter with the ghost of Robert Eliot. In her testimonial, she claims that Eliot’s ghost promised he would never hurt her. What he wanted was simply for her to hear his story: He had been murdered for his coins in the very house she occupied.

A 1730 broadside ballad called “The Suffolk Miracle” – still performed today – tells the tale of young lovers parted by an overprotective father. After the daughter is whisked away, her beloved dies of a broken heart. When his ghost later appears to her, she “joy’d to see her heart’s delight.”

Demonic possession

While reformed Protestant thinkers rejected the existence of ghosts, they enthusiastically accepted the reality of devils.

Reports of demonic possession were popular. Before his ascension to the English throne, King James VI of Scotland published a literary treatise on demonology in 1597. He argues that “assaultes of Sathan are most certainly practized” and “detestable slaves of the Devill” live among us.

Engraving of a man surrounded by devilish creatures tearing at and beating him
‘Saint Anthony Tormented by Demons,’ 1470–74.
Martin Schongauer/Rogers Fund via The Met

The diaries of English Puritans offer further proof that beliefs about devilish encounters were common. In the 1650s, the Calvinist preacher Thomas Hall insisted that his godliness attracted the attention of Satan like a moth to a candle. From an early age, he complained, he was subjected to “Satanicall buffettings” and terrifying dreams. He believed, however, that surviving demonic temptation demonstrated his unwavering devotion to God.

Distinguishing ghosts from demons

Based on the literature, what can we conclude about how people saw ghosts and demons?

Early modern people often represented ghosts as sad and pitiable. They were depicted as the spiritual remainder of a recently deceased person, haunting their friends and kin – or, occasionally, a stranger. They retained some of their humanity and were psychically connected to a place, such as their former home, or to a person, such as their most cherished companion.

Demons, by contrast, were almost always malevolent tricksters who served the Devil. Demons lacked knowledge of what it meant to be human. Hell was the demons’ lair. Early modern texts describe them visiting the earthly plane to corrupt, possess or tempt humans to commit self-harm or violence against others.

Faustus getting dragged to hell at the Globe Theatre.

Then and now, stories of ghosts and demons have provoked fear and wonder. Tales of the supernatural have inspired the imagination of kings, theologians, playwrights and everyday people.

Approaching the topic of the otherworldly with intellectual humility can inspire deeper curiosity about cultures across space and time. As Hamlet muses to his friend after meeting the ghost of his father, “There are more things in Heaven and Earth, Horatio, than are dreamt of in your philosophy.”


Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.

The Conversation

Penelope Geng does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. What’s the difference between ghosts and demons? Books, folklore and history reflect society’s supernatural beliefs – https://theconversation.com/whats-the-difference-between-ghosts-and-demons-books-folklore-and-history-reflect-societys-supernatural-beliefs-250997

An Indigenous approach shows how changing the clocks for Daylight Saving Time runs counter to human nature – and nature itself

Source: The Conversation – USA (3) – By Rachelle Wilson Tollemar, Lecturer in Spanish Environmental Cultural Studies, University of Wisconsin-Madison

Humans and nature can find balance in each other. timnewman/E+ via Getty Images

It is that time again. Time to wonder: Why do we turn the clocks forward and backward twice a year? Academics, scientists, politicians, economists, employers, parents – and just about everyone else you will interact with this week – are likely debating a wide range of reasons for and against Daylight Saving Time.

But the reason is right there in the name: It’s an effort to “save” daylight hours, which some express as an opportunity for people to “make more use of” time when it’s light outside.

But as an Indigenous person who studies environmental humanities, this sort of effort, and the debate about it, misses a key ecological perspective.

Biologically speaking, it is normal, and even critical, for nature to do more during the brighter months and to do less during the darker ones. Animals go into hibernation, plants into dormancy.

Humans are intimately interconnected with, interdependent on, and interrelated to nonhuman beings, rhythms and environments. Indigenous knowledges, which despite their complex, diverse and plural forms, amazingly cohere in reminding humans that we too are an equal part of nature. Like trees and flowers, we are beings who also need winter to rest and summer to bloom.

As far as we humans know, we are the only species that chooses to fight against our biological presets, regularly changing our clocks, miserably dragging ourselves into and out of bed at unnatural hours.

The reason, many scholars agree, is that capitalism teaches humans that they are separate from, and superior to, nature – like the point on top of a pyramid. That, and I argue, that capitalism wants people to work the same number of hours year-round, no matter the season. This mindset runs counter to the way Indigenous people have lived for thousands of years.

A group of people stand around an open circle on an island, as the Sun rises behind a bridge across the water.
A large gathering of people celebrate Indigenous Peoples Day in 2024, by watching the Sun rise over San Francisco Bay.
Tayfun Coskun/Anadolu via Getty Images

The nature of time and work

Indigenous views of the world are not the pyramids or lines of capitalism but the circles and cycles of life.

Concretely, time correlates with terrestrial and celestial changes. Historic records and oral interviews document that in traditional Indigenous cultures of the past, human activity was scheduled according to nature’s recurring patterns. So for example, a meeting might have been scheduled not at 4 p.m. on Thursday, but rather at the next full moon. Everyone knew well in advance when that would arise and could plan accordingly.

Such an acute sensitivity to nature’s calendar has symbolic meaning, too. To look up and see the Moon in the sky at night is to see the same Moon that someone once saw centuries ago and someone else will hopefully see centuries into the future. Time is interwoven with nature in a sense that far exceeds Western understanding. It embodies past, present and future all at once. Time is life.

The 2015 movie ‘El Abrazo de la Serpiente (Embrace of the Serpent)’ examines the relationship between Indigenous cultures of knowing and colonizing forces.

In this Indigenous context, Daylight Saving Time is nonsensical – if not outright comical. Time can’t be changed any more than a clock’s hands can grab the Sun and move its position in the sky. The Sun will continue to cycle at its gravitational will for generations – and economic systems – to come.

Like time, Indigenous approaches to work are also more expansive than the capitalist economy’s. They validate and value all life-sustaining activities as work. Taking care of oneself, of the sick, of the elderly, of the young, of the land, or even merely resting, for example, are equally valuable activities.

That’s because the objective of most Indigenous economies is not to increase an economist-invented measurement of production by working from 9 a.m. to 5 p.m., Monday through Friday. Rather, their goal is to find and generate a holistic well-being for all.

Daylight Saving Time is exclusively designed for 9-to-5 workers. It attempts to boost economic activity by giving them, and them alone, more light. Think about it: Care workers, who are predominantly women, work beyond daylight hours year-round. Where is their temporal accommodation? Though likely not malicious or even purposeful, the political intervention of Daylight Saving Time ignores the massive workforce that operates on the periphery of the mainstream economy. In some ways, it reinforces the discriminatory idea that only some workers are worthy of economic recognition and accommodation.

In this sense, Daylight Saving Time raises the question: Does the economy really need that extra hour of sunshine and worker productivity? Traditional economic philosophies would likely answer no out of principle; they may see Daylight Saving Time as trespassing the biophysical, ethical and sacred limits of the world ecology by encouraging cultures of overwork and overconsumption.

A person swipes a card in a machine on a wall.
A worker swipes a time card to clock in at the beginning of their shift.
halbergman/E+ via Getty Images

The working of time and nature

Since the invention of the clock, capitalism has increasingly treated time as an inanimate object largely independent of the environment.

While the rest of nature rises and slumbers to lunar and solar cycles, humans work and sleep to the resetting of their artificial clocks.

In their 2016 book “The Slow Professor,” humanities scholars Maggie Berg and Barbara K. Seeber connect this objectification of time to an inhumane culture of work.

Modern workers, they write, are increasingly expected to treat time as a numerical asset that can be managed, measured and controlled. Time for rest and relaxation has no countable home in the capitalist economy of life.

There are certainly practical benefits to using time to measure and monitor economic activities – such as knowing the precise time a meeting is scheduled to start and end. But Berg’s and Seeber’s work reveals how that reasonable practicality has been subverted to hold workers captive within what I argue is an unsustainable, unnatural and exploitative environment. Work time and life time have blurred into one.

In capitalism, work is expected to grow infinitely, despite existing within a finite world inhabited by limited beings. At a time when human activity depletes the world’s ecology – rather than sustaining it as it once did – this around-the-clock approach to work is simply incompatible with nature.

In sum, Daylight Saving Time reproduces the same destructive logic that has led humans and nonhumans into the present socio-ecological crises. Disobeying and dominating the laws, rhythms and shape of nature, as seen in the seasonal exploitation of human energy and labor via Daylight Saving Time, perpetuates the unparalleled social and environmental decline uniquely characteristic to the current capitalist era.

Looking backwards, progressing forward

Unlike the relatively recent inception of capitalism, Indigenous wisdom espouses a set of philosophies as old as time. It reminds humans that there are other ways of interacting with time, work and the environment – ways that existed before capitalism and that can exist afterward, too.

In my view, people might be better off if the discussion about changing the clocks in the fall and spring wasn’t about how much time we can “make use of” or how much daylight we might “save,” but rather about reducing the number of hours we are expected to be made useful – and profitable – to secure a more just and sustainable existence for all.

The Conversation

Rachelle Wilson Tollemar does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. An Indigenous approach shows how changing the clocks for Daylight Saving Time runs counter to human nature – and nature itself – https://theconversation.com/an-indigenous-approach-shows-how-changing-the-clocks-for-daylight-saving-time-runs-counter-to-human-nature-and-nature-itself-253097

Pumpkins’ journey from ancient food staple to spicy fall obsession spans thousands of years

Source: The Conversation – USA – By Shelley Mitchell, Senior Extension Specialist, Horticulture and Landscape Architecture, Oklahoma State University

Pumpkin patch excursions have become a fall staple in many U.S. households. Creative Touch Imaging Ltd./NurPhoto via Getty Images

October in much of the U.S. brings cooler weather, vibrant fall colors and, of course, pumpkin-spiced everything. This is peak pumpkin season, with most of the American pumpkin crop harvested in October.

With the pumpkin spice craze fully underway, I find myself thinking more about pumpkins. As an extension specialist working at Oklahoma State University’s botanic garden, I educate the people pouring in to buy pumpkins at our annual sale about the plant’s storied history and its prominence today.

While people often picture pumpkins as bright orange, they actually come in a wide range of colors, including red, yellow, white, blue and even green. They vary in size and texture too: Some are smooth, others warty. They can even be miniature or giant.

The word “pumpkin” comes from the Greek word “peopon,” meaning “large melon.” Botanically, pumpkins are fruits because they contain seeds, and they belong to the squash family, Cucurbitaceae. This family also includes cucumbers, zucchini and gourds. Pumpkins are grown for many purposes: food, seasonal decorating, carving for Halloween and even giant pumpkin contests.

A crowd of people look at five large pumpkins lined up on small platforms
Some pumpkins can be over 1,000 pounds. Pumpkin-growing contests are common at county and state fairs.
Joseph Prezioso/AFP via Getty Images

All 50 states produce some pumpkins, with Illinois harvesting the most. In 2023, Illinois grew 15,400 acres of pumpkins. The next largest amount was grown in Indiana, with about 6,500 acres.

Pumpkin yields vary each year, depending on the varieties grown and the growing conditions in each area. The top six pumpkin-producing states are California, Illinois, Indiana, Michigan, Pennsylvania and Washington.

Early pumpkin history

Pumpkins originated in Central and South America, ending up in North America as Native Americans migrated north and carried the seeds with them. The oldest pumpkin seeds discovered were found in Mexico and date back about 9,000 years.

Pumpkins were grown as a crop even before corn or beans, the other two sisters in a traditional Native American “three sisters” garden. The three sister crops – corn, beans and squash – are planted together, and each has a role in helping the others grow.

Native Americans planted corn in the spring, and once the plants were a few inches tall, they planted beans. The beans vine around the corn as it grows, giving them a natural trellis. Beans also have the ability to take nitrogen from the atmosphere, and with the help of bacteria they convert it into forms that plants can use, such as ammonia, for fertilizer.

After the beans started growing, it was time to plant squash, such as pumpkin. Squash leaves covered the ground, shading the soil and helping keep it moist. The giant leaves also helped reduce the number of weeds that would compete with the corn, bean and squash growth.

Every part of the pumpkin plant is edible, even the flowers. Some Native American groups would dry pumpkins’ tough outer shells, cut them into strips and weave them into mats.

Pumpkins were introduced to Europe from North America through the Columbian Exchange. Europeans found that the pumpkins grown in the New World were easier to grow and sweeter than the ones in 1600s England or France, likely due to the weather and soil conditions in the Americas.

A black and white illustration of a group of people loading pumpkins in a cart.
People have been harvesting pumpkin for centuries. This historical illustration from around 1893 shows the pumpkin harvest in Hungary.
bildagentur-online/uig via Getty Images

Baking American pumpkins

Native Americans introduced early settlers to pumpkins, and the colonists eagerly incorporated them into their diet, even making pies with them.

Early settlers’ pumpkin pies were hollowed-out pumpkins filled with milk, honey and spices, cooked over an open fire or in hot ashes. Others followed English traditions, combining pumpkin and apple with sugar and spices between two crusts.

The custard-style pumpkin pie we know today first appeared in 1796 as part of the first cookbook written and published in the United States, “American Cookery,” by Amelia Simmons. There were actually two pumpkin pie recipes: one used mace, nutmeg and ginger, the other just allspice and ginger.

The pumpkin spice craze

Pumpkin spice as one mixed ingredient was sold beginning in the early 1930s for convenience. The spice mix typically includes a blend of cinnamon, nutmeg, ginger, allspice and cloves.

Pumpkins and pumpkin spice are now synonymous with fall in America. Pumpkin spice flavoring is used in candles, marshmallows, coffees, lotions, yogurts, pretzels, cookies, milk and many other products.

A white mug with a Starbucks logo, filled with foamy coffee and powdered cinnamon on top.
Starbucks’ pumpkin spice latte kicked off the craze thath put this seasonal flavor in high demand.
Beata Zawrzel/NurPhoto via Getty Images

While pumpkin spice is available in one form or another all year long, sales of pumpkin-spiced products increase exponentially in the fall. The pumpkin spice craze is so popular that the start of the pumpkin spice season is a couple of months before the pumpkins themselves are even ready to harvest in October.

Pumpkin excursions

Americans continue to wholeheartedly embrace pumpkins today. Pumpkins in production are typically hand-harvested as soon as they mature, when the skins are hard enough to not be dented when you press it with your thumb.

Children often take field trips to pumpkin patches to pick their own. With the growing popularity of agritourism, many farmers are letting the customers go into the field and pick their own, getting more dollars per pumpkin than farmers could get by selling through the markets. Customer harvesting also reduces labor costs, produces immediate profits and builds community relationships.

In addition, farmers often combine the you-pick experience with other sources of income: corn mazes, hay rides, petting zoos and more. The customers get fresher fruit, enjoy a fun and educational activity and support the local economy.

This year you could get pumpkin spice flavors across the United States by late August, and the industry started promoting pumpkin spice season in July. Because fall has the right conditions for pumpkin picking, the season will keep its hold on pumpkin spice flavor, and consumers will continue to eagerly await its return each year.

The Conversation

Shelley Mitchell does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Pumpkins’ journey from ancient food staple to spicy fall obsession spans thousands of years – https://theconversation.com/pumpkins-journey-from-ancient-food-staple-to-spicy-fall-obsession-spans-thousands-of-years-268260

Dinosaur ‘mummies’ help scientists visualize the fleshy details of these ancient animals

Source: The Conversation – USA – By Paul C. Sereno, Professor of Paleontology, University of Chicago

A mummy of a juvenile duck-billed dinosaur, _Edmontosaurus annectens_, preserved as a dried carcass. Tyler Keillor/Fossil Lab

Dinosaur “mummies” couldn’t have been further from my mind as I trudged up a grassy knoll on the Zerbst Ranch in east-central Wyoming, followed by University of Chicago undergraduates on a field trip linked to my “Dinosaur Science” course.

As a university professor, I realized early that to understand paleontology, students would need to see first-hand where fossils are born. And that field experience had to be real, a place I wanted to be – somewhere where we had a shot at discovery.

I chose outcrops of the Lance Formation, a rock formation composed largely of sandstones laid down during last few million years of the dinosaur era. These rocks are well exposed in the parched badlands of Wyoming, crisscrossed for more than a century by dinosaur hunters. Yet, perhaps they missed something.

Then I saw it.

At the top of the hill lay a massive concretion – a hardened, iron-stained rock the size of a compact car – surrounded by some fossil bone fragments. Poking from its side were a series of small rod-shaped bones I recognized as the stomach ribs of the giant predator Tyrannosaurus rex.

Mummies nearby

But T. rex wasn’t alone among the amazing finds that field season. That same field trip, colleagues working nearby uncovered two fossilized duckbills – a plant-eating dinosaur that roamed in herds and grew to the length of T. rex. They showed signs of extraordinary preservation.

Poking out of the vertical wall of a cutbank in a seasonally dry river was a vertebra – part of the backbone – and some ossified tendons.

“What do you think?” asked my colleague Marcus Eriksen, who counts paleoartistry, science education and environmentalism as his other mainstays alongside paleontology. “You’ve got the back half of a duckbill,” I said, referring to Edmontosaurus annectens, the formal name for the dinosaur most likely to be on T. rex’s dinner menu.

It would take Marcus two field seasons to remove 15 feet of rock overlying the skeleton. To his surprise, the tail bones were covered with large areas of scaly skin and topped by a row of spikes. When I visited the exposed skeleton and took a look at its feet, I saw a hairline around the final toe bone. “Pull back, take more,” I said, wide-eyed at what I saw. “I think it has the hooves.”

A rock showing the outline of several bones and a hoof.
A juvenile duckbill dinosaur’s hoof preserved as a thin layer of clay.
Tyler Keillor/Fossil Lab

Yet another group of bone hunters in the area found a Triceratops skeleton next to a large slab of its scaly skin. Finding even a patch of skin on a skeleton merits celebration in paleontological circles. Discovering large areas of the outer fleshy surface of a dinosaur is the find of a lifetime.

Mummification mystery

How is the skin of a dinosaur “mummy” preserved? What composes the “skin impressions”?

Are these dinosaur “mummies” preserved like the human mummies from Egypt, where salt and oils applied after someone died were used to desiccate and then preserve skin, hair, internal organs and, as recently shown, their genomes?

No. Dinosaur “mummies” don’t preserve dehydrated skin. But many researchers thought that, just maybe, traces of tissue structure or even original organic materials might remain.

A hand touching a beige rock with a lined, crest pattern and dimpled texture.
A fossil preserves the scaly skin of a crest over the back of a juvenile duck-billed dinosaur.
Tyler Keillor/Fossil Lab

To lift the veil on dinosaur mummification, I needed expertise and digital savvy beyond my own. I recruited Evan Saitta to rigorously determine the composition of the ancient scaly skin, after learning he was cooking reptile skin to mimic fossilization.

I brought others on board: Dan Vidal, a Spanish paleontologist able to digitally capture surface detail in 3D; Nathan Myhrvold, a polyglot scientist fresh off studying the chemistry of barbecue; Stephanie Baumgart, a paleontologist steeped in the CT scans of living vertebrates; María Ciudad Real and Lauren Bop, the first skilled at analyzing CT scans and the latter at combining them into composite figures; Tyler Keillor, who would invent new methods to clean ancient skin tissue; and Dani Navarro, a superb Spanish paleoartist who reimagines prehistoric scenes.

Clay mask, crests, hooves and scales

We used a diamond blade to section the skin, spikes and hooves, and found that all were made of a very thin bounding layer of clay – a clay mask or template – one-hundredth of an inch (less than 1 millimeter) thick. The sand on both sides of the clay layer was indistinguishable, suggesting that when the carcass was buried, the same sand that pressed against the outside also entered the dried, hollowed carcass through many cracks and holes, filling all of the internal spaces. Even the spaces inside the spikes and hooves were sand-filled.

We found no evidence of tissue structures inside the clay layer, whether looking at scaly skin, a spike or a hoof. And we could not find any trace of original organic materials. In other words, the original skin inside the clay layer must have decayed and washed away, while the same groundwater saturated the bones en route to their fossilization.

The very real-looking skin, spikes and hooves of our duckbill are actually a mask of clay, a thin layer applied to the outside that captured all of the original form and texture of the fleshy body surface.

To test out the digital rendering we’d created, we compared the digital version of the duckbill’s hoofed foot with a fossilized duckbill footprint on a museum shelf in Canada, discovered in beds of the same age as those of the Lance Formation. We adjusted our foot up slightly in size, to see it was a snug fit. Together, the foot and footprint generated the first complete view of a duckbill’s fleshy foot.

The only duckbill alive at this time was Edmontosaurus annectens, the likely track-maker. The footprint was preserved so perfectly that it showed the scales on the sole of the foot.

The ‘mummy’ zone

The Lance Formation’s unique geology allowed many of these dinosaur mummies to be preserved under clay, in a small area.

Drilling in pursuit of natural gas and oil in Wyoming has shown that the sandstone rock composing the Lance Formation is very deep beneath the mummies, measuring more than 1,000 meters (over 3,200 feet). This is five times thicker than anywhere else in the West, suggesting that the formation subsided more quickly in the mummy zone, with periodic floods covering up dried dinosaur carcasses.

In this last epoch of the dinosur era in western North America, a monsoonal climate took hold. Severe droughts brought death to vast herds of duck-billed dinosaurs – for some, right as they looked for the last bit of water in a dry riverbed before succumbing. Flash floods followed, bringing tons of sandy sediment that would cover a sun-dried dinosaur carcass in an instant.

Only rarely do scientists have the chance to accurately visualize what any large dinosaur looked like when alive, because all we normally have are bones to reconstruct beasts with no close living analog. Dinosaur “mummies” give us that extraordinary opportunity through a fluke of preservation.

The dream research team I assembled was able to clean, scan, resize, combine and otherwise restore the life appearance of a duck-billed dinosaur from rare dinosaur mummies – breathing life back into the fossils, and allowing all to appreciate the grandeur of past life.

The Conversation

Paul C. Sereno is President of the Scitopia Foundation, a 501c3 organization dedicated to science education in out-of-school time, with operations and a founding facility in planning in Chicago (Scitopia Chicago).

ref. Dinosaur ‘mummies’ help scientists visualize the fleshy details of these ancient animals – https://theconversation.com/dinosaur-mummies-help-scientists-visualize-the-fleshy-details-of-these-ancient-animals-267619

Navigating mental illness in the workplace can be tricky, but employees are entitled to accommodations

Source: The Conversation – USA (3) – By Julie Wolfe, Assistant Professor of Psychiatry, University of Colorado Anschutz Medical Campus

Coping with mental illness can make starting and completing simple tasks at work more difficult. Fiordaliso/Moment via Getty Images

Mental health challenges can affect anyone, regardless of background or circumstance, and they are becoming more common across the United States.

In 2022, a national survey found that about 60 million American adults – approximately 23% of the U.S. adult population – were living with a mental illness, defined as a diagnosable mental, emotional or behavioral disorder.

This translates to a nearly 37% increase over the past decade.

These conditions can have a profound and lasting effect on patients’ lives, including their ability to engage meaningfully and sustainably in the workforce.

Globally, depression and anxiety are estimated to lead to 12 billion lost working days annually, costing an estimated US$1 trillion per year in lost productivity worldwide and $47 billion in the United States.

I am a medical director and practicing psychiatrist. I work with graduate students, residents, faculty and staff on a health science campus, supporting their mental health – including when it intersects with challenges in the workplace.

I often meet with patients who feel unsure about how to approach conversations with their schools, programs or employers regarding their mental health, especially when it involves taking time off for care. This uncertainty can lead to delays in treatment, even when it’s truly needed.

Mental health by the numbers

Anxiety and depression are the most common mental health conditions in the U.S.. Nineteen percent of American adults suffer from an anxiety disorder, and more than 15% have depression.

Meanwhile, about 11% of Americans experience other conditions such as post-traumatic stress disorder, commonly known as PTSD, bipolar disorder, borderline personality disorder or obsessive-compulsive disorder.

Rates of anxiety and depression increased worldwide during the COVID-19 pandemic. But one positive consequence of the pandemic is that talking about mental health has become more normalized and less stigmatized, including in the workplace.

Struggling at work

For those with mental illness, the traditional expectation of maintaining a strict separation between personal and professional life is not only unrealistic, it may even be detrimental. The effect of mental illness on a person’s work varies depending on the type, severity and duration of their symptoms.

For instance, severe depression can affect basic self-care, making it difficult to complete tasks such as bathing, eating or even getting out of bed. Severe anxiety can also be profoundly debilitating and limit a person’s ability to leave the house due to intense fear or panic. The symptoms of such severe mental illness may make it difficult even to show up to work.

On the other hand, someone struggling with mild depression or anxiety may have a hard time initiating or completing tasks that they would typically manage with ease and find it difficult to interact with colleagues. Both depression and anxiety may affect sleep, which can contribute to cognitive lapses and increased fatigue during the work day.

Someone with PTSD may find that certain environments remind them of traumatic experiences, making it difficult to fully engage in their work. And a person experiencing a manic episode related to bipolar disorder might need to take time away from work entirely to focus on their stabilization and recovery.

Knowing when to ask for help

Identifying a trusted colleague, supervisor or human resources representative can be an important first step in managing your mental health at work. While selecting the right person to confide in may be challenging, especially given the vulnerability associated with disclosing mental health concerns, doing so can open pathways to appropriate resources and tailored support services.

For instance, it might encourage an employer to consider offering access to free or low-cost mental health care if it’s not already available, or to provide flexible scheduling that makes it easier for employees to get mental health treatment.

It’s also important to be aware of changes in your mental health. The earlier you can recognize signs of decline, the sooner you can get the support that you need, which might prevent symptoms from worsening.

On the other hand, sharing sensitive information with someone who is not equipped to respond appropriately could lead to unintended consequences, such as workplace gossip, unmet expectations and increased frustration due to perceived lack of support. However, even if your supervisor or manager is not understanding, that doesn’t change the fact that you have rights in the workplace.

In 2022, U.S. Surgeon General Vivek Murthy warned that American workplaces needed to change to better support employees’ well-being.

Consider exploring accommodations

The Americans with Disabilities Act provides critical protections for individuals with disabilities in the workplace. Under the act, it is unlawful for employers to discriminate against qualified individuals based on a disability.

The law also requires employers to provide reasonable accommodations so that people who qualify are able to participate fully in the workplace provided that they do not impose undue burden on the place of employment.

There are many reasonable accommodations for workers with mental illness. These can include protected time to attend mental health appointments and flexibility in work schedules and workplace.

For instance, if your job allows for it, working from home can be helpful. If your job requires being on site, a private work space is another reasonable accommodation. Someone with anxiety might find that working in a quiet, private space helps reduce distractions that trigger their symptoms, making it easier for them to stay focused and get things done.

Other possible accommodations include providing sick leave or flexible vacation time to use for mental health days or appointments, or allowing an employee to take breaks according to their individual needs rather than a fixed schedule. Employers can also provide support by offering equipment or technology such as white noise machines or dictation software.

The role of the workplace

An organization’s commitment to supporting employee mental health can play a large role in shaping how well employees perform at work – and, ultimately, the organization’s success.

Relying on individual employees to manage their mental health is not a sustainable long-term strategy for employers and may lead to significant workplace disruptions, such as more missed work days and lower productivity.

Studies show that when employers lead targeted initiatives promoting mental health, overall workplace functioning and resilience improve. These initiatives might include educating employees on mental health, providing accessible care, helping employees have better work-life balance and designing supportive workplace policies for those who are struggling. These steps help reduce stigma and signal to employees that it’s safe to seek support.

The Conversation

Julie Wolfe does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Navigating mental illness in the workplace can be tricky, but employees are entitled to accommodations – https://theconversation.com/navigating-mental-illness-in-the-workplace-can-be-tricky-but-employees-are-entitled-to-accommodations-259802

Demolishing the White House East Wing to build a ballroom embodies Trump’s heritage politics

Source: The Conversation – USA – By R. Grant Gilmore III, Director, Historic Preservation and Community Planning Program, College of Charleston

Demolition in process on the East Wing of the White House, Oct. 23, 2025. AP Photo/Jacquelyn Martin

From ancient Egypt to Washington, D.C., rulers have long used architecture and associated stories to project power, control memory and shape national identity. As 17th-century French statesman Jean-Baptiste Colbert observed:

“In the absence of brilliant deeds of war, nothing proclaims the greatness and spirit of princes more than building works.”

Today, the Trump administration is mobilizing heritage and architecture as tools of ideology and control. In U.S. historic preservation, “heritage” is the shared, living inheritance of places, objects, practices and stories – often plural and contested – that communities value and preserve. America’s architectural heritage is as diverse as the people who created, inhabited and continue to care for it.

As an archaeologist with three decades of practice, I read environments designed by humans. Enduring modifications to these places, especially to buildings and monuments, carry power and speak across generations.

In his first term as president, and even more so today, Donald Trump has pushed to an extreme legacy-building through architecture and heritage policy. He is remaking the White House physically and metaphorically in his image, consistent with his long record of putting his name on buildings as a developer.

In December 2020, Trump issued an executive order declaring classical and traditional architectural styles the “preferred” design for new federal buildings. The order derided Brutalist and modernist structures as inconsistent with national values.

Now, Trump is seeking to roll back inclusive historical narratives at U.S. parks and monuments. And he is reviving sanitized myths about America’s history of slavery, misogyny and Manifest Destiny, for use in museums, textbooks and public schools.

Yet artifacts don’t lie. And it is the archaeologist’s task to recover these legacies as truthfully as possible, since how the past is remembered shapes the choices a nation makes about its future.

The Trump administration tore down much of the White House East Wing without either consulting historians and preservation experts or opening the project to public comment.

Architecture as political power and legacy

Dictators, tyrants and kings build monumental architecture to buttress their own egos, which is called authoritarian monumentalism. They also seek to build the national ego – another word for nationalism.

Social psychologists have found that the awe we experience when we encounter something vast diminishes the “individual self,” making viewers feel respect and attachment to creators of awesome architecture. Authoritarian monumentalism often exploits this phenomenon. For example, in France, King Louis XIV expanded the Palace of Versailles and renovated its gardens in the mid-1600s to evoke perceptions of royal grandeur and territorial power in visitors.

Many leaders throughout history have built “temples to power” while erasing or overshadowing the memory of their predecessors – a practice known as damnatio memoriae, or condemnation to oblivion.

In the ancient world, the Sumerians, Babylonians, Egyptians, Romans, Chinese dynasties, Mayans and Incas all left behind architecture that still commands awe in the form of monuments to gods, rulers and communities. These monuments conveyed power and often served as instruments of physical and psychological control.

In the 19th century, Napoleon fused conquest with heritage. Expeditions to Egypt and Rome, and the building of Parisian monuments – the Arc de Triomphe and the Vendôme Column, both modeled on Roman precedents – reinforced his legitimacy.

Albert Speer’s and Hermann Giesler’s monumental neoclassical designs in Nazi Germany, such as the party rally grounds in Nuremberg, were intended to overwhelm the individual and glorify the regime. And Josef Stalin’s Soviet Union suppressed avant-garde experimentation in favor of monumental “socialist realist” architecture, projecting permanence and centralized power.

Now, Trump has proposed building his own triumphal arch in Arlington, Virginia, just across the Potomac River from the Lincoln Memorial, as a symbol to mark the 250th anniversary of the Declaration of Independence.

Four men in military uniforms with swastika arm bands inspect a large model of a stadium.
German Chancellor Adolf Hitler and architect Albert Speer in 1932, inspecting Speer’s model for a huge stadium to be built at Nuremberg.
Corbis via Getty Images

An American alternative

Born of Enlightenment ideals of John Locke, Voltaire and Adam Smith, the American Revolution rejected the European idea of monarchs as semidivine rulers. Instead, leaders were expected to serve the citizenry.

That philosophy took architectural form in the Federal style, which was dominant from about 1785 to 1830. This clear, democratic architectural language was distinct from Europe’s ornate traditions, and recognizably American.

Its key features were Palladian proportions – measurements rooted in classical Roman architecture – and an emphasis on balance, simplicity and patriotic motifs.

James Hoban’s White House and Thomas Jefferson’s Monticello embodied this style. Interiors featured lighter construction, symmetrical lines, and motifs such as eagles, urns and bellflowers. They rejected the opulent rococo styles associated with monarchy.

Americans also recognized preservation’s political force. In 1816, the city of Philadelphia bought Independence Hall, which was constructed in 1753 and was where the Declaration of Independence and the Constitution were debated and signed, to keep it from being demolished. Today the building is a U.S. National Park and a UNESCO World Heritage Site.

Early preservationists saved George Washington’s home, Mount Vernon, Jefferson’s Monticello, and other landmarks, tying democracy’s endurance to the built environment.

Architecture, memory and Trump

In remaking the White House and prescribing the style and content of many federal sites, Trump is targeting not just buildings but the stories they tell.

By challenging narratives that depart from white, Anglo-Saxon origin myths, Trump is using his power to roll back decades of work toward creating a more inclusive national history.

These actions ignore the fact that America’s strength lies in its identity as a nation of immigrants. The Trump administration has singled out the Smithsonian Institution – the world’s largest museum, founded “for the increase and diffusion of knowledge – for ideological reshaping. Trump also is pushing to restore recently removed Confederate monuments, helping to revive “Lost Cause” mythology about the Civil War.

Trump’s 2020 order declaring classical and traditional architectural styles the preferred design for government buildings echoed authoritarian leaders like Adolf Hitler and Stalin, whose governments sought to dictate aesthetics as expressions of ideology. The American Institute of Architects publicly opposed the order, warning that it imposed ideological restrictions on design.

Trump’s second administration has advanced this agenda by adopting many recommendations in the Heritage Foundation’s Project 2025 blueprint. Notably, Project 2025 calls for repealing the 1906 Antiquities Act – which empowers presidents to quickly designate national monuments on federal land – and for shrinking many existing monuments. Such rollbacks would undercut the framework that has safeguarded places like Devils Tower in Wyoming and Muir Woods in California for over a century.

Trump’s new ballroom is a distinct departure from the core values embodied in the White House’s Federal style. Although many commentators have described it as rococo, it is more aligned with the overwrought and opulent styles of the Gilded Age – a time in American history, from about 1875 through 1895, with many parallels to the present.

In ordering its construction, Trump has ignored long-standing consultation and review procedures that are central to historic preservation. The demolition of the East Wing may have ignored processes required by law at one of the most important U.S. historic sites. It’s the latest illustration of his unilateral and unaccountable methods for getting what he wants.

A man in a suit at a podium holds out a model of a white arch topped with a gilded statue.
At a press conference on Oct. 15, 2025, President Donald Trump holds a model of a proposed arch to be built across the Potomac River from the Lincoln Memorial.
Kevin Dietsch/Getty Images

Instruments of memory and identity

When leaders push selective histories and undercut inclusive ones, they turn heritage into a tool for controlling public memory. This collective understanding and interpretation of the past underpins a healthy democracy. It sustains a shared civic identity, ensures accountability for past wrongs and supports rights and participation.

Heritage politics in the Trump era seeks to redefine America’s story and determine who gets to speak. Attacks on so-called “woke” history seek to erase complex truths about slavery, inequality and exclusion that are essential to democratic accountability.

Architecture and heritage are never just bricks and mortar. They are instruments of memory, identity and power.

The Conversation

R. Grant Gilmore III does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Demolishing the White House East Wing to build a ballroom embodies Trump’s heritage politics – https://theconversation.com/demolishing-the-white-house-east-wing-to-build-a-ballroom-embodies-trumps-heritage-politics-264947

Influencers could learn a thing or two from traditional journalism about disclosing who’s funding their political coverage

Source: The Conversation – USA – By Edward Wasserman, Professor of Journalism, University of California, Berkeley

When influencers accept money and don’t disclose it, then they’re being influenced. Bambu Productions, Getty Images

Online influencers, through their postings on Instagram, Threads, TikTok and elsewhere, have created an exuberant universe of news and commentary that often outruns mainstream media in reach and even impact. They work the same waterfront as journalism and public relations, but their relationship with those mainstay practices built around fact and advocacy is an uneasy one.

And when it comes to the rules that are supposed to keep communicators honest, they have been slow to step up, as a raging controversy over undisclosed payments to freelance influencers shows.

For the past month, social media has been ablaze with postings about a provocative story alleging improper political influence among left-leaning online commentators. Headlined “A Dark Money Group is Secretly Funding High-Profile Democratic Influencers,” it ran in Wired, the San Francisco-based magazine that specializes in tech, and was written by Taylor Lorenz, a high-profile reporter who has built a stormy career of tech coverage for outlets including The Washington Post, The New York Times and The Atlantic.

The 3,600-word article focused on Chorus, described as a secretive arm of the Sixteen Thirty Fund, whose wide-ranging support for progressive causes totals more than US$100 million a year. Starting in spring 2025, Lorenz reported, Chorus quietly recruited and supported a coterie of liberal political influencers, with monthly stipends of anywhere from $250 to $8,000.

Just how tightly Chorus sought to control what the 90-some freelancers actually produce is somewhat unclear, and was sharply disputed in the reaction to the article.

But what is clear to me, as a journalist and student of media ethics, is that any creators who conceal financial support while weighing in on matters of interest to their funders are, by implication, falsely presenting themselves as independent voices. They are no less deceitful than the business journalist who covers a company they secretly invest in.

Furious response misses the point

The Wired story declared that the program supporting influencers, called the Chorus Creator Incubator Program, “was aimed at bolstering Democratic messaging on the internet.” Funded commentators got regular briefings with lawmakers and others, organized by Chorus, on newsworthy issues.

The paid influencers also allegedly agreed to forewarn Chorus about interviews with prominent sources, Lorenz wrote, saying “creators in the program must funnel all bookings with lawmakers and political leaders through Chorus. Creators also have to loop Chorus in on any independently organized engagements with government officials or political leaders.”

And, a big red flag for anyone concerned with ethical communications practices: Participating influencers were also prohibited from telling anybody about the money they were getting.

The Wired story plainly hit a nerve and triggered a spasm of angry postings on Instagram, TikTok, Bluesky, YouTube, X, Facebook and other social media sites. But for all their passion, the comments brought to light the disheveled state of online ethics.

Ad hominem attacks predominated. Posts denounced Lorenz as a liar and a hypocrite who had no business exposing the program because she herself admittedly receives similar funding. Some said her real motive was sabotaging the left. Others praised the Chorus program as a valuable attempt to sharpen the skills of participants and enrich their reporting. Still others asserted that Chorus is not hands-on, never assigns or edits anybody’s stories and is an overdue corrective that gives left-leaning influencers just the kind of support the political right has had for years.

Only rarely did the commentary touch on what should have been the white-hot core of the problem: The absence of a shared understanding of the basic responsibilities that online influencers have to the people they serve. Those responsibilities are no different from those of journalists or professional advocates – to come clean.

As Don Heider, head of the Markkula Center for Applied Ethics at Santa Clara University, told Lorenz with admirable clarity: “If the contract for getting money from a particular interest group says you can’t disclose it, then it’s pretty simple, you can’t take the money.” Or, said the influencer Overopinionatedbrit3 on TikTok: “If you are getting paid disclose or you are an influencer being influenced.”

An obligation to disclose

The principle of disclosure is one that is widely accepted by professional communicators.

From their earliest iterations a century ago, journalism codes have recognized that conflict of interest is perhaps the most toxic threat to the credibility of reporters and the trust they seek from audiences.

The Public Relations Society of America has based its efforts to professionalize advocacy in part on an insistence that practitioners not conceal support or withhold information about whose message they are conveying – prohibitions that, sadly, are not universally observed. One notorious breach was the use of on-air “military analysts” by CNN and other networks during the Iraqi invasion. They were typically former high-ranking officers now employed by arms contractors whose paychecks depended on cordial relations with the Pentagon, but who nevertheless proffered supposedly independent expert appraisals of the U.S. military campaign to CNN viewers. None of that was disclosed to the public.

Femi Redwood, who chairs the National Association of Black Journalists LGBTQ+ task force, was one of the few respondents among the flood of comments on the Wired story who zeroed in on the absence of online standards. Redwood defined the problem as “the intersection of news and influencing without the ethics of journalism,” and called for a code that would make the ethical obligations of influencers explicit.

The world of influencers is, admittedly, a bit of a Wild West. How universally ethical guidelines would be embraced and whether platforms might find the stomach to consider enforcement remain open – but pivotal – questions.

The lure of the high road

But the social media commentariat may be receptive.

Online practitioners have long claimed greater intellectual independence and cleaner hands than the legacy newspeople they challenge, who they say are trapped in the cobwebs of institutional bias and material thralldom. Much of their claim to the high road has rested on their greater candor – renamed transparency and hailed as the “new objectivity” – which calls for influencers to fess up about their predispositions and biases rather than go the traditional mainstream route and imply they have none. Secrecy over funding, plainly, is incompatible with such transparency.

Indeed, in the current moment, when colossal news organizations have been brought to heel by an administration in Washington that uses their owners’ financial ambitions to enforce ideological discipline, the influencers’ potential ability to claim moral superiority seems even stronger.

But the freelance model doesn’t ensure independence. It may only create a shifting roster of dependencies and allegiances that are wholly invisible to the audience being served and a potent source of corruption.

Disclosure is an imperfect remedy. But failing to adopt it as a minimum expectation leaves the robust online universe with a moral taint that is lethal to trust.

The Conversation

Edward Wasserman does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Influencers could learn a thing or two from traditional journalism about disclosing who’s funding their political coverage – https://theconversation.com/influencers-could-learn-a-thing-or-two-from-traditional-journalism-about-disclosing-whos-funding-their-political-coverage-267905

You’ve just stolen a priceless artifact – what happens next?

Source: The Conversation – USA (2) – By Leila Amineddoleh, Adjunct Professor of Law, New York University

The tiara of Empress Eugénie was one of eight priceless pieces of jewelry stolen from the Louvre in Paris on Oct. 19, 2025. Zhang Mingming/VCG via Getty Images

The high-profile heist at the Louvre in Paris on Oct 19, 2025, played out like a scene from a Hollywood movie: a gang of thieves steal an assortment of dazzling royal jewels on display at one of the world’s most famous museums.

But with the authorities hot in pursuit, the robbers still have more work to do: How can they capitalize on their haul?

Most stolen works are never found. In the art crime courses I teach, I often point out that the recovery rate is below 10%. This is particularly disturbing when you consider that between 50,000 and 100,000 artworks are stolen each year globally – the actual number may be higher due to underreporting – with the majority stolen from Europe.

That said, it’s quite difficult to actually make money off stolen works of art. Yet the types of objects stolen from the Louvre – eight pieces of priceless jewelry – could give these thieves an upper hand.

A visual of a museum in the middle of a city with text describing the seven-minute chain of events for the heist that took place on Oct. 17.
At 9:30 a.m. local time on Oct. 19, 2025, four thieves reportedly used a lift mounted on a vehicle to enter the Louvre’s Apollo Gallery.
Murat Usubali/Anadolu via Getty Images

A narrow market of buyers

Pilfered paintings can’t be sold on the art market because thieves can’t convey what’s known as “good title,” the ownership rights that belong to a legal owner. Furthermore, no reputable auction house or dealer would knowingly sell stolen art, nor would responsible collectors purchase stolen property.

But that doesn’t mean stolen paintings don’t have value.

In 2002, thieves broke into Amsterdam’s Van Gogh Museum through the roof and departed with “View of the Sea at Scheveningen” and “Congregation Leaving the Reformed Church in Nuenen” in tow. In 2016, Italian police recovered the relatively unscathed artworks from a Mafia safehouse in Naples. It isn’t clear whether the Mafia actually purchased the works, but it’s common for criminal syndicates to hold onto valuable assets as collateral of some sort.

Oil painting of a group of people with a church in the background.
Van Gogh’s 1884-85 oil on canvas painting ‘Congregation Leaving the Reformed Church in Nuenen’ was one of two of the artist’s works stolen from Amsterdam’s Van Gogh Museum in 2002.
Van Gogh Museum

Other times, stolen works do unwittingly end up in the hands of collectors.

In the 1960s in New York City, an employee of the Guggenheim Museum stole a Marc Chagall painting from storage. But the crime wasn’t even discovered until an inventory was taken years later. Unable to locate the work, the museum simply removed it from its records.

In the meantime, collectors Jules and Rachel Lubell bought the piece for US$17,000 from a gallery. When the couple requested that an auction house review the work for an estimate, a former Guggenheim employee at Sotheby’s recognized it as the missing painting.

Guggenheim demanded that the painting be returned, and a contentious court battle ensued. In the end, the parties settled the case, and the painting was returned to the museum after an undisclosed sum was paid to the collectors.

Some people do knowingly buy stolen art. After World War II, stolen works circulated on the market, with buyers fully aware of the widespread plunder that had just taken place across Europe.

Eventually, international laws were developed that gave the original owners the opportunity to recover looted property, even decades after the fact. In the U.S., for example, the law even allows descendants of the original owners to regain ownership of stolen works, provided they can offer enough evidence to prove their claims.

Jewels and gold easier to monetize

The Louvre theft didn’t involve paintings, though. The thieves came away with bejeweled property: a sapphire diadem; a necklace and single earring from a matching set linked to 19th-century French queens Marie-Amélie and Hortense; an opulent matching set of earrings and a necklace that belonged to Empress Marie-Louise, Napoleon Bonaparte’s second wife; a diamond brooch; and Empress Eugénie’s diadem and her corsage-bow brooch.

These centuries-old, exquisitely crafted works have unique historic and cultural value. But even if each one were broken to bits and sold for parts, they would still be worth a lot of money. Thieves can peddle the precious gemstones and metals to unscrupulous dealers and jewelers, who could reshape and sell them. Even at a fraction of their value – the price received for looted art is always far lower than that received for legitimately sourced art – the gems are worth millions of dollars.

A display case featuring an emerald-and-diamond necklace and a pair of emerald-and-diamond earrings.
An emerald-and-diamond necklace that belonged to Napoleon’s second wife, Empress Marie Louise, was among the items stolen from the Louvre on Oct. 19, 2025.
Maeva Destombes/Hans Lucas/AFP via Getty Images

While difficult to sell stolen goods on the legitimate market, there is an underground market for looted artworks. The pieces may be sold in backrooms, in private meetings or even on the dark web, where participants cannot be identified. Studies have also revealed that stolen – and sometimes forged – art and antiquities often appear on mainstream e-commerce sites like Facebook and eBay. After making a sale, the vendor may delete his or her online store and disappear.

A heist’s sensational allure

While films like “The Thomas Crown Affair” feature dramatic heists pulled off by impossibly attractive bandits, most art crimes are far more mundane.

Art theft is usually a crime of opportunity, and it tends to take place not in the heavily guarded halls of cultural institutions, but in storage units or while works are in transit.

Most large museums and cultural institutions do not display all the objects within their care. Instead, they sit in storage. Less than 10% of the Louvre’s collection is ever on display at one time – only about 35,000 of the museum’s 600,000 objects. The rest can remain unseen for years, even decades.

Works in storage can be unintentionally misplaced – like Andy Warhol’s rare silkscreen “Princess Beatrix,” which was likely accidentally discarded, along with 45 other works, during the renovation of a Dutch town hall – or simply pilfered by employees. According to the FBI, around 90% of museum heists are inside jobs.

In fact, days before the Louvre crime, a Picasso work valued at $650,000, “Still Life with Guitar,” went missing during its journey from Madrid to Granada. The painting was part of a shipment including other works by the Spanish master, but when the shipping packages were opened, the piece was missing. The incident received much less public attention.

To me, the biggest mistake the thieves made wasn’t abandoning the crown they dropped or the vest they discarded, essentially leaving clues for the authorities.

Rather, it was the brazen nature of the heist itself – one that captured the world’s attention, all but ensuring that French detectives, independent sleuths and international law enforcement will be on the lookout for new pieces of gold, gems and royal bling being offered up for sale in the years to come.

The Conversation

Leila Amineddoleh does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. You’ve just stolen a priceless artifact – what happens next? – https://theconversation.com/youve-just-stolen-a-priceless-artifact-what-happens-next-267947