At one elite college, over 80% of students now use AI – but it’s not all about outsourcing their work

Source: The Conversation – USA (2) – By Germán Reyes, Assistant Professor of Economics, Middlebury

Students have quickly incorporated the likes of ChatGPT into their work, but little research is available on how they’re using generative AI. Photo by Alejandra Villa Loarca/Newsday RM via Getty Images

Over 80% of Middlebury College students use generative AI for coursework, according to a recent survey I conducted with my colleague and fellow economist Zara Contractor. This is one of the fastest technology adoption rates on record, far outpacing the 40% adoption rate among U.S. adults, and it happened in less than two years after ChatGPT’s public launch.

Although we surveyed only one college, our results align with similar studies, providing an emerging picture of the technology’s use in higher education.

Between December 2024 and February 2025, we surveyed over 20% of Middlebury College’s student body, or 634 students, to better understand how students are using artificial intelligence, and published our results in a working paper that has not yet gone through peer review.

What we found challenges the panic-driven narrative around AI in higher education and instead suggests that institutional policy should focus on how AI is used, not whether it should be banned.

Not just a homework machine

Contrary to alarming headlines suggesting that “ChatGPT has unraveled the entire academic project” and “AI Cheating Is Getting Worse,” we discovered that students primarily use AI to enhance their learning rather than to avoid work.

When we asked students about 10 different academic uses of AI – from explaining concepts and summarizing readings to proofreading, creating programming code and, yes, even writing essays – explaining concepts topped the list. Students frequently described AI as an “on-demand tutor,” a resource that was particularly valuable when office hours weren’t available or when they needed immediate help late at night.

We grouped AI uses into two types: “augmentation” to describe uses that enhance learning, and “automation” for uses that produce work with minimal effort. We found that 61% of the students who use AI employ these tools for augmentation purposes, while 42% use them for automation tasks like writing essays or generating code.

Even when students used AI to automate tasks, they showed judgment. In open-ended responses, students told us that when they did automate work, it was often during crunch periods like exam week, or for low-stakes tasks like formatting bibliographies and drafting routine emails, not as their default approach to completing meaningful coursework.

**In the graphic explainer, add “tasks” after

Of course, Middlebury is a small liberal arts college with a relatively large portion of wealthy students. What about everywhere else? To find out, we analyzed data from other researchers covering over 130 universities across more than 50 countries. The results mirror our Middlebury findings: Globally, students who use AI tend to be more likely to use it to augment their coursework, rather than automate it.

But should we trust what students tell us about how they use AI? An obvious concern with survey data is that students might underreport uses they see as inappropriate, like essay writing, while overreporting legitimate uses like getting explanations. To verify our findings, we compared them with data from AI company Anthropic, which analyzed actual usage patterns from university email addresses of their chatbot, Claude AI.

Anthropic’s data shows that “technical explanations” represent a major use, matching our finding that students most often use AI to explain concepts. Similarly, Anthropic found that designing practice questions, editing essays and summarizing materials account for a substantial share of student usage, which aligns with our results.

In other words, our self-reported survey data matches actual AI conversation logs.

Why it matters

As writer and academic Hua Hsu recently noted, “There are no reliable figures for how many American students use A.I., just stories about how everyone is doing it.” These stories tend to emphasize extreme examples, like a Columbia student who used AI “to cheat on nearly every assignment.”

But these anecdotes can conflate widespread adoption with universal cheating. Our data confirms that AI use is indeed widespread, but students primarily use it to enhance learning, not replace it. This distinction matters: By painting all AI use as cheating, alarmist coverage may normalize academic dishonesty, making responsible students feel naive for following rules when they believe “everyone else is doing it.”

Moreover, this distorted picture provides biased information to university administrators, who need accurate data about actual student AI usage patterns to craft effective, evidence-based policies.

What’s next

Our findings suggest that extreme policies like blanket bans or unrestricted use carry risks. Prohibitions may disproportionately harm students who benefit most from AI’s tutoring functions while creating unfair advantages for rule breakers. But unrestricted use could enable harmful automation practices that may undermine learning.

Instead of one-size-fits-all policies, our findings lead me to believe that institutions should focus on helping students distinguish beneficial AI uses from potentially harmful ones. Unfortunately, research on AI’s actual learning impacts remains in its infancy – no studies I’m aware of have systematically tested how different types of AI use affect student learning outcomes, or whether AI impacts might be positive for some students but negative for others.

Until that evidence is available, everyone interested in how this technology is changing education must use their best judgment to determine how AI can foster learning.

The Conversation

Germán Reyes does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. At one elite college, over 80% of students now use AI – but it’s not all about outsourcing their work – https://theconversation.com/at-one-elite-college-over-80-of-students-now-use-ai-but-its-not-all-about-outsourcing-their-work-262856

Data that taxpayers have paid for and rely on is disappearing – here’s how it’s happening and what you can do about it

Source: The Conversation – USA (2) – By Margaret Levenstein, Research Professor at the Institute for Social Research, University of Michigan

Many U.S. government agencies collect data and make it publicly available. Anna Moneymaker/Getty Images

People rely on data from federal agencies every day – often without realizing it.

Rural residents use groundwater level data from the U.S. geological survey’s National Water Information System to decide where to dig wells. High school coaches turn to weather apps supported by data from the National Weather Service to decide when to move practice inside to avoid life-threatening heat. Emergency managers use data from the Census Bureau’s American Community Survey to ensure that residents without vehicles have seats on evacuation buses during local emergencies.

On Jan. 31, 2025, websites and datasets from across the federal government began to disappear. As that happened, archivists and researchers from around the world sprang into action, grabbing what they could before it was gone.

Trust in the federal statistical system took another hit when Bureau of Labor Statistics Commissioner Erika McEntarfer was fired on the heels of a dismal Aug. 1, 2025, employment report.

And reduced data collection at the bureau was already causing concern before her dismissal. The bureau has ceased collection of critical inputs to the Consumer Price Index, likely reducing that inflation indicator’s accuracy, especially at the level of specific locations and products.

As researchers of economics and epidemiology at the University of Michigan, we have spent years working with data, often from the federal government. When data and information began to disappear, we were spurred into action to preserve these important public goods.

The Inter-university Consortium for Political and Social Research, where we work – commonly known as ICPSR – has been making data from governments and researchers available for more than 60 years. We are stewards of this data, preserving it and ensuring that it is accessible in a safe and responsible manner.

Unfortunately, government data is now at risk of becoming less available or disappearing. But there are steps that researchers – and the public – can take to reduce that risk.

Data at risk

Some 8,000 pages were removed from federal websites within a few days of Jan. 31, 2025. Though many were soon restored following substantial outcry and some court orders, it’s still unclear how the restored webpages and datasets may have been changed.

Webpage showing a search bar, several categories of data, and a mission statement about open government data
Data.gov, launched in 2009, lists many datasets available from the government, providing pointers back to the agency where the data resides. Congress codified this data transparency in the Open Government Data Act in 2019.
Screenshot by The Conversation, CC BY-SA

In one preliminary examination, researchers found that 49% of the 232 datasets they reviewed had been substantially altered, including the replacement of the word “gender” with “sex.” This alteration can obscure nonbinary gender identities. Only 13% of the changes the researchers found were documented by the government.

U.S. government data has also become less accessible because of mass firings of federal workers and the dismantling of entire agencies.

Important efforts like the Data Rescue Project and the Internet Archive have been able to preserve a great deal of knowledge and data, but they are mostly limited to publicly available data and information.

No one left to vet data

Many important government data resources contain sensitive or identifying information. This means officials must vet requests before they grant access to data rescue efforts. But many agencies have had their ability to conduct vetting and manage access severely curtailed and, in some cases, eliminated altogether.

Take the Pregnancy Risk Assessment Monitoring System, which provides key data on maternal and child health from around the United States. The Centers for Disease Control and Prevention integrates data collected at state and local levels and adds population information to come up with estimates. While some of this data is publicly available, access to most data from 2016 and later requires a request to the CDC and a signed data use agreement.

At the start of 2025, multiple researchers reported to our team at the Inter-university Consortium for Political and Social Research that the CDC had stopped processing these requests. In February, researchers discovered that the Pregnancy Risk Assessment Monitoring System would be discontinued.

The CDC suggested that data collection would restart at some point. But on April 1, the entire Pregnancy Risk Assessment Monitoring System team was laid off. This made one of the most valuable sources of data on the health of mothers and babies largely inaccessible, and put plans for its future in limbo.

Similar situations have played out at other agencies, including the dismantled U.S. Agency for International Development and the National Center for Education Statistics. Data collected, cleaned and harmonized using taxpayer dollars is now languishing on inaccessible servers.

Inaccessible data

The portal that researchers use to apply for access to restricted federal statistical data now includes a list of data that researchers can no longer access.

Screenshot of the dataset search bar with a notice above listing datasets that are not available
The portal that researchers use to apply for access to restricted datasets from 16 agencies has added a list of several large datasets that are no longer available.
The Conversation, CC BY-ND

Some organizations are leading efforts to restore access to particular datasets. The Inter-university Consortium for Political and Social Research, for instance, has an agreement with USAID to preserve and provide access to USAID’s education data. Unfortunately, these efforts barely scratch the surface. With very few staff left, there isn’t a clear estimate of which other USAID resources remain inaccessible.

According to our count, 354 restricted datasets from the federal statistical system’s Standard Application Portal have become unavailable due to firings, layoffs and funding cuts.

Data is critical for people and the state and local governments that represent them to make good decisions. Federal data is also used for oversight, so that researchers can verify that the government is doing what it’s supposed to in accordance with its congressionally mandated missions. Government efficiency requires accountability.

And accountability requires high-quality and timely data on operations.

The mass firings of federal employees means that those tasked with ensuring this accountability are doing so while struggling to obtain necessary data.

So where do we go from here?

While the pace of intentional government data removal appears to have slowed, it hasn’t stopped. New datasets under threat of disappearing are being rescued daily. Restructured federal agencies and related changes to – or neglect of – official websites can make data difficult or impossible to find.

What you can do

If you identify data that is at risk, perhaps because its collection has been discontinued or it covers a controversial topic, you can report your observations to the Data Rescue Project, a grassroots effort of archivists, librarians and other concerned people.

The Data Rescue Project has been working for months to identify data and preserve government data, including in the Inter-university Consortium for Political and Social Research’s DataLumos open-access archive.

Similarly, the Public Environmental Data Partners, a coalition of nonprofits, archivists and researchers, are preserving federal environmental data and have a nomination form.

Efforts to identify restored data that has been altered are also gaining steam.

Dataindex tracks Federal Register notices that describe proposed changes to 24 widely used datasets from across the federal government, including the American Community Survey from the Census Bureau, the National Crime Victimization Survey from the Bureau of Justice Statistics, and the National Health Interview Survey from the CDC. The website also facilitates comment on proposed alterations.

You can help researchers understand the scale of data alterations that have been, and continue to be, made. If you notice changes in public datasets, you can share that information with the American Statistical Association’s FedStatMonitoring project.

The Inter-university Consortium for Political and Social Research is continuing our efforts to ensure the preservation of, and access to, existing data, including from the Bureau of Labor Statistics and the Consumer Financial Protection Bureau.

At the same time, we and other groups are planning future efforts in data collection to avoid gaps in our knowledge.

The federal statistical system is both large and complex, including hundreds of thousands of datasets that people depend on in many ways, from weather forecasts to local economic indicators. If the federal government continues to step back from its role as a provider of high-quality, trusted data, others – including state and local governments, academia, nonprofits and companies – may need to fill the gap by stepping up to collect it.

The Conversation

Margaret Levenstein receives or has received funding from the National Science Foundation, the National Institutes of Health, and the U.S. Census Bureau.

John Kubale receives funding from the National Institutes of Health and Flu Lab. He previously worked for the US Centers for Disease Control and Prevention. .

ref. Data that taxpayers have paid for and rely on is disappearing – here’s how it’s happening and what you can do about it – https://theconversation.com/data-that-taxpayers-have-paid-for-and-rely-on-is-disappearing-heres-how-its-happening-and-what-you-can-do-about-it-251787

Twelver Shiism – a branch of Islam that serves both as a spiritual and political force in Iran and beyond

Source: The Conversation – USA (3) – By Massumeh H. Toosi, PhD Student in Philanthropic Studies, Indiana University

Iranian Shiite mourners during Ashura, the 10th day of Muharram, on July 6, 2025, in Tehran. Photo by Majid Saeedi/Getty Images

Twelver Shiism is the largest branch within Shiism – one of the two major sects within Islam. Shiism is the second-largest tradition within Islam overall, following the Sunni tradition.

Iran is the only country to have Twelver Shiism as its official religion. In this tradition, religious leaders known as the marājiʿ al-taqlīd – the highest-ranking cleric within Twelver Shiism – and other high-ranking clerics, including ayatollahs, are regarded as moral and spiritual authorities whose guidance extends to both religious and political matters.

The second-largest population of Twelvers after Iran is in Iraq. Other major communities live in Pakistan, India, Lebanon, Azerbaijan and other countries of the Persian Gulf, such as Bahrain and Kuwait. There are also Twelver communities in some Western countries.

I am a practicing Twelver and have worked for an anthropological research project that highlights the rich cultural traditions of ethnic groups across Iran, based on written historical documents that cover various topics. This experience deepened my appreciation for Iran’s diversity, including the many ways in which Twelver Shiism is practiced and understood. Twelver Shiism is deeply rooted in a spiritual, theological and ethical tradition with over a millennium of history.

In Twelver Shiism, these values touch many aspects of daily and communal life. They are present in traditions and rituals, such as during Muḥarram, the first month of the Islamic calendar. They are also reflected in art, architecture and philanthropy as moral and religious obligations.

History and core beliefs

According to the Shiite tradition, the Prophet Muhammad’s family holds a special, divinely guided role in both religious and political leadership of the Muslim community.

Twelver Shiites, however, believe in a continuous line of 12 imams, considered to be descendants of the prophet through his daughter Fatima and son-in-law Ali. These imams represent moral integrity and spiritual authority and have a deep knowledge of the Quran and Islamic law.

The origin of Shiite identity traces back to the period following Muhammad’s death in A.D. 632. One group of Muslims supported Abu Bakr, a close companion of the prophet, as the first caliph – the successor to Muhammad and the leader of the Muslim community. This group later came to be known as Sunnis. Others believed that Ali, the prophet’s cousin and son-in-law, had been designated to lead. This group became known as Shi‘at Ali – the Party of Ali – which eventually evolved into the Shiite branch of Islam.

In the following years, Imam Hussein ibn Ali – the grandson of Muhammad – refused to recognize the authority of the Umayyad caliph Yazid, the ruler of the Umayyad dynasty from 680 to 683. Yazid’s rule marked the beginning of dynastic succession in the caliphate, a change many Shiites criticized as a departure from earlier Islamic principles of leadership. Hussein objected to Yazid’s claim on both political and moral grounds. He questioned Yazid’s legitimacy and refused to pledge allegiance to a ruler he believed was unjust.

Accompanied by a small group of companions and family members, Hussein embarked on a journey toward Kufa, Iraq, where he was intercepted and ultimately killed at the Battle of Karbala in 680. In Twelver Shiism, this death is revered as martyrdom, and the event holds enormous historical and religious significance, as it stands as a symbol of resistance to injustice when faced with tyranny. The remembrance of Karbala stands at the heart of the Shiite worldview. Within the Twelver tradition, it affirms the right to resist injustice.

In the following centuries, further differences over succession led to divisions within Shiites. Twelver Shiites recognize 12 imams, while other groups, such as the Ismailis and Zaydis, follow different descendants of Ali and have formed their own interpretations of religious authority.

Mourning and reflection

The battle of Karbala is mourned as a tragedy but remembered as a moral triumph among Shiite Muslims.

During the month of Muḥarram, the first 10 nights are set aside for reflection on the martyrdom of Hussein. This period culminates on the 10th day known as Āshūrā.

People seated in rows on a carpet, listening to a man in a black robe and white turban, in a hall with ornate decorations and a large chandelier.
The observances of Ashura, the 10th day of Muharram, on July 6, 2025, in Tehran.
Majid Saeedi/Getty Images

During these nights, Shiite families and communities create spaces of grief and remembrance. In many neighborhoods, shrines and mosques – grand or modest – are adorned with black flags, handwritten prayers, chains and candles.

Taziyyah – elegies and processions – enact the story of the martyrdom of Hussein as a symbol of resistance and sacrifice.

During those 10 days, people often turn living rooms, basements or alleyways into mourning spaces. They hang black, red and green cloth on the walls, and they light candles. Families and neighbors gather to recite poetic elegies and serve tea and food. These moments bring mourning into the rhythm of daily life.

The ritual and the art of the Taziyyah.

Culture and art

Love for the prophet’s family, and a continuing search for justice and spiritual meaning, has inspired poetry, architecture, ritual and daily practice among Twelver Shiites across generations.

In Twelver contexts, especially in Iran and Iraq, grief is expressed through shrine design. Holy shrines are decorated with mirror mosaics, glowing tiles and engraved prayers. This architecture has symbolic and spiritual significance that deepens the sense of awe and respect in visitors, conveying a sense of hope for divine light and inviting reflection on martyrdom as the most honored form of death, or ashraf al-mawt.

Ornately designed ceiling with mirrors set amid colorful and artistically designed tiles.
Decorative ceiling of the Imam Hussein Shrine in Karbala, Iraq.

There are other examples of how the spiritual memory of Karbala has inspired Twelver Shiites across generations: Artists in the the Saqqakhana movement, an Iranian art movement that emerged in the 1950s and ‘60s, drew inspiration from devotional structures and other popular rituals of Shiite piety. The word saqqakhanan – house of the water-giver – refers to small shrinelike fountains installed on neighborhood corners, where water is offered in remembrance of Hussein’s thirst at Karbala.

These structures, often decorated with chains, candles, mirrors and handwritten prayers, serve as modest devotional spaces where passersby pause for reflection or prayer. The artists used materials such as calligraphy, amulets, talismans, prayer beads and cloth, and combined them with techniques from modern painting and sculpture. Their goal was not to depict faith directly but to translate its emotional and symbolic language of devotion into visual expression. This movement carried into modern art galleries.

Philanthropic practices

This living tradition also shapes how Twelver Shiism understands generosity and philanthropy as a moral and religious obligation.

Philanthropy in Twelver Shiism evolved under the guidance of contemporary “marāji,” the highest-ranking religious scholars, who interpret the obligations of giving in light of present-day realities.

Like all Muslims, Twelvers are required to give zakat, a key Islamic philanthropic tradition and an obligatory act of giving. In addition, Shiite tradition uniquely includes Khums, a 20% religious obligation on annual savings and surplus income, which Twelvers pay directly to recognized Shiite religious authorities or their appointed representatives. Half is allocated to the needy descendants of the prophet, while the other half is entrusted to religious authorities to support religious, educational and charitable initiatives.

Shiite communities also sustain charitable giving through the institution of waqf, or charitable endowment, which supports mosques, religious schools and aid for the poor.

According to Islamic scholar Seyyed Hossein Nasr, the diversity within Islam, including the Shiite tradition, reflects the richness of Islam. Twelver Shiism, in this view, stands as a profound spiritual path within the broader Islamic tradition.

The Conversation

Massumeh H. Toosi does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Twelver Shiism – a branch of Islam that serves both as a spiritual and political force in Iran and beyond – https://theconversation.com/twelver-shiism-a-branch-of-islam-that-serves-both-as-a-spiritual-and-political-force-in-iran-and-beyond-259853

Size matters, but so does beauty and vigour — at least when it comes to peacocks

Source: The Conversation – Canada – By Rama Shankar Singh, Professor (Emeritus) of Biology, McMaster University

In 1871, Charles Darwin introduced his theory of sexual selection by female choice in The Descent of Man. He suggested females of a species would exhibit a preference for beauty and ornamentation when choosing mates, leading to a prevalence of those traits.

Darwin claimed this explained the evolution of the peacock’s long tail. More than 150 years later, evidence from peafowl research challenges Darwin’s theory.

Our research on the peacock’s long tail discovered a simple developmental rule that explains its symmetry, complexity and beauty. It suggests peahens choose their mates on the basis of size, vigour and beauty, not beauty alone, as Darwin had thought.

Darwin’s assumptions

Darwin saw the peacock’s impractically long tail as maladaptive; it was too long to be explained by his grand theory of natural selection that held that species evolved only traits that could help them survive.

As he wrote to a fellow scientist: “The sight of a feather in a peacock’s tail, whenever I gaze at it, makes me sick!”

Darwin made two implicit assumptions that, our research shows, undermine his sexual selection theory. First, Darwin could not see that maladaptation can also be a product of adaptation since trade-offs between traits are common in nature.

Peacock tails here refer to the long, irridescent feathers that trail behind. Taller trains (the height when the feathers are fanned out) can be beneficial to males in securing mates, but at the same time, long tails are maladaptive because, for example, they may hinder escape from predators.

a peacock without its showy feathers
After mating season, peacocks shed their long trains.
(J.M.Garg/Wikimedia Commons)

Second, Darwin assumed peahens admired the peacock tail “as much as we do” and the birds assessed mates on the basis of esthetic appeal. He argued that birds have a feel for beauty. Later, this explanation would set the stage for research exploring how females assess beauty in their mates.

Researchers focused on the tail’s brightly coloured eye-shaped spots, but a large number of studies have been done over the last 30 years and no uncontroversial support for eyespot-based female choice has been found.

Complexity and vision

As a fruit fly geneticist interested in the variation and evolution of sex and reproduction-related genes, I unexpectedly stumbled on the evolution of the peacock’s long tail. I noticed its excessive complexity and wondered if peahens saw what we see.

I examined museum specimens of peacock tails and made two important discoveries. First, I found that a zigzag/alternate arrangement of follicles gave rise to the symmetry, complexity and beauty of the peacock’s train. It is remarkable that this alternate arrangement, the densest form of spherical packing known, would produce such wonderful effects when applied to living things.

a black and white engraving of a peacock sitting in a branch.
An illustration of a peacock published in Darwin’s The Descent of Man.
(Wikimedia Commons)

Second, because feathers and eyespots are parts of the same structure, the size of the train and the number of eyespots are developmentally correlated. Peahens cannot see eyespots and train size as separate traits, as we do; peahens react only to the green-blue colour of the eyespots and the eyespots are too small to see from distance. Therefore peahens view the tails as one complex trait that combines train size and some aspects of the eyespot colours.

What this means is that females cannot see eyespots without seeing the train first, which raises the possibility of direct selection based on the train and not the indirect result of selection through the attractiveness of the eyespots.

Since sexual selection and mate choice are an important part of the standard evolutionary processes involved in natural selection, there is no need for a separate sexual selection theory. Darwin was wrong in this respect.

Addressing beliefs

For a variety of reasons, the sexual selection theory found scant support during Darwin’s time. Naturalist Alfred Russel Wallace, the co-discoverer of natural selection, was among those who argued sexual selection was subsumed under natural selection.

But Darwin had other reasons to push his sexual selection theory. He used it to solve three problems at once. First, of course, to explain the evolution of secondary and often exaggerated sexual traits, particularly in birds, including peafowl.

Second, he used his theory to explain race formation in humans, arguing for inherent race-specific standards of beauty that worked as a means of isolation between races.

Prevailing Victorian views, however, held women as weak and unable to exercise decisive preference on males. They also saw appreciation of beauty as an exclusively human trait not shared with other animals. This led Darwin to craft a theory attributing beauty-based female choice in birds and beauty-based male choice in humans.

Last, Darwin used peacock feathers to challenge the religious establishment and open the door to the esthetic appreciation of the animal world — beauty, intelligence and morality, which were taken as God-given.

This research provides reasons to reflect on why sexual selection theory is controversial, even after a century and a half. Sexual selection as a process of mate choice is common sense, but sexual selection as a theory is wrong.

The Conversation

Rama Shankar Singh received funding from Natural Sciences and Engineering Research Council of Canada He is affiliated with Centre for Peace Studies, McMaster University.

ref. Size matters, but so does beauty and vigour — at least when it comes to peacocks – https://theconversation.com/size-matters-but-so-does-beauty-and-vigour-at-least-when-it-comes-to-peacocks-261070

Cultivating for color: The hidden trade-offs between garden aesthetics and pollinator preferences

Source: The Conversation – USA (2) – By Claire Therese Hemingway, Assistant Professor of Ecology & Evolutionary Biology, University of Tennessee

Colorful gardens can be pollinator-friendly with native flowering plants. Borchee/E+ via Getty Images

People often prioritize aesthetics when choosing plants for their gardens. They may pick flowers based on colors that create visually appealing combinations and varieties that have bigger and brighter displays or more fragrant and pleasant-smelling flowers. Some may also choose species that bloom at different times in order to maintain a colorful display throughout the growing season.

Many gardeners also strive for ecological harmony, seeking to maintain pollinator-friendly gardens that support bees, butterflies, hummingbirds and other pollinators. But there are some notable ways in which the preferences of humans and pollinators have the potential to diverge, with negative consequences for pollinators.

As a cognitive ecologist who studies animal decision-making, I find that understanding how pollinators learn about and choose between flowers can add a helpful perspective to garden aesthetics.

Pollinator preferences and rewards

Over millions of years, plants have evolved suites of floral traits to attract specific types of pollinators.

A hummingbird with its beak inside a small, bright red flower with a narrow tube-like shape
Hummingbirds are attracted to and pollinate red, tubular flowers.
AGAMI stock/iStock via Getty Images Plus

Pollinators can be attracted to flowers based on color, pattern, scent and texture. For instance, hummingbirds typically visit bright red and orange flowers with narrow openings and a tubular shape. The striking red cardinal flower is one that is primarily pollinated by hummingbirds, for example.

Bees are often attracted to blue, yellow and white flowers that can be either narrow and tubular or open. Lavender, sage and sunflower plants all bear flowers that attract bee pollinators.

A bee perching on a lavender sprig with many small, purple flowers
Bees pollinate blue, purple, white and yellow flowers, such as the lavender flowers here.
Leila Coker/iStock via Getty Images Plus

Flowers offer rewards to visiting pollinators. Nectar is a sugar-rich solution that flowers produce to attract pollinators, which use it to meet their energy needs.

Flowers have an ulterior motive for providing this energy source. While drinking the nectar, pollen can get stuck on the pollinator and transferred to the next flower it visits. This process is essential for the plant’s reproductive success. Pollen contains the plants’ male gametes, which, when deposited onto another flower in the right place, can fertilize the female gametes and produce seeds that grow into new plants. Pollen is also nutrient-rich, containing proteins, lipids and amino acids. Many species, such as bees, collect pollen to feed their developing young.

A battle for resources

Pollinators visit flowers in search of floral rewards. But modifying the features of flowers for aesthetics can be a hindrance to pollinators trying to get these rewards. For example, popular garden plants such as roses and peonies are often bred to have more petals and larger flower heads, making them more visually striking and appealing to humans. But these extra petals may block a pollinator’s access to the center of the flower, where floral rewards are located.

Further, plants have limited resources. Spending them on building aesthetically pleasing but energetically expensive features can mean there’s less left to invest in signals and rewards essential for attracting pollinators. In extreme cases, breeding for aesthetics has led to plants with what scientists and gardeners call “double flowers.” In these varieties, extra petals replace reproductive parts entirely. These plants are often altogether unrewarding for pollinators, since the flowers no longer produce nectar or pollen. Double flowers occur from mutations that convert the pollen-producing stamens and other reproductive organs into petals. They can occur naturally but are rare in wild populations.

Since these double flowers cannot spread pollen or produce seeds effectively, they are unable to reproduce in the wild and pass these mutations on. To cultivate them as garden varieties, people propagate these plants through cuttings – small sections cut from the stem that can be rooted to grow clones of the parent plant. Many common garden plants have popular double-flowered varieties, including roses, peonies, camellias, marigolds, tulips, dahlias and chrysanthemums.

Roses, for example, have become synonymous with having many densely-packed petals. But these popular garden varieties are usually double-flowered or have many extra petals blocking access to the center of the rose and provide no rewards to pollinators.

Consider making your gardens friendlier to pollinators by avoiding these double-flowered plants, and ask your local garden center for recommended varieties if you need help.

Two pink flowers side by side, the left one has five petals, visible and accessible stamens in the center. The right one has many more petals packed densely, and no stamens visible
The five-petaled wild rose, left, is much better for pollinators than garden roses with double flowers and many more petals but no reproductive parts.
(L) Clara Nila/iStock and (R) Alex Manders/iStock via Getty Images Plus

Sometimes gardeners intentionally prevent plants from flowering, which limits or eliminates their value to pollinators.

For example, herbs such as thyme, oregano, mint and basil are generally most flavorful and tender before the plant begins to flower. Once it flowers, the plant diverts energy to reproductive structures, and leaves become tougher and lose flavor. As a result, gardeners often pinch off flower buds and harvest leaves frequently to promote continued growth and delay flowering. Letting some of your herbs flower occasionally can help pollinators without affecting your kitchen supplies too much.

Stems of a flowering basil plant with green leaves and tiny white flowers
Letting garden herbs such as basil flower can be beneficial to pollinators.
Rafael Goes/iStock via Getty Images Plus

Finding the right flowers

Flowers with unusual colors and stronger fragrances can be difficult for pollinators to detect and recognize.

People might favor bright and unusual flower colors in their gardens over naturally occurring shades. But this preference might not align with what the pollinators have evolved to favor in nature. For example, planting human-preferred colors, such as white or pink morphs of hummingbird-pollinated flowers that are typically red, can reduce a flower’s visibility and attractiveness. Even when cultivated varieties have a similar enough color to natural ones to attract pollinators, they may lack other important visual components, such as ultraviolet floral patterns that can guide pollinators to nectar sources.

Breeding plant varieties to accentuate particular aesthetic traits may have unintended consequences for other traits. Changes in flower color through selective breeding can also affect leaf color, as genes involved in pigment production can affect multiple plant tissues. Besides changing the overall appearance of the plant, changing leaf color may alter background contrast between flowers and the leaves, which can make flowers less conspicuous to pollinators.

Many pollinators use a combination of color and scent to detect and discriminate between flowers. But breeding for traits such as color and brightness can alter floral scents due to unintended genetic changes or energetic trade-offs.

Scent helps pollinators locate flowers from farther distances, so unfamiliar or reduced fragrance may make flowers harder to find in the first place. Disruption in either color or scent can make flowers less noticeable to pollinators expecting a familiar pairing and can hinder a pollinator’s ability to learn which flowers are suitable for them in the first place.

A balanced approach for healthy gardens

When flowers are harder to find, pollinators are less likely to visit them. When they offer poor or no rewards, pollinators quickly abandon them for better options. This disruption in plant-pollinator interactions has implications not only for pollinator health but also for garden vitality. Many plants rely on animal pollinators to reproduce and make mature seeds. These are either collected by gardeners or allowed to drop to the ground and sprout on their own to grow new flowering plants the following year.

A close-up of a bee covered in pollen on top of a yellow flower
Pollinators not only transfer pollen from flower to flower, they also gather pollen to feed their young.
John Kimbler/500px via Getty Images

When selecting plants for a garden, gardeners who want to support pollinators might consider choosing native varieties that have evolved alongside local pollinators.

In most areas of the country, there are native plants with colorful and interesting flowers that bloom at different times, from early spring to late fall. These plants tend to produce reliable floral signals and offer the nectar and pollen needed to support pollinator nutrition and development.

Sterile varieties and double flowers offer little or no rewards for pollinators, and gardens with them may not attract as many pollinators. Letting herbs flower after some harvesting is another simple way to support beneficial insects.

With choices informed by not just your aesthetic preferences but also those of pollinators, you can create colorful gardens that support wildlife and stay in bloom across various seasons.

The Conversation

Claire Therese Hemingway does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Cultivating for color: The hidden trade-offs between garden aesthetics and pollinator preferences – https://theconversation.com/cultivating-for-color-the-hidden-trade-offs-between-garden-aesthetics-and-pollinator-preferences-261631

Has extreme poverty really plunged since the 1980s? New analysis suggests not

Source: The Conversation – UK – By Jason Hickel, Professor at the Institute for Environmental Science and Technology, Autonomous University of Barcelona

oceanfishing/Shutterstock

Data from the World Bank suggests that extreme poverty has declined dramatically over the past four decades, from 47% of the world’s population in 1981 to around 10% today.

This narrative is based on the World Bank’s method of calculating the share of people who live on less than US$3 per day in 2021 prices. This is adjusted for general price differences between countries (what’s known as purchasing power parity, or PPP).

But a growing body of literature argues that the World Bank’s PPP-based method has a major empirical limitation. The problem is that it does not account for the cost of meeting basic needs in any given context. Having more than US$3 PPP does not guarantee that a person can afford the specific goods and services that are necessary for survival in a particular location.

In recent years, scholars have developed what they argue is a more accurate method for measuring extreme poverty. This is done by comparing people’s incomes to the prices of essential goods (specifically food, shelter, clothing and fuel) in each country.

This approach is known as the “basic needs poverty line” (BNPL), and it more closely reflects what the original concept of extreme poverty was intended to measure. There is robust data from household consumption surveys and consumer prices covering the period from 1980-2011.

The BNPL data indicates that the story of global poverty over the past few decades is more complex – and troubling – than the World Bank narrative suggests.

This data indicates that between 1980 and 2011, the global extreme poverty rate declined by only six percentage points, from 23% to 17%. During the same period, the number of people in extreme poverty actually increased, from 1.01 billion to 1.20 billion.

What’s more, the alleviation of poverty has not been steady. In the 1980s and 1990s, an additional one billion people were thrown into extreme poverty. This occurred during the period when market reforms were implemented across most of the global south (developing countries in Africa, Asia and Latin America), often under pressure from western-controlled financial institutions. There was improvement throughout the 2000s, but progress has ultimately been slow and shallow.

Rising food insecurity

Robust BNPL data does not exist after 2011. However, data from the UN Food and Agricultural Organization’s (FAO) surveys on food insecurity shows that the proportion of the world population without reliable access to food increased steadily during the past decade or so, going from 21% in 2014 to 30% in 2022.

This includes cases of severe food insecurity, which is associated with prolonged periods of hunger. The share of the world population suffering in this way has increased from 7.7% to 11.3%.

Given that secure access to food is central to the BNPL method, we may assume that post-2011 poverty trends have probably not improved much, if at all.

This has important implications for the United Nations’ millennium development goals. The first of these set out to halve the proportion of the world’s population living in extreme poverty between 1990 and 2015. But the data on basic-needs poverty and food insecurity indicates that this goal was probably not achieved.

Extreme poverty is not a natural condition, but a sign of severe dislocation. Data on real wages since the 15th century indicates that, under normal conditions, across different societies and eras, people are generally able to meet their subsistence needs except during periods of severe social displacement.

This includes crises like famine and war, and the institutionalised denial of resources to marginalised people, particularly under European colonialism.

What’s more, the BNPL data shows that many countries have achieved very low levels of extreme poverty, even where GDP per capita is not high. They have done this by using strategies such as public provisioning and price controls for basic essentials.

This is consistent with previous research that found that these strategies can enable better social outcomes at any level of income.

In fact, research shows that the world economy already has enough productive capacity to eliminate global poverty many times over. Indeed it is possible not only to eliminate extreme poverty, but also to eliminate deprivation at much higher thresholds.

With these levels of production, we could ensure universal access to healthcare, education, modern housing, sanitation systems, electricity, clean cooking stoves, refrigeration, mobile phones, internet, computers, transport, household appliances and other necessities for decent living standards, for more than eight billion people.

The fact that poverty persists at such high levels today indicates that severe dislocation is institutionalised in the world economy – and that markets have failed to meet the basic needs of much of humanity.

Ending extreme poverty is the first objective of the UN’s sustainable development goals. The world economy has the resources and productive capacity to realise this goal – and more. But achieving it will require organising production to guarantee universal access to the specific goods and services that people need to live decent lives.

The Conversation

Jason Hickel receives funding from European Research Council (ERC-2022-SYG reference number 101071647).

Dylan Sullivan receives funding from European Research Council (ERC-2022-SYG reference number 101071647).

Michail Moatsos receives funding from UKRI.

ref. Has extreme poverty really plunged since the 1980s? New analysis suggests not – https://theconversation.com/has-extreme-poverty-really-plunged-since-the-1980s-new-analysis-suggests-not-261144

Canada and the U.K.’s conditional recognition of Palestine reveals the uneven rules of statehood

Source: The Conversation – Canada – By Catherine Frost, Professor of Political Science, McMaster University

Canada and the United Kingdom have said they will recognize Palestinian statehood during the United Nations General Assembly in September, provided certain conditions are met.

Canada’s position is premised on seeing political and military reform from the Palestinian Authority, the governing body responsible for the autonomous Palestinian territories.

The U.K., responding to a severe food crisis in Gaza, said it would extend recognition unless the Israeli government agrees to a ceasefire, takes steps to “end the appalling situation in Gaza” and commits to a “long-term, sustainable peace.”




Read more:
Why UK recognition of a Palestinian state should not be conditional on Israel’s actions


These cautious, conditional endorsements reflect the workings of a dated international system that governs the birth of states. France, by contrast, has opted to recognize Palestine without conditions. What explains these different approaches?

Officially, state recognition is governed by international law. In practice, it is subject to a complex mix of national, global and moral considerations.

This process grants existing states significant discretion in recognizing new ones, with the expectation that such decisions serve international peace. But this can result in an uneven statehood process for aspiring nations.

How states are born

The 1933 Montevideo Convention outlines the core criteria for statehood recognition: a permanent population, control over a defined territory, a functioning government and the capacity to open relations with other states.

When recognition is given on this basis, it is essentially acknowledging that these qualities are already in place. Yet these requirements are not iron clad, and some experts have argued that recognition can also be extended on humanitarian or moral grounds, such as in response to human rights violations.

In such cases, recognition becomes more of a statement that a state should have the opportunity to exist, rather than a confirmation that it already does. The classic case would be a group facing colonial domination. The American colonies appealed to this principle in the 1776 Declaration of Independence, for example.

Because individual states decide when such exceptions apply, these measures provide uncertain relief for aspiring nations.

As a final step, new states can apply for membership in the UN. This application is first considered by the UN Security Council. If nine states agree, and none of the council’s permanent members object, the application continues to the UN General Assembly for approval.

But a single veto from any of the five permanent members — China, France, Russia, the U.K. and the United States — can paralyze statehood at the start. In 2024, for example, the U.S. vetoed Palestine’s request for full UN membership.

Statehood in waiting

To date, 147 of 193 states in the United Nations recognize Palestinian statehood. Palestine has also had special observer status at the UN since 2012, and before that it had limited standing before international courts typically reserved for states.

But Palestine is not the only instance where the international system has struggled to address atypical or contested statehood.

After a wave of recognitions in post-colonial Africa and post-Second World War Europe, the recognition of new states slowed to a crawl toward the end of the 20th century. This trend suggests there is a conservative quality to the recognition system.

Wary of rewarding violent separatism, international bodies have traditionally favoured negotiated solutions for state birth, including upholding a parent-state veto over any independence efforts.

This principle was most clearly articulated by the Canadian Supreme Court in a 1998 advisory opinion. It warned that an independent Québec, without first agreeing on terms of exit with the rest of Canada, was unlikely to gain international recognition.

There is wisdom to this approach, but such rules cannot prevent political breakdown in every case. A growing number of unrecognized states have left millions stranded in political limbo.

This includes Somaliland, which split from Somalia in 1991 and has been operating as a de facto state ever since without receiving formal recognition from any other country.

Palestine is not an instance of state breakup, but rather an unresolved case of colonization and occupation. Decades of negotiations with Israel, the occupying power, have failed. Yet formal statehood has still proven elusive. A cumbersome recognition system may be helping to keep the problem alive.

Cracks in the system

Even when recognition occurs, the results can be disappointing.

South Sudan, the UN’s newest member, was universally recognized in 2011 under close UN supervision and with the consent of its parent state, Sudan. Yet it quickly descended into civil war — a conflict it has yet to fully emerged from.

Kosovo was recognized by states like the U.S. and Canada when it declared independence in 2008 following the breakup of Yugoslavia, but it still has fewer recognitions than Palestine.

A handful of states like Togo and Sierra Leone even began de-recognizing it under pressure from Kosovo’s one-time parent state, Serbia, although there is a broadly accepted principle that once a state is recognized, barring any complete disaster, it should remain recognized.

Meanwhile, rising sea levels threaten to leave some island states like Tuvalu without the territorial requirements for normal statehood. The International Court of Justice has signalled the statehood of such nations should survive, but has not said how.




Read more:
The Australia-Tuvalu deal shows why we need a global framework for climate relocations


These examples suggest the current state recognition system is ill equipped to face today’s changing world.

Allowing established states to set the rules for who qualifies is unlikely to solve these current problems. While setting special terms for new entrants may have value in the short term, the longer term need is for a more fair and transparent system.

Experts are working on ways to make the system more inclusive for aspiring states and unrepresented peoples, including by opening up access to diplomatic venues. If successful, these measures could change the way future states are born.

The Conversation

Catherine Frost receives funding from the Social Sciences and Humanities Research Council of Canada.

ref. Canada and the U.K.’s conditional recognition of Palestine reveals the uneven rules of statehood – https://theconversation.com/canada-and-the-u-k-s-conditional-recognition-of-palestine-reveals-the-uneven-rules-of-statehood-262418

Can a game stop vaccine misinformation? This one just might

Source: The Conversation – UK – By Sander van der Linden, Professor of Social Psychology in Society, University of Cambridge

Christopher Penler/Shutterstock.com

Modern vaccines have saved over 150 million lives. Yet misinformation about them can still have deadly consequences. A gunman recently opened fire at the US Centers for Disease Control and Prevention headquarters, wrongly believing that the coronavirus vaccine had caused his depression.

Public health is increasingly being threatened by the spread of dangerous misinformation. In fact, there have been several recent cases of healthy unvaccinated children who died after contracting the highly contagious measles virus – including in July in Liverpool. Childhood vaccination rates in the UK are now at their lowest point in over a decade, well below the World Health Organization recommended threshold of 95% for herd immunity.

A key question for scientists and public health practitioners alike is how to design interventions that help reduce people’s susceptibility to health misinformation. To help accomplish this, we designed a free browser game, Bad Vaxx, that simulates social media and lets players step into the shoes of online grifters who peddle vaccine misinformation, using four common manipulation tactics.

In three experimental trials, we found that the game helps people discern significantly better between credible and misleading information about vaccinations, boosts players’ confidence in their judgments, and reduces their willingness to share vaccine-related misinformation with others.

Much research shows that once exposed, people often continue to rely on falsehoods despite having seen a debunk or fact-check. Fact-checks matter, but it’s difficult to get people to engage with science and to spread corrective information across an increasingly fractured media landscape.

Inoculating minds

Our new game draws inspiration from a more preemptive approach known as “prebunking”. Prebunking aims to prevent people from encoding misinformation into their brains in the first place.

The most common way to prebunk misinformation is through psychological inoculation – an approach that befittingly parallels the immunisation analogy. Just as the body gains immunity to infection through exposure to severely weakened doses of a viral pathogen (that is, the vaccine), so too can the mind acquire cognitive resistance to misinformation. This happens through exposure to weakened doses of the tricks used to manipulate people online, along with clear examples of how to identify and neutralise them.

One way to build immunity is to immerse people in a social media simulation. This is exactly what happens in our game, Bad Vaxx.

In a controlled setting, people are exposed to weakened doses of the main techniques used to deceive people on vaccination through humorous and entertaining scenarios where people interact with four shady characters. They include Ann McDoctal, who loves to float scary anecdotes about vaccines, Dr Forge, who fakes his expertise and gets traction by pumping out pseudoscience, Ali Natural, who promotes the naturalistic fallacy (“if it’s natural, it must be good”), and the conspiracy theorist Mystic Mac, who doubts all official narratives.

The player can choose between two competing perspectives: one, take on the role of an online manipulator to see how the sausage is made, or two, try to defeat the characters by reducing their influence.

Cognitive inoculation is thought to work, in part, by introducing a sense of threat to elicit motivation to resist propaganda, which both perspectives aim to achieve, albeit through different means. People were randomly assigned to either the “good” or “evil” version of our 15-minute game or a placebo group (who played Tetris).

We “pre-registered” our study, meaning we wrote down our hypothesis and analysis plan before collecting any data, so we couldn’t move the goalposts.

We measured effectiveness by asking people how manipulative they found vaccine misinformation embedded in social media posts, how confident they are in their judgments, and whether they intend to share the post with their networks.

We based the test on real-world misinformation that corresponded to each of the techniques featured in the game or a non-manipulative (neutral) counterpart. This was done to see if the game improves people’s ability to discern between misleading and credible content. For example, a conspiratorial post read: “Vaccine database wiped by government to hide uptick in vaccine injuries.”

In all our tests, we found that both versions of the game helped people get much better at spotting fake vaccine information. Players also became more confident in their ability to tell real from fake, and they made better decisions about what to share online. The version where you play as the “good guy” worked slightly better than the version where you play as the “bad guy”.

Boosting discernment without breeding cynicism

We also found that people became significantly better at spotting false and manipulative content without becoming sceptical of credible content that doesn’t use manipulation. In other words, players became more discerning.

Of course, the immunisation analogy should not be over-interpreted as effects of psychological interventions are generally modest and do wear off but epidemiological simulations show that when applied across millions of people, prebunking can help contain the spread of misinformation.

Although vaccination decisions are complex, much research has shown a robust link between exposure to misinformation and reduced vaccination coverage. Needless to say, misinformation about vaccines is not new. In the 1800s, anti-vaxxers falsely claimed that taking the cowpox vaccine against smallpox would turn you into a human-cow hybrid.

What’s different today is that the most influential misinformation is coming from the top, including prominent politicians and influencers, who spread thoroughly debunked claims, such as the myth that the MMR vaccine causes autism.

Empowering the public to identify pseudoscience, misdirection, and manipulation in matters of life and death is therefore crucially important. We hope that our easy-to-play, short prebunking game can be integrated into educational curriculums, used by public health officials, doctors and patients in medical settings, and feature as part of international public health campaigns on social media and beyond. After all, viruses need a susceptible host. If enough people are immunised, misinformation will no longer have a chance to spread.

The Conversation

Sander van der Linden has received funding from the UK Cabinet Office, Google, the American Psychological Association, the US Centers for Disease Control, EU Horizon 2020, the Templeton World Charity Foundation, and the Alfred Landecker Foundation. He has lectured and/or consulted for the WHO, UN, Meta, Google, the Global Engagement Center (US State Dept), and UK Defense and national intelligence.

Jon Roozenbeek has received funding from the UK Cabinet Office, the US State Department, the ESRC, Google, the American Psychological Association, the US Centers for Disease Control, EU Horizon 2020, the Templeton World Charity Foundation, and the Alfred Landecker Foundation.

During her time at Stanford University, Ruth Elisabeth Appel has been supported by an SAP Stanford Graduate Fellowship in Science and Engineering, a Stanford Center on Philanthropy and Civil Society PhD Research Fellowship, a Stanford Impact Labs Summer Collaborative Research Fellowship, and a Stanford Impact Labs Postdoctoral Fellowship. She has interned at Google in 2020 and attended an event where food was paid for by Meta. After completing her research at Stanford University, which forms the basis for this article, she joined Anthropic to research the economic and societal impacts of AI.

ref. Can a game stop vaccine misinformation? This one just might – https://theconversation.com/can-a-game-stop-vaccine-misinformation-this-one-just-might-262468

How scientists can contribute to social movements and climate action

Source: The Conversation – UK – By Aaron Thierry, PhD Candidate, Social Science, Cardiff University

Despite decades of scientists’ warnings about climate and ecological breakdown, record-breaking heat and escalating environmental disasters have become commonplace. Science has been attacked, dismissed and politicised, and the world is accelerating in a terrifying direction.

To scientists this can feel particularly overwhelming. So what can we do?

Scientific knowledge alone hasn’t generated the urgent societal action many scientists expected. Therefore, to protect ourselves, future generations and countless other species, some scientists have started to reflect on their tactics. Not prepared to be neutral in the face of such an all encompassing threat, scientists like us have been asking what our role should be in an era when our planet’s life support systems are crumbling so rapidly, while governments and officials pour fuel on the flames.

Answering this question has led some of us to join social movements and take part in peaceful protests. Three years ago, a group of scientists were arrested in the course of protesting the UK government’s decisions to licence new oil fields. We were among the lab-coat wearing protesters who took the science to the government that day, pasting huge posters explaining the dangers of new fossil fuels onto the windows of the department that was committing us to them for decades to come.

It was a surreal experience, recently documented in the short film Plan Z: From Lab Coats to Handcuffs (2024) and a book called Scientists on Survival: Personal Stories of Climate Action (2025).

While taking a visible, public stand against harmful decisions can be a provocative and effective route for scientists to push for change, it isn’t the only way we can be effective advocates. Recent surveys reveal that there is a great appetite from scientists to be more involved in social movements, but many face barriers to participation and often don’t know where to start or how best to contribute.

In our recent article published in the journal npj Climate Action in collaboration with our colleague and science communicator Abi Perrin, we explore how scientists across all disciplines, backgrounds and career stages can get involved in activism in a range of practical ways. Whatever your strengths and limitations (depending on your status and which country you live), there are many positive ways to engage.

From silos to society

Currently scientific disciplines can be quite isolated from one another, and scientists generally aren’t very prominent in the public domain. We might feel restricted to speaking to our own very specific expertise but a scientist’s job involves understanding complex information, converting and communicating it into simpler, more useful forms.

Scientists can communicate about climate and nature, even without writing a PhD thesis on it. And we can be very powerful when we do: scientists are still widely trusted.

Politicians need to listen to scientists, not just the lobbyists. This is why engaging directly with MPs (or equivalents) is a route more scientists like us are taking. For instance, scientists in the UK have been important and prominent champions of the Climate and Nature Bill currently being debated in the House of Commons.

Social movements need scientists too. We can use our research and communication skills to inform and improve campaigns, bringing them to a wider audience. This support adds credibility to campaigns. We can also analyse what works to investigate, for example, the effectiveness of different activism strategies and targets in a range of contexts.

Academics have also supported activist campaigns against destructive infrastructure development by speaking at public hearings, as well as providing expert witness testimonies for activists in court for acts of protest.

Scientists can also push for cultural and policy change within our own institutions, including research institutions, science academies and professional bodies. This might include cutting ties to the fossil fuel industry, reorienting research and teaching to focus on sustainable development or accelerating the decarbonisation of campus activities.

We can support colleagues and students who engage in protests and encourage peers and leaders to do the same. It is easier to take action when you know you are not acting alone.

It’s more important than ever that our professional bodies and institutions are emboldened by their membership to advocate for the public good that science brings, and the need to defend academic freedom, recognising that often means speaking truth to power.

Collective action is crucial. We can all seek out allies, organise among peers and build powerful coalitions, rather than hoping science will passively translate into change. Time is of the essence.


Don’t have time to read about climate change as much as you’d like?

Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation’s environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 45,000+ readers who’ve subscribed so far.


This article features references to books that have been included for editorial reasons, and may contain links to bookshop.org. If you click on one of the links and go on to buy something from bookshop.org The Conversation UK may earn a commission.

The Conversation

Aaron Thierry receives funding from ESRC. He is affiliated with Scientists for Extinction Rebellion.

Tristram Wyatt is affiliated with Scientists for Extinction Rebellion.

ref. How scientists can contribute to social movements and climate action – https://theconversation.com/how-scientists-can-contribute-to-social-movements-and-climate-action-261959

How inflammatory bowel disease may accelerate the progression of dementia

Source: The Conversation – UK – By Iris Mikulic, Research Assistant, Department of Neurobiology, Care Sciences and Society, Karolinska Institutet

Orawan Pattarawimonchai/Shutterstock

You have probably heard the phrase “follow your gut” – often used to mean trusting your instinct and intuition. But in the context of the gut-brain axis, the phrase takes on a more literal meaning. Scientific research increasingly shows that the brain and gut are in constant, two-way communication. Once overlooked, this connection is now at the forefront of growing interest in neuroscience, nutrition and mental health.

The gut–brain axis is a highly complex system of interconnected pathways that relay information through diverse signals. Previous research has suggested that gut inflammation may contribute to the development of dementia. This may occur through to the triggering of systemic inflammation and the disruption of the pathways between the gut and the brain.

While interest in the gut-brain axis has grown rapidly, there is still limited understanding of whether intestinal inflammation might accelerate cognitive decline in people who already have dementia.

IBD and dementia connection

Our study explored this under-researched question, aiming to expand understanding in this area and improve the care of those affected. We focused on people who had already been diagnosed with both dementia and inflammatory bowel disease (IBD).

Dementia refers to a group of neurological disorders with different underlying causes, all characterised by progressive cognitive decline and increasing loss of independent function.

It is a growing global health concern, with the number of diagnoses rising steadily around the world. Older age remains the most significant risk factor for developing the condition.

In 2024, the FDA approved donanemab, a second novel drug aimed at slowing the progression of early-stage Alzheimer’s disease – the most common form of dementia. However, there is still no cure, and current treatments are primarily focused on managing symptoms.

Inflammatory bowel disease (IBD) is a complex, chronic inflammatory condition affecting the gastrointestinal tract. It includes Crohn’s disease, ulcerative colitis, and IBD-unclassified (also called indeterminate IBD), which refers to cases where symptoms and clinical findings do not clearly fit the criteria for either Crohn’s disease or ulcerative colitis.

IBD is typically characterised by symptoms such as abdominal pain, diarrhoea and changes in bowel habits. However, because it can have systemic (extra-intestinal) effects, the condition can also affect other parts of the body, including the skin, eyes, joints and liver but can also cause general fatigue.

IBD should not be confused with irritable bowel syndrome (IBS), which is a common functional condition of the gastrointestinal tract. IBS can cause similar symptoms – such as abdominal pain and changes in bowel habits – however, unlike IBD, there are no changes in gut tissue.

While there is currently no cure for IBD, except in ulcerative colitis, where, in some select cases, surgery may be curative, IBD can often be managed with (anti-inflammatory) medications and lifestyle changes.

IBD is a global health problem. Worldwide, between 1990 and 2021, new cases increased across all age groups, with the biggest jump seen in people aged 50 to 54. The smallest increase occurred in children under five. Importantly, IBD can be diagnosed from early childhood to later life, but in older adults, among others, symptoms can be mistaken for other conditions – potentially delaying diagnosis and treatment.

For our study, we used data from the Swedish Registry for Cognitive/Dementia Disorders (SveDem) – a comprehensive national quality registry that holds detailed medical information on people with various forms of dementia across Sweden. From this database, we identified people who were diagnosed with IBD after their dementia diagnosis. We then compared 111 people who had both dementia and newly diagnosed IBD with a control group of 1,110 people who had dementia but no IBD diagnosis. The two groups were closely matched for age, gender, type of dementia, other health conditions and medication use.

Measuring cognitive decline

To measure changes in cognitive function, we used the Mini-Mental State Examination (MMSE) score. The MMSE is a standardised test made up of 11 tasks, with a maximum score of 30 points. It is widely used by healthcare professionals to assess memory, attention, language and other aspects of cognitive performance, particularly when dementia is suspected. People without dementia typically score between 25 and 30, while those with dementia often score below 24.

In our study, we compared MMSE scores between the two groups. We also looked at changes in MMSE scores before and after the IBD diagnosis in people who had both dementia and IBD. Our results showed that those with both conditions experienced a significantly faster decline in cognitive function. This decline became more noticeable after the IBD diagnosis. On average, people with both diagnoses lost nearly one additional MMSE point per year compared to those with dementia alone. This level of decline is comparable to the difference seen between people with dementia who take the new Alzheimer’s drug donanemab and those who do not.

Our findings suggest that IBD – and the systemic inflammation it causes – may contribute to a faster worsening of cognitive function. This highlights the need for closer monitoring of people with both conditions. Managing IBD effectively through anti-inflammatory medications, nutritional support and in some cases surgery, might potentially help reduce neuroinflammation, thereby slowing the progression of dementia.

While our results indicate that cognitive decline was significantly faster in people with both dementia and newly diagnosed IBD, it is important to note that this was an observational study, so we cannot establish direct causality. The study also had some limitations. For instance, we lacked data on IBD severity and the specific treatments patients were receiving. We also did not explore differences by gender, dementia subtype, or IBD subtype.

Additionally, since dementia is typically diagnosed in older age, the elderly onset IBD cases may have been underdiagnosed. Finally, while SveDem is a valuable national registry, it does not yet include all newly diagnosed dementia cases in Sweden.

Understanding how IBD influences the brain could open the door to new strategies for protecting cognitive health in older adults. Furthermore, identifying whether specific IBD treatments can slow cognitive decline may benefit people living with both conditions and could help with the refinement of care for this vulnerable patient population.

The Conversation

Hong Xu receives funding from the Swedish Research Council (Starting grant#2022-01428) and the Center for Innovative Medicine Foundation (CIMED, FoUI-1002840).

Jonas F. Ludvigsson has coordinated an unrelated study on behalf of the Swedish IBD quality register (SWIBREG). That study received funding from Janssen corporation. Dr Ludvigsson has also received financial support from Merck/MSD for an unrelated study on IBD; and for developing a paper reviewing national healthcare registers in China. Dr Ludvigsson has also an ongoing research collaboration on celiac disease with Takeda.

Iris Mikulic does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How inflammatory bowel disease may accelerate the progression of dementia – https://theconversation.com/how-inflammatory-bowel-disease-may-accelerate-the-progression-of-dementia-260904