Calls to designate the Bishnoi gang a terrorist group shine a spotlight on Canadian security laws

Source: The Conversation – Canada – By Basema Al-Alami, SJD Candidate, Faculty of Law, University of Toronto

British Columbia Premier David Eby recently called on Prime Minister Mark Carney to designate the India-based Bishnoi gang a terrorist organization.

Brampton Mayor Patrick Brown echoed the request days later. The RCMP has also alleged the gang may be targeting pro-Khalistan activists in Canada.

These claims follow a series of high-profile incidents in India linked to the Bishnoi network, including the murder of a Punjabi rapper in New Delhi, threats against a Bollywood actor and the killing of a Mumbai politician in late 2024.

How terrorism designations work

Eby’s request raises broader legal questions. What does it mean to label a group a terrorist organization in Canada and what happens once that label is applied?

Under Section 83.05 of the Criminal Code, the federal government can designate an entity a terrorist organization if there are “reasonable grounds to believe” it has engaged in, supported or facilitated terrorist activity. The term “entity” is defined broadly, covering individuals, groups, partnerships and unincorporated associations.

The process begins with intelligence and law enforcement reports submitted to the public safety minister, who may then recommend listing the group to cabinet if it’s believed the legal threshold is met. If cabinet agrees, the group is officially designated a terrorist organization.

A designation carries serious consequences: assets can be frozen and financial dealings become criminalized. Banks and other institutions are protected from liability if they refuse to engage with the group. Essentially, the designation cuts the group off from economic and civic life, often without prior notice or public hearing.

As of July 2025, Canada has listed 86 entities, from the Islamic Revolutionary Guard Corps to far-right and nationalist organizations. In February, the government added seven violent criminal groups from Latin America, including the Sinaloa cartel and La Mara Salvatrucha, known as the MS-13.

This marked a turning point: for the first time, Canada extended terrorism designations beyond ideological or political movements to include transnational criminal networks.

Why the shift matters

This shift reflects a deeper redefinition of what Canada considers a national security threat. For much of the post-9/11 era, counterterrorism efforts in Canada have concentrated on groups tied to ideological, religious or political agendas — most often framed through the lens of Islamic terrorism.

This has determined not only who is targeted, but also what forms of violence are taken seriously as national security concerns.

That is why the recent expansion of terrorism designations — first with the listing of Mexican cartels in early 2025, and now potentially with the Bishnoi gang — feels so significant.

It signals a shift away from targeting ideology alone and toward labelling profit-driven organized crime as terrorism. While transnational gangs may pose serious public safety risks, designating them terrorist organizations could erode the legal and political boundaries that once separated counterterrorism initiatives from criminal law.

Canada’s terrorism listing process only adds to these concerns. The decision is made by cabinet, based on secret intelligence, with no obligation to inform the group or offer a chance to respond. Most of the evidence remains hidden, even from the courts.

While judicial review is technically possible, it is limited, opaque and rarely successful.

In effect, the label becomes final. It brings serious legal consequences like asset freezes, criminal charges and immigration bans. But the informal fallout can be just as harsh: banks shut down accounts, landlords back out of leases, employers cut ties. Even without a trial or conviction, the stigma of being associated with a listed group can dramatically change someone’s life.

What’s at stake

Using terrorism laws to go after violent criminal networks like the Bishnoi gang may seem justified. But it quietly expands powers that were originally designed for specific types of threats. It also stretches a national security framework already tainted by racial and political bias.




Read more:
Canadian law enforcement agencies continue to target Muslims


For more than two decades, Canada’s counterterrorism laws have disproportionately targeted Muslim and racialized communities under a logic of pre-emptive suspicion. Applying those same powers to organized crime, especially when it impacts immigrant and diaspora communities, risks reproducing that harm under a different label.

Canadians should be asking: what happens when tools built for exceptional threats become the default response to complex criminal violence?

As the federal government considers whether to label the Bishnoi gang a terrorist organization, the real question goes beyond whether the group meets the legal test. It’s about what kind of legal logic Canada is endorsing.

Terrorism designations carry sweeping powers, with little oversight and lasting consequences. Extending those powers to organized crime might appear pragmatic, but it risks normalizing a process that has long operated in the shadows, shaped by secrecy and executive discretion.

As national security law expands, Canadians should ask not just who gets listed, but how those decisions are made and what broader political agendas they might serve.

The Conversation

Basema Al-Alami does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Calls to designate the Bishnoi gang a terrorist group shine a spotlight on Canadian security laws – https://theconversation.com/calls-to-designate-the-bishnoi-gang-a-terrorist-group-shine-a-spotlight-on-canadian-security-laws-259844

Universities in every state care for congressional papers that document US political history − federal cuts put their work at risk

Source: The Conversation – USA – By Katherine Gregory, Assistant Professor, University Libraries, Mississippi State University

The papers of members of Congress are fertile ground for research into Congress’ role in shaping U.S. history. cunfek, iStock/Getty Images Plus

In 1971, the president of Mississippi State University, Dr. William L. Giles, invited President Richard Nixon to attend the dedication of U.S. Sen. John C. Stennis’ papers to the university library’s archives.

Nixon declined, but the Republican president sent a generous note in support of the veteran Democrat Stennis.

“Future students and scholars who study there will … familiarize themselves with the outstanding record of a U.S. Senator whose … judgment in complex areas of national security have been a source of strength and comfort to those who have led this Nation and to all who are concerned in preserving the freedom we cherish.”

Nixon’s prediction came true, perhaps ironically, considering the legal troubles over his own papers during the Watergate crisis. Congress passed the Presidential Records Act of 1978 after Nixon resigned.

Stennis’ gift to his alma mater caused a windfall of subsequent congressional donations to what is now the Mississippi Political Collections at Mississippi State University Libraries.

Now, 55 years later, Mississippi State University holds a body of records from a bipartisan group of officials that has positioned it to tell a major part of the state’s story in national and global politics. That story is told to over 100 patrons and dozens of college and K-12 classes each year.

The papers are fertile ground for scholarly research into Congress’ role in shaping U.S. history, with its extraordinary powers over lawmaking, the economy and one of the world’s largest militaries.

Mississippi State University, where I work as an assistant professor and director of the Mississippi Political Collections, is not alone in providing such a rich source of history. It is part of a national network of universities that hold and steward congressional papers.

But support for this stewardship is in jeopardy. With the White House’s proposed elimination of independent granting agencies such as the National Endowment for the Humanities and the Institute of Museum and Library Services, it is unclear what money will be available for this work in the future.

A typed letter on the letterhead of the U.S. Congress, Committee on Armed Forces begins 'Dear Walter:'
A 1963 letter from Sen. John Stennis to a constituent about agricultural legislation and also Russians in Cuba.
Mississippi State University

From research to public service

Mississippi State University’s building of an expansive political archive is neither unique nor a break from practices by our national peers:

The Richard Russell Library for Political Research and Studies at the University of Georgia – named after the U.S. senator from Georgia from 1933 to 1971 – has grown since its founding in 1974 into one of America’s premier research libraries of political history, with more than 600 manuscript collections and an extensive oral history collection.

• Iowa Sen. Tom Harkin donated his papers to Drake University to form The Harkin Institute, which memorializes Harkin’s role as chief sponsor of the Americans with Disabilities Act through disability policy research and education.

• Sens. Robert and Elizabeth Dole’s papers are the bedrock of the Dole Institute of Politics at Kansas University.

• In 2023, retiring Sens. Richard Shelby and Patrick Leahy donated their archives – Shelby to the University of Alabama and Leahy to the University of Vermont.

By lending their papers and relative political celebrity, members of Congress have laid the groundwork for repositories like these to promote policy research to enable local and state governments to shape legislation on issues central to their states.

More complete history

When the repositories are at universities, they also provide educational programming that encourages public service for the next generations.

At Mississippi State University, the John C. Stennis Institute for Government and Community Development sponsors an organization that allows students to learn about government, voting, organizing and potential careers on Capitol Hill with trips to Washington, D.C.

Depositing congressional papers in states and districts, to be cared for by professional archivists and librarians, extends the life of the records and expands their utility.

When elected officials give their papers to their constituents, they ensure the public can see and use the papers. This is a way of returning their history to them, while giving them the power to assemble a more complete, independent version of their political history. While members of Congress are not required by law to donate their papers, they passed a bipartisan concurrent resolution in 2008 encouraging the practice.

Users of congressional archives range from historians to college students, local investigative journalists, political memoirists and documentary filmmakers. In advance of the 2020 election, we contributed historical materials to CNN’s reporting on Joe Biden’s controversial relationship with the Southern bloc of segregationist senators in his early Senate years.

A yellowed letter from 1947 about Indian resource rights from a congressman to a Native American constituent in Oklahoma.
A copy of a letter from U.S. Rep. Carl Albert of Oklahoma, who ultimately became the 46th speaker of the U.S. House of Representatives.
Carl Albert Center Congressional and Political Collections, University of Oklahoma

Preserving the archives

While the results contribute to the humanities, the process of archival preservation and management is as complex a science as any other.

“Congressional records” is a broad term that encompasses many formats such as letters, diaries, notes, meeting minutes, speech transcripts, guestbooks and schedules.

They also include ephemera such as campaign bumper stickers, military medals and even ceremonial pieces of the original U.S. Capitol flooring. They contain rare photographs of everything from natural disaster damage to state dinners and legacy audiovisual materials such as 8 mm film, cassette tapes and vinyl records. Members of Congress also have donated their libraries of hundreds of books.

Archival preservation is a constantly evolving science. Only in the mid-20th century was the acid-free box developed to arrest the deterioration of paper records. After the advent of film-based photographs, archivists later learned to keep them away from light and heat, and they observed that audiovisual materials such as 8mm tape decompose from acid decay quickly if not stored in proper conditions.

Alongside preservation work comes the task of inventorying the records for public use. Archivists write finding aids – itemized, searchable catalogs of the records – and create metadata, which describes items in terms of size, creation date and location.

Future congressional papers will include born-digital content such as email and social media. This means traditional archiving will give way to digital preservation and data management. Federal law mandates that digital records have alt-text and transcription, and they need specialized expertise in file storage and data security because congressional papers often contain case files with sensitive personal data.

With congressional materials often clocking in at hundreds or thousands of linear feet, emerging artificial intelligence and automation technologies will usher this field into a new era, with AI speeding metadata and cataloging work to deliver usable records for researchers faster than ever.

No more funding?

All of this work takes money; most of it takes staff time. Institutions meet these needs through federal grants – the very grants at risk from the Trump administration’s proposed elimination of the agencies that administer them.

For example, West Virginia University has been awarded over $400,000 since 2021 from the National Endowment for the Humanities for the American Congress Digital Archives Portal project, a website that centralizes digitized congressional records at the university and a growing list of partners such as the University of Hawaii and the University of Oklahoma.

Past federal grants have funded other congressional papers projects, from basic supply needs such as folders to more complex repair of film and tape.

The Howard Baker Center for Public Policy at the University of Tennessee used National Endowment for the Humanities funds to purchase specialized supplies needed to store the papers of its namesake, the Republican senator who also served as chief of staff to President Ronald Reagan.

National Endowment for the Humanities funds helped process U.S. Rep. Pat Williams’ papers at the University of Montana, resulting in a searchable finding aid for the 87 boxes of records documenting the Montana Democrat’s 18 years in Congress.
President Franklin D. Roosevelt said, “I have an unshaken conviction that democracy can never be undermined if we maintain our library resources and a national intelligence capable of utilizing them.”

With the current threat to federal grants – and agencies – that pay for the crucial work of stewarding these congressional papers, it appears that these records of democracy may no longer play their role in supporting that democracy.

The Conversation

Katherine Gregory received funding from the National Endowment for the Humanities and is a member of the Society of American Archivists.

ref. Universities in every state care for congressional papers that document US political history − federal cuts put their work at risk – https://theconversation.com/universities-in-every-state-care-for-congressional-papers-that-document-us-political-history-federal-cuts-put-their-work-at-risk-256053

Your data privacy is slipping away – here’s why, and what you can do about it

Source: The Conversation – USA – By Mike Chapple, Teaching Professor of IT, Analytics, and Operations, University of Notre Dame

Cybersecurity and data privacy are constantly in the news. Governments are passing new cybersecurity laws. Companies are investing in cybersecurity controls such as firewalls, encryption and awareness training at record levels.

And yet, people are losing ground on data privacy.

In 2024, the Identity Theft Resource Center reported that companies sent out 1.3 billion notifications to the victims of data breaches. That’s more than triple the notices sent out the year before. It’s clear that despite growing efforts, personal data breaches are not only continuing, but accelerating.

What can you do about this situation? Many people think of the cybersecurity issue as a technical problem. They’re right: Technical controls are an important part of protecting personal information, but they are not enough.

As a professor of information technology, analytics and operations at the University of Notre Dame, I study ways to protect personal privacy.

Solid personal privacy protection is made up of three pillars: accessible technical controls, public awareness of the need for privacy, and public policies that prioritize personal privacy. Each plays a crucial role in protecting personal privacy. A weakness in any one puts the entire system at risk.

The first line of defense

Technology is the first line of defense, guarding access to computers that store data and encrypting information as it travels between computers to keep intruders from gaining access. But even the best security tools can fail when misused, misconfigured or ignored.

Two technical controls are especially important: encryption and multifactor authentication. These are the backbone of digital privacy – and they work best when widely adopted and properly implemented.




Read more:
The hidden cost of convenience: How your data pulls in hundreds of billions of dollars for app and social media companies


Encryption uses complex math to put sensitive data in an unreadable format that can only be unlocked with the right key. For example, your web browser uses HTTPS encryption to protect your information when you visit a secure webpage. This prevents anyone on your network – or any network between you and the website – from eavesdropping on your communications. Today, nearly all web traffic is encrypted in this way.

But if we’re so good at encrypting data on networks, why are we still suffering all of these data breaches? The reality is that encrypting data in transit is only part of the challenge.

Securing stored data

We also need to protect data wherever it’s stored – on phones, laptops and the servers that make up cloud storage. Unfortunately, this is where security often falls short. Encrypting stored data, or data at rest, isn’t as widespread as encrypting data that is moving from one place to another.

While modern smartphones typically encrypt files by default, the same can’t be said for cloud storage or company databases. Only 10% of organizations report that at least 80% of the information they have stored in the cloud is encrypted, according to a 2024 industry survey. This leaves a huge amount of unencrypted personal information potentially exposed if attackers manage to break in. Without encryption, breaking into a database is like opening an unlocked filing cabinet – everything inside is accessible to the attacker.

Multifactor authentication is a security measure that requires you to provide more than one form of verification before accessing sensitive information. This type of authentication is more difficult to crack than a password alone because it requires a combination of different types of information. It often combines something you know, such as a password, with something you have, such as a smartphone app that can generate a verification code or with something that’s part of what you are, like a fingerprint. Proper use of multifactor authentication reduces the risk of compromise by 99.22%.

While 83% of organizations require that their employees use multifactor authentication, according to another industry survey, this still leaves millions of accounts protected by nothing more than a password. As attackers grow more sophisticated and credential theft remains rampant, closing that 17% gap isn’t just a best practice – it’s a necessity.

Multifactor authentication is one of the simplest, most effective steps organizations can take to prevent data breaches, but it remains underused. Expanding its adoption could dramatically reduce the number of successful attacks each year.

Awareness gives people the knowledge they need

Even the best technology falls short when people make mistakes. Human error played a role in 68% of 2024 data breaches, according to a Verizon report. Organizations can mitigate this risk through employee training, data minimization – meaning collecting only the information necessary for a task, then deleting it when it’s no longer needed – and strict access controls.

Policies, audits and incident response plans can help organizations prepare for a possible data breach so they can stem the damage, see who is responsible and learn from the experience. It’s also important to guard against insider threats and physical intrusion using physical safeguards such as locking down server rooms.

Public policy holds organizations accountable

Legal protections help hold organizations accountable in keeping data protected and giving people control over their data. The European Union’s General Data Protection Regulation is one of the most comprehensive privacy laws in the world. It mandates strong data protection practices and gives people the right to access, correct and delete their personal data. And the General Data Protection Regulation has teeth: In 2023, Meta was fined €1.2 billion (US$1.4 billion) when Facebook was found in violation.

Despite years of discussion, the U.S. still has no comprehensive federal privacy law. Several proposals have been introduced in Congress, but none have made it across the finish line. In its place, a mix of state regulations and industry-specific rules – such as the Health Insurance Portability and Accountability Act for health data and the Gramm-Leach-Bliley Act for financial institutions – fill the gaps.

Some states have passed their own privacy laws, but this patchwork leaves Americans with uneven protections and creates compliance headaches for businesses operating across jurisdictions.

The tools, policies and knowledge to protect personal data exist – but people’s and institutions’ use of them still falls short. Stronger encryption, more widespread use of multifactor authentication, better training and clearer legal standards could prevent many breaches. It’s clear that these tools work. What’s needed now is the collective will – and a unified federal mandate – to put those protections in place.


This article is part of a series on data privacy that explores who collects your data, what and how they collect, who sells and buys your data, what they all do with it, and what you can do about it.

The Conversation

Mike Chapple does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Your data privacy is slipping away – here’s why, and what you can do about it – https://theconversation.com/your-data-privacy-is-slipping-away-heres-why-and-what-you-can-do-about-it-251768

Scientific norms shape the behavior of researchers working for the greater good

Source: The Conversation – USA – By Jeffrey A. Lee, Professor of Geography and the Environment, Texas Tech University

Mentors model the ethical pursuit of scientific knowledge. sanjeri/E+ via Getty Images

Over the past 400 years or so, a set of mostly unwritten guidelines has evolved for how science should be properly done. The assumption in the research community is that science advances most effectively when scientists conduct themselves in certain ways.

The first person to write down these attitudes and behaviors was Robert Merton, in 1942. The founder of the sociology of science laid out what he called the “ethos of science,” a set of “values and norms which is held to be binding on the man of science.” (Yes, it’s sexist wording. Yes, it was the 1940s.) These now are referred to as scientific norms.

The point of these norms is that scientists should behave in ways that improve the collective advancement of knowledge. If you’re a cynic, you might be rolling your eyes at such a Pollyannaish ideal. But corny expectations keep the world functioning. Think: Be kind, clean up your mess, return the shopping cart to the cart corral.

I’m a physical geographer who realized long ago that students are taught biology in biology classes and chemistry in chemistry classes, but rarely are they taught about the overarching concepts of science itself. So I wrote a book called “The Scientific Endeavor,” laying out what scientists and other educated people should know about science itself.

Scientists in training are expected to learn the big picture of science after years of observing their mentors, but that doesn’t always happen. And understanding what drives scientists can help nonscientists better understand research findings. These scientific norms are a big part of the scientific endeavor. Here are Merton’s original four, along with a couple I think are worth adding to the list:

Universalism

Scientific knowledge is for everyone – it’s universal – and not the domain of an individual or group. In other words, a scientific claim must be judged on its merits, not the person making it. Characteristics like a scientist’s nationality, gender or favorite sports team should not affect how their work is judged.

Also, the past record of a scientist shouldn’t influence how you judge whatever claim they’re currently making. For instance, Nobel Prize-winning chemist Linus Pauling was not able to convince most scientists that large doses of vitamin C are medically beneficial; his evidence didn’t sufficiently support his claim.

In practice, it’s hard to judge contradictory claims fairly when they come from a “big name” in the field versus an unknown researcher without a reputation. It is, however, easy to point out such breaches of universalism when others let scientific fame sway their opinion one way or another about new work.

black-and-white image of man in white jacket holding up two bottles
When asked about patenting his polio vaccine, Jonas Salk replied, ‘There is no patent. Could you patent the sun?’
Bettmann via Getty Images

Communism

Communism in science is the idea that scientific knowledge is the property of everyone and must be shared.

Jonas Salk, who led the research that resulted in the polio vaccine, provides a classic example of this scientific norm. He published the work and did not patent the vaccine so that it could be freely produced at low cost.

When scientific research doesn’t have direct commercial application, communism is easy to practice. When money is involved, however, things get complicated. Many scientists work for corporations, and they might not publish their findings in order to keep them away from competitors. The same goes for military research and cybersecurity, where publishing findings could help the bad guys.

Disinterestedness

Disinterestedness refers to the expectation that scientists pursue their work mainly for the advancement of knowledge, not to advance an agenda or get rich. The expectation is that a researcher will share the results of their work, regardless of a finding’s implications for their career or economic bottom line.

Research on politically hot topics, like vaccine safety, is where it can be tricky to remain disinterested. Imagine a scientist who is strongly pro-vaccine. If their vaccine research results suggest serious danger to children, the scientist is still obligated to share these findings.

Likewise, if a scientist has invested in a company selling a drug, and the scientist’s research shows that the drug is dangerous, they are morally compelled to publish the work even if that would hurt their income.

In addition, when publishing research, scientists are required to disclose any conflicts of interest related to the work. This step informs others that they may want to be more skeptical in evaluating the work, in case self-interest won out over disinterest.

Disinterestedness also applies to journal editors, who are obligated to decide whether to publish research based on the science, not the political or economic implications.

Organized skepticism

Merton’s last norm is organized skepticism. Skepticism does not mean rejecting ideas because you don’t like them. To be skeptical in science is to be highly critical and look for weaknesses in a piece of research.

colorful journals with spines out on library shelves
By the time new research is published in a reputable journal, it’ has made it past several sets of skeptical eyes.
gorsh13/iStock via Getty Images Plus

This concept is formalized in the peer review process. When a scientist submits an article to a journal, the editor sends it to two or three scientists familiar with the topic and methods used. They read it carefully and point out any problems they find.

The editor then uses the reviewer reports to decide whether to accept as is, reject outright or request revisions. If the decision is revise, the author then makes each change or tries to convince the editor that the reviewer is wrong.

Peer review is not perfect and doesn’t always catch bad research, but in most cases it improves the work, and science benefits. Traditionally, results weren’t made public until after peer review, but that practice has weakened in recent years with the rise of preprints, reducing the reliability of information for nonscientists.

Integrity and humility

I’m adding two norms to Merton’s list.

The first is integrity. It’s so fundamental to good science that it almost seems unnecessary to mention. But I think it’s justified since cheating, stealing and lazy scientists are getting plenty of attention these days.

The second is humility. You may have made a contribution to our understanding of cell division, but don’t tell us that you cured cancer. You may be a leader in quantum mechanics research, but that doesn’t make you an authority on climate change.

Scientific norms are guidelines for how scientists are expected to behave. A researcher who violates one of these norms won’t be carted off to jail or fined an exorbitant fee. But when a norm is not followed, scientists must be prepared to justify their reasons, both to themselves and to others.

The Conversation

Jeffrey A. Lee does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Scientific norms shape the behavior of researchers working for the greater good – https://theconversation.com/scientific-norms-shape-the-behavior-of-researchers-working-for-the-greater-good-255159

President Trump’s tug-of-war with the courts, explained

Source: The Conversation – USA – By Paul M. Collins Jr., Professor of Legal Studies and Political Science, UMass Amherst

The U.S. Supreme Court in Washington, D.C. Stefani Reynolds/Bloomberg

The Supreme Court handed President Donald Trump a big win on June 27, 2025, by limiting the ability of judges to block Trump administration policies across the nation.

But Trump has not fared nearly as well in the lower courts, where he has lost a series of cases through different levels of the federal court system. On June 5, a single judge temporarily stopped the administration from preventing Harvard University from enrolling international students.

And a three-judge panel of the U.S. Court of International Trade blocked Trump on May 28 from imposing tariffs on China and other nations. The Trump administration has appealed this decision. It will be taken up in July by all 11 judges on the United States Court of Appeals for the Federal Circuit.

After that, the case can be appealed to the Supreme Court.

I’m a scholar of the federal courts. The reasons why some courts have multiple judges and others have a single judge can be confusing. Here’s a guide to help understand what’s going on in the federal courts.

Federal District Courts

The U.S. District Courts are the trial courts in the federal system and hear about 400,000 cases per year. A single judge almost always presides over cases.

This makes sense for a jury trial, since a judge might make dozens of spur-of-the-moment decisions during the course of a trial, such as ruling on a lawyer’s objection to a question asked of a witness. If a panel of, say, three judges performed this task, it would prolong proceedings because the three judges would have to deliberate over every ruling.

A more controversial role of District Courts involves setting nationwide injunctions. This happens when a single judge temporarily stops the government from enforcing a policy throughout the nation.

There have been more than two dozen nationwide injunctions during Trump’s second term. These involve policy areas as diverse as ending birthright citizenship, firing federal employees and banning transgender people from serving in the military.

A man at a podium speaks to dozens of reporters.
President Donald Trump speaks at the White House on June 27, 2025, after the Supreme Court curbed the power of lone federal judges to block executive actions.
Andrew Caballero-Reynolds/AFP via Getty Images

Trump and Republicans in Congress argue that the ability to issue nationwide injunctions gives too much power to a single judge. Instead, they believe injunctions should apply only to the parties involved in the case.

On June 27, the Supreme Court agreed with the Trump administration and severely limited the ability of District Court judges to issue nationwide injunctions. This means that judges can generally stop policies from being enforced only against the parties to a lawsuit, instead of everyone in the nation.

In rare instances, a panel of three District Court judges hears a case. Congress decides what cases these special three-judge panels hear, reserving them for especially important issues. For example, these panels have heard cases involving reapportionment, which is how votes are translated into legislative seats in Congress and state legislatures, and allegations that a voter’s rights have been violated.

The logic behind having three judges hear such important cases is that they will give more careful consideration to the dispute. This may lend legitimacy to a controversial decision and prevents a single judge from exercising too much power.

There are also specialized courts that hear cases involving particular policies, sometimes in panels of three judges. For instance, three-judge panels on the U.S. Court of International Trade decide cases involving executive orders related to international trade.

The federal Court of Appeals

The U.S. Court of Appeals hears appeals from the District Courts and specialized courts.

The 13 federal circuit courts that make up the U.S. Court of Appeals are arranged throughout the country and handle about 40,000 cases per year. Each circuit court has six to 29 judges. Cases are decided primarily by three-judge panels.

Having multiple judges decide cases on the Court of Appeals is seen as worthwhile, since these courts are policymaking institutions. This means they set precedents for the judicial circuit in which they operate, which covers three to nine states.

Supporters of this system argue that by having multiple judges on appellate courts, the panel will consider a variety of perspectives on the case and collaborate with one another. This can lead to better decision-making. Additionally, having multiple judges check one another can boost public confidence in the judiciary.

The party that loses a case before a three-judge panel can request that the entire circuit rehear the case. This is known as sitting en banc.

Because judges on a circuit can decline to hear cases en banc, this procedure is usually reserved for especially significant cases. For instance, the U.S. Court of Appeals for the Federal Circuit has agreed to an en banc hearing to review the Court of International Trade’s decision to temporarily halt Trump’s sweeping tariff program. It also allowed the tariffs to remain in effect until the appeal plays out, likely in August.

The exception to having the entire circuit sit together en banc is the 9th Circuit, based in San Francisco, which has 29 judges, far more than other circuit courts. It uses an 11-judge en banc process, since having 29 judges hear cases together would be logistically challenging.

Cargo ships are seen at a container terminal.
Cargo ships are seen at a container terminal in the Port of Shanghai, China, in May 2025. A three-judge panel of the U.S. Court of International Trade blocked Trump from imposing tariffs on China and other nations.
CFOTO/Future Publishing via Getty Images

The US Supreme Court

The U.S. Supreme Court sits atop the American legal system and decides about 60 cases per year.

Cases are decided by all nine justices, unless a justice declines to participate because of a conflict of interest. As with other multimember courts, advocates of the nine-member makeup argue that the quality of decision-making is improved by having many justices participate in a case’s deliberation.

Each Supreme Court justice is charged with overseeing one or more of the 13 federal circuits. In this role, a single justice reviews emergency appeals from the District Courts and an appellate court within a circuit. This authorizes them to put a temporary hold on the implementation of policies within that circuit or refer the matter to the entire Supreme Court.

In February, for example, Chief Justice John Roberts blocked a Court of Appeals order that would have compelled the Trump administration to pay nearly US$2 billion in reimbursements for already completed foreign aid work.

In March, a 5-4 majority of the high court sent the case back to U.S. District Judge Amir Ali, who subsequently ordered the Trump administration to release some of the funds.

The federal judicial system is complex. The flurry of executive orders from the Trump administration means that cases are being decided on a nearly daily basis by a variety of courts.

A single judge will decide some of these cases, and others are considered by full courts. Though the nine justices of the Supreme Court technically have the final say, the sheer volume of legal challenges means that America’s District Courts and Court of Appeals will resolve many of the disputes.

The Conversation

Paul M. Collins Jr. does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. President Trump’s tug-of-war with the courts, explained – https://theconversation.com/president-trumps-tug-of-war-with-the-courts-explained-258234

The US has high hopes for a new Gaza ceasefire, but Israel’s long-term aims seem far less peaceful

Source: The Conversation – Global Perspectives – By Ali Mamouri, Research Fellow, Middle East Studies, Deakin University

US President Donald Trump has hosted Israeli Prime Minister Benjamin Netanyahu for dinner at the White House, where he has declared talks to end the war in Gaza are “going along very well”.

In turn, Netanyahu revealed he has nominated Trump for the Nobel Peace Prize, saying:

he is forging peace as we speak, in one country, in one region, after the other.

Despite all the talk of peace, negotiations in Qatar between Israeli and Palestinian delegations have broken up without a breakthrough. The talks are expected to resume later this week.

If an agreement is reached, it will likely be hailed as a crucial opportunity to end nearly two years of humanitarian crisis in Gaza, following the October 7 attacks in which 1,200 Israelis were killed by Hamas-led militants.

However, there is growing scepticism about the durability of any truce. A previous ceasefire agreement reached in January led to the release of dozens of Israeli hostages and hundreds of Palestinian prisoners.

But it collapsed by March, when Israel resumed military operations in Gaza.

This breakdown in trust on both sides, combined with ongoing Israeli military operations and political instability, suggests the new deal may prove to be another temporary pause rather than a lasting resolution.

Details of the deal

The proposed agreement outlines a 60-day ceasefire aimed at de-escalating hostilities in Gaza and creating space for negotiations toward a more lasting resolution.

Hamas would release ten surviving Israeli hostages and return the remains of 18 others. In exchange, Israel is expected to withdraw its military forces to a designated buffer zone along Gaza’s borders with both Israel and Egypt.

An Israeli hostage flanked by militants in Gaza
The agreement being thrashed out in Doha includes the release of Israeli hostages, held in Gaza for the past 22 months.
Anas-Mohammed/Shutterstock

While the specific terms of a prisoner exchange remain under negotiation, the release of Palestinian detainees held in Israeli prisons is a central component of the proposal.

Humanitarian aid is also a key focus of the agreement. Relief would be delivered through international organisations, primarily UN agencies and the Palestinian Red Crescent.

However, the agreement does not specify the future role of the US-backed Gaza Humanitarian Fund, which has been distributing food aid since May.

The urgency of humanitarian access is underscored by the scale of destruction in Gaza. According to Gaza’s Health Ministry, Israel’s military campaign has killed more than 57,000 Palestinians. The offensive has triggered a hunger crisis, displaced much of the population internally, and left vast areas of the territory in ruins.

Crucially, the agreement does not represent an end to the war, one of Hamas’s core demands. Instead, it commits both sides to continue negotiations throughout the 60-day period, with the hope of reaching a more durable and comprehensive ceasefire.

Obstacles to a lasting peace

Despite the apparent opportunity to reach a final ceasefire, especially after Israel has inflicted severe damage on Hamas, Netanyahu’s government appears reluctant to fully end the military campaign.

Palestinian people in front of bombed out buildings in Gaza.
There is scepticism a temporary ceasefire would lead to permanent peace.
Anas-Mohammed/Shutterstock

A central reason is political: Netanyahu’s ruling coalition heavily relies on far-right parties that insist on continuing the war. Any serious attempt at a ceasefire could lead to the collapse of his government.

Militarily, Israel has achieved several of its tactical objectives.

It has significantly weakened Hamas and other Palestinian factions and caused widespread devastation across Gaza. This is alongside the mass arrests, home demolitions, and killing of hundreds of Palestinians in the West Bank.

And it has forced Hezbollah in Lebanon to scale back its operations after sustaining major losses.

Perhaps most notably, Israel struck deep into Iran’s military infrastructure, killing dozens of high-ranking commanders and damaging its missile and nuclear capabilities.

Reshaping the map

Yet Netanyahu’s ambitions may go beyond tactical victories. There are signs he is aiming for two broader strategic outcomes.

First, by making Gaza increasingly uninhabitable, his government could push Palestinians to flee. This would effectively pave the way for Israel to annex the territory in the long term – a scenario advocated by many of his far-right allies.

Speaking at the White House, Netanyahu says he is working with the US on finding countries that will take Palestinians from Gaza:

if people want to stay, they can stay, but if they want to leave, they should be able to leave.

Second, prolonging the war allows Netanyahu to delay his ongoing corruption trial and extend his political survival.

True intentions

At the heart of the impasse is the far-right’s vision for total Palestinian defeat, with no concession and no recognition of a future Palestinian state. This ideology has consistently blocked peace efforts for three decades.

Israeli leaders have repeatedly described any potential Palestinian entity as “less than a state” or a “state-minus”, a formulation that falls short of Palestinian aspirations and international legal standards.

Today, even that limited vision appears to be off the table, as Israeli policy moves towards complete rejection of Palestinian statehood.

With Palestinian resistance movements significantly weakened and no immediate threat facing Israel, this moment presents a crucial test of Israel’s intentions.

Is Israel genuinely pursuing peace, or seeking to cement its dominance in the region while permanently denying Palestinians their right to statehood?

Following its military successes and the normalisation of relations with several Arab states under the Abraham Accords, Israeli political discourse has grown increasingly bold.

Some voices in the Israeli establishment are openly advocating for the permanent displacement of Palestinians to neighbouring Arab countries such as Jordan, Egypt and Saudi Arabia. This would effectively erase the prospect of a future Palestinian state.

This suggests that for certain factions within Israel, the end goal is not a negotiated settlement, but a one-sided resolution that reshapes the map and the people of the region on Israel’s terms.

The coming weeks will reveal whether Israel chooses the path of compromise and coexistence, or continues down a road that forecloses the possibility of lasting peace.

The Conversation

Ali Mamouri does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. The US has high hopes for a new Gaza ceasefire, but Israel’s long-term aims seem far less peaceful – https://theconversation.com/the-us-has-high-hopes-for-a-new-gaza-ceasefire-but-israels-long-term-aims-seem-far-less-peaceful-260286

Higher ed’s relationship with marriage? It’s complicated – and depends on age

Source: The Conversation – USA (2) – By John V. Winters, Professor of Economics, Iowa State University

Education rates are rising; marriage rates are falling. But the relationship between those two trends isn’t straightforward. Ugur Karakoc/E+ via Getty Images

The longer someone stays in school, the more likely they are to delay getting married – but education does not reduce the overall likelihood of being married later in life, according to our research recently published in Education Economics. Education also influences who Americans marry: Obtaining a four-year degree vs. just a high school diploma more than doubles someone’s likelihood of marrying a fellow college graduate.

Previous research has documented that the more education you have, the more likely you are to get married. But correlation does not imply causality, and plenty of other factors influence marriage and education.

My research with economist Kunwon Ahn provides evidence that there is indeed a causal link between education and marriage – but it’s a nuanced one.

Our study applies economic theory and advanced statistics to a 2006-2019 sample from the American Community Survey: more than 8 million people, whom we divided into different cohorts based on birthplace, birth year and self-reported ancestry.

To isolate the causal relationship, we needed to sidestep other factors that can influence someone’s decisions about marriage and education. Therefore, we did not calculate based on individuals’ own education level. Instead, we estimated their educational attainment using a proxy: their mothers’ level of education. On the individual level, plenty of people finish more or less education than their parents. Within a cohort, however, the amount of schooling that mothers have, on average, is a strong predictor of how much education children in that cohort received.

We found that an additional year of schooling – counting from first grade to the end of any postgraduate degrees – reduces the likelihood that someone age 25 to 34 is married by roughly 4 percentage points.

Among older age groups, the effects of education were more mixed. On average, the level of education has almost zero impact on the probability that someone age 45 to 54 is married. Among people who were married by that age, being more educated reduces their likelihood of being divorced or separated.

However, more education also makes people slightly more likely to have never been married by that age. In our sample, about 12% of people in that age group have never married. An additional year of education increases that, on average, by 2.6 percentage points.

Why it matters

Marriage rates are at historical lows in the United States, especially for young people. Before 1970, more than 80% of Americans 25 to 34 were married. By 2023, that number had fallen to only 38%, according to data from the U.S. Census Bureau.

Over the same time, the percentage of Americans with a college degree has increased considerably. Additional education can increase someone’s earning potential and make them a more attractive partner.

Yet the rising costs of higher education may make marriage less attainable. A 2016 study found that the more college debt someone had, the less likely they were to ever marry.

While marriage rates have fallen across the board, the drop is most pronounced for lower-income groups, and not all of the gap is driven by education. One of the other causes may be declining job prospects for lower-income men. Over recent decades, as their earning potential has dwindled and women’s job options have grown, it appears some of the economic benefits of marriage have declined.

Declining marriage rates have important effects on individuals, families and society as a whole. Many people value the institution for its own sake, and others assign it importance based on religious, cultural and social values. Economically, marriage has important consequences for children, including how many children people have and the resources that they can invest in those children.

What still isn’t known

Education levels are only part of the explanation for trends in marriage rates. Other cultural, social, economic and technological factors are likely involved in the overall decline, but their exact contribution is still unknown.

One idea gaining traction, though little research has been done on it, considers the ways smartphones and social media may be reducing psychological and social well-being. We stay in more, go out less, and are increasingly divided – all of which could make people less likely to marry.

The Research Brief is a short take on interesting academic work.

The Conversation

John V. Winters does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Higher ed’s relationship with marriage? It’s complicated – and depends on age – https://theconversation.com/higher-eds-relationship-with-marriage-its-complicated-and-depends-on-age-258664

How slashing university research grants impacts Colorado’s economy and national innovation – a CU Boulder administrator explains

Source: The Conversation – USA (2) – By Massimo Ruzzene, Vice Chancellor of Research and Innovation, University of Colorado Boulder

Federal funding cuts to the University of Colorado Boulder have already impacted research and could cause even more harm. Glenn J. Asakawa/University of Colorado

The Trump administration has been freezing or reducing federal grants to universities across the country.

Over the past several months, universities have lost more than US$11 billion in funding, according to NPR. More than two dozen universities, including the University of Colorado Boulder and the University of Denver, have been affected. Research into cancer, farming solutions and climate resiliency are just a few of the many projects nationally that have seen cuts.

The Conversation asked Massimo Ruzzene, senior vice chancellor for research and innovation at the University of Colorado Boulder, to explain how these cuts and freezes are impacting the university he works for and Colorado’s local economy.

How important are federal funds to CU Boulder?

Federal funding pays for approximately 70% of CU Boulder’s research each year. That’s about $495 million in the 2023-2024 fiscal year.

The other 30% of research funding comes from a variety of sources. The second-largest is international partnerships at $127 million. Last year, CU Boulder also received $27 million in philanthropic gifts to support research and approximately $29 million from collaborations with industry.

CU Boulder uses this money to fund research that advances fields like artificial intelligence, space exploration and planetary sciences, quantum technologies, biosciences and climate and energy.

At CU Boulder, federal funding also supports research projects like the Dust Accelerator Laboratory that helps us understand the composition and structure of cosmic dust. This research allows scientists to reconstruct the processes that formed planets, moons and organic molecules.

How much federal funding has CU Boulder lost?

So far in 2025, CU Boulder has received 56 grant cancellations or stop-work orders. Those amount to approximately $30 million in lost funding. This number is not inclusive of awards that are on hold and awaiting action by the sponsor.

This number also does not include the funds that have not been accessible due the considerable lag in funding from agencies such as the National Science Foundation and the National Institutes of Health.
Nationwide, National Science Foundation funding has dropped by more than 50% through the end of May of this year compared to the average of the past 10 years. The university anticipates that our funding received from these agencies will drop a similar amount, but the numbers are still being collected for this year.

What research has been impacted?

A wide variety. To take just one example, CU Boulder’s Cooperative Institute for Research in Environmental Sciences and the Institute of Arctic and Alpine Research investigate how to monitor, predict, respond to and recover from extreme weather conditions and natural disasters.

This research directly impacts the safety, well-being and prosperity of Colorado residents facing wildfires, droughts and floods.

A man in a red jacket sits on the ground and reads a tablet near a metal weather detector.
Michael Gooseff, a researcher from the College of Engineering and Applied Science, collects weather data from the McMurdo Dry Valleys in Antarctica.
Byron Adams/University of Colorado Boulder

Past research from these groups includes recovery efforts following the 2021 Marshall Fire in the Boulder area. Researchers collaborated with local governments and watershed groups to monitor environmental impacts and develop dashboards that detailed their findings.

How might cuts affect Colorado’s aerospace economy?

Colorado has more aerospace jobs per capita than any other state. The sector employs more than 55,000 people and contributes significantly to both Colorado’s economy and the national economy.

This ecosystem encompasses research universities such as CU Boulder and Colorado-based startups like Blue Canyon Technologies and Ursa Major Technologies. It also includes established global companies like Lockheed Martin and Raytheon Technologies.

At CU Boulder, the Laboratory for Atmospheric and Space Physics is one of the world’s premier space science research institutions. Researchers at the lab design, build and operate spacecraft and other instruments that contribute critical data. That data helps us understand Earth’s atmosphere, the Sun, planetary systems and deep space phenomena. If the projects the lab supports are cut, then it’s likely the lab will be cut as well.

The Presidential Budget Request proposes up to 24% cuts to NASA’s annual budget. These include reductions of 47% for the Science Mission Directorate. The directorate supports more than a dozen space missions at CU Boulder. That cut could have an immediate impact on university programs of approximately $50 million.

People in white suits stand in front of large metal solar arrays to test them for a space mission to Mars.
Scientists test the solar arrays on NASA’s Mars Atmosphere and Volatile Evolution orbiter spacecraft at Lockheed Martin’s facility near Denver.
Photo courtesy of LASP

One of the largest space missions CU Boulder is involved in is the Mars Atmosphere and Volatile Evolution orbiter. MAVEN, as it’s known, provides telecommunications and space weather monitoring capabilities. These are necessary to support future human and robotic missions to Mars over the next decade and beyond, a stated priority for the White House. If MAVEN were to be canceled, experts estimate that it would cost almost $1 billion to restart it.

Have the cuts hit quantum research?

While the federal government has identified quantum technology as a national priority, the fiscal year 2026 budget proposal only maintains existing funding levels. It does not introduce new investments or initiatives.

I’m concerned that this stagnation, amid broader cuts to science agencies, could undermine progress in this field and undercut the training of its critical workforce. The result could be the U.S. ceding its leadership in quantum innovation to global competitors.

The Conversation

Massimo Ruzzene receives funding from the National Science Foundation.

ref. How slashing university research grants impacts Colorado’s economy and national innovation – a CU Boulder administrator explains – https://theconversation.com/how-slashing-university-research-grants-impacts-colorados-economy-and-national-innovation-a-cu-boulder-administrator-explains-257869

3 basic ingredients, a million possibilities: How small pizzerias succeed with uniqueness in an age of chain restaurants

Source: The Conversation – USA (2) – By Paula de la Cruz-Fernández, Cultural Digital Collections Manager, University of Florida

Variety is the sauce of life. Suzanne Kreiter/Boston Globe via Getty Images

At its heart, pizza is deceptively simple. Made from just a few humble ingredients – baked dough, tangy sauce, melted cheese and maybe a few toppings – it might seem like a perfect candidate for the kind of mass-produced standardization that defines many global food chains, where predictable menus reign supreme.

Yet, visit two pizzerias in different towns – or even on different blocks of the same town – and you’ll find that pizza stubbornly refuses to be homogenized.

We are researchers working on a local business history project that documents the commercial landscape of Gainesville, Florida, in the 20th and 21st centuries. As part of that project, we’ve spent a great many hours over the past two years interviewing local restaurant owners, especially those behind Gainesville’s independent pizzerias. What we’ve found reaffirms a powerful truth: Pizza resists sameness – and small pizzerias are a big reason why.

Why standardized pizza rose but didn’t conquer

While tomatoes were unknown in Italy until the mid-16th century, they have since become synonymous with Italian cuisine – especially through pizza.

Pizza arrived in the U.S. from Naples in the early 20th century, when Italian immigration was at its peak. Two of the biggest destinations for Italian immigrants were New York City and Chicago, and today each has a distinctive pizza style. A New York slice can easily be identified by its thin, soft, foldable crust, while Chicago pies are known for deep, thick, buttery crusts.

After World War II, other regions developed their own types of pizza, including the famed New Haven and Detroit styles. The New Haven style is known for being thin, crispy and charred in a coal-fired oven, while the Detroit style has a rectangular, deep-dish shape and thick, buttery crust.

By the latter half of the 20th century, pizza had become a staple of the American diet. And as its popularity grew, so did demand for consistent, affordable pizza joints. Chains such as Pizza Hut, founded in 1958, and Papa John’s, established in 1984, applied the model pioneered by McDonalds in the late 1940s, adopting limited menus, assembly line kitchens and franchise models built for consistency and scale. New technologies such as point-of-sale systems and inventory management software made things even more efficient.

As food historian Carol Helstosky explains in “Pizza: A Global History,” the transformation involved simplifying recipes, ensuring consistent quality and developing formats optimized for rapid expansion and franchising. What began as a handcrafted, regional dish became a highly replicable product suited to global mass markets.

Today, more than 20,000 Pizza Huts operate worldwide. Papa John’s, which runs about 6,000 pizzerias, built its brand explicitly on a promise rooted in standardization. In this model, success means making pizza the same way, everywhere, every time.

So, what happened to the independent pizzerias? Did they get swallowed up by efficiency?

Not quite.

Chain restaurants don’t necessarily suffocate small competitors, recent research shows. In fact, in the case of pizza, they often coexist, sometimes even fueling creativity and opportunity. Independent pizzerias – there are more than 44,000 nationwide – lean into what makes them unique, carving out a niche. Rather than focusing only on speed or price, they compete by offering character, inventive toppings, personal service and a sense of place that chains just can’t replicate.

A local pizza scene: Creativity in a corporate age

For an example, look no farther than Gainesville. A college town with fewer than 150,000 residents, Gainesville doesn’t have the same culinary cache as New York or Chicago, but it has developed a very unique pizza scene. With 13 independent pizzerias serving Neapolitan, Detroit, New York and Mediterranean styles and more, hungry Gators have a plethora of options when craving a slice.

What makes Gainesville’s pizza scene especially interesting is the range of backgrounds its proprietors have. Through interviews with pizzeria owners, we found that some had started as artists and musicians, while others had worked in engineering or education – and each had their own unique approach to making pizzas.

The owner of Strega Nona’s Oven, for example, uses his engineering background to turn dough-making into a science, altering the proportions of ingredients by as little as half of a percent based on the season or even the weather.

Satchel’s Pizza, on the other hand, is filled with works made by its artist owner, including mosaic windows, paintings, sculptures and fountains.

Gainesville’s independent pizzerias often serve as what sociologists call “third places”: spaces for gathering that aren’t home or work. And their owners think carefully about how to create a welcoming environment. For example, the owner of Scuola Pizza insisted the restaurant be free of TVs, so diners can focus on their food. Squarehouse Pizza features a large outdoor space; an old, now repurposed school bus outfitted with tables and chairs to dine in, and a stage for live music.

Squarehouse also is known for its unusual toppings on square, Detroit-style pies – for example, the Mariah Curry, topped with curry chicken or cauliflower and coconut curry sauce. It refreshes its specialty menus every semester or two.

While the American pizza landscape may be shaped by big brands and standardized menus, small pizzerias continue to shine. Gainesville is a perfect example of how a local pizza scene in a small Southern college town can be so unique, even in a globalized industry. Small pizzerias don’t just offer food – they offer a flavorful reminder that the marketplace rewards distinctiveness and local character, too.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. 3 basic ingredients, a million possibilities: How small pizzerias succeed with uniqueness in an age of chain restaurants – https://theconversation.com/3-basic-ingredients-a-million-possibilities-how-small-pizzerias-succeed-with-uniqueness-in-an-age-of-chain-restaurants-259661

The aftermath of floods, hurricanes and other disasters can be hardest on older rural Americans – here’s how families and neighbors can help

Source: The Conversation – USA (3) – By Lori Hunter, Professor of Sociology, Director of the Institute of Behavioral Science, University of Colorado Boulder

Edith Schaecher, center, and her daughter and granddaughter look at a photo album recovered from her tornado-damaged home in Greenfield, Iowa, in May 2024. AP Photo/Charlie Neibergall

Hurricanes, tornadoes and other extreme weather do not distinguish between urban and rural boundaries. But when a disaster strikes, there are big differences in how well people are able to respond and recover – and older adults in rural areas are especially vulnerable.

If a disaster causes injuries, getting health care can take longer in rural areas. Many rural hospitals have closed, leaving patients traveling longer distances for care.

At the same time, rural areas have higher percentages of older adults, a group that is more likely to have chronic health problems that make experiencing natural disasters especially dangerous. Medical treatments, such as dialysis, can be disrupted when power goes out or clinics are damaged, and injuries are more likely around property damaged by flooding or powerful winds.

As a sociologist who studies rural issues and directs the Institute of Behavioral Science at the University of Colorado Boulder, I believe that understanding the risks is essential for ensuring healthier lives for older adults. I see many different ways rural communities are helping reduce their vulnerability in disasters.

Disasters disrupt health care, especially in isolated rural regions

According to the U.S. Census Bureau, about 20% of the country’s rural population is age 65 and over, compared with only 16% of urban residents. That’s about 10 million older adults living in rural areas.

There are three primary reasons rural America has been aging faster than the rest of the country: Young people have been leaving for college and job opportunities, meaning fewer residents are starting new families. Many older rural residents are choosing to “age in place” where they have strong social ties. And some rural areas are gaining older adults who choose to retire there.

An aging population means rural areas tend to have a larger percentage of residents with chronic disease, such as dementia, heart disease, respiratory illness and diabetes.

According to research from the National Council on Aging, nearly 95% of adults age 60 and older have at least one chronic condition, while more than 78% have two or more. Rural areas also have higher rates of death from chronic diseases, particularly heart disease.

At the same time, health care access in rural areas is rapidly declining.

Nearly 200 rural hospitals have closed or stopped providing in-patient care since 2005. Over 700 more — one-third of the nation’s remaining rural hospitals — were considered to be at risk of closing even before the cuts to Medicaid that the president signed in July 2025.

Hospital closures have left rural residents traveling about 20 miles farther for common in-patient health care services than they did two decades ago, and even farther for specialist care.

Those miles might seem trivial, but in emergencies when roads are damaged or flooded, they can mean losing access to care and treatment.

After Hurricane Katrina struck New Orleans in 2005, 44% of patients on dialysis missed at least one treatment session, and almost 17% missed three or more.

When Hurricanes Matthew and Florence hit rural Robeson County, North Carolina, in 2016 and 2018, some patients who relied on insulin to manage their blood sugar levels went without insulin for weeks. The county had high rates of poverty and poor health already, and the healthy foods people needed to manage the disease were also hard to find after the storm.

Insulin is important for treating diabetes – a chronic disease estimated to affect nearly one-third of adults age 65 and older. But a sufficient supply can be harder to maintain when a disaster knocks out power, because insulin should be kept cool, and medical facilities and drugstores may be harder for patients to reach.

Rural residents also often live farther from community centers, schools or other facilities that can serve as cooling centers during heat waves or evacuation centers in times of crisis.

Alzheimer’s disease can make evacuation difficult

Cognitive decline also affects older adults’ ability to manage disasters.

Over 11% of Americans age 65 and older – more than 7 million people – have Alzheimer’s disease or related dementia, and the prevalence is higher in rural areas’ older populations compared with urban areas.

Caregivers for family members living with dementia may struggle to find time to prepare for disasters. And when disaster strikes, they face unique challenges. Disasters disrupt routines, which can cause agitation for people with Alzheimer’s, and patients may resist evacuation.

Living through a disaster can also worsen brain health over the long run. Older adults who lived through the 2011 Great East Japan earthquake and tsunami were found to have greater cognitive decline over the following decade, especially those who lost their homes or jobs, or whose health care routines were disrupted.

Social safety nets are essential

One thing that many rural communities have that helps is a strong social fabric. Those social connections can help reduce older adults’ vulnerability when disasters strike.

Following severe flooding in Colorado in 2013, social connections helped older adults navigate the maze of paperwork required for disaster aid, and some even provided personal loans.

Two older men stand by debris form a storm-damaged church. Another group of people walks in the background.
Community support through churches, like this one whose building was hit by a tornado in rural Argyle, Wis., in 2024, and other groups can help older adults recover from disasters.
Ross Harried/NurPhoto via Getty Images

Friends, family and neighbors in rural areas often check in on seniors, particularly those living alone. They can help them develop disaster response plans to ensure older residents have access to medications and medical treatment, and that they have an evacuation plan.

Rural communities and local groups can also help build up older adults’ mental and physical health before and after storms by developing educational, social and exercise programs. Better health and social connections can improve resilience, including older adults’ ability to respond to alerts and recover after disasters.

Ensuring that everyone in the community has that kind of support is important in rural areas and cities alike as storm and flood risks worsen, particularly for older adults.

The Conversation

Lori Hunter receives funding from the National Institutes of Health and the National Science Foundation.

ref. The aftermath of floods, hurricanes and other disasters can be hardest on older rural Americans – here’s how families and neighbors can help – https://theconversation.com/the-aftermath-of-floods-hurricanes-and-other-disasters-can-be-hardest-on-older-rural-americans-heres-how-families-and-neighbors-can-help-247691