Your data privacy is slipping away – here’s why, and what you can do about it

Source: The Conversation – USA – By Mike Chapple, Teaching Professor of IT, Analytics, and Operations, University of Notre Dame

Cybersecurity and data privacy are constantly in the news. Governments are passing new cybersecurity laws. Companies are investing in cybersecurity controls such as firewalls, encryption and awareness training at record levels.

And yet, people are losing ground on data privacy.

In 2024, the Identity Theft Resource Center reported that companies sent out 1.3 billion notifications to the victims of data breaches. That’s more than triple the notices sent out the year before. It’s clear that despite growing efforts, personal data breaches are not only continuing, but accelerating.

What can you do about this situation? Many people think of the cybersecurity issue as a technical problem. They’re right: Technical controls are an important part of protecting personal information, but they are not enough.

As a professor of information technology, analytics and operations at the University of Notre Dame, I study ways to protect personal privacy.

Solid personal privacy protection is made up of three pillars: accessible technical controls, public awareness of the need for privacy, and public policies that prioritize personal privacy. Each plays a crucial role in protecting personal privacy. A weakness in any one puts the entire system at risk.

The first line of defense

Technology is the first line of defense, guarding access to computers that store data and encrypting information as it travels between computers to keep intruders from gaining access. But even the best security tools can fail when misused, misconfigured or ignored.

Two technical controls are especially important: encryption and multifactor authentication. These are the backbone of digital privacy – and they work best when widely adopted and properly implemented.




Read more:
The hidden cost of convenience: How your data pulls in hundreds of billions of dollars for app and social media companies


Encryption uses complex math to put sensitive data in an unreadable format that can only be unlocked with the right key. For example, your web browser uses HTTPS encryption to protect your information when you visit a secure webpage. This prevents anyone on your network – or any network between you and the website – from eavesdropping on your communications. Today, nearly all web traffic is encrypted in this way.

But if we’re so good at encrypting data on networks, why are we still suffering all of these data breaches? The reality is that encrypting data in transit is only part of the challenge.

Securing stored data

We also need to protect data wherever it’s stored – on phones, laptops and the servers that make up cloud storage. Unfortunately, this is where security often falls short. Encrypting stored data, or data at rest, isn’t as widespread as encrypting data that is moving from one place to another.

While modern smartphones typically encrypt files by default, the same can’t be said for cloud storage or company databases. Only 10% of organizations report that at least 80% of the information they have stored in the cloud is encrypted, according to a 2024 industry survey. This leaves a huge amount of unencrypted personal information potentially exposed if attackers manage to break in. Without encryption, breaking into a database is like opening an unlocked filing cabinet – everything inside is accessible to the attacker.

Multifactor authentication is a security measure that requires you to provide more than one form of verification before accessing sensitive information. This type of authentication is more difficult to crack than a password alone because it requires a combination of different types of information. It often combines something you know, such as a password, with something you have, such as a smartphone app that can generate a verification code or with something that’s part of what you are, like a fingerprint. Proper use of multifactor authentication reduces the risk of compromise by 99.22%.

While 83% of organizations require that their employees use multifactor authentication, according to another industry survey, this still leaves millions of accounts protected by nothing more than a password. As attackers grow more sophisticated and credential theft remains rampant, closing that 17% gap isn’t just a best practice – it’s a necessity.

Multifactor authentication is one of the simplest, most effective steps organizations can take to prevent data breaches, but it remains underused. Expanding its adoption could dramatically reduce the number of successful attacks each year.

Awareness gives people the knowledge they need

Even the best technology falls short when people make mistakes. Human error played a role in 68% of 2024 data breaches, according to a Verizon report. Organizations can mitigate this risk through employee training, data minimization – meaning collecting only the information necessary for a task, then deleting it when it’s no longer needed – and strict access controls.

Policies, audits and incident response plans can help organizations prepare for a possible data breach so they can stem the damage, see who is responsible and learn from the experience. It’s also important to guard against insider threats and physical intrusion using physical safeguards such as locking down server rooms.

Public policy holds organizations accountable

Legal protections help hold organizations accountable in keeping data protected and giving people control over their data. The European Union’s General Data Protection Regulation is one of the most comprehensive privacy laws in the world. It mandates strong data protection practices and gives people the right to access, correct and delete their personal data. And the General Data Protection Regulation has teeth: In 2023, Meta was fined €1.2 billion (US$1.4 billion) when Facebook was found in violation.

Despite years of discussion, the U.S. still has no comprehensive federal privacy law. Several proposals have been introduced in Congress, but none have made it across the finish line. In its place, a mix of state regulations and industry-specific rules – such as the Health Insurance Portability and Accountability Act for health data and the Gramm-Leach-Bliley Act for financial institutions – fill the gaps.

Some states have passed their own privacy laws, but this patchwork leaves Americans with uneven protections and creates compliance headaches for businesses operating across jurisdictions.

The tools, policies and knowledge to protect personal data exist – but people’s and institutions’ use of them still falls short. Stronger encryption, more widespread use of multifactor authentication, better training and clearer legal standards could prevent many breaches. It’s clear that these tools work. What’s needed now is the collective will – and a unified federal mandate – to put those protections in place.


This article is part of a series on data privacy that explores who collects your data, what and how they collect, who sells and buys your data, what they all do with it, and what you can do about it.

The Conversation

Mike Chapple does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Your data privacy is slipping away – here’s why, and what you can do about it – https://theconversation.com/your-data-privacy-is-slipping-away-heres-why-and-what-you-can-do-about-it-251768

Scientific norms shape the behavior of researchers working for the greater good

Source: The Conversation – USA – By Jeffrey A. Lee, Professor of Geography and the Environment, Texas Tech University

Mentors model the ethical pursuit of scientific knowledge. sanjeri/E+ via Getty Images

Over the past 400 years or so, a set of mostly unwritten guidelines has evolved for how science should be properly done. The assumption in the research community is that science advances most effectively when scientists conduct themselves in certain ways.

The first person to write down these attitudes and behaviors was Robert Merton, in 1942. The founder of the sociology of science laid out what he called the “ethos of science,” a set of “values and norms which is held to be binding on the man of science.” (Yes, it’s sexist wording. Yes, it was the 1940s.) These now are referred to as scientific norms.

The point of these norms is that scientists should behave in ways that improve the collective advancement of knowledge. If you’re a cynic, you might be rolling your eyes at such a Pollyannaish ideal. But corny expectations keep the world functioning. Think: Be kind, clean up your mess, return the shopping cart to the cart corral.

I’m a physical geographer who realized long ago that students are taught biology in biology classes and chemistry in chemistry classes, but rarely are they taught about the overarching concepts of science itself. So I wrote a book called “The Scientific Endeavor,” laying out what scientists and other educated people should know about science itself.

Scientists in training are expected to learn the big picture of science after years of observing their mentors, but that doesn’t always happen. And understanding what drives scientists can help nonscientists better understand research findings. These scientific norms are a big part of the scientific endeavor. Here are Merton’s original four, along with a couple I think are worth adding to the list:

Universalism

Scientific knowledge is for everyone – it’s universal – and not the domain of an individual or group. In other words, a scientific claim must be judged on its merits, not the person making it. Characteristics like a scientist’s nationality, gender or favorite sports team should not affect how their work is judged.

Also, the past record of a scientist shouldn’t influence how you judge whatever claim they’re currently making. For instance, Nobel Prize-winning chemist Linus Pauling was not able to convince most scientists that large doses of vitamin C are medically beneficial; his evidence didn’t sufficiently support his claim.

In practice, it’s hard to judge contradictory claims fairly when they come from a “big name” in the field versus an unknown researcher without a reputation. It is, however, easy to point out such breaches of universalism when others let scientific fame sway their opinion one way or another about new work.

black-and-white image of man in white jacket holding up two bottles
When asked about patenting his polio vaccine, Jonas Salk replied, ‘There is no patent. Could you patent the sun?’
Bettmann via Getty Images

Communism

Communism in science is the idea that scientific knowledge is the property of everyone and must be shared.

Jonas Salk, who led the research that resulted in the polio vaccine, provides a classic example of this scientific norm. He published the work and did not patent the vaccine so that it could be freely produced at low cost.

When scientific research doesn’t have direct commercial application, communism is easy to practice. When money is involved, however, things get complicated. Many scientists work for corporations, and they might not publish their findings in order to keep them away from competitors. The same goes for military research and cybersecurity, where publishing findings could help the bad guys.

Disinterestedness

Disinterestedness refers to the expectation that scientists pursue their work mainly for the advancement of knowledge, not to advance an agenda or get rich. The expectation is that a researcher will share the results of their work, regardless of a finding’s implications for their career or economic bottom line.

Research on politically hot topics, like vaccine safety, is where it can be tricky to remain disinterested. Imagine a scientist who is strongly pro-vaccine. If their vaccine research results suggest serious danger to children, the scientist is still obligated to share these findings.

Likewise, if a scientist has invested in a company selling a drug, and the scientist’s research shows that the drug is dangerous, they are morally compelled to publish the work even if that would hurt their income.

In addition, when publishing research, scientists are required to disclose any conflicts of interest related to the work. This step informs others that they may want to be more skeptical in evaluating the work, in case self-interest won out over disinterest.

Disinterestedness also applies to journal editors, who are obligated to decide whether to publish research based on the science, not the political or economic implications.

Organized skepticism

Merton’s last norm is organized skepticism. Skepticism does not mean rejecting ideas because you don’t like them. To be skeptical in science is to be highly critical and look for weaknesses in a piece of research.

colorful journals with spines out on library shelves
By the time new research is published in a reputable journal, it’ has made it past several sets of skeptical eyes.
gorsh13/iStock via Getty Images Plus

This concept is formalized in the peer review process. When a scientist submits an article to a journal, the editor sends it to two or three scientists familiar with the topic and methods used. They read it carefully and point out any problems they find.

The editor then uses the reviewer reports to decide whether to accept as is, reject outright or request revisions. If the decision is revise, the author then makes each change or tries to convince the editor that the reviewer is wrong.

Peer review is not perfect and doesn’t always catch bad research, but in most cases it improves the work, and science benefits. Traditionally, results weren’t made public until after peer review, but that practice has weakened in recent years with the rise of preprints, reducing the reliability of information for nonscientists.

Integrity and humility

I’m adding two norms to Merton’s list.

The first is integrity. It’s so fundamental to good science that it almost seems unnecessary to mention. But I think it’s justified since cheating, stealing and lazy scientists are getting plenty of attention these days.

The second is humility. You may have made a contribution to our understanding of cell division, but don’t tell us that you cured cancer. You may be a leader in quantum mechanics research, but that doesn’t make you an authority on climate change.

Scientific norms are guidelines for how scientists are expected to behave. A researcher who violates one of these norms won’t be carted off to jail or fined an exorbitant fee. But when a norm is not followed, scientists must be prepared to justify their reasons, both to themselves and to others.

The Conversation

Jeffrey A. Lee does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Scientific norms shape the behavior of researchers working for the greater good – https://theconversation.com/scientific-norms-shape-the-behavior-of-researchers-working-for-the-greater-good-255159

President Trump’s tug-of-war with the courts, explained

Source: The Conversation – USA – By Paul M. Collins Jr., Professor of Legal Studies and Political Science, UMass Amherst

The U.S. Supreme Court in Washington, D.C. Stefani Reynolds/Bloomberg

The Supreme Court handed President Donald Trump a big win on June 27, 2025, by limiting the ability of judges to block Trump administration policies across the nation.

But Trump has not fared nearly as well in the lower courts, where he has lost a series of cases through different levels of the federal court system. On June 5, a single judge temporarily stopped the administration from preventing Harvard University from enrolling international students.

And a three-judge panel of the U.S. Court of International Trade blocked Trump on May 28 from imposing tariffs on China and other nations. The Trump administration has appealed this decision. It will be taken up in July by all 11 judges on the United States Court of Appeals for the Federal Circuit.

After that, the case can be appealed to the Supreme Court.

I’m a scholar of the federal courts. The reasons why some courts have multiple judges and others have a single judge can be confusing. Here’s a guide to help understand what’s going on in the federal courts.

Federal District Courts

The U.S. District Courts are the trial courts in the federal system and hear about 400,000 cases per year. A single judge almost always presides over cases.

This makes sense for a jury trial, since a judge might make dozens of spur-of-the-moment decisions during the course of a trial, such as ruling on a lawyer’s objection to a question asked of a witness. If a panel of, say, three judges performed this task, it would prolong proceedings because the three judges would have to deliberate over every ruling.

A more controversial role of District Courts involves setting nationwide injunctions. This happens when a single judge temporarily stops the government from enforcing a policy throughout the nation.

There have been more than two dozen nationwide injunctions during Trump’s second term. These involve policy areas as diverse as ending birthright citizenship, firing federal employees and banning transgender people from serving in the military.

A man at a podium speaks to dozens of reporters.
President Donald Trump speaks at the White House on June 27, 2025, after the Supreme Court curbed the power of lone federal judges to block executive actions.
Andrew Caballero-Reynolds/AFP via Getty Images

Trump and Republicans in Congress argue that the ability to issue nationwide injunctions gives too much power to a single judge. Instead, they believe injunctions should apply only to the parties involved in the case.

On June 27, the Supreme Court agreed with the Trump administration and severely limited the ability of District Court judges to issue nationwide injunctions. This means that judges can generally stop policies from being enforced only against the parties to a lawsuit, instead of everyone in the nation.

In rare instances, a panel of three District Court judges hears a case. Congress decides what cases these special three-judge panels hear, reserving them for especially important issues. For example, these panels have heard cases involving reapportionment, which is how votes are translated into legislative seats in Congress and state legislatures, and allegations that a voter’s rights have been violated.

The logic behind having three judges hear such important cases is that they will give more careful consideration to the dispute. This may lend legitimacy to a controversial decision and prevents a single judge from exercising too much power.

There are also specialized courts that hear cases involving particular policies, sometimes in panels of three judges. For instance, three-judge panels on the U.S. Court of International Trade decide cases involving executive orders related to international trade.

The federal Court of Appeals

The U.S. Court of Appeals hears appeals from the District Courts and specialized courts.

The 13 federal circuit courts that make up the U.S. Court of Appeals are arranged throughout the country and handle about 40,000 cases per year. Each circuit court has six to 29 judges. Cases are decided primarily by three-judge panels.

Having multiple judges decide cases on the Court of Appeals is seen as worthwhile, since these courts are policymaking institutions. This means they set precedents for the judicial circuit in which they operate, which covers three to nine states.

Supporters of this system argue that by having multiple judges on appellate courts, the panel will consider a variety of perspectives on the case and collaborate with one another. This can lead to better decision-making. Additionally, having multiple judges check one another can boost public confidence in the judiciary.

The party that loses a case before a three-judge panel can request that the entire circuit rehear the case. This is known as sitting en banc.

Because judges on a circuit can decline to hear cases en banc, this procedure is usually reserved for especially significant cases. For instance, the U.S. Court of Appeals for the Federal Circuit has agreed to an en banc hearing to review the Court of International Trade’s decision to temporarily halt Trump’s sweeping tariff program. It also allowed the tariffs to remain in effect until the appeal plays out, likely in August.

The exception to having the entire circuit sit together en banc is the 9th Circuit, based in San Francisco, which has 29 judges, far more than other circuit courts. It uses an 11-judge en banc process, since having 29 judges hear cases together would be logistically challenging.

Cargo ships are seen at a container terminal.
Cargo ships are seen at a container terminal in the Port of Shanghai, China, in May 2025. A three-judge panel of the U.S. Court of International Trade blocked Trump from imposing tariffs on China and other nations.
CFOTO/Future Publishing via Getty Images

The US Supreme Court

The U.S. Supreme Court sits atop the American legal system and decides about 60 cases per year.

Cases are decided by all nine justices, unless a justice declines to participate because of a conflict of interest. As with other multimember courts, advocates of the nine-member makeup argue that the quality of decision-making is improved by having many justices participate in a case’s deliberation.

Each Supreme Court justice is charged with overseeing one or more of the 13 federal circuits. In this role, a single justice reviews emergency appeals from the District Courts and an appellate court within a circuit. This authorizes them to put a temporary hold on the implementation of policies within that circuit or refer the matter to the entire Supreme Court.

In February, for example, Chief Justice John Roberts blocked a Court of Appeals order that would have compelled the Trump administration to pay nearly US$2 billion in reimbursements for already completed foreign aid work.

In March, a 5-4 majority of the high court sent the case back to U.S. District Judge Amir Ali, who subsequently ordered the Trump administration to release some of the funds.

The federal judicial system is complex. The flurry of executive orders from the Trump administration means that cases are being decided on a nearly daily basis by a variety of courts.

A single judge will decide some of these cases, and others are considered by full courts. Though the nine justices of the Supreme Court technically have the final say, the sheer volume of legal challenges means that America’s District Courts and Court of Appeals will resolve many of the disputes.

The Conversation

Paul M. Collins Jr. does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. President Trump’s tug-of-war with the courts, explained – https://theconversation.com/president-trumps-tug-of-war-with-the-courts-explained-258234

Higher ed’s relationship with marriage? It’s complicated – and depends on age

Source: The Conversation – USA (2) – By John V. Winters, Professor of Economics, Iowa State University

Education rates are rising; marriage rates are falling. But the relationship between those two trends isn’t straightforward. Ugur Karakoc/E+ via Getty Images

The longer someone stays in school, the more likely they are to delay getting married – but education does not reduce the overall likelihood of being married later in life, according to our research recently published in Education Economics. Education also influences who Americans marry: Obtaining a four-year degree vs. just a high school diploma more than doubles someone’s likelihood of marrying a fellow college graduate.

Previous research has documented that the more education you have, the more likely you are to get married. But correlation does not imply causality, and plenty of other factors influence marriage and education.

My research with economist Kunwon Ahn provides evidence that there is indeed a causal link between education and marriage – but it’s a nuanced one.

Our study applies economic theory and advanced statistics to a 2006-2019 sample from the American Community Survey: more than 8 million people, whom we divided into different cohorts based on birthplace, birth year and self-reported ancestry.

To isolate the causal relationship, we needed to sidestep other factors that can influence someone’s decisions about marriage and education. Therefore, we did not calculate based on individuals’ own education level. Instead, we estimated their educational attainment using a proxy: their mothers’ level of education. On the individual level, plenty of people finish more or less education than their parents. Within a cohort, however, the amount of schooling that mothers have, on average, is a strong predictor of how much education children in that cohort received.

We found that an additional year of schooling – counting from first grade to the end of any postgraduate degrees – reduces the likelihood that someone age 25 to 34 is married by roughly 4 percentage points.

Among older age groups, the effects of education were more mixed. On average, the level of education has almost zero impact on the probability that someone age 45 to 54 is married. Among people who were married by that age, being more educated reduces their likelihood of being divorced or separated.

However, more education also makes people slightly more likely to have never been married by that age. In our sample, about 12% of people in that age group have never married. An additional year of education increases that, on average, by 2.6 percentage points.

Why it matters

Marriage rates are at historical lows in the United States, especially for young people. Before 1970, more than 80% of Americans 25 to 34 were married. By 2023, that number had fallen to only 38%, according to data from the U.S. Census Bureau.

Over the same time, the percentage of Americans with a college degree has increased considerably. Additional education can increase someone’s earning potential and make them a more attractive partner.

Yet the rising costs of higher education may make marriage less attainable. A 2016 study found that the more college debt someone had, the less likely they were to ever marry.

While marriage rates have fallen across the board, the drop is most pronounced for lower-income groups, and not all of the gap is driven by education. One of the other causes may be declining job prospects for lower-income men. Over recent decades, as their earning potential has dwindled and women’s job options have grown, it appears some of the economic benefits of marriage have declined.

Declining marriage rates have important effects on individuals, families and society as a whole. Many people value the institution for its own sake, and others assign it importance based on religious, cultural and social values. Economically, marriage has important consequences for children, including how many children people have and the resources that they can invest in those children.

What still isn’t known

Education levels are only part of the explanation for trends in marriage rates. Other cultural, social, economic and technological factors are likely involved in the overall decline, but their exact contribution is still unknown.

One idea gaining traction, though little research has been done on it, considers the ways smartphones and social media may be reducing psychological and social well-being. We stay in more, go out less, and are increasingly divided – all of which could make people less likely to marry.

The Research Brief is a short take on interesting academic work.

The Conversation

John V. Winters does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Higher ed’s relationship with marriage? It’s complicated – and depends on age – https://theconversation.com/higher-eds-relationship-with-marriage-its-complicated-and-depends-on-age-258664

How slashing university research grants impacts Colorado’s economy and national innovation – a CU Boulder administrator explains

Source: The Conversation – USA (2) – By Massimo Ruzzene, Vice Chancellor of Research and Innovation, University of Colorado Boulder

Federal funding cuts to the University of Colorado Boulder have already impacted research and could cause even more harm. Glenn J. Asakawa/University of Colorado

The Trump administration has been freezing or reducing federal grants to universities across the country.

Over the past several months, universities have lost more than US$11 billion in funding, according to NPR. More than two dozen universities, including the University of Colorado Boulder and the University of Denver, have been affected. Research into cancer, farming solutions and climate resiliency are just a few of the many projects nationally that have seen cuts.

The Conversation asked Massimo Ruzzene, senior vice chancellor for research and innovation at the University of Colorado Boulder, to explain how these cuts and freezes are impacting the university he works for and Colorado’s local economy.

How important are federal funds to CU Boulder?

Federal funding pays for approximately 70% of CU Boulder’s research each year. That’s about $495 million in the 2023-2024 fiscal year.

The other 30% of research funding comes from a variety of sources. The second-largest is international partnerships at $127 million. Last year, CU Boulder also received $27 million in philanthropic gifts to support research and approximately $29 million from collaborations with industry.

CU Boulder uses this money to fund research that advances fields like artificial intelligence, space exploration and planetary sciences, quantum technologies, biosciences and climate and energy.

At CU Boulder, federal funding also supports research projects like the Dust Accelerator Laboratory that helps us understand the composition and structure of cosmic dust. This research allows scientists to reconstruct the processes that formed planets, moons and organic molecules.

How much federal funding has CU Boulder lost?

So far in 2025, CU Boulder has received 56 grant cancellations or stop-work orders. Those amount to approximately $30 million in lost funding. This number is not inclusive of awards that are on hold and awaiting action by the sponsor.

This number also does not include the funds that have not been accessible due the considerable lag in funding from agencies such as the National Science Foundation and the National Institutes of Health.
Nationwide, National Science Foundation funding has dropped by more than 50% through the end of May of this year compared to the average of the past 10 years. The university anticipates that our funding received from these agencies will drop a similar amount, but the numbers are still being collected for this year.

What research has been impacted?

A wide variety. To take just one example, CU Boulder’s Cooperative Institute for Research in Environmental Sciences and the Institute of Arctic and Alpine Research investigate how to monitor, predict, respond to and recover from extreme weather conditions and natural disasters.

This research directly impacts the safety, well-being and prosperity of Colorado residents facing wildfires, droughts and floods.

A man in a red jacket sits on the ground and reads a tablet near a metal weather detector.
Michael Gooseff, a researcher from the College of Engineering and Applied Science, collects weather data from the McMurdo Dry Valleys in Antarctica.
Byron Adams/University of Colorado Boulder

Past research from these groups includes recovery efforts following the 2021 Marshall Fire in the Boulder area. Researchers collaborated with local governments and watershed groups to monitor environmental impacts and develop dashboards that detailed their findings.

How might cuts affect Colorado’s aerospace economy?

Colorado has more aerospace jobs per capita than any other state. The sector employs more than 55,000 people and contributes significantly to both Colorado’s economy and the national economy.

This ecosystem encompasses research universities such as CU Boulder and Colorado-based startups like Blue Canyon Technologies and Ursa Major Technologies. It also includes established global companies like Lockheed Martin and Raytheon Technologies.

At CU Boulder, the Laboratory for Atmospheric and Space Physics is one of the world’s premier space science research institutions. Researchers at the lab design, build and operate spacecraft and other instruments that contribute critical data. That data helps us understand Earth’s atmosphere, the Sun, planetary systems and deep space phenomena. If the projects the lab supports are cut, then it’s likely the lab will be cut as well.

The Presidential Budget Request proposes up to 24% cuts to NASA’s annual budget. These include reductions of 47% for the Science Mission Directorate. The directorate supports more than a dozen space missions at CU Boulder. That cut could have an immediate impact on university programs of approximately $50 million.

People in white suits stand in front of large metal solar arrays to test them for a space mission to Mars.
Scientists test the solar arrays on NASA’s Mars Atmosphere and Volatile Evolution orbiter spacecraft at Lockheed Martin’s facility near Denver.
Photo courtesy of LASP

One of the largest space missions CU Boulder is involved in is the Mars Atmosphere and Volatile Evolution orbiter. MAVEN, as it’s known, provides telecommunications and space weather monitoring capabilities. These are necessary to support future human and robotic missions to Mars over the next decade and beyond, a stated priority for the White House. If MAVEN were to be canceled, experts estimate that it would cost almost $1 billion to restart it.

Have the cuts hit quantum research?

While the federal government has identified quantum technology as a national priority, the fiscal year 2026 budget proposal only maintains existing funding levels. It does not introduce new investments or initiatives.

I’m concerned that this stagnation, amid broader cuts to science agencies, could undermine progress in this field and undercut the training of its critical workforce. The result could be the U.S. ceding its leadership in quantum innovation to global competitors.

The Conversation

Massimo Ruzzene receives funding from the National Science Foundation.

ref. How slashing university research grants impacts Colorado’s economy and national innovation – a CU Boulder administrator explains – https://theconversation.com/how-slashing-university-research-grants-impacts-colorados-economy-and-national-innovation-a-cu-boulder-administrator-explains-257869

3 basic ingredients, a million possibilities: How small pizzerias succeed with uniqueness in an age of chain restaurants

Source: The Conversation – USA (2) – By Paula de la Cruz-Fernández, Cultural Digital Collections Manager, University of Florida

Variety is the sauce of life. Suzanne Kreiter/Boston Globe via Getty Images

At its heart, pizza is deceptively simple. Made from just a few humble ingredients – baked dough, tangy sauce, melted cheese and maybe a few toppings – it might seem like a perfect candidate for the kind of mass-produced standardization that defines many global food chains, where predictable menus reign supreme.

Yet, visit two pizzerias in different towns – or even on different blocks of the same town – and you’ll find that pizza stubbornly refuses to be homogenized.

We are researchers working on a local business history project that documents the commercial landscape of Gainesville, Florida, in the 20th and 21st centuries. As part of that project, we’ve spent a great many hours over the past two years interviewing local restaurant owners, especially those behind Gainesville’s independent pizzerias. What we’ve found reaffirms a powerful truth: Pizza resists sameness – and small pizzerias are a big reason why.

Why standardized pizza rose but didn’t conquer

While tomatoes were unknown in Italy until the mid-16th century, they have since become synonymous with Italian cuisine – especially through pizza.

Pizza arrived in the U.S. from Naples in the early 20th century, when Italian immigration was at its peak. Two of the biggest destinations for Italian immigrants were New York City and Chicago, and today each has a distinctive pizza style. A New York slice can easily be identified by its thin, soft, foldable crust, while Chicago pies are known for deep, thick, buttery crusts.

After World War II, other regions developed their own types of pizza, including the famed New Haven and Detroit styles. The New Haven style is known for being thin, crispy and charred in a coal-fired oven, while the Detroit style has a rectangular, deep-dish shape and thick, buttery crust.

By the latter half of the 20th century, pizza had become a staple of the American diet. And as its popularity grew, so did demand for consistent, affordable pizza joints. Chains such as Pizza Hut, founded in 1958, and Papa John’s, established in 1984, applied the model pioneered by McDonalds in the late 1940s, adopting limited menus, assembly line kitchens and franchise models built for consistency and scale. New technologies such as point-of-sale systems and inventory management software made things even more efficient.

As food historian Carol Helstosky explains in “Pizza: A Global History,” the transformation involved simplifying recipes, ensuring consistent quality and developing formats optimized for rapid expansion and franchising. What began as a handcrafted, regional dish became a highly replicable product suited to global mass markets.

Today, more than 20,000 Pizza Huts operate worldwide. Papa John’s, which runs about 6,000 pizzerias, built its brand explicitly on a promise rooted in standardization. In this model, success means making pizza the same way, everywhere, every time.

So, what happened to the independent pizzerias? Did they get swallowed up by efficiency?

Not quite.

Chain restaurants don’t necessarily suffocate small competitors, recent research shows. In fact, in the case of pizza, they often coexist, sometimes even fueling creativity and opportunity. Independent pizzerias – there are more than 44,000 nationwide – lean into what makes them unique, carving out a niche. Rather than focusing only on speed or price, they compete by offering character, inventive toppings, personal service and a sense of place that chains just can’t replicate.

A local pizza scene: Creativity in a corporate age

For an example, look no farther than Gainesville. A college town with fewer than 150,000 residents, Gainesville doesn’t have the same culinary cache as New York or Chicago, but it has developed a very unique pizza scene. With 13 independent pizzerias serving Neapolitan, Detroit, New York and Mediterranean styles and more, hungry Gators have a plethora of options when craving a slice.

What makes Gainesville’s pizza scene especially interesting is the range of backgrounds its proprietors have. Through interviews with pizzeria owners, we found that some had started as artists and musicians, while others had worked in engineering or education – and each had their own unique approach to making pizzas.

The owner of Strega Nona’s Oven, for example, uses his engineering background to turn dough-making into a science, altering the proportions of ingredients by as little as half of a percent based on the season or even the weather.

Satchel’s Pizza, on the other hand, is filled with works made by its artist owner, including mosaic windows, paintings, sculptures and fountains.

Gainesville’s independent pizzerias often serve as what sociologists call “third places”: spaces for gathering that aren’t home or work. And their owners think carefully about how to create a welcoming environment. For example, the owner of Scuola Pizza insisted the restaurant be free of TVs, so diners can focus on their food. Squarehouse Pizza features a large outdoor space; an old, now repurposed school bus outfitted with tables and chairs to dine in, and a stage for live music.

Squarehouse also is known for its unusual toppings on square, Detroit-style pies – for example, the Mariah Curry, topped with curry chicken or cauliflower and coconut curry sauce. It refreshes its specialty menus every semester or two.

While the American pizza landscape may be shaped by big brands and standardized menus, small pizzerias continue to shine. Gainesville is a perfect example of how a local pizza scene in a small Southern college town can be so unique, even in a globalized industry. Small pizzerias don’t just offer food – they offer a flavorful reminder that the marketplace rewards distinctiveness and local character, too.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. 3 basic ingredients, a million possibilities: How small pizzerias succeed with uniqueness in an age of chain restaurants – https://theconversation.com/3-basic-ingredients-a-million-possibilities-how-small-pizzerias-succeed-with-uniqueness-in-an-age-of-chain-restaurants-259661

The aftermath of floods, hurricanes and other disasters can be hardest on older rural Americans – here’s how families and neighbors can help

Source: The Conversation – USA (3) – By Lori Hunter, Professor of Sociology, Director of the Institute of Behavioral Science, University of Colorado Boulder

Edith Schaecher, center, and her daughter and granddaughter look at a photo album recovered from her tornado-damaged home in Greenfield, Iowa, in May 2024. AP Photo/Charlie Neibergall

Hurricanes, tornadoes and other extreme weather do not distinguish between urban and rural boundaries. But when a disaster strikes, there are big differences in how well people are able to respond and recover – and older adults in rural areas are especially vulnerable.

If a disaster causes injuries, getting health care can take longer in rural areas. Many rural hospitals have closed, leaving patients traveling longer distances for care.

At the same time, rural areas have higher percentages of older adults, a group that is more likely to have chronic health problems that make experiencing natural disasters especially dangerous. Medical treatments, such as dialysis, can be disrupted when power goes out or clinics are damaged, and injuries are more likely around property damaged by flooding or powerful winds.

As a sociologist who studies rural issues and directs the Institute of Behavioral Science at the University of Colorado Boulder, I believe that understanding the risks is essential for ensuring healthier lives for older adults. I see many different ways rural communities are helping reduce their vulnerability in disasters.

Disasters disrupt health care, especially in isolated rural regions

According to the U.S. Census Bureau, about 20% of the country’s rural population is age 65 and over, compared with only 16% of urban residents. That’s about 10 million older adults living in rural areas.

There are three primary reasons rural America has been aging faster than the rest of the country: Young people have been leaving for college and job opportunities, meaning fewer residents are starting new families. Many older rural residents are choosing to “age in place” where they have strong social ties. And some rural areas are gaining older adults who choose to retire there.

An aging population means rural areas tend to have a larger percentage of residents with chronic disease, such as dementia, heart disease, respiratory illness and diabetes.

According to research from the National Council on Aging, nearly 95% of adults age 60 and older have at least one chronic condition, while more than 78% have two or more. Rural areas also have higher rates of death from chronic diseases, particularly heart disease.

At the same time, health care access in rural areas is rapidly declining.

Nearly 200 rural hospitals have closed or stopped providing in-patient care since 2005. Over 700 more — one-third of the nation’s remaining rural hospitals — were considered to be at risk of closing even before the cuts to Medicaid that the president signed in July 2025.

Hospital closures have left rural residents traveling about 20 miles farther for common in-patient health care services than they did two decades ago, and even farther for specialist care.

Those miles might seem trivial, but in emergencies when roads are damaged or flooded, they can mean losing access to care and treatment.

After Hurricane Katrina struck New Orleans in 2005, 44% of patients on dialysis missed at least one treatment session, and almost 17% missed three or more.

When Hurricanes Matthew and Florence hit rural Robeson County, North Carolina, in 2016 and 2018, some patients who relied on insulin to manage their blood sugar levels went without insulin for weeks. The county had high rates of poverty and poor health already, and the healthy foods people needed to manage the disease were also hard to find after the storm.

Insulin is important for treating diabetes – a chronic disease estimated to affect nearly one-third of adults age 65 and older. But a sufficient supply can be harder to maintain when a disaster knocks out power, because insulin should be kept cool, and medical facilities and drugstores may be harder for patients to reach.

Rural residents also often live farther from community centers, schools or other facilities that can serve as cooling centers during heat waves or evacuation centers in times of crisis.

Alzheimer’s disease can make evacuation difficult

Cognitive decline also affects older adults’ ability to manage disasters.

Over 11% of Americans age 65 and older – more than 7 million people – have Alzheimer’s disease or related dementia, and the prevalence is higher in rural areas’ older populations compared with urban areas.

Caregivers for family members living with dementia may struggle to find time to prepare for disasters. And when disaster strikes, they face unique challenges. Disasters disrupt routines, which can cause agitation for people with Alzheimer’s, and patients may resist evacuation.

Living through a disaster can also worsen brain health over the long run. Older adults who lived through the 2011 Great East Japan earthquake and tsunami were found to have greater cognitive decline over the following decade, especially those who lost their homes or jobs, or whose health care routines were disrupted.

Social safety nets are essential

One thing that many rural communities have that helps is a strong social fabric. Those social connections can help reduce older adults’ vulnerability when disasters strike.

Following severe flooding in Colorado in 2013, social connections helped older adults navigate the maze of paperwork required for disaster aid, and some even provided personal loans.

Two older men stand by debris form a storm-damaged church. Another group of people walks in the background.
Community support through churches, like this one whose building was hit by a tornado in rural Argyle, Wis., in 2024, and other groups can help older adults recover from disasters.
Ross Harried/NurPhoto via Getty Images

Friends, family and neighbors in rural areas often check in on seniors, particularly those living alone. They can help them develop disaster response plans to ensure older residents have access to medications and medical treatment, and that they have an evacuation plan.

Rural communities and local groups can also help build up older adults’ mental and physical health before and after storms by developing educational, social and exercise programs. Better health and social connections can improve resilience, including older adults’ ability to respond to alerts and recover after disasters.

Ensuring that everyone in the community has that kind of support is important in rural areas and cities alike as storm and flood risks worsen, particularly for older adults.

The Conversation

Lori Hunter receives funding from the National Institutes of Health and the National Science Foundation.

ref. The aftermath of floods, hurricanes and other disasters can be hardest on older rural Americans – here’s how families and neighbors can help – https://theconversation.com/the-aftermath-of-floods-hurricanes-and-other-disasters-can-be-hardest-on-older-rural-americans-heres-how-families-and-neighbors-can-help-247691

What is the ‘Seven Mountains Mandate’ and how is it linked to political extremism in the US?

Source: The Conversation – USA (3) – By Art Jipson, Associate Professor of Sociology, University of Dayton

People pray before Republican vice presidential nominee J.D. Vance at a town hall hosted by Lance Wallnau on Sept. 28, 2024, in Monroeville, Pa. AP Photo/Rebecca Droke

Vance Boelter, who allegedly shot Melissa Hortman, a Democratic Minnesota state representative, and her husband, Mark Hortman, on June 14, 2025, studied at Christ for the Nations Institute in Dallas. The group is a Bible school linked to the New Apostolic Reformation, or NAR.

The NAR is a loosely organized but influential charismatic Christian movement that shares similarities with Pentecostalism, especially in its belief that God actively communicates with believers through the Holy Spirit. Unlike traditional Pentecostalism, however, the organization emphasizes modern-day apostles and prophets as authoritative leaders tasked with transforming society and ushering in God’s kingdom on Earth. Prayer, prophecy and worship are defined not only as acts of devotion but as strategic tools for advancing believers’ vision of government and society.

After the shooting, the Christ for the Nations Institute issued a statement “unequivocally” denouncing “any and all forms of violence and extremism.” It stated: “Our organization’s mission is to educate and equip students to spread the Gospel of Jesus Christ through compassion, love, prayer, service, worship, and value for human life.”

But the shooting has drawn attention to the school and the larger Christian movement it belongs to. One of the most important aspects of NAR teachings today is what is called “the Seven Mountain Mandate.”

The Seven Mountain Mandate calls on Christians to gain influence, or “take dominion,” over seven key areas of culture: religion, family, education, government, media, business and the arts.

With over three decades of experience studying extremism, I offer a brief overview of the history and core beliefs of the Seven Mountains Mandate.

‘Dominion of Christians’

The Seven Mountains concept was originally proposed in 1975 by evangelical leader Bill Bright, the founder of Campus Crusade for Christ. Now known as “Cru,” the Campus Crusade for Christ was founded as a global ministry in 1951 to promote Christian evangelism, especially on college campuses.

United by a shared vision to influence society through Christian values, Bright partnered with Loren Cunningham, the founder of Youth With A Mission, a major international missionary training and outreach organization, in the 1970s.

The Seven Mountains Mandate was popularized by theologian Francis Schaeffer, who linked it to a larger critique of secularism and liberal culture. Over time, it evolved.

C. Peter Wagner, a former seminary professor who helped organize and name the New Apostolic Reformation, is often regarded as the theological architect of the group. He developed it into a call for dominion. In his 2008 book “Dominion! How Kingdom Action Can Change the World,” he urged Christians to take authoritative control of cultural institutions.

For Wagner, “dominion theology” – the idea that Christians should have control over all aspects of society – was a call to spiritual warfare, so that God’s kingdom would be “manifested here on earth as it is in heaven.”

A gray-haired man wearing glasses and a blue shirt.
Bill Johnson.
Doctorg via Wikimedia Commons

Since 1996, Bill Johnson, a senior leader of Bethel Church, and Johnny Enlow, a self-described prophet and Seven Mountains advocate, among others, have taken the original idea of the Seven Mountains Mandate and reshaped it into a more aggressive, political and spiritually militant approach. Spiritual militancy reflects an aggressive, us-vs.-them mindset that blurs the line between faith and authoritarianism, promoting dominion over society in the name of spiritual warfare.

Their version doesn’t just aim to influence culture; it frames the effort as a spiritual battle to reclaim and reshape the nation according to their vision of God’s will.

Lance Wallnau, another Christian evangelical preacher, televangelist, speaker and author, has promoted dominion theology since the early 2000s. During the 2020 U.S. presidential election, Wallnau, along with several prominent NAR figures, described Donald Trump as anointed by God to reclaim the “mountain” of government from demonic control.

In their book “Invading Babylon: The 7 Mountains Mandate,” Wallnau and Johnson explicitly call for Christian leadership as the only antidote to perceived moral decay and spiritual darkness.

The beliefs

Sometimes referred to as Seven Mountains of Influence or Seven Mountains of Culture, the seven mountains are not neutral domains but seen as battlegrounds between divine truth and demonic deception.

Adherents believe that Christians are called to reclaim these areas through influence, leadership and even, if necessary, the use of force and to confront demonic political forces, as religion scholar Matthew Taylor demonstrates in his book “The Violent Take It By Force.”

Diverse perspectives and interpretations surround the rhetoric and actions associated with the New Apostolic Reformation. Some analysts have pointed out how the NAR is training its followers for an active confrontation. Other commentators have said that the rhetoric calling for physical violence is anti-biblical and should be denounced.

NAR-aligned leaders have framed electoral contests as struggles between “godly” candidates and those under the sway of “satanic” influence.

Similarly, NAR prophet Cindy Jacobs has repeatedly emphasized the need for “spiritual warfare” in schools to combat what she characterizes as “demonic ideologies” such as sex education, LGBTQ+ inclusion or discussions of systemic racism.

In the NAR worldview, cultural change is not merely political or social but considered a supernatural mission; opponents are not simply wrong but possibly under the sway of demonic influence. Elections become spiritual battles.

This belief system views pluralism as weakness, compromise as betrayal, and coexistence as capitulation. Frederick Clarkson, a senior research analyst at Political Research Associates, a progressive think tank based in Somerville, Massachusetts, defines the Seven Mountains Mandate as “the theocratic idea that Christians are called by God to exercise dominion over every aspect of society by taking control of political and cultural institutions.”

The call to “take back” the culture is not metaphorical but literal, and believers are encouraged to see themselves as soldiers in a holy war to dominate society. Some critics argue that NAR’s call to “take back” culture is about literal domination, but this interpretation is contested.

Many within the movement see the language of warfare as spiritually focused on prayer, evangelism and influencing hearts and minds. Still, the line between metaphor and mandate can blur, especially when rhetoric about “dominion” intersects with political and cultural action. That tension is part of an ongoing debate both within and outside the movement.

Networks that spread the beliefs

This belief system is no longer confined to the margins. It is spread widely through evangelical churches, podcasts, YouTube videos and political networks.

It’s hard to know exactly how many churches are part of the New Apostolic Reformation, but estimates suggest that about 3 million people in the U.S. attend churches that openly follow NAR leaders.

At the same time, the Seven Mountains Mandate doesn’t depend on centralized leadership or formal institutions. It spreads organically through social networks, social media – notably podcasts and livestreams – and revivalist meetings and workshops.

André Gagné, a theologian and author of “American Evangelicals for Trump: Dominion, Spiritual Warfare, and the End Times,” writes about the ways in which the mandate spreads by empowering local leaders and believers. Individuals are authorized – often through teachings on spiritual warfare, prophetic gifting, and apostolic leadership – to see themselves as agents of divine transformation in society, called to reclaim the “mountains,” such as government, media and education, for God’s kingdom.

This approach, Gagné explains, allows different communities to adapt the action mandate to their unique cultural, political and social contexts. It encourages individuals to see themselves as spiritual warriors and leaders in their domains – whether in business, education, government, media or the arts.

Small groups or even individuals can start movements or initiatives without waiting for top-down directives. The only recognized authorities are the apostles and prophets running the church or church network the believers attend.

The framing of the Seven Mountains Mandate as a divinely inspired mission, combined with the movement’s emphasis on direct spiritual experiences and a specific interpretation of scripture, can create an environment where questioning the mandate is perceived as challenging God’s authority.

Slippery slope

These beliefs have increasingly fused with nationalist rhetoric and conspiracy theories.

A white flag bearing the words 'An Appeal to Heaven,' featuring a green pine tree, with the American flag displayed beneath it.
The ‘Appeal to Heaven’ flags symbolize the belief that people have the right to appeal directly to God’s authority when they think the government has failed.
Paul Becker/Becker1999 via Flickr, CC BY

A powerful example of NAR political rhetoric in action is the rise and influence of the “Appeal to Heaven” flags. For those in the New Apostolic Reformation, these flags symbolize the belief that when all earthly authority fails, people have the right to appeal directly to God’s authority to justify resistance.

This was evident during the Jan. 6, 2021, Capitol insurrection, when these flags were prominently displayed.

To be clear, its leaders are not calling for violence but rather for direct political engagement and protest. For some believers, however, the calls for “spiritual warfare” may become a slippery slope into justification for violence, as in the case of the alleged Minnesota shooter.

Understanding the Seven Mountains Mandate is essential for grasping the dynamics of contemporary efforts to align government and culture with a particular vision of Christian authority and influence.

The Conversation

Art Jipson does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. What is the ‘Seven Mountains Mandate’ and how is it linked to political extremism in the US? – https://theconversation.com/what-is-the-seven-mountains-mandate-and-how-is-it-linked-to-political-extremism-in-the-us-260034

Here’s a way to save lives, curb traffic jams and make commutes faster and easier − ban left turns at intersections

Source: The Conversation – USA – By Vikash V. Gayah, Associate Professor of Civil Engineering, Penn State

Research shows left turns at intersections are dangerous and slow traffic. Benjamin Rondel/The Image Bank via Getty Images

More than 60% of traffic collisions at intersections involve left turns. Some U.S. cities – including San Francisco, Salt Lake City and Birmingham, Alabama – are restricting left turns.

Dr. Vikash Gayah, a professor of civil engineering at Penn State University and the interim director of the Larson Transportation Institute, discusses how left turns at intersections cause accidents, make traffic worse and use more gas.

Dr. Vikash Gayah discusses why left turns should be banned at some intersections.

The Conversation has collaborated with SciLine to bring you highlights from the discussion, edited for brevity and clarity.

How dangerous are left turns at intersections?

Vikash Gayah: When you make a left turn, you have to cross oncoming traffic. When you have a green light, you need to wait for a gap in the oncoming traffic before turning left. If you misjudge when you decide to turn, you could hit the oncoming traffic, or be hit by it. That’s an angle crash, one of the most dangerous types of crashes.

Also, the driver of the left-turning vehicle is typically looking at oncoming traffic. But pedestrians may be crossing the street they’re turning on to. Often the driver doesn’t see the pedestrians, and that too can cause a serious accident.

On the other hand, right turns require merging into traffic, but they’re not conflicting directly with traffic. So right turns are much, much safer than left turns.

What are the statistics on the unique dangers of left turns?

Gayah: Approximately 40% of all crashes occur at intersections − 50% of those crashes involve a serious injury, and 20% involve a fatality.

About 61% of the crashes at intersections involve a left turn. Left-hand turns are generally the least frequent movement at an intersection, so that 61% is a lot.

Why are left turns inefficient for traffic flow?

Gayah: When left-turning vehicles are waiting for the gap, they can block other lanes from moving, particularly when several vehicles are waiting to turn left.

Instead of the solid green light, many intersections use the green arrow to let left-turning vehicles move. But to do that, all other movements at the intersection have to stop. Stopping all other traffic just to serve a few left turns makes the intersection less efficient.

Also, every time you move to another “phase” of traffic – like the green arrow – the intersection has a brief period of time when all the lights are red. Traffic engineers call that an all-red time, and that’s when the intersection is not serving any vehicles. All-red time is two to three seconds per phase change, and that wasted time adds up quickly to further make the intersection less efficient.

An aerial view of a cars traveling around a roundabout.
Roundabouts reduce the need for left turns, but they don’t work everywhere.
Pete Ark/Moment via Getty Images

What restrictions have been tried in different cities?

Gayah: When a downtown is not very busy – in the off-peak periods – allowing left turns is fine because you don’t need that additional ability to move vehicles at each intersection.

Some cities are implementing signs that say no left turns at intersections from 7 to 9, which is the morning peak period, or 4 to 6, which is the afternoon peak period. In San Francisco, for example, Van Ness Avenue restricts left turns during peak periods.

But cities aren’t implementing these restrictions on a larger scale. Restrictions are more along individual corridors or isolated intersections instead of essentially the entire downtown, where possible. That would make the downtown street network more efficient.

Roundabouts are one approach to avoiding left turns.

Gayah: Roundabouts are safe because there’s no longer a need to cross opposing traffic. Everyone circulates in the same direction. You find where you need to go and then exit.

But restricting left turns, in general, is more efficient. Roundabouts aren’t as efficient when it’s busier. The roundabout gets full, which can cause a gridlock, and no vehicle can move. Traditional intersections are less prone to gridlock.

Roundabouts also take up more space. Installing a roundabout might mean expanding the intersection. In some downtowns, that means tearing down buildings or removing sidewalks. Restricting left turns only requires a sign that says “no left turns” or “no left turns during peak periods.” That’s it.

What are the benefits to banning left turns in urban areas?

Gayah: Any way you cut it, eliminating left turns will result in longer travel distances. I’ll have to travel a longer distance to get to where I need to go. The worst case is having to circle the block. I’m actually traveling four extra block lengths to get to where I need to go.

But not all trips require circling the block. In a typical downtown, each trip will be about one block length longer on average. That’s not a lot of extra distance. And that extra driving is more than offset by the fact that each intersection with banned left turns is now moving more vehicles. Which means every time you’re at an intersection, you wait less time, on average. So you travel a slightly longer distance but get to where you’re going more quickly.

Does avoiding left turns improve fuel efficiency?

Gayah: Our research found that even though vehicles travel longer distances on average with the restricted left turns, they spend less fuel – about 10% to 15% less per trip – because they don’t stop as much at intersections.

This is why UPS and other fleets route their vehicles to avoid left turns. There’s less idling and fewer stops.

Do you think banning left turns could become widely accepted?

Gayah: It’s a new strategy, so it’s uncomfortable for some people. But when they get to their destination faster, I think people will latch onto it.

Watch the full interview to hear more.

SciLine is a free service based at the American Association for the Advancement of Science, a nonprofit that helps journalists include scientific evidence and experts in their news stories.

The Conversation

Vikash V. Gayah’s research has been funded by various State Departments of Transportation (including Pennsylvania, Wisconsin, Washington State, Montana, South Dakota and North Carolina), US Department of Transportation (via the Mineta National Transit Research Consortium, the Mid-Atlantic Universities Transportation Center, and the Center for Integrated Asset Management for Multimodal Transportation Infrastructure Systems), Federal Highway Administration, National Cooperative Highway Research Program, and National Science Foundation..

ref. Here’s a way to save lives, curb traffic jams and make commutes faster and easier − ban left turns at intersections – https://theconversation.com/heres-a-way-to-save-lives-curb-traffic-jams-and-make-commutes-faster-and-easier-ban-left-turns-at-intersections-257877

How can the James Webb Space Telescope see so far?

Source: The Conversation – USA – By Adi Foord, Assistant Professor of Astronomy and Astrophysics, University of Maryland, Baltimore County

This is a James Webb Space Telescope image of NGC 604, a star-forming region about 2.7 million light-years from Earth. NASA/ESA/CSA/STScI

Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to CuriousKidsUS@theconversation.com.


How does the camera on the James Webb Space Telescope work and see so far out? – Kieran G., age 12, Minnesota


Imagine a camera so powerful it can see light from galaxies that formed more than 13 billion years ago. That’s exactly what NASA’s James Webb Space Telescope is built to do.

Since it launched in December 2021, Webb has been orbiting more than a million miles from Earth, capturing breathtaking images of deep space. But how does it actually work? And how can it see so far? The secret lies in its powerful cameras – especially ones that don’t see light the way our eyes do.

I’m an astrophysicist who studies galaxies and supermassive black holes, and the Webb telescope is an incredible tool for observing some of the earliest galaxies and black holes in the universe.

When Webb takes a picture of a distant galaxy, astronomers like me are actually seeing what that galaxy looked like billions of years ago. The light from that galaxy has been traveling across space for the billions of years it takes to reach the telescope’s mirror. It’s like having a time machine that takes snapshots of the early universe.

By using a giant mirror to collect ancient light, Webb has been discovering new secrets about the universe.

A telescope that sees heat

Unlike regular cameras or even the Hubble Space Telescope, which take images of visible light, Webb is designed to see a kind of light that’s invisible to your eyes: infrared light. Infrared light has longer wavelengths than visible light, which is why our eyes can’t detect it. But with the right instruments, Webb can capture infrared light to study some of the earliest and most distant objects in the universe.

A dog, shown normally, then through thermal imaging, with the eyes, mouth and ears brighter than the rest of the dog.
Infrared cameras, like night-vision goggles, allow you to ‘see’ the infrared waves emitting from warm objects such as humans and animals. The temperatures for the images are in degrees Fahrenheit.
NASA/JPL-Caltech

Although the human eye cannot see it, people can detect infrared light as a form of heat using specialized technology, such as infrared cameras or thermal sensors. For example, night-vision goggles use infrared light to detect warm objects in the dark. Webb uses the same idea to study stars, galaxies and planets.

Why infrared? When visible light from faraway galaxies travels across the universe, it stretches out. This is because the universe is expanding. That stretching turns visible light into infrared light. So, the most distant galaxies in space don’t shine in visible light anymore – they glow in faint infrared. That’s the light Webb is built to detect.

A diagram of the electromagnetic spectrum, with radio, micro and infrared waves having a longer wavelength than visible light, while UV, X-ray and gamma rays have shorter wavelengths than visible light.
The rainbow of visible light that you can see is only a small slice of all the kinds of light. Some telescopes can detect light with a longer wavelength, such as infrared light, or light with a shorter wavelength, such as ultraviolet light. Others can detect X-rays or radio waves.
Inductiveload, NASA/Wikimedia Commons, CC BY-SA

A golden mirror to gather the faintest glow

Before the light reaches the cameras, it first has to be collected by the Webb telescope’s enormous golden mirror. This mirror is over 21 feet (6.5 meters) wide and made of 18 smaller mirror pieces that fit together like a honeycomb. It’s coated in a thin layer of real gold – not just to look fancy, but because gold reflects infrared light extremely well.

The mirror gathers light from deep space and reflects it into the telescope’s instruments. The bigger the mirror, the more light it can collect – and the farther it can see. Webb’s mirror is the largest ever launched into space.

The JWST's mirror, which looks like a large, roughly hexagonal shiny surface made up of 18 smaller hexagons put together, sitting in a facility. The mirror is reflecting the NASA meatball logo.
Webb’s 21-foot primary mirror, made of 18 hexagonal mirrors, is coated with a plating of gold.
NASA

Inside the cameras: NIRCam and MIRI

The most important “eyes” of the telescope are two science instruments that act like cameras: NIRCam and MIRI.

NIRCam stands for near-infrared camera. It’s the primary camera on Webb and takes stunning images of galaxies and stars. It also has a coronagraph – a device that blocks out starlight so it can photograph very faint objects near bright sources, such as planets orbiting bright stars.

NIRCam works by imaging near-infrared light, the type closest to what human eyes can almost see, and splitting it into different wavelengths. This helps scientists learn not just what something looks like but what it’s made of. Different materials in space absorb and emit infrared light at specific wavelengths, creating a kind of unique chemical fingerprint. By studying these fingerprints, scientists can uncover the properties of distant stars and galaxies.

MIRI, or the mid-infrared instrument, detects longer infrared wavelengths, which are especially useful for spotting cooler and dustier objects, such as stars that are still forming inside clouds of gas. MIRI can even help find clues about the types of molecules in the atmospheres of planets that might support life.

Both cameras are far more sensitive than the standard cameras used on Earth. NIRCam and MIRI can detect the tiniest amounts of heat from billions of light-years away. If you had Webb’s NIRCam as your eyes, you could see the heat from a bumblebee on the Moon. That’s how sensitive it is.

Two photos of space, with lots of stars and galaxies shown as little dots. The right image shows more, brighter dots than the left.
Webb’s first deep-field image: The MIRI image is on the left and the NIRCam image is on the right.
NASA

Because Webb is trying to detect faint heat from faraway objects, it needs to keep itself as cold as possible. That’s why it carries a giant sun shield about the size of a tennis court. This five-layer sun shield blocks heat from the Sun, Earth and even the Moon, helping Webb stay incredibly cold: around -370 degrees F (-223 degrees C).

MIRI needs to be even colder. It has its own special refrigerator, called a cryocooler, to keep it chilled to nearly -447 degrees F (-266 degrees C). If Webb were even a little warm, its own heat would drown out the distant signals it’s trying to detect.

Turning space light into pictures

Once light reaches the Webb telescope’s cameras, it hits sensors called detectors. These detectors don’t capture regular photos like a phone camera. Instead, they convert the incoming infrared light into digital data. That data is then sent back to Earth, where scientists process it into full-color images.

The colors we see in Webb’s pictures aren’t what the camera “sees” directly. Because infrared light is invisible, scientists assign colors to different wavelengths to help us understand what’s in the image. These processed images help show the structure, age and composition of galaxies, stars and more.

By using a giant mirror to collect invisible infrared light and sending it to super-cold cameras, Webb lets us see galaxies that formed just after the universe began.


Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.

The Conversation

Adi Foord does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How can the James Webb Space Telescope see so far? – https://theconversation.com/how-can-the-james-webb-space-telescope-see-so-far-257421