Exploring questions of meaning, ethics and belief through Japanese anime

Source: The Conversation – USA (2) – By Ronald S. Green, Professor and Chair of the Department of Philosophy and Religious Studies, Coastal Carolina University

A still from the Japanese anime ‘Spirited Away.’ Choo Yut Shing via Flickr, CC BY

Uncommon Courses is an occasional series from The Conversation U.S. highlighting unconventional approaches to teaching.

Title of course:

Anime and Religious Identity: Cultural Aesthetics in Japanese Spiritual Worlds

What prompted the idea for the course?

As a scholar who studies Japanese religion and has a lifelong love of visual storytelling, I started using anime in my class to spark conversations around the Buddhist ideas of karma and Shintō notions of “kami,” or spirits in nature.

When I introduced the idea of karma, a scene from “Mob Psycho 100” – a Japanese manga and anime series from 2016 to 2022 about a shy teenage boy with powerful psychic abilities – came up in discussion. It sparked a conversation about how our intentions and actions carry real moral weight. In Buddhism, karma is not just about punishment or reward in a future life. It is believed to play out in the present – shaping how we relate to others and how we grow or get stuck as people.

Later, when I explained kami in Shintō, a quiet moment from “Mushishi” helped students think differently about the world around them. “Mushishi” is a slow-paced, atmospheric anime about a wandering healer who helps people affected by mysterious spiritlike beings called mushi. These beings are not gods or monsters but part of nature itself – barely seen, yet always present. The series gave students a visual language for imagining how spiritual forces might exist in ordinary places.

The Japanese animation movie ‘Mushishi.’

Over the years, two moments convinced me to create a full course. First was my students’ strong reaction to Gyōmei Himejima, the Pure Land Buddhist priest in “Demon Slayer.” He is a gentle but powerful guardian who refuses to hate the demons he must fight. His actions lead to honest and thoughtful conversations about compassion, fear and the limits of violence.

One student asked, “If Gyōmei doesn’t hate even the demons, does that mean violence can be compassionate?” Another pointed out that Gyōmei’s strength does not come from anger, but from grief and empathy. These kinds of insights showed me that anime was helping students think through complex ethical questions that would have been harder to engage through abstract theory alone.

The second moment came from watching “Dragon Ball Daima.” In this 2024 series, familiar heroes are turned into children. This reminded me of Buddhist stories about being reborn and starting over, and it prompted new questions: If someone loses all the strength they had built up over time, are they still the same person? What, if anything, remains constant about the self, and what changes?

What does the course explore?

This course helps students explore questions of meaning, ethics and belief that anime brings to life. It examines themes such as what happens when the past resurfaces? What does it mean to carry the weight of responsibility? How should we act when our personal desires come into conflict with what we know is right? And how can suffering become a path to transformation?

What materials does the course feature?

We start with “Spirited Away,” a 2001 animated film about a young girl who becomes trapped in a spirit world after her parents are transformed into pigs. The story draws on Shintō ideas such as purification, sacred space and kami. Students learn how these religious concepts are expressed through the film’s visual design, soundscape and narrative structure.

Later in the semester, we watch “Your Name,” a 2016 film in which two teenagers mysteriously begin switching bodies across time and space. It’s a story about connection, memory and longing. The idea of “musubi,” a spiritual thread that binds people and places together, becomes central to understanding the film’s emotional impact.

Attack on Titan,” which first aired in 2013, immerses students in a world marked by moral conflict, sacrifice and uncertainty. The series follows a group of young soldiers fighting to survive in a society under siege by giant humanoid creatures known as Titans. Students are often surprised to learn that this popular series engages with profound questions drawn from Buddhism and existential thought, such as the meaning of freedom, the tension between destiny and individual choice, and the deeper causes of human violence.

The characters in these stories face real struggles. Some are spirit mediums or time travelers. But all of them must make hard decisions about who they are and what they believe.

As the semester goes on, students develop visual or written projects such as short essays, podcasts, zines or illustrated stories. These projects help them explore the same questions as the anime, but in their own voices.

Why is this course relevant now?

Anime has become a global phenomenon. But even though millions of people watch it, many do not realize how deeply it draws on Japanese religious traditions. In this course, students learn to look closely at what anime is saying about life, morality and the choices we make.

Through these characters’ journeys, students learn that religion is not just something found in ancient texts or sacred buildings. It can also live in the stories we tell, the art we create and the questions we ask about ourselves and the world.

The Conversation

Ronald S. Green does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Exploring questions of meaning, ethics and belief through Japanese anime – https://theconversation.com/exploring-questions-of-meaning-ethics-and-belief-through-japanese-anime-260035

AI and art collide in this engineering course that puts human creativity first

Source: The Conversation – USA (2) – By Francesco Fedele, Associate Professor of Civil and Environmental Engineering, Georgia Institute of Technology

A Georgia Tech University course links art and artificial intelligence. Yuichiro Chino/Moment via Getty Images

Uncommon Courses is an occasional series from The Conversation U.S. highlighting unconventional approaches to teaching.

Title of course:

Art and Generative AI

What prompted the idea for the course?

I see many students viewing artificial intelligence as humanlike simply because it can write essays, do complex math or answer questions. AI can mimic human behavior but lacks meaningful engagement with the world. This disconnect inspired the course and was shaped by the ideas of 20th-century German philosopher Martin Heidegger. His work highlights how we are deeply connected and present in the world. We find meaning through action, care and relationships. Human creativity and mastery come from this intuitive connection with the world. Modern AI, by contrast, simulates intelligence by processing symbols and patterns without understanding or care.

In this course, we reject the illusion that machines fully master everything and put student expression first. In doing so, we value uncertainty, mistakes and imperfection as essential to the creative process.

This vision expands beyond the classroom. In the 2025-26 academic year, the course will include a new community-based learning collaboration with Atlanta’s art communities. Local artists will co-teach with me to integrate artistic practice and AI.

The course builds on my 2018 class, Art and Geometry, which I co-taught with local artists. The course explored Picasso’s cubism, which depicted reality as fractured from multiple perspectives; it also looked at Einstein’s relativity, the idea that time and space are not absolute and distinct but part of the same fabric.

What does the course explore?

We begin with exploring the first mathematical model of a neuron, the perceptron. Then, we study the Hopfield network, which mimics how our brain can remember a song from just listening to a few notes by filling in the rest. Next, we look at Hinton’s Boltzmann Machine, a generative model that can also imagine and create new, similar songs. Finally, we study today’s deep neural networks and transformers, AI models that mimic how the brain learns to recognize images, speech or text. Transformers are especially well suited for understanding sentences and conversations, and they power technologies such as ChatGPT.

In addition to AI, we integrate artistic practice into the coursework. This approach broadens students’ perspectives on science and engineering through the lens of an artist. The first offering of the course in spring 2025 was co-taught with Mark Leibert, an artist and professor of the practice at Georgia Tech. His expertise is in art, AI and digital technologies. He taught students fundamentals of various artistic media, including charcoal drawing and oil painting. Students used these principles to create art using AI ethically and creatively. They critically examined the source of training data and ensured that their work respects authorship and originality.

Students also learn to record brain activity using electroencephalography – EEG – headsets. Through AI models, they then learn to transform neural signals into music, images and storytelling. This work inspired performances where dancers improvised in response to AI-generated music.

The Improv AI performance at Georgia Tech on April 15, 2025. Dancers improvised to music generated by AI from brain waves and sonified black hole data.

Why is this course relevant now?

AI entered our lives so rapidly that many people don’t fully grasp how it works, why it works, when it fails or what its mission is.

In creating this course, the aim is to empower students by filling that gap. Whether they are new to AI or not, the goal is to make its inner algorithms clear, approachable and honest. We focus on what these tools actually do and how they can go wrong.

We place students and their creativity first. We reject the illusion of a perfect machine, but we provoke the AI algorithm to confuse and hallucinate, when it generates inaccurate or nonsensical responses. To do so, we deliberately use a small dataset, reduce the model size or limit training. It’s in these flawed states of AI that students step in as conscious co-creators. The students are the missing algorithm that takes back control of the creative process. Their creations do not obey AI but reimagine it by the human hand. The artwork is rescued from automation.

What’s a critical lesson from the course?

Students learn to recognize AI’s limitations and harness its failures to reclaim creative authorship. The artwork isn’t generated by AI, but it’s reimagined by students.

Students learn chatbot queries have an environmental cost because large AI models use a lot of power. They avoid unnecessary iterations when designing prompts or using AI. This helps reducing carbon emissions.

The Improv AI performance on April 15, 2025, featured dancer Bekah Crosby responding to AI-generated music from brain waves.

What will the course prepare students to do?

The course prepares students to think like artists. Through abstraction and imagination they gain the confidence to tackle the engineering challenges of the 21st century. These include protecting the environment, building resilient cities and improving health.

Students also realize that while AI has vast engineering and scientific applications, ethical implementation is crucial. Understanding the type and quality of training data that AI uses is essential. Without it, AI systems risk producing biased or flawed predictions.

The Conversation

Francesco Fedele does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. AI and art collide in this engineering course that puts human creativity first – https://theconversation.com/ai-and-art-collide-in-this-engineering-course-that-puts-human-creativity-first-256673

How Philadelphia’s current sanitation strike differs from past labor disputes in the city

Source: The Conversation – USA – By Francis Ryan, Associate Professor of Labor Studies and Employment Relations, Rutgers University

Curbside trash collection has been on pause in Philadelphia since July 1, 2025. AP Photo/Matt Slocum

As the Philadelphia municipal worker strike enters its second week, so-called “Parker piles” – large collections of garbage that some residents blame on Mayor Cherelle Parker – continue to build up in neighborhoods across the city.

The AFSCME District Council 33 union on strike represents about 9,000 blue-collar workers in the city, including sanitation workers, 911 dispatchers, city mechanics and water department staff.

The Conversation U.S. asked Francis Ryan, a professor of labor studies at Rutgers University and author of “AFSCME’s Philadelphia Story: Municipal Workers and Urban Power in Philadelphia in the Twentieth Century,” about the history of sanitation strikes in Philly and what makes this one unique.

Has anything surprised you about this strike?

This strike marks the first time in the history of labor relations between the City of Philadelphia and the AFSCME District Council 33 union where social media is playing a significant role in how the struggle is unfolding.

The union is getting their side of the story out on Instagram and other social media platforms, and citizens are taking up or expressing sympathy with their cause.

Piles of garbage on the street beside a green Dumpster spraypainted with 'Don't Scab Parker's Mess'
Some city residents are referring to the garbage build-up sites as ‘Parker piles.’
AP Photo/Tassanee Vejpongsa

How successful are trash strikes in Philly or other U.S. cities?

As I describe in my book, Philadelphia has a long history of sanitation strikes that goes back to March 1937. At that time, a brief work stoppage brought about discussions between the city administration and an early version of the current union.

When over 200 city workers were laid off in September 1938, city workers called a weeklong sanitation strike. Street battles raged in West Philadelphia when strikers blocked police-escorted trash wagons that were aiming to collect trash with workers hired to replace the strikers.

Philadelphia residents, many of whom were union members who worked in textile, steel, food and other industries rallied behind the strikers. The strikers’ demands were met, and a new union, the American Federation of State, County and Municipal Employees, or AFSCME, was formally recognized by the city.

This strike was a major event because it showed how damaging a garbage strike could be. The fact that strikers were willing to fight in the streets to stop trash services showed that such events had the potential for violence, not to mention the health concerns from having tons of trash on the streets.

There was another two-week trash strike in Philadelphia in 1944, but there wouldn’t be another for more than 20 years.

However, a growing number of sanitation strikes popped up around the country in the 1960s, the most infamous being the 1968 Memphis Sanitation Strike.

Black-and-white photo of a line of Black men walking past a row of white soldiers in uniform with bayonets fixed
Black sanitation workers peacefully march wearing placards reading ‘I Am A Man’ during the Memphis sanitation strike in 1968.
Bettmann via Getty Images

In Memphis, a majority African American sanitation workforce demanded higher wages, basic safety procedures and recognition of their union. Dr. Martin Luther King, Jr. rallied to support the Memphis workers and their families as part of his Poor Peoples’ Campaign, which sought to organize working people from across the nation into a new coalition to demand full economic and political rights.

On April 4, 1968, Dr. King was assassinated. His death put pressure on Memphis officials to settle the strike, and on April 16 the the strikers secured their demands.

Following the Memphis strike, AFSCME began organizing public workers around the country and through the coming years into the 1970s, there were sanitation strikes and slowdowns across the nation including in New York City, Atlanta, Cleveland and Washington, D.C. Often, these workers, who were predominantly African American, gained the support of significant sections of the communities they served and secured modest wage boosts.

By the 1980s, such labor actions were becoming fewer. In 1986, Philadelphia witnessed a three-week sanitation strike that ended with the union gaining some of its wage demands, but losing on key areas related to health care benefits.

Black-and-white photo of men standing alongside huge pile of trash and two trash trucks
Workers begin removing mounds of trash after returning to work after the 18-day strike in Philadelphia in July 1986.
Bettmann via Getty Images

How do wages and benefits for DC33 workers compare to other U.S. cities?

DC 33 president Greg Boulware has said that the union’s members make an average salary of $46,000 per year. According to MIT’s Living Wage Calculator, that is $2,000 less than what a single adult with no kids needs to reasonably support themselves living in Philadelphia.

Sanitation workers who collect curbside trash earn a salary of $42,500 to $46,200, or $18-$20 an hour. NBC Philadelphia reports that those wages are the lowest of any of the major cities they looked at. Hourly wages in the other cities they looked at ranged from $21 an hour in Dallas to $25-$30 an hour in Chicago.

Unlike other eras, the fact that social media makes public these personal narratives and perspectives – like from former sanitation worker Terrill Haigler, aka “Ya Fav Trashman” – is shaping the way many citizens respond to these disruptions. I see a level of support for the strikers that I believe is unprecedented going back as far as 1938.

What do you think is behind this support?

The pandemic made people more aware of the role of essential workers in society. If the men and women who do these jobs can’t afford their basic needs, something isn’t right. This may explain why so many people are seeing things from the perspective of striking workers.

At the same time, money is being cut from important services at the federal, state and local levels. The proposed gutting of the city’s mass transit system by state lawmakers is a case in point. Social media allows people to make these broader connections and start conversations.

If the strike continues much longer, I think it will gain more national and international attention, and bring discussions about how workers should be treated to the forefront.

The Conversation

Francis Ryan does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How Philadelphia’s current sanitation strike differs from past labor disputes in the city – https://theconversation.com/how-philadelphias-current-sanitation-strike-differs-from-past-labor-disputes-in-the-city-260676

Universities in every state care for congressional papers that document US political history − federal cuts put their work at risk

Source: The Conversation – USA – By Katherine Gregory, Assistant Professor, University Libraries, Mississippi State University

The papers of members of Congress are fertile ground for research into Congress’ role in shaping U.S. history. cunfek, iStock/Getty Images Plus

In 1971, the president of Mississippi State University, Dr. William L. Giles, invited President Richard Nixon to attend the dedication of U.S. Sen. John C. Stennis’ papers to the university library’s archives.

Nixon declined, but the Republican president sent a generous note in support of the veteran Democrat Stennis.

“Future students and scholars who study there will … familiarize themselves with the outstanding record of a U.S. Senator whose … judgment in complex areas of national security have been a source of strength and comfort to those who have led this Nation and to all who are concerned in preserving the freedom we cherish.”

Nixon’s prediction came true, perhaps ironically, considering the legal troubles over his own papers during the Watergate crisis. Congress passed the Presidential Records Act of 1978 after Nixon resigned.

Stennis’ gift to his alma mater caused a windfall of subsequent congressional donations to what is now the Mississippi Political Collections at Mississippi State University Libraries.

Now, 55 years later, Mississippi State University holds a body of records from a bipartisan group of officials that has positioned it to tell a major part of the state’s story in national and global politics. That story is told to over 100 patrons and dozens of college and K-12 classes each year.

The papers are fertile ground for scholarly research into Congress’ role in shaping U.S. history, with its extraordinary powers over lawmaking, the economy and one of the world’s largest militaries.

Mississippi State University, where I work as an assistant professor and director of the Mississippi Political Collections, is not alone in providing such a rich source of history. It is part of a national network of universities that hold and steward congressional papers.

But support for this stewardship is in jeopardy. With the White House’s proposed elimination of independent granting agencies such as the National Endowment for the Humanities and the Institute of Museum and Library Services, it is unclear what money will be available for this work in the future.

A typed letter on the letterhead of the U.S. Congress, Committee on Armed Forces begins 'Dear Walter:'
A 1963 letter from Sen. John Stennis to a constituent about agricultural legislation and also Russians in Cuba.
Mississippi State University

From research to public service

Mississippi State University’s building of an expansive political archive is neither unique nor a break from practices by our national peers:

The Richard Russell Library for Political Research and Studies at the University of Georgia – named after the U.S. senator from Georgia from 1933 to 1971 – has grown since its founding in 1974 into one of America’s premier research libraries of political history, with more than 600 manuscript collections and an extensive oral history collection.

• Iowa Sen. Tom Harkin donated his papers to Drake University to form The Harkin Institute, which memorializes Harkin’s role as chief sponsor of the Americans with Disabilities Act through disability policy research and education.

• Sens. Robert and Elizabeth Dole’s papers are the bedrock of the Dole Institute of Politics at Kansas University.

• In 2023, retiring Sens. Richard Shelby and Patrick Leahy donated their archives – Shelby to the University of Alabama and Leahy to the University of Vermont.

By lending their papers and relative political celebrity, members of Congress have laid the groundwork for repositories like these to promote policy research to enable local and state governments to shape legislation on issues central to their states.

More complete history

When the repositories are at universities, they also provide educational programming that encourages public service for the next generations.

At Mississippi State University, the John C. Stennis Institute for Government and Community Development sponsors an organization that allows students to learn about government, voting, organizing and potential careers on Capitol Hill with trips to Washington, D.C.

Depositing congressional papers in states and districts, to be cared for by professional archivists and librarians, extends the life of the records and expands their utility.

When elected officials give their papers to their constituents, they ensure the public can see and use the papers. This is a way of returning their history to them, while giving them the power to assemble a more complete, independent version of their political history. While members of Congress are not required by law to donate their papers, they passed a bipartisan concurrent resolution in 2008 encouraging the practice.

Users of congressional archives range from historians to college students, local investigative journalists, political memoirists and documentary filmmakers. In advance of the 2020 election, we contributed historical materials to CNN’s reporting on Joe Biden’s controversial relationship with the Southern bloc of segregationist senators in his early Senate years.

A yellowed letter from 1947 about Indian resource rights from a congressman to a Native American constituent in Oklahoma.
A copy of a letter from U.S. Rep. Carl Albert of Oklahoma, who ultimately became the 46th speaker of the U.S. House of Representatives.
Carl Albert Center Congressional and Political Collections, University of Oklahoma

Preserving the archives

While the results contribute to the humanities, the process of archival preservation and management is as complex a science as any other.

“Congressional records” is a broad term that encompasses many formats such as letters, diaries, notes, meeting minutes, speech transcripts, guestbooks and schedules.

They also include ephemera such as campaign bumper stickers, military medals and even ceremonial pieces of the original U.S. Capitol flooring. They contain rare photographs of everything from natural disaster damage to state dinners and legacy audiovisual materials such as 8 mm film, cassette tapes and vinyl records. Members of Congress also have donated their libraries of hundreds of books.

Archival preservation is a constantly evolving science. Only in the mid-20th century was the acid-free box developed to arrest the deterioration of paper records. After the advent of film-based photographs, archivists later learned to keep them away from light and heat, and they observed that audiovisual materials such as 8mm tape decompose from acid decay quickly if not stored in proper conditions.

Alongside preservation work comes the task of inventorying the records for public use. Archivists write finding aids – itemized, searchable catalogs of the records – and create metadata, which describes items in terms of size, creation date and location.

Future congressional papers will include born-digital content such as email and social media. This means traditional archiving will give way to digital preservation and data management. Federal law mandates that digital records have alt-text and transcription, and they need specialized expertise in file storage and data security because congressional papers often contain case files with sensitive personal data.

With congressional materials often clocking in at hundreds or thousands of linear feet, emerging artificial intelligence and automation technologies will usher this field into a new era, with AI speeding metadata and cataloging work to deliver usable records for researchers faster than ever.

No more funding?

All of this work takes money; most of it takes staff time. Institutions meet these needs through federal grants – the very grants at risk from the Trump administration’s proposed elimination of the agencies that administer them.

For example, West Virginia University has been awarded over $400,000 since 2021 from the National Endowment for the Humanities for the American Congress Digital Archives Portal project, a website that centralizes digitized congressional records at the university and a growing list of partners such as the University of Hawaii and the University of Oklahoma.

Past federal grants have funded other congressional papers projects, from basic supply needs such as folders to more complex repair of film and tape.

The Howard Baker Center for Public Policy at the University of Tennessee used National Endowment for the Humanities funds to purchase specialized supplies needed to store the papers of its namesake, the Republican senator who also served as chief of staff to President Ronald Reagan.

National Endowment for the Humanities funds helped process U.S. Rep. Pat Williams’ papers at the University of Montana, resulting in a searchable finding aid for the 87 boxes of records documenting the Montana Democrat’s 18 years in Congress.
President Franklin D. Roosevelt said, “I have an unshaken conviction that democracy can never be undermined if we maintain our library resources and a national intelligence capable of utilizing them.”

With the current threat to federal grants – and agencies – that pay for the crucial work of stewarding these congressional papers, it appears that these records of democracy may no longer play their role in supporting that democracy.

The Conversation

Katherine Gregory received funding from the National Endowment for the Humanities and is a member of the Society of American Archivists.

ref. Universities in every state care for congressional papers that document US political history − federal cuts put their work at risk – https://theconversation.com/universities-in-every-state-care-for-congressional-papers-that-document-us-political-history-federal-cuts-put-their-work-at-risk-256053

Your data privacy is slipping away – here’s why, and what you can do about it

Source: The Conversation – USA – By Mike Chapple, Teaching Professor of IT, Analytics, and Operations, University of Notre Dame

Cybersecurity and data privacy are constantly in the news. Governments are passing new cybersecurity laws. Companies are investing in cybersecurity controls such as firewalls, encryption and awareness training at record levels.

And yet, people are losing ground on data privacy.

In 2024, the Identity Theft Resource Center reported that companies sent out 1.3 billion notifications to the victims of data breaches. That’s more than triple the notices sent out the year before. It’s clear that despite growing efforts, personal data breaches are not only continuing, but accelerating.

What can you do about this situation? Many people think of the cybersecurity issue as a technical problem. They’re right: Technical controls are an important part of protecting personal information, but they are not enough.

As a professor of information technology, analytics and operations at the University of Notre Dame, I study ways to protect personal privacy.

Solid personal privacy protection is made up of three pillars: accessible technical controls, public awareness of the need for privacy, and public policies that prioritize personal privacy. Each plays a crucial role in protecting personal privacy. A weakness in any one puts the entire system at risk.

The first line of defense

Technology is the first line of defense, guarding access to computers that store data and encrypting information as it travels between computers to keep intruders from gaining access. But even the best security tools can fail when misused, misconfigured or ignored.

Two technical controls are especially important: encryption and multifactor authentication. These are the backbone of digital privacy – and they work best when widely adopted and properly implemented.




Read more:
The hidden cost of convenience: How your data pulls in hundreds of billions of dollars for app and social media companies


Encryption uses complex math to put sensitive data in an unreadable format that can only be unlocked with the right key. For example, your web browser uses HTTPS encryption to protect your information when you visit a secure webpage. This prevents anyone on your network – or any network between you and the website – from eavesdropping on your communications. Today, nearly all web traffic is encrypted in this way.

But if we’re so good at encrypting data on networks, why are we still suffering all of these data breaches? The reality is that encrypting data in transit is only part of the challenge.

Securing stored data

We also need to protect data wherever it’s stored – on phones, laptops and the servers that make up cloud storage. Unfortunately, this is where security often falls short. Encrypting stored data, or data at rest, isn’t as widespread as encrypting data that is moving from one place to another.

While modern smartphones typically encrypt files by default, the same can’t be said for cloud storage or company databases. Only 10% of organizations report that at least 80% of the information they have stored in the cloud is encrypted, according to a 2024 industry survey. This leaves a huge amount of unencrypted personal information potentially exposed if attackers manage to break in. Without encryption, breaking into a database is like opening an unlocked filing cabinet – everything inside is accessible to the attacker.

Multifactor authentication is a security measure that requires you to provide more than one form of verification before accessing sensitive information. This type of authentication is more difficult to crack than a password alone because it requires a combination of different types of information. It often combines something you know, such as a password, with something you have, such as a smartphone app that can generate a verification code or with something that’s part of what you are, like a fingerprint. Proper use of multifactor authentication reduces the risk of compromise by 99.22%.

While 83% of organizations require that their employees use multifactor authentication, according to another industry survey, this still leaves millions of accounts protected by nothing more than a password. As attackers grow more sophisticated and credential theft remains rampant, closing that 17% gap isn’t just a best practice – it’s a necessity.

Multifactor authentication is one of the simplest, most effective steps organizations can take to prevent data breaches, but it remains underused. Expanding its adoption could dramatically reduce the number of successful attacks each year.

Awareness gives people the knowledge they need

Even the best technology falls short when people make mistakes. Human error played a role in 68% of 2024 data breaches, according to a Verizon report. Organizations can mitigate this risk through employee training, data minimization – meaning collecting only the information necessary for a task, then deleting it when it’s no longer needed – and strict access controls.

Policies, audits and incident response plans can help organizations prepare for a possible data breach so they can stem the damage, see who is responsible and learn from the experience. It’s also important to guard against insider threats and physical intrusion using physical safeguards such as locking down server rooms.

Public policy holds organizations accountable

Legal protections help hold organizations accountable in keeping data protected and giving people control over their data. The European Union’s General Data Protection Regulation is one of the most comprehensive privacy laws in the world. It mandates strong data protection practices and gives people the right to access, correct and delete their personal data. And the General Data Protection Regulation has teeth: In 2023, Meta was fined €1.2 billion (US$1.4 billion) when Facebook was found in violation.

Despite years of discussion, the U.S. still has no comprehensive federal privacy law. Several proposals have been introduced in Congress, but none have made it across the finish line. In its place, a mix of state regulations and industry-specific rules – such as the Health Insurance Portability and Accountability Act for health data and the Gramm-Leach-Bliley Act for financial institutions – fill the gaps.

Some states have passed their own privacy laws, but this patchwork leaves Americans with uneven protections and creates compliance headaches for businesses operating across jurisdictions.

The tools, policies and knowledge to protect personal data exist – but people’s and institutions’ use of them still falls short. Stronger encryption, more widespread use of multifactor authentication, better training and clearer legal standards could prevent many breaches. It’s clear that these tools work. What’s needed now is the collective will – and a unified federal mandate – to put those protections in place.


This article is part of a series on data privacy that explores who collects your data, what and how they collect, who sells and buys your data, what they all do with it, and what you can do about it.

The Conversation

Mike Chapple does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Your data privacy is slipping away – here’s why, and what you can do about it – https://theconversation.com/your-data-privacy-is-slipping-away-heres-why-and-what-you-can-do-about-it-251768

Scientific norms shape the behavior of researchers working for the greater good

Source: The Conversation – USA – By Jeffrey A. Lee, Professor of Geography and the Environment, Texas Tech University

Mentors model the ethical pursuit of scientific knowledge. sanjeri/E+ via Getty Images

Over the past 400 years or so, a set of mostly unwritten guidelines has evolved for how science should be properly done. The assumption in the research community is that science advances most effectively when scientists conduct themselves in certain ways.

The first person to write down these attitudes and behaviors was Robert Merton, in 1942. The founder of the sociology of science laid out what he called the “ethos of science,” a set of “values and norms which is held to be binding on the man of science.” (Yes, it’s sexist wording. Yes, it was the 1940s.) These now are referred to as scientific norms.

The point of these norms is that scientists should behave in ways that improve the collective advancement of knowledge. If you’re a cynic, you might be rolling your eyes at such a Pollyannaish ideal. But corny expectations keep the world functioning. Think: Be kind, clean up your mess, return the shopping cart to the cart corral.

I’m a physical geographer who realized long ago that students are taught biology in biology classes and chemistry in chemistry classes, but rarely are they taught about the overarching concepts of science itself. So I wrote a book called “The Scientific Endeavor,” laying out what scientists and other educated people should know about science itself.

Scientists in training are expected to learn the big picture of science after years of observing their mentors, but that doesn’t always happen. And understanding what drives scientists can help nonscientists better understand research findings. These scientific norms are a big part of the scientific endeavor. Here are Merton’s original four, along with a couple I think are worth adding to the list:

Universalism

Scientific knowledge is for everyone – it’s universal – and not the domain of an individual or group. In other words, a scientific claim must be judged on its merits, not the person making it. Characteristics like a scientist’s nationality, gender or favorite sports team should not affect how their work is judged.

Also, the past record of a scientist shouldn’t influence how you judge whatever claim they’re currently making. For instance, Nobel Prize-winning chemist Linus Pauling was not able to convince most scientists that large doses of vitamin C are medically beneficial; his evidence didn’t sufficiently support his claim.

In practice, it’s hard to judge contradictory claims fairly when they come from a “big name” in the field versus an unknown researcher without a reputation. It is, however, easy to point out such breaches of universalism when others let scientific fame sway their opinion one way or another about new work.

black-and-white image of man in white jacket holding up two bottles
When asked about patenting his polio vaccine, Jonas Salk replied, ‘There is no patent. Could you patent the sun?’
Bettmann via Getty Images

Communism

Communism in science is the idea that scientific knowledge is the property of everyone and must be shared.

Jonas Salk, who led the research that resulted in the polio vaccine, provides a classic example of this scientific norm. He published the work and did not patent the vaccine so that it could be freely produced at low cost.

When scientific research doesn’t have direct commercial application, communism is easy to practice. When money is involved, however, things get complicated. Many scientists work for corporations, and they might not publish their findings in order to keep them away from competitors. The same goes for military research and cybersecurity, where publishing findings could help the bad guys.

Disinterestedness

Disinterestedness refers to the expectation that scientists pursue their work mainly for the advancement of knowledge, not to advance an agenda or get rich. The expectation is that a researcher will share the results of their work, regardless of a finding’s implications for their career or economic bottom line.

Research on politically hot topics, like vaccine safety, is where it can be tricky to remain disinterested. Imagine a scientist who is strongly pro-vaccine. If their vaccine research results suggest serious danger to children, the scientist is still obligated to share these findings.

Likewise, if a scientist has invested in a company selling a drug, and the scientist’s research shows that the drug is dangerous, they are morally compelled to publish the work even if that would hurt their income.

In addition, when publishing research, scientists are required to disclose any conflicts of interest related to the work. This step informs others that they may want to be more skeptical in evaluating the work, in case self-interest won out over disinterest.

Disinterestedness also applies to journal editors, who are obligated to decide whether to publish research based on the science, not the political or economic implications.

Organized skepticism

Merton’s last norm is organized skepticism. Skepticism does not mean rejecting ideas because you don’t like them. To be skeptical in science is to be highly critical and look for weaknesses in a piece of research.

colorful journals with spines out on library shelves
By the time new research is published in a reputable journal, it’ has made it past several sets of skeptical eyes.
gorsh13/iStock via Getty Images Plus

This concept is formalized in the peer review process. When a scientist submits an article to a journal, the editor sends it to two or three scientists familiar with the topic and methods used. They read it carefully and point out any problems they find.

The editor then uses the reviewer reports to decide whether to accept as is, reject outright or request revisions. If the decision is revise, the author then makes each change or tries to convince the editor that the reviewer is wrong.

Peer review is not perfect and doesn’t always catch bad research, but in most cases it improves the work, and science benefits. Traditionally, results weren’t made public until after peer review, but that practice has weakened in recent years with the rise of preprints, reducing the reliability of information for nonscientists.

Integrity and humility

I’m adding two norms to Merton’s list.

The first is integrity. It’s so fundamental to good science that it almost seems unnecessary to mention. But I think it’s justified since cheating, stealing and lazy scientists are getting plenty of attention these days.

The second is humility. You may have made a contribution to our understanding of cell division, but don’t tell us that you cured cancer. You may be a leader in quantum mechanics research, but that doesn’t make you an authority on climate change.

Scientific norms are guidelines for how scientists are expected to behave. A researcher who violates one of these norms won’t be carted off to jail or fined an exorbitant fee. But when a norm is not followed, scientists must be prepared to justify their reasons, both to themselves and to others.

The Conversation

Jeffrey A. Lee does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Scientific norms shape the behavior of researchers working for the greater good – https://theconversation.com/scientific-norms-shape-the-behavior-of-researchers-working-for-the-greater-good-255159

President Trump’s tug-of-war with the courts, explained

Source: The Conversation – USA – By Paul M. Collins Jr., Professor of Legal Studies and Political Science, UMass Amherst

The U.S. Supreme Court in Washington, D.C. Stefani Reynolds/Bloomberg

The Supreme Court handed President Donald Trump a big win on June 27, 2025, by limiting the ability of judges to block Trump administration policies across the nation.

But Trump has not fared nearly as well in the lower courts, where he has lost a series of cases through different levels of the federal court system. On June 5, a single judge temporarily stopped the administration from preventing Harvard University from enrolling international students.

And a three-judge panel of the U.S. Court of International Trade blocked Trump on May 28 from imposing tariffs on China and other nations. The Trump administration has appealed this decision. It will be taken up in July by all 11 judges on the United States Court of Appeals for the Federal Circuit.

After that, the case can be appealed to the Supreme Court.

I’m a scholar of the federal courts. The reasons why some courts have multiple judges and others have a single judge can be confusing. Here’s a guide to help understand what’s going on in the federal courts.

Federal District Courts

The U.S. District Courts are the trial courts in the federal system and hear about 400,000 cases per year. A single judge almost always presides over cases.

This makes sense for a jury trial, since a judge might make dozens of spur-of-the-moment decisions during the course of a trial, such as ruling on a lawyer’s objection to a question asked of a witness. If a panel of, say, three judges performed this task, it would prolong proceedings because the three judges would have to deliberate over every ruling.

A more controversial role of District Courts involves setting nationwide injunctions. This happens when a single judge temporarily stops the government from enforcing a policy throughout the nation.

There have been more than two dozen nationwide injunctions during Trump’s second term. These involve policy areas as diverse as ending birthright citizenship, firing federal employees and banning transgender people from serving in the military.

A man at a podium speaks to dozens of reporters.
President Donald Trump speaks at the White House on June 27, 2025, after the Supreme Court curbed the power of lone federal judges to block executive actions.
Andrew Caballero-Reynolds/AFP via Getty Images

Trump and Republicans in Congress argue that the ability to issue nationwide injunctions gives too much power to a single judge. Instead, they believe injunctions should apply only to the parties involved in the case.

On June 27, the Supreme Court agreed with the Trump administration and severely limited the ability of District Court judges to issue nationwide injunctions. This means that judges can generally stop policies from being enforced only against the parties to a lawsuit, instead of everyone in the nation.

In rare instances, a panel of three District Court judges hears a case. Congress decides what cases these special three-judge panels hear, reserving them for especially important issues. For example, these panels have heard cases involving reapportionment, which is how votes are translated into legislative seats in Congress and state legislatures, and allegations that a voter’s rights have been violated.

The logic behind having three judges hear such important cases is that they will give more careful consideration to the dispute. This may lend legitimacy to a controversial decision and prevents a single judge from exercising too much power.

There are also specialized courts that hear cases involving particular policies, sometimes in panels of three judges. For instance, three-judge panels on the U.S. Court of International Trade decide cases involving executive orders related to international trade.

The federal Court of Appeals

The U.S. Court of Appeals hears appeals from the District Courts and specialized courts.

The 13 federal circuit courts that make up the U.S. Court of Appeals are arranged throughout the country and handle about 40,000 cases per year. Each circuit court has six to 29 judges. Cases are decided primarily by three-judge panels.

Having multiple judges decide cases on the Court of Appeals is seen as worthwhile, since these courts are policymaking institutions. This means they set precedents for the judicial circuit in which they operate, which covers three to nine states.

Supporters of this system argue that by having multiple judges on appellate courts, the panel will consider a variety of perspectives on the case and collaborate with one another. This can lead to better decision-making. Additionally, having multiple judges check one another can boost public confidence in the judiciary.

The party that loses a case before a three-judge panel can request that the entire circuit rehear the case. This is known as sitting en banc.

Because judges on a circuit can decline to hear cases en banc, this procedure is usually reserved for especially significant cases. For instance, the U.S. Court of Appeals for the Federal Circuit has agreed to an en banc hearing to review the Court of International Trade’s decision to temporarily halt Trump’s sweeping tariff program. It also allowed the tariffs to remain in effect until the appeal plays out, likely in August.

The exception to having the entire circuit sit together en banc is the 9th Circuit, based in San Francisco, which has 29 judges, far more than other circuit courts. It uses an 11-judge en banc process, since having 29 judges hear cases together would be logistically challenging.

Cargo ships are seen at a container terminal.
Cargo ships are seen at a container terminal in the Port of Shanghai, China, in May 2025. A three-judge panel of the U.S. Court of International Trade blocked Trump from imposing tariffs on China and other nations.
CFOTO/Future Publishing via Getty Images

The US Supreme Court

The U.S. Supreme Court sits atop the American legal system and decides about 60 cases per year.

Cases are decided by all nine justices, unless a justice declines to participate because of a conflict of interest. As with other multimember courts, advocates of the nine-member makeup argue that the quality of decision-making is improved by having many justices participate in a case’s deliberation.

Each Supreme Court justice is charged with overseeing one or more of the 13 federal circuits. In this role, a single justice reviews emergency appeals from the District Courts and an appellate court within a circuit. This authorizes them to put a temporary hold on the implementation of policies within that circuit or refer the matter to the entire Supreme Court.

In February, for example, Chief Justice John Roberts blocked a Court of Appeals order that would have compelled the Trump administration to pay nearly US$2 billion in reimbursements for already completed foreign aid work.

In March, a 5-4 majority of the high court sent the case back to U.S. District Judge Amir Ali, who subsequently ordered the Trump administration to release some of the funds.

The federal judicial system is complex. The flurry of executive orders from the Trump administration means that cases are being decided on a nearly daily basis by a variety of courts.

A single judge will decide some of these cases, and others are considered by full courts. Though the nine justices of the Supreme Court technically have the final say, the sheer volume of legal challenges means that America’s District Courts and Court of Appeals will resolve many of the disputes.

The Conversation

Paul M. Collins Jr. does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. President Trump’s tug-of-war with the courts, explained – https://theconversation.com/president-trumps-tug-of-war-with-the-courts-explained-258234

Higher ed’s relationship with marriage? It’s complicated – and depends on age

Source: The Conversation – USA (2) – By John V. Winters, Professor of Economics, Iowa State University

Education rates are rising; marriage rates are falling. But the relationship between those two trends isn’t straightforward. Ugur Karakoc/E+ via Getty Images

The longer someone stays in school, the more likely they are to delay getting married – but education does not reduce the overall likelihood of being married later in life, according to our research recently published in Education Economics. Education also influences who Americans marry: Obtaining a four-year degree vs. just a high school diploma more than doubles someone’s likelihood of marrying a fellow college graduate.

Previous research has documented that the more education you have, the more likely you are to get married. But correlation does not imply causality, and plenty of other factors influence marriage and education.

My research with economist Kunwon Ahn provides evidence that there is indeed a causal link between education and marriage – but it’s a nuanced one.

Our study applies economic theory and advanced statistics to a 2006-2019 sample from the American Community Survey: more than 8 million people, whom we divided into different cohorts based on birthplace, birth year and self-reported ancestry.

To isolate the causal relationship, we needed to sidestep other factors that can influence someone’s decisions about marriage and education. Therefore, we did not calculate based on individuals’ own education level. Instead, we estimated their educational attainment using a proxy: their mothers’ level of education. On the individual level, plenty of people finish more or less education than their parents. Within a cohort, however, the amount of schooling that mothers have, on average, is a strong predictor of how much education children in that cohort received.

We found that an additional year of schooling – counting from first grade to the end of any postgraduate degrees – reduces the likelihood that someone age 25 to 34 is married by roughly 4 percentage points.

Among older age groups, the effects of education were more mixed. On average, the level of education has almost zero impact on the probability that someone age 45 to 54 is married. Among people who were married by that age, being more educated reduces their likelihood of being divorced or separated.

However, more education also makes people slightly more likely to have never been married by that age. In our sample, about 12% of people in that age group have never married. An additional year of education increases that, on average, by 2.6 percentage points.

Why it matters

Marriage rates are at historical lows in the United States, especially for young people. Before 1970, more than 80% of Americans 25 to 34 were married. By 2023, that number had fallen to only 38%, according to data from the U.S. Census Bureau.

Over the same time, the percentage of Americans with a college degree has increased considerably. Additional education can increase someone’s earning potential and make them a more attractive partner.

Yet the rising costs of higher education may make marriage less attainable. A 2016 study found that the more college debt someone had, the less likely they were to ever marry.

While marriage rates have fallen across the board, the drop is most pronounced for lower-income groups, and not all of the gap is driven by education. One of the other causes may be declining job prospects for lower-income men. Over recent decades, as their earning potential has dwindled and women’s job options have grown, it appears some of the economic benefits of marriage have declined.

Declining marriage rates have important effects on individuals, families and society as a whole. Many people value the institution for its own sake, and others assign it importance based on religious, cultural and social values. Economically, marriage has important consequences for children, including how many children people have and the resources that they can invest in those children.

What still isn’t known

Education levels are only part of the explanation for trends in marriage rates. Other cultural, social, economic and technological factors are likely involved in the overall decline, but their exact contribution is still unknown.

One idea gaining traction, though little research has been done on it, considers the ways smartphones and social media may be reducing psychological and social well-being. We stay in more, go out less, and are increasingly divided – all of which could make people less likely to marry.

The Research Brief is a short take on interesting academic work.

The Conversation

John V. Winters does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Higher ed’s relationship with marriage? It’s complicated – and depends on age – https://theconversation.com/higher-eds-relationship-with-marriage-its-complicated-and-depends-on-age-258664

How slashing university research grants impacts Colorado’s economy and national innovation – a CU Boulder administrator explains

Source: The Conversation – USA (2) – By Massimo Ruzzene, Vice Chancellor of Research and Innovation, University of Colorado Boulder

Federal funding cuts to the University of Colorado Boulder have already impacted research and could cause even more harm. Glenn J. Asakawa/University of Colorado

The Trump administration has been freezing or reducing federal grants to universities across the country.

Over the past several months, universities have lost more than US$11 billion in funding, according to NPR. More than two dozen universities, including the University of Colorado Boulder and the University of Denver, have been affected. Research into cancer, farming solutions and climate resiliency are just a few of the many projects nationally that have seen cuts.

The Conversation asked Massimo Ruzzene, senior vice chancellor for research and innovation at the University of Colorado Boulder, to explain how these cuts and freezes are impacting the university he works for and Colorado’s local economy.

How important are federal funds to CU Boulder?

Federal funding pays for approximately 70% of CU Boulder’s research each year. That’s about $495 million in the 2023-2024 fiscal year.

The other 30% of research funding comes from a variety of sources. The second-largest is international partnerships at $127 million. Last year, CU Boulder also received $27 million in philanthropic gifts to support research and approximately $29 million from collaborations with industry.

CU Boulder uses this money to fund research that advances fields like artificial intelligence, space exploration and planetary sciences, quantum technologies, biosciences and climate and energy.

At CU Boulder, federal funding also supports research projects like the Dust Accelerator Laboratory that helps us understand the composition and structure of cosmic dust. This research allows scientists to reconstruct the processes that formed planets, moons and organic molecules.

How much federal funding has CU Boulder lost?

So far in 2025, CU Boulder has received 56 grant cancellations or stop-work orders. Those amount to approximately $30 million in lost funding. This number is not inclusive of awards that are on hold and awaiting action by the sponsor.

This number also does not include the funds that have not been accessible due the considerable lag in funding from agencies such as the National Science Foundation and the National Institutes of Health.
Nationwide, National Science Foundation funding has dropped by more than 50% through the end of May of this year compared to the average of the past 10 years. The university anticipates that our funding received from these agencies will drop a similar amount, but the numbers are still being collected for this year.

What research has been impacted?

A wide variety. To take just one example, CU Boulder’s Cooperative Institute for Research in Environmental Sciences and the Institute of Arctic and Alpine Research investigate how to monitor, predict, respond to and recover from extreme weather conditions and natural disasters.

This research directly impacts the safety, well-being and prosperity of Colorado residents facing wildfires, droughts and floods.

A man in a red jacket sits on the ground and reads a tablet near a metal weather detector.
Michael Gooseff, a researcher from the College of Engineering and Applied Science, collects weather data from the McMurdo Dry Valleys in Antarctica.
Byron Adams/University of Colorado Boulder

Past research from these groups includes recovery efforts following the 2021 Marshall Fire in the Boulder area. Researchers collaborated with local governments and watershed groups to monitor environmental impacts and develop dashboards that detailed their findings.

How might cuts affect Colorado’s aerospace economy?

Colorado has more aerospace jobs per capita than any other state. The sector employs more than 55,000 people and contributes significantly to both Colorado’s economy and the national economy.

This ecosystem encompasses research universities such as CU Boulder and Colorado-based startups like Blue Canyon Technologies and Ursa Major Technologies. It also includes established global companies like Lockheed Martin and Raytheon Technologies.

At CU Boulder, the Laboratory for Atmospheric and Space Physics is one of the world’s premier space science research institutions. Researchers at the lab design, build and operate spacecraft and other instruments that contribute critical data. That data helps us understand Earth’s atmosphere, the Sun, planetary systems and deep space phenomena. If the projects the lab supports are cut, then it’s likely the lab will be cut as well.

The Presidential Budget Request proposes up to 24% cuts to NASA’s annual budget. These include reductions of 47% for the Science Mission Directorate. The directorate supports more than a dozen space missions at CU Boulder. That cut could have an immediate impact on university programs of approximately $50 million.

People in white suits stand in front of large metal solar arrays to test them for a space mission to Mars.
Scientists test the solar arrays on NASA’s Mars Atmosphere and Volatile Evolution orbiter spacecraft at Lockheed Martin’s facility near Denver.
Photo courtesy of LASP

One of the largest space missions CU Boulder is involved in is the Mars Atmosphere and Volatile Evolution orbiter. MAVEN, as it’s known, provides telecommunications and space weather monitoring capabilities. These are necessary to support future human and robotic missions to Mars over the next decade and beyond, a stated priority for the White House. If MAVEN were to be canceled, experts estimate that it would cost almost $1 billion to restart it.

Have the cuts hit quantum research?

While the federal government has identified quantum technology as a national priority, the fiscal year 2026 budget proposal only maintains existing funding levels. It does not introduce new investments or initiatives.

I’m concerned that this stagnation, amid broader cuts to science agencies, could undermine progress in this field and undercut the training of its critical workforce. The result could be the U.S. ceding its leadership in quantum innovation to global competitors.

The Conversation

Massimo Ruzzene receives funding from the National Science Foundation.

ref. How slashing university research grants impacts Colorado’s economy and national innovation – a CU Boulder administrator explains – https://theconversation.com/how-slashing-university-research-grants-impacts-colorados-economy-and-national-innovation-a-cu-boulder-administrator-explains-257869

3 basic ingredients, a million possibilities: How small pizzerias succeed with uniqueness in an age of chain restaurants

Source: The Conversation – USA (2) – By Paula de la Cruz-Fernández, Cultural Digital Collections Manager, University of Florida

Variety is the sauce of life. Suzanne Kreiter/Boston Globe via Getty Images

At its heart, pizza is deceptively simple. Made from just a few humble ingredients – baked dough, tangy sauce, melted cheese and maybe a few toppings – it might seem like a perfect candidate for the kind of mass-produced standardization that defines many global food chains, where predictable menus reign supreme.

Yet, visit two pizzerias in different towns – or even on different blocks of the same town – and you’ll find that pizza stubbornly refuses to be homogenized.

We are researchers working on a local business history project that documents the commercial landscape of Gainesville, Florida, in the 20th and 21st centuries. As part of that project, we’ve spent a great many hours over the past two years interviewing local restaurant owners, especially those behind Gainesville’s independent pizzerias. What we’ve found reaffirms a powerful truth: Pizza resists sameness – and small pizzerias are a big reason why.

Why standardized pizza rose but didn’t conquer

While tomatoes were unknown in Italy until the mid-16th century, they have since become synonymous with Italian cuisine – especially through pizza.

Pizza arrived in the U.S. from Naples in the early 20th century, when Italian immigration was at its peak. Two of the biggest destinations for Italian immigrants were New York City and Chicago, and today each has a distinctive pizza style. A New York slice can easily be identified by its thin, soft, foldable crust, while Chicago pies are known for deep, thick, buttery crusts.

After World War II, other regions developed their own types of pizza, including the famed New Haven and Detroit styles. The New Haven style is known for being thin, crispy and charred in a coal-fired oven, while the Detroit style has a rectangular, deep-dish shape and thick, buttery crust.

By the latter half of the 20th century, pizza had become a staple of the American diet. And as its popularity grew, so did demand for consistent, affordable pizza joints. Chains such as Pizza Hut, founded in 1958, and Papa John’s, established in 1984, applied the model pioneered by McDonalds in the late 1940s, adopting limited menus, assembly line kitchens and franchise models built for consistency and scale. New technologies such as point-of-sale systems and inventory management software made things even more efficient.

As food historian Carol Helstosky explains in “Pizza: A Global History,” the transformation involved simplifying recipes, ensuring consistent quality and developing formats optimized for rapid expansion and franchising. What began as a handcrafted, regional dish became a highly replicable product suited to global mass markets.

Today, more than 20,000 Pizza Huts operate worldwide. Papa John’s, which runs about 6,000 pizzerias, built its brand explicitly on a promise rooted in standardization. In this model, success means making pizza the same way, everywhere, every time.

So, what happened to the independent pizzerias? Did they get swallowed up by efficiency?

Not quite.

Chain restaurants don’t necessarily suffocate small competitors, recent research shows. In fact, in the case of pizza, they often coexist, sometimes even fueling creativity and opportunity. Independent pizzerias – there are more than 44,000 nationwide – lean into what makes them unique, carving out a niche. Rather than focusing only on speed or price, they compete by offering character, inventive toppings, personal service and a sense of place that chains just can’t replicate.

A local pizza scene: Creativity in a corporate age

For an example, look no farther than Gainesville. A college town with fewer than 150,000 residents, Gainesville doesn’t have the same culinary cache as New York or Chicago, but it has developed a very unique pizza scene. With 13 independent pizzerias serving Neapolitan, Detroit, New York and Mediterranean styles and more, hungry Gators have a plethora of options when craving a slice.

What makes Gainesville’s pizza scene especially interesting is the range of backgrounds its proprietors have. Through interviews with pizzeria owners, we found that some had started as artists and musicians, while others had worked in engineering or education – and each had their own unique approach to making pizzas.

The owner of Strega Nona’s Oven, for example, uses his engineering background to turn dough-making into a science, altering the proportions of ingredients by as little as half of a percent based on the season or even the weather.

Satchel’s Pizza, on the other hand, is filled with works made by its artist owner, including mosaic windows, paintings, sculptures and fountains.

Gainesville’s independent pizzerias often serve as what sociologists call “third places”: spaces for gathering that aren’t home or work. And their owners think carefully about how to create a welcoming environment. For example, the owner of Scuola Pizza insisted the restaurant be free of TVs, so diners can focus on their food. Squarehouse Pizza features a large outdoor space; an old, now repurposed school bus outfitted with tables and chairs to dine in, and a stage for live music.

Squarehouse also is known for its unusual toppings on square, Detroit-style pies – for example, the Mariah Curry, topped with curry chicken or cauliflower and coconut curry sauce. It refreshes its specialty menus every semester or two.

While the American pizza landscape may be shaped by big brands and standardized menus, small pizzerias continue to shine. Gainesville is a perfect example of how a local pizza scene in a small Southern college town can be so unique, even in a globalized industry. Small pizzerias don’t just offer food – they offer a flavorful reminder that the marketplace rewards distinctiveness and local character, too.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

ref. 3 basic ingredients, a million possibilities: How small pizzerias succeed with uniqueness in an age of chain restaurants – https://theconversation.com/3-basic-ingredients-a-million-possibilities-how-small-pizzerias-succeed-with-uniqueness-in-an-age-of-chain-restaurants-259661