‘Big’ legislative package shifts more of SNAP’s costs to states, saving federal dollars but causing fewer Americans to get help paying for food

Source: The Conversation – USA (2) – By Tracy Roof, Associate Professor of Political Science, University of Richmond

People shop for food in Brooklyn in 2023 at a store that makes sure that its customers know it accepts SNAP benefits, also known as food stamps and EBT.
Spencer Platt/Getty Images

The legislative package that President Donald Trump signed into law on July 4, 2025, has several provisions that will shrink the safety net, including the Supplemental Nutrition Assistance Program, long known as food stamps. SNAP spending will decline by an estimated US$186 billion through 2034 as a result of several changes Congress made to the program that today helps roughly 42 million people buy groceries – an almost 20% reduction.

In my research on the history of food stamps, I’ve found that the program was meant to be widely available to most low-income people. The SNAP changes break that tradition in two ways.

The Congressional Budget Office estimates that about 3 million people are likely to be dropped from the program and lose their benefits. This decline will occur in part because more people will face time limits if they don’t meet work requirements. Even those who meet the requirements may lose benefits because of difficulty submitting the necessary documents.

And because states will soon have to take on more of the costs of the program, which totaled over $100 billion in 2024, they may eventually further restrict who gets help due to their own budgetary constraints.

Summing up SNAP’s origins

Inspired by the plight of unemployed coal miners whom John F. Kennedy met in Appalachia when he campaigned for the presidency in 1960, the early food stamps program was not limited to single parents with children, older people and people with disabilities, like many other safety net programs were at the time. It was supposed to help low-income people afford more and better food, regardless of their circumstances.

In response to national attention in the late 1960s to widespread hunger and malnutrition in other areas of the country, such as among tenant farmers in the rural South, a limited food stamps program was expanded. It reached every part of the country by 1974.

From the start, the states administered the program and covered some of its administrative costs and the federal government paid for the benefits in full. This arrangement encouraged states to enroll everyone who needed help without fearing the budgetary consequences.

Who could qualify and how much help they could get were set by uniform national standards, so that even the residents of the poorest states would be able to afford a budget-conscious but nutritionally adequate diet.

The federal government’s responsibility for the cost of benefits also allowed spending to automatically grow during economic downturns, when more people need assistance. These federal dollars helped families, retailers and local economies weather tough times.

The changes to the SNAP program included in the legislative package that Congress approved by narrow margins and Trump signed into law, however, will make it harder for the program to serve its original goals.

Restricting benefits

Since the early 1970s, most so-called able-bodied adults who were not caring for a child or an adult with disabilities had to meet a work requirement to get food stamps. Welfare reform legislation in 1996 made that requirement stricter for such adults between the ages of 18 and 50 by imposing a three-month time limit if they didn’t log 20 hours or more of employment or another approved activity, such as verified volunteering.

Budget legislation passed in 2023 expanded this rule to adults up to age 54. The 2025 law will further expand the time limit to adults up to age 64 and parents of children age 14 or over.

States can currently get permission from the federal government to waive work requirements in areas with insufficient jobs or unemployment above the national average. This flexibility to waive work requirements will now be significantly limited and available only where at least 1 in 10 workers are unemployed.

Concerned senators secured an exemption from the work requirements for most Native Americans and Native Alaskans, who are more likely to live in areas with limited job opportunities.

A 2023 budget deal exempted veterans, the homeless and young adults exiting the foster care system from work requirements because they can experience special challenges getting jobs. The 2025 law does not exempt them.

The new changes to SNAP policies will also deny benefits to many immigrants with authorization to be in the U.S., such as people granted political asylum or official refugee status. Immigrants without authorization to reside in the U.S. will continue to be ineligible for SNAP benefits.

Tracking ‘error rates’

Critics of food stamps have long argued that states lack incentives to carefully administer the program because the federal government is on the hook for the cost of benefits.

In the 1970s, as the number of Americans on the food stamp rolls soared, the U.S. Department of Agriculture, which oversees the program, developed a system for assessing if states were accurately determining whether applicants were eligible for benefits and how much they could get.

A state’s “payment error rate” estimates the share of benefits paid out that were more or less than an applicant was actually eligible for. The error rate was not then and is not today a measure of fraud. Typically, it just indicates the share of families who get a higher – or lower – amount of benefits than they are eligible for because of mistakes or confusion on the part of the applicant or the case worker who handles the application.

Congress tried to penalize states with error rates over 5% in the 1980s but ultimately suspended the effort under state pressure. After years of political wrangling, the USDA started to consistently enforce financial penalties on states with high error rates in the mid-1990s.

States responded by increasing their red tape. For example, they asked applicants to submit more documentation and made them go through more bureaucratic hoops, like having more frequent in-person interviews, to get – and continue receiving – SNAP benefits.

These demands hit low-wage workers hardest because their applications were more prone to mistakes. Low-income workers often don’t have consistent work hours and their pay can vary from week to week and month to month. The number of families getting benefits fell steeply.

The USDA tried to reverse this decline by offering states options to simplify the process for applying for and continuing to get SNAP benefits over the course of the presidencies of Bill Clinton, George W. Bush and Barack Obama. Enrollment grew steadily.

Penalizing high rates

Since 2008, states with error rates over 6% have had to develop a detailed plan to lower them.

Despite this requirement, the national average error rate jumped from 7.4% before the pandemic, to a record high of 11.7% in 2023. Rates rose as states struggled with a surge of people applying for benefits, a shortage of staff in state welfare agencies and procedural changes.

Republican leaders in Congress have responded to that increase by calling for more accountability.

Making states pay more

The big legislative package will increase states’ expenses in two ways.

It will reduce the federal government’s responsibility for half of the cost of administering the program to 25% beginning in the 2027 fiscal year.

And some states will have to pay a share of benefit costs for the first time in the program’s history, depending on their payment error rates. Beginning in the 2028 fiscal year, states with an error rate between 6-8% would be responsible for 5% of the cost of benefits. Those with an error rate between 8-10% would have to pay 10%, and states with an error rate over 10% would have to pay 15%. The federal government would continue to pay all benefits in states with error rates below 6%.

Republicans argue the changes will give states more “skin in the game” and ensure better administration of the program.

While the national payment error rate fell from 11.68% in the 2023 fiscal year to 10.93% a year later, 42 states still had rates in excess of 6% in 2024. Twenty states plus the District of Columbia had rates of 10% or higher.

At nearly 25%, Alaska has the highest payment error rate in the country. But Alaska won’t be in trouble right away. To ease passage in the Senate, where the vote of Sen. Lisa Murkowski, an Alaska Republican, was in doubt, a provision was added to the bill allowing several states with the highest error rates to avoid cost sharing for up to two years after it begins.

Democrats argue this may encourage states to actually increase their error rates in the short term.

The effect of the new law on the amount of help an eligible household gets is expected to be limited.

About 600,000 individuals and families will lose an average of $100 a month in benefits because of a change in the way utility costs are treated. The law also prevents future administrations from increasing benefits beyond the cost of living, as the Biden Administration did.

States cannot cut benefits below the national standards set in federal law.

But the shift of costs to financially strapped states will force them to make tough choices. They will either have to cut back spending on other programs, increase taxes, discourage people from getting SNAP benefits or drop the program altogether.

The changes will, in the end, make it even harder for Americans who can’t afford the bare necessities to get enough nutritious food to feed their families.

The Conversation

Tracy Roof does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. ‘Big’ legislative package shifts more of SNAP’s costs to states, saving federal dollars but causing fewer Americans to get help paying for food – https://theconversation.com/big-legislative-package-shifts-more-of-snaps-costs-to-states-saving-federal-dollars-but-causing-fewer-americans-to-get-help-paying-for-food-260166

Why Texas Hill Country, where a devastating flood killed dozens, is one of the deadliest places in the US for flash flooding

Source: The Conversation – USA (2) – By Hatim Sharif, Professor of Civil and Environmental Engineering, The University of Texas at San Antonio

A Kerrville, Texas, resident watches the flooded Guadalupe River on July 4, 2025. Eric Vryn/Getty Images

Texas Hill Country is known for its landscapes, with shallow rivers winding among hills and through rugged valleys. But that geography also makes it one of the deadliest places in the U.S. for flash flooding.

In the early hours of July 4, 2025, a rush of flood water swept through an area dotted with summer camps and small towns about 70 miles west of San Antonio. At least 27 people died, and about two dozen girls from one camp and other people in the area were still unaccounted for the following morning, officials said. More than 200 people had to be rescued.

The flooding began as many flash floods in this region do, with a heavy downpour that sent water sheeting off the hillsides into creeks. The creeks poured into the Guadalupe River. Around 3 a.m. on July 4, National Weather Service data shows the river was rising about 1 foot every 5 minutes near the camp. By 4:30 a.m., the water had risen more than 20 feet.

Flood expert Hatim Sharif, a hydrologist and civil engineer at the University of Texas at San Antonio, explains what makes this part of the country, known as Flash Flood Alley, so dangerous.

What makes Hill Country so prone to flooding?

Texas as a whole leads the nation in flood deaths, and by a wide margin. A colleague and I analyzed data from 1959 to 2019 and found 1,069 people had died in flooding in Texas over those six decades. The next highest total was in Louisiana, with 693.

Many of those flood deaths have been in Hill County, an area known as Flash Flood Alley. It’s a crescent of land that curves from near Dallas down to San Antonio and then westward.

The hills are steep, and the water moves quickly when it floods. This is a semi-arid area with soils that don’t soak up much water, so the water sheets off quickly and the shallow creeks can rise fast.

When those creeks converge on a river, they can create a wall of water that wipes out homes and washes away cars and, unfortunately, anyone in its path.

Hill Country has seen some devastating flash floods. In 1987, heavy rain in western Kerr County quickly flooded the Guadalupe River, triggering a flash flood similar to the one in 2025. Ten teenagers being evacuated from a camp died in the rushing water.

San Antonio, considered the gateway to Hill Country, was hit with another flash flood on June 12, 2025, that killed 13 people whose cars were swept away when they drove into high water from a flooding creek near an interstate ramp in the early morning.

Why does the region get such strong downpours?

One reason Hill Country gets powerful downpours is the Balcones Escarpment.

The escarpment is a line of cliffs and steep hills created by a geologic fault. When warm air from the Gulf rushes up the escarpment, it condenses and can dump a lot of moisture. That water flows down the hills quickly, from many different directions, filling streams and rivers below.

As temperature rise, the warmer atmosphere can hold more moisture, increasing the downpour and flood risk.

A tour of the Guadalupe River and its flood risk.

The same effect can contribute to flash flooding in San Antonio, where the large amount of paved land and lack of updated drainage to control runoff adds to the risk.

What can be done to improve flash flood safety?

First, it’s important for people to understand why flash flooding happens and just how fast the water can rise and flow. In many arid areas, dry or shallow creeks can quickly fill up with fast-moving water and become deadly. So people should be aware of the risks and pay attention to the weather.

Improving flood forecasting, with more detailed models of the physics and water velocity at different locations, can also help.

Probabilistic forecasting, for example, can provide a range of rainfall scenarios, enabling authorities to prepare for worst-case scenarios. A scientific framework linking rainfall forecasts to the local impacts, such as streamflow, flood depth and water velocity, could also help decision-makers implement timely evacuations or road closures.

Education is particularly essential for drivers. One to two feet of moving water can wash away a car. People may think their trucks and SUVs can go through anything, but fast-moving water can flip a truck and carry it away.

Officials can also do more to barricade roads when the flood risk is high to prevent people from driving into harm’s way. We found that 58% of the flood deaths in Texas over the past six decades involved vehicles.

The storm on June 12 in San Antonio was an example. It was early morning, and drivers has poor visibility. Cars drove into floodwater without seeing the risk until it was too late.

The Conversation

Hatim Sharif does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Why Texas Hill Country, where a devastating flood killed dozens, is one of the deadliest places in the US for flash flooding – https://theconversation.com/why-texas-hill-country-where-a-devastating-flood-killed-dozens-is-one-of-the-deadliest-places-in-the-us-for-flash-flooding-260555

Employers are failing to insure the working class – Medicaid cuts will leave them even more vulnerable

Source: The Conversation – USA (3) – By Sumit Agarwal, Assistant Professor of Internal Medicine, University of Michigan

The Congressional Budget Office estimates that 7.8 million Americans across the U.S. will lose their coverage through Medicaid – the public program that provides health insurance to low-income families and individuals – under the multitrillion-dollar domestic policy package that President Donald Trump signed into law on July 4, 2025.

That includes 247,000 to 412,000 of my fellow residents of Michigan based on the House Reconciliation Bill in early June. There are similarly deep projected cuts within the Senate version of the legislation, which Trump signed.
Many of these people are working Americans who will lose Medicaid because of the onerous paperwork involved with the proposed work requirements.

They won’t be able to get coverage in the Affordable Care Act Marketplaces after losing Medicaid. Premiums and out-of-pocket costs are likely to be too high for those making less than 100% to 138% of the federal poverty level who do not qualify for health insurance marketplace subsidies. Funding for this program is also under threat.

And despite being employed, they also won’t be able to get health insurance through their employers because it is either too expensive or not offered to them. Researchers estimate that coverage losses will lead to thousands of medically preventable deaths across the country because people will be unable to access health care without insurance.

I am a physician, health economist and policy researcher who has cared for patients on Medicaid and written about health care in the U.S. for over eight years. I think it’s important to understand the role of Medicaid within the broader insurance landscape. Medicaid has become a crucial source of health coverage for low-wage workers.

A brief history of Medicaid expansion.

Michigan removed work requirements from Medicaid

A few years ago, Michigan was slated to institute Medicaid work requirements, but the courts blocked the implementation of that policy in 2020. It would have cost upward of US$70 million due to software upgrades, staff training, and outreach to Michigan residents enrolled in the Medicaid program, according to the Michigan Department of Health and Human Services.

Had it gone into effect, 100,000 state residents were expected to lose coverage within the first year.

The state took the formal step of eliminating work requirements from its statutes earlier this year in recognition of implementation costs being too high and mounting evidence against the policy’s effectiveness.

When Arkansas instituted Medicaid work requirements in 2018, there was no increase in employment, but within months, thousands of people enrolled in the program lost their coverage. The reason? Many people were subjected to paperwork and red tape, but there weren’t actually that many people who would fail to meet the criteria of the work requirements. It is a recipe for widespread coverage losses without meeting any of the policy’s purported goals.

Work requirements, far from incentivizing work, paradoxically remove working people from Medicaid with nowhere else to go for insurance.

Shortcomings of employer-sponsored insurance

Nearly half of Americans get their health insurance through their employers.

In contrast to a universal system that covers everyone from cradle to grave, an employer-first system leaves huge swaths of the population uninsured. This includes tens of millions of working Americans who are unable to get health insurance through their employers, especially low-income workers who are less likely to even get the choice of coverage from their employers.

Over 80% of managers and professionals have employer-sponsored health coverage, but only 50% to 70% of blue-collar workers in service jobs, farming, construction, manufacturing and transportation can say the same.

There are some legal requirements mandating employers to provide health insurance to their employees, but the reality of low-wage work means many do not fall under these legal protections.

For example, employers are allowed to incorporate a waiting period of up to 90 days before health coverage begins. The legal requirement also applies only to full-time workers. Health coverage can thus remain out of reach for seasonal and temporary workers, part-time employees and gig workers.

Even if an employer offers health insurance to their low-wage employees, those workers may forego it because the premiums and deductibles are too high to make it worth earning less take-home pay.

To make matters worse, layoffs are more common for low-wage workers, leaving them with limited options for health insurance during job transitions. And many employers have increasingly shed low-wage staff, such as drivers and cleaning staff, from their employment rolls and contracted that work out. Known as the fissuring of the workplace, it allows employers of predominately high-income employees to continue offering generous benefits while leaving no such commitment to low-wage workers employed as contractors.

Medicaid fills in gaps

Low-income workers without access to employer-sponsored insurance had virtually no options for health insurance in the years before key parts of the Affordable Care Act went into effect in 2014.

Research my coauthors and I conducted showed that blue-collar workers have since gained health insurance coverage, cutting the uninsured rate by a third thanks to the expansion of Medicaid eligibility and subsidies in the health insurance marketplaces. This means low-income workers can more consistently see doctors, get preventive care and fill prescriptions.

Further evidence from Michigan’s experience has shown that Medicaid can help the people it covers do a better job at work by addressing health impairments. It can also improve their financial well-being, including fewer problems with debt, fewer bankruptcies, higher credit scores and fewer evictions.

Premiums and cost sharing in Medicaid are minimal compared with employer-sponsored insurance, making it a more realistic and accessible option for low-income workers. And because Medicaid is not tied directly to employment, it can promote job mobility, allowing workers to maintain coverage within or between jobs without having to go through the bureaucratic complexity of certifying work.

Of course, Medicaid has its own shortcomings. Payment rates to providers are low relative to other insurers, access to doctors can be limited, and the program varies significantly by state. But these weaknesses stem largely from underfunding and political hostility – not from any intrinsic flaw in the model. If anything, Medicaid’s success in covering low-income workers and containing per-enrollee costs points to its potential as a broader foundation for health coverage.

The current employer-based system, which is propped up by an enormous and regressive tax break for employer-sponsored insurance premiums, favors high-income earners and contributes to wage stagnation. In my view, which is shared by other health economists, a more public, universal model could better cover Americans regardless of how someone earns a living.

Over the past six decades, Medicaid has quietly stepped into the breach left by employer-sponsored insurance. Medicaid started as a welfare program for the needy in the 1960s, but it has evolved and adapted to fill the needs of a country whose health care system leaves far too many uninsured.

This article was updated on July 4, 2025, to reflect Trump signing the bill into law.

The Conversation

Sumit Agarwal does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Employers are failing to insure the working class – Medicaid cuts will leave them even more vulnerable – https://theconversation.com/employers-are-failing-to-insure-the-working-class-medicaid-cuts-will-leave-them-even-more-vulnerable-259256

AI isn’t replacing student writing – but it is reshaping it

Source: The Conversation – USA (2) – By Jeanne Beatrix Law, Professor of English, Kennesaw State University

Studies have shown that many students are using AI to brainstorm, learn new information and revise their work. krisanapong detraphiphat/Moment via Getty Images

I’m a writing professor who sees artificial intelligence as more of an opportunity for students, rather than a threat.

That sets me apart from some of my colleagues, who fear that AI is accelerating a glut of superficial content, impeding critical thinking and hindering creative expression. They worry that students are simply using it out of sheer laziness or, worse, to cheat.

Perhaps that’s why so many students are afraid to admit that they use ChatGPT.

In The New Yorker magazine, historian D. Graham Burnett recounts asking his undergraduate and graduate students at Princeton whether they’d ever used ChatGPT. No one raised their hand.

“It’s not that they’re dishonest,” he writes. “It’s that they’re paralyzed.”

Students seem to have internalized the belief that using AI for their coursework is somehow wrong. Yet, whether my colleagues like it or not, most college students are using it.

A February 2025 report from the Higher Education Policy Institute in the U.K. found that 92% of university students are using AI in some form. As early as August 2023 – a mere nine months after ChatGPT’s public release – more than half of first-year students at Kennesaw State University, the public research institution where I teach, reported that they believed that AI is the future of writing.

It’s clear that students aren’t going to magically stop using AI. So I think it’s important to point out some ways in which AI can actually be a useful tool that enhances, rather than hampers, the writing process.

Helping with the busywork

A February 2025 OpenAI report on ChatGPT use among college-aged users found that more than one-quarter of their ChatGPT conversations were education-related.

The report also revealed that the top five uses for students were writing-centered: starting papers and projects (49%); summarizing long texts (48%); brainstorming creative projects (45%); exploring new topics (44%); and revising writing (44%).

These figures challenge the assumption that students use AI merely to cheat or write entire papers.

Instead, it suggests they are leveraging AI to free up more time to engage in deeper processes and metacognitive behaviors – deliberately organizing ideas, honing arguments and refining style.

If AI allows students to automate routine cognitive tasks – like information retrieval or ensuring that verb tenses are consistent – it doesn’t mean they’re thinking less. It means their thinking is changing.

Of course, students can misuse AI if they use the technology passively, reflexively accepting its outputs and ideas. And overreliance on ChatGPT can erode a student’s unique voice or style.

However, as long as students learn how to use AI intentionally, this shift can be seen as an opportunity, rather than a loss.

Clarifying the creative vision

It has also become clear that AI, when used responsibly, can augment human creativity.

For example, science comedy writer Sarah Rose Siskind recently gave a talk to Harvard students about her creative process. She spoke about how she uses ChatGPT to brainstorm joke setups and explore various comedic scenarios, which allows her to focus on crafting punchlines and refining her comedic timing.

Note how Siskin used AI in ways that didn’t supplant the human touch. Instead of replacing her creativity, AI amplified it by providing structured and consistent feedback, giving her more time to polish her jokes.

Another example is the Rhetorical Prompting Method, which I developed alongside fellow Kennesaw State University researchers. Designed for university students and adult learners, it’s a framework for conversing with an AI chatbot, one that emphasizes the importance of agency in guiding AI outputs.

When writers use precise language to prompt, critical thinking to reflect, and intentional revision to sculpt inputs and outputs, they direct AI to help them generate content that aligns with their vision.

There’s still a process

The Rhetorical Prompting Method mirrors best practices in process writing, which encourages writers to revisit, refine and revise their drafts.

When using ChatGPT, though, it’s all about thoughtfully revisiting and revising prompts and outputs.

For instance, say a student wants to create a compelling PSA for social media to encourage campus composting. She considers her audience. She prompts ChatGPT to draft a short, upbeat message in under 50 words that’s geared to college students.

Reading the first output, she notices it lacks urgency. So she revises the prompt to emphasize immediate impact. She also adds some additional specifics that are important to her message, such as the location of an information session. The final PSA reads:

“Every scrap counts! Join campus composting today at the Commons. Your leftovers aren’t trash – they’re tomorrow’s gardens. Help our university bloom brighter, one compost bin at a time.”

The Rhetorical Prompting Method isn’t groundbreaking; it’s riffing on a process that’s been tested in the writing studies discipline for decades. But I’ve found that it works by directing writers how to intentionally prompt.

I know this because we asked users about their experiences. In an ongoing study, my colleagues and I polled 133 people who used the Rhetorical Prompting Method for their academic and professional writing:

  • 92% reported that it helped them evaluate writing choices before and during their process.

  • 75% said that they were able to maintain their authentic voice while using AI assistance.

  • 89% responded that it helped them think critically about their writing.

The data suggests that learners take their writing seriously. Their responses reveal that they are thinking carefully about their writing styles and strategies. While this data is preliminary, we continue to gather responses in different courses, disciplines and learning environments.

All of this is to say that, while there are divergent points of view over when and where it’s appropriate to use AI, students are certainly using it. And being provided with a framework can help them think more deeply about their writing.

AI, then, is not just a tool that’s useful for trivial tasks. It can be an asset for creativity. If today’s students – who are actively using AI to write, revise and explore ideas – see AI as a writing partner, I think it’s a good idea for professors to start thinking about helping them learn the best ways to work with it.

The Conversation

Jeanne Beatrix Law does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. AI isn’t replacing student writing – but it is reshaping it – https://theconversation.com/ai-isnt-replacing-student-writing-but-it-is-reshaping-it-254878

How proposed changes to higher education accreditation could impact campus diversity efforts

Source: The Conversation – USA (2) – By Jimmy Aguilar, PhD Candidate in Urban Education Policy, University of Southern California

An executive order seeks to remove ‘discriminatory ideology’ in universities. Critics contend it politicizes the accreditation process. Abraham Gonzalez Fernandez via Getty Images

President Donald Trump on April 23, 2025, signed an executive order that aims to change the higher education accreditation process. It asks accrediting agencies to root out “discriminatory ideology” and roll back diversity, equity and inclusion initiatives on college campuses.

The Conversation asked Jimmy Aguilar, who studies higher education at the University of Southern California, to explain what accreditation is, why it matters and how the Trump order seeks to change it.

What is accreditation and how does it work?

Accreditation is a process that evaluates whether colleges and universities meet standards of academic rigor, institutional integrity and financial stability.

In the United States, there were 88 accrediting agencies during the 2022-23 school academic year.

The agencies are formally recognized by the Department of Education and the Council for Higher Education Accreditation.

Accreditation is not a one-time stamp of approval, but a continuous process.

At its core, accreditation is a guarantor of quality in higher education.

The process involves self-assessment and peer review visits.

Colleges typically undergo a full review every five to 10 years, depending on the accrediting agency.

Institutions must meet standards for curriculum, faculty, student services and outcomes, and provide documentation.

Then, federally recognized accrediting agencies review the documentation.

Teams, often comprised of peer reviewers from other colleges, conduct campus visits and evaluations before granting or reviewing accreditation.

Why do universities need to be accredited?

Accreditation assures students, employers and the public that an institution meets basic academic standards.

It also signals credibility and secures federal financial support.

Without it, colleges cannot access key funding sources such as Pell Grants and federal student loans.

The funding is essential for college budgets and students’ access to higher education.

Accreditation is also required for professional licensure in fields such as teaching, nursing, medicine and law.

It also helps ensure that students can transfer credits between institutions.

What does Trump’s executive order do?

President Donald Trump wearing a blue suit and red tie displays a signed executive order.
President Donald Trump displays a signed executive order in the Oval Office at the White House on April 23, 2025, in Washington.
Chip Somodevilla/Getty Images)

The executive order would reshape the college accreditation system, aligning it with the administration’s political priorities. Those priorities include the rollback of DEI initiatives.

The order seeks to use federal oversight to weaken institutional DEI policies and priorities. It also promotes new standards aligned with the administration’s interpretation of “merit-based” education.

The executive order also directs the Department of Education to penalize agencies that require colleges to implement DEI-related standards.

The Trump administration claims that such standards amount to “unlawful discrimination.”

Penalties may include increased oversight or loss of federal recognition. This would render the accreditation seal meaningless, according to the executive order.

The order also proposes a broad overhaul of the accreditation process, including:

  • Promoting “intellectual diversity” in faculty hiring. The executive order argues that promoting a broader range of viewpoints among faculty will enhance academic freedom. Critics often interpret this language as an effort to increase conservative ideological representation.

  • Streamlining the process for institutions to switch accreditors. During Trump’s first term, his administration removed geographic restrictions, giving colleges more flexibility to choose. The new executive order goes further. It makes it easier for schools to leave agencies whose standards they disagree with.

  • Expanding recognition of new accrediting agencies to increase competition.

  • Linking accreditation more directly to student outcomes. This would shift focus to metrics such as graduation rates and earnings, rather than commitments to diversity or equity.

View from front steps of US Supreme Court
A 2023 Supreme Court ruling that outlawed affirmative action in university admissions has been a point of contention in the debate over diversity, equity and inclusion in higher education.
Joe Daniel Price/Getty Images

The executive order singles out accreditors for law schools, such as the American Bar Association, and for medical schools, such as the Liaison Committee on Medical Education.

The order accuses them of enforcing DEI standards that conflict with a 2023 Supreme Court ruling that outlawed affirmative action in university admissions.

However, the ruling was limited to race-conscious admissions. It did not directly address faculty hiring or accreditation standards.

That raises questions about whether the order’s interpretation extends beyond the scope of the court’s decision.

The ruling has nonetheless been a point of contention in the debate over diversity, equity and inclusion.

The American Association of University Professors and the Lawyers’ Committee for Civil Rights Under Law have denounced the executive order.

The groups argue that it threatens to politicize accreditation and suppress efforts to promote equity and inclusion.

Nevertheless, the order represents a push by the federal government to influence higher education governance.

The Conversation

Jimmy Aguilar does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. How proposed changes to higher education accreditation could impact campus diversity efforts – https://theconversation.com/how-proposed-changes-to-higher-education-accreditation-could-impact-campus-diversity-efforts-255309

Why the traditional college major may be holding students back in a rapidly changing job market

Source: The Conversation – USA (2) – By John Weigand, Professor Emeritus of Architecture and Interior Design, Miami University

Rethinking the college major could help colleges better understand what employers and students need. Westend61/Getty Images

Colleges and universities are struggling to stay afloat.

The reasons are numerous: declining numbers of college-age students in much of the country, rising tuition at public institutions as state funding shrinks, and a growing skepticism about the value of a college degree.

Pressure is mounting to cut costs by reducing the time it takes to earn a degree from four years to three.

Students, parents and legislators increasingly prioritize return on investment and degrees that are more likely to lead to gainful employment. This has boosted enrollment in professional programs while reducing interest in traditional liberal arts and humanities majors, creating a supply-demand imbalance.

The result has been increasing financial pressure and an unprecedented number of closures and mergers, to date mostly among smaller liberal arts colleges.

To survive, institutions are scrambling to align curriculum with market demand. And they’re defaulting to the traditional college major to do so.

The college major, developed and delivered by disciplinary experts within siloed departments, continues to be the primary benchmark for academic quality and institutional performance.

This structure likely works well for professional majors governed by accreditation or licensure, or more tightly aligned with employment. But in today’s evolving landscape, reliance on the discipline-specific major may not always serve students or institutions well.

As a professor emeritus and former college administrator and dean, I argue that the college major may no longer be able to keep up with the combinations of skills that cross multiple academic disciplines and career readiness skills demanded by employers, or the flexibility students need to best position themselves for the workplace.

Students want flexibility

A man wearing headphones checks his phone while working on a laptop.
The college curriculum may be less flexible now than ever.
MoMo Productions/Digital Vision via Getty Images

I see students arrive on campus each year with different interests, passions and talents – eager to stitch them into meaningful lives and careers.

A more flexible curriculum is linked to student success, and students now consult AI tools such as ChatGPT to figure out course combinations that best position them for their future. They want flexibility, choice and time to redirect their studies if needed.

And yet, the moment students arrive on campus – even before they apply – they’re asked to declare a major from a list of predetermined and prescribed choices. The major, coupled with general education and other college requirements, creates an academic track that is anything but flexible.

Not surprisingly, around 80% of college students switch their majors at least once, suggesting that more flexible degree requirements would allow students to explore and combine diverse areas of interest. And the number of careers, let alone jobs, that college graduates are expected to have will only increase as technological change becomes more disruptive.

As institutions face mounting pressures to attract students and balance budgets, and the college major remains the principal metric for doing so, the curriculum may be less flexible now than ever.

How schools are responding

A student wearing a blue cap and gown stands on grass looking at a building.
The college major emerged as a response to an evolving workforce that prioritized specialized knowledge.
Fuse/Corbia via Getty Images

In response to market pressures, colleges are adding new high-demand majors at a record pace. Between 2002 and 2022, the number of degree programs nationwide increased by nearly 23,000, or 40%, while enrollment grew only 8%. Some of these majors, such as cybersecurity, fashion business or entertainment design, arguably connect disciplines rather than stand out as distinct. Thus, these new majors siphon enrollment from lower-demand programs within the institution and compete with similar new majors at competitor schools.

At the same time, traditional arts and humanities majors are adding professional courses to attract students and improve employability. Yet, this adds credit hours to the degree while often duplicating content already available in other departments.

Importantly, while new programs are added, few are removed. The challenge lies in faculty tenure and governance, along with a traditional understanding that faculty set the curriculum as disciplinary experts. This makes it difficult to close or revise low-demand majors and shift resources to growth areas.

The result is a proliferation of under-enrolled programs, canceled courses and stretched resources – leading to reduced program quality and declining faculty morale.

Ironically, under the pressure of declining demand, there can be perverse incentives to grow credit hours required in a major or in general education requirements as a way of garnering more resources or adding courses aligned with faculty interests. All of which continues to expand the curriculum and stress available resources.

Universities are also wrestling with the idea of liberal education and how to package the general education requirement.

Although liberal education is increasingly under fire, employers and students still value it.

Students’ career readiness skills – their ability to think critically and creatively, to collaborate effectively and to communicate well – remain strong predictors of future success in the workplace and in life.

Reenvisioning the college major

Assuming the requirement for students to complete a major in order to earn a degree, colleges can also allow students to bundle smaller modules – such as variable-credit minors, certificates or course sequences – into a customizable, modular major.

This lets students, guided by advisers, assemble a degree that fits their interests and goals while drawing from multiple disciplines. A few project-based courses can tie everything together and provide context.

Such a model wouldn’t undermine existing majors where demand is strong. For others, where demand for the major is declining, a flexible structure would strengthen enrollment, preserve faculty expertise rather than eliminate it, attract a growing number of nontraditional students who bring to campus previously earned credentials, and address the financial bottom line by rightsizing curriculum in alignment with student demand.

One critique of such a flexible major is that it lacks depth of study, but it is precisely the combination of curricular content that gives it depth. Another criticism is that it can’t be effectively marketed to an employer. But a customized major can be clearly named and explained to employers to highlight students’ unique skill sets.

Further, as students increasingly try to fit cocurricular experiences – such as study abroad, internships, undergraduate research or organizational leadership – into their course of study, these can also be approved as modules in a flexible curriculum.

It’s worth noting that while several schools offer interdisciplinary studies majors, these are often overprescribed or don’t grant students access to in-demand courses. For a flexible-degree model to succeed, course sections would need to be available and added or deleted in response to student demand.

Several schools also now offer microcredentials– skill-based courses or course modules that increasingly include courses in the liberal arts. But these typically need to be completed in addition to requirements of the major.

We take the college major for granted.

Yet it’s worth noting that the major is a relatively recent invention.

Before the 20th century, students followed a broad liberal arts curriculum designed to create well-rounded, globally minded citizens. The major emerged as a response to an evolving workforce that prioritized specialized knowledge. But times change – and so can the model.

The Conversation

John Weigand does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Why the traditional college major may be holding students back in a rapidly changing job market – https://theconversation.com/why-the-traditional-college-major-may-be-holding-students-back-in-a-rapidly-changing-job-market-258383

At Cannes, decency and dress codes clash with fashion’s red carpet revolution

Source: The Conversation – USA (2) – By Elizabeth Castaldo Lundén, Research Fellow at the School of Cinematic Arts, University of Southern California

Jennifer Lawrence and Robert Pattinson appear on the red carpet prior to the screening of ‘Die, My Love’ at the 78th annual Cannes Film Festival on May 17, 2025. Kristy Sparow/Getty Images

Ahead of the Cannes Film Festival, the spotlight moved from movie stars and directors to the festival’s fashion rules.

Cannes reminded guests to follow the standard black-tie dress code for evening events at the Grand Theatre Lumière – “long dresses and tuxedos” – while highlighting acceptable alternatives, such as cocktail dresses and pantsuits for women, and a black or navy suit with a tie for men.

The real stir, however, came from two additions to the formal guidelines: a ban on nudity “for decency reasons” and a restriction on oversize garments.

The new rules caught many stylists and stars by surprise, with some decrying the move as a regressive attempt to police clothing.

It’s hard not to wonder whether this is part of some broader conservative cultural shift around the world.

But I study the cultural and economic forces behind fashion and media, and I think a lot of the criticism of Cannes is unfounded. To me, the festival isn’t changing its identity. It’s reasserting it.

Red carpet control

Concerns about indecency on the red carpet have appeared before – most notably during the first televised Academy Awards in 1953.

In 1952, the National Association of Radio and Television Broadcasters adopted a censorship code in response to concerns about television’s influence on young audiences. Among its rules for “decency and decorum” were guidelines against revealing clothing, suggestive movements or camera angles that emphasized body parts – all to avoid causing “embarrassment” to the viewers.

Woman holds paper over her head to protect her hair as she walks across a carpet wearing high heels and a short skirt.
Actress Inger Stevens at the 39th Academy Awards in 1967, a year before she was reprimanded for her skimpy attire.
Bettmann/Getty Images

To ensure that no actress would break the decency dress code, the Academy of Motion Picture Arts and Sciences hired acclaimed costume designer Edith Head as a fashion consultant for the show in 1953.

In my book “Fashion on the Red Carpet,” I explain how Head equipped backstage staff with kits to deal with any sartorial emergencies that might arise. That same year, the balcony cameras at the Pantages Theatre accidentally peeked down into the actresses’ cleavage as they walked to the stage. From then on, a supply of tulle – a type of versatile fabric that can easily cover revealing openings that expose too much skin – was kept backstage.

The 1960s posed new challenges. Youth fashion trends clashed with traditional dress codes and television censorship. In 1968, after actress Inger Stevens appeared on the red carpet wearing a mini skirt, the Academy sent a letter reminding attendees of the black-tie – preferably floor-length – dress code. When Barbra Streisand’s Scaasi outfit accidentally turned see-through under the lighting in 1969, Head again warned against “freaky, far-out, unusual fashion” ahead of the 1970 ceremony.

However, in the 1970s, the Oscars eliminated Head’s fashion consultant position. Despite maintaining its black-tie dress code, the absence of a fashion consultant opened the door to some provocative attire, ranging from Cher’s see-through, sheer outfits, to Edy Williams’ provocative, barely-there getups.

Woman wearing leopard-print bikini and leopard-print shawl.
Once the fashion consultant position was eliminated for the Oscars, many attendees – like actress Edy Williams – tried to stand out from the crowd with provocative attire.
Fotos International/Getty Images

Old rules in a new era

Racy red carpet appearances have since become a hallmark of awards shows, particularly in the digital age.

Extravagance and shock are a way for celebrities and brands to stand out amid a glut of social media content, especially as brands increasingly pay a fortune to turn celebrities into walking billboards.

And in an era when red carpet looks are carefully curated ahead of time through partnerships with fashion brands, many celebrities expressed frustration about being unable to sport the outfits they had planned to wear at Cannes.

Stylist Rose Forde lamented the restrictions, saying, “You should be able to express yourself as an artist, with your style however you feel,” while actress Chloë Sevigny described the code as “an old-fashioned archaic rule.”

But I still can’t see the Cannes rules as part of any sort of broader conservative backlash.

Whether at the Oscars or the MTV Video Music Awards, backlash over celebrities baring too much skin has gone on for decades. Cannes hasn’t been spared from controversy, either: There was Michelle Morgan’s bikini in 1946, La Cicciolina’s topless look in 1988, Madonna’s Jean Paul Gaultier lingerie in 1991, Leila Depina’s barely-there pearl outfit in 2023 and Bella Hadid’s sheer pantyhose dress in 2024, to name just a few.

Young woman with curly hair and a skimpy beaded dress poses on the red carpet in front of a crowd of photographers.
Cape Verdean model Leila Depina arrives for the screening of the film ‘Asteroid City’ during the 2023 Cannes Film Festival.
Christophe Simon/AFP via Getty Images

The festival has routinely reminded guests of its dress code, regardless of the cultural zeitgeist.

The “decency” rule, for example, is actually required by French law. Article 222-32 of the French Criminal Code classifies showing private parts in public as a sexual offense, and can lead to a year in prison and a fine. While the legal definition hinges on intent and setting, the festival, as a public event, technically has to operate within that framework.

Compared to white-tie events like the Nobel Prize ceremony or a state banquet, Cannes’ black-tie requirement is relatively flexible. It allows for cocktail-length dresses and even accommodates pants and flat sandals for women.

Meanwhile, the worry about voluminous clothes points to a practical issue: the movement of bodies in tight spaces.

Unlike the Met Gala – where the fashion spectacle is the focus, and its red carpet is a stage for photo-ops – Cannes is a film festival. The red carpet is the main path thousands of people use to enter the theater.

A dramatic gown – like the one worn at the Met Gala by Cardi B in 2024 – could block others and cause delays. While a photo-op may be the primary goal for celebrities and the brands they promote, the festival has a screening schedule to stick to, and attendees must be able to easily access the venue and their seats.

Red carpet rules are fluid. Sometimes they adapt to cultural shifts. Sometimes they resist them. And sometimes, they’re there to make sure you can fit in your seat in the movie theater.

The Conversation

Elizabeth Castaldo Lundén received funding from Fulbright (2023-2024)

ref. At Cannes, decency and dress codes clash with fashion’s red carpet revolution – https://theconversation.com/at-cannes-decency-and-dress-codes-clash-with-fashions-red-carpet-revolution-256948

From the marriage contract to breaking the glass under the chuppah, many Jewish couples adapt their weddings to celebrate gender equality

Source: The Conversation – USA (3) – By Samira Mehta, Associate Professor of Women and Gender Studies & Jewish Studies, University of Colorado Boulder

The ketubah is a binding document in Jewish law that traditionally spells out a groom’s responsibilities toward his wife − but that many couples adapt to be more egalitarian. PowerSiege/iStock via Getty Images Plus

Traditional Jewish weddings share one key aspect with traditional Christian weddings. Historically, the ceremony was essentially a transfer of property: A woman went from being the responsibility of her father to being the responsibility of her husband.

That may not be the first thing Americans associate with weddings today, but it lives on in rituals and vows. Think, in a traditional Christian wedding, of a bride promising “to obey” her husband, or being “given away” by her father after he walks her down the aisle.

Feminism has changed some aspects of the Christian wedding. More egalitarian or feminist couples, for example, might have the bride be “given away” by both her parents, or have both the bride and groom escorted in by parents. Others skip the “giving” altogether. Queer couples, too, have reimagined the wedding ceremony.

Two women wearing white clothes and prayer shawls dance under a simple canopy in a park as a few people look on.
Mara Mooiweer, left, and Elisheva Dan dance during their socially distanced wedding in Brookline, Mass., during the COVID-19 pandemic.
Jessica Rinaldi/The Boston Globe via Getty Images

During research for my book “Beyond Chrismukkah,” about Christian-Jewish interfaith families, many interviewees wound up talking about their weddings and the rituals that they selected or innovated for the day to reflect their cultural background. Some of them had also designed their ceremonies to reflect feminism and marriage equality – something that the interfaith weddings had in common with many weddings where both members of the couple were Jewish.

These values have transformed many Jewish couples’ weddings, just as they have transformed the Christian wedding. Some Jewish couples make many changes, while some make none. And like every faith, Judaism has lots of internal diversity – not all traditional Jewish weddings look the same.

Contracts and covenants

Perhaps one of the most important places where feminism and marriage equality have reshaped traditions is in the “ketubah,” or Jewish marriage contract.

A traditional ketubah is a simple legal document in Hebrew or Aramaic, a related ancient language. Two witnesses sign the agreement, which states that the groom has acquired the bride. However, the ketubah is also sometimes framed as a tool to protect women. The document stipulates the husband’s responsibility to provide for his wife and confirms what he should pay her in case of divorce. Traditional ketubot – the plural of ketubah – did not discuss love, God or intentions for the marriage.

A man in a blue-gray suit signs a colorfully decorated piece of paper as another man in a white shirt watches.
A groom signs the ketubah as witnesses sit beside him in Jerusalem, Israel, in 2014.
Dan Porges/Getty Images

Contemporary ketubot in more liberal branches of Judaism, whether between opposite- or same-sex couples, are usually much more egalitarian documents that reflect the home and the marriage that the couple want to create. Sometimes the couple adapt the Aramaic text; others keep the Aramaic and pair it with a text in the language they speak every day, describing their intentions for their marriage.

Rather than being simple, printed documents, contemporary ketubot are often beautiful pieces of art, made to hang in a place of prominence in the newlyweds’ home. Sometimes the art makes references to traditional Jewish symbols, such as a pomegranate for fertility and love. Other times, the artist works with the couple to personalize their decorations with images and symbols that are meaningful to them.

Contemporary couples will often also use their ketubah to address an inherent tension in Jewish marriage. Jewish law gives men much more freedom to divorce than it gives women. Because women cannot generally initiate divorce, they can end up as “agunot,” which literally means “chained”: women whose husbands have refused to grant them a religious divorce. Even if the couple have been divorced in secular court, an “agunah” cannot, according to Jewish law, remarry in a religious ceremony.

Contemporary ketubot will sometimes make a note that, while the couple hope to remain married until death, if the marriage deteriorates, the husband agrees to grant a divorce if certain conditions are met. This prevents women from being held hostage in unhappy marriages.

Other couples eschew the ketubah altogether in favor of a new type of document called a “brit ahuvim,” or covenant of lovers. These documents are egalitarian agreements between couples. The brit ahuvim was developed by Rachel Adler, a feminist rabbi with a deep knowledge of Jewish law, and is grounded in ancient Jewish laws for business partnerships between equals. That said, many Jews, including some feminists, do not see the brit ahuvim as equal in status to a ketubah.

A colorful, framed drawing on a white wall, with two older women barely visible sitting on a couch at the back of the room.
Two female ducks are depicted on the ketubah hanging in the sunroom in Lennie Gerber and Pearl Berlin’s home in High Point, N.C.
AP Photo/Allen G. Breed

Building together

Beyond the ketubah, there are any number of other changes that couples make to symbolize their hopes for an egalitarian marriage.

Jewish ceremonies often take place under a canopy called the chuppah, which symbolizes the home that the couple create together. In a traditional Jewish wedding, the bride circles the groom three or seven times before entering the chuppah. This represents both her protection of their home and that the groom is now her priority.

Many couples today omit this custom, because they feel it makes the bride subservient to the groom. Others keep the circling but reinterpret it: In circling the groom, the bride actively creates their home, an act of empowerment. Other egalitarian couples, regardless of their genders, share the act of circling: Each spouse circles three times, and then the pair circle once together.

In traditional Jewish weddings, like in traditional Christian weddings, the groom gives his bride a ring to symbolize his commitment to her – and perhaps to mark her as a married woman. Many contemporary Jewish couples exchange two rings: both partners offering a gift to mark their marriage and presenting a symbol of their union to the world. While some see this shift as an adaptation to American culture, realistically, the dual-ring ceremony is a relatively new development in both American Christian and American Jewish marriage ceremonies.

Finally, Jewish weddings traditionally end when the groom stomps on and breaks a glass, and the entire crowd yells “Mazel tov” to congratulate them. People debate the symbolism of the broken glass. Some say that it reminds us that life contains both joy and sorrow, or that it is a reminder of a foundational crisis in Jewish history: the destruction of the Second Temple in Jerusalem in 70 C.E. Others say that it is a reminder that life is fragile, or that marriage, unlike the glass, is an unbreakable covenant.

A man and woman, both wearing white, smile as they raise their joined hands above their heads.
Yulia Tagil and Stas Granin celebrate their union on July 25, 2010, at a square in Tel Aviv. The couple held a public wedding to protest Israeli marriage guidelines set by the chief rabbinate.
Uriel Sinai/Getty Images

Regardless of what it means, some contemporary couples both step on glasses, or have one partner place their foot on top of the other’s so that the newlyweds can break the glass together. The couple symbolize their commitment to equality – and both get to do a fun wedding custom.

There are many other innovations in contemporary Jewish weddings that have much less to do with feminism and egalitarianism, such as personalized wedding canopies or wedding programs. But these key changes represent how the wedding ceremony itself has become more egalitarian in response to both feminism and marriage equality.

The Conversation

Samira Mehta receives funding from the Henry Luce Foundation for work on Jews of Color.

ref. From the marriage contract to breaking the glass under the chuppah, many Jewish couples adapt their weddings to celebrate gender equality – https://theconversation.com/from-the-marriage-contract-to-breaking-the-glass-under-the-chuppah-many-jewish-couples-adapt-their-weddings-to-celebrate-gender-equality-229084

I’m a business professor who asked dozens of former students how they define success. Here are their lessons for today’s grads

Source: The Conversation – USA (2) – By Patrick Abouchalache, Lecturer in Strategy and Innovation, Boston University

As the Class of 2025 graduates into an uncertain and fast-changing working world, they face a crucial question: What does it mean to be successful?

Is it better to take a job that pays more, or one that’s more prestigious? Should you prioritize advancement, relationship building, community impact or even the opportunity to live somewhere new? Sorting through these questions can feel overwhelming.

I am a business school professor who spends a lot of time mentoring students and alumni in Generation Z – those born between 1997 and 2012. As part of this effort, I’ve surveyed about 300 former undergraduate students and spoken at length with about 50 of them.

Through these conversations, I’ve watched them wrestle with the classic conflicts of young adulthood – such as having to balance external rewards like money against internal motivations like wanting to be of service.

I recently revisited their stories and reflections, and I compiled the most enduring insights to offer to the next generation of graduates.

Here’s their collective advice to the Class of 2025:

1. Define what matters most to you

Success starts with self-reflection. It means setting aside society’s noise and defining your own values.

When people are driven by internal rewards like curiosity, purpose or pleasure in an activity itself – rather than outside benefits such as money – psychologists say they have “intrinsic motivation.”

Research shows that people driven by intrinsic motivation tend to display higher levels of performance, persistence and satisfaction. Harvard Business School professor Teresa Amabile’s componential theory further suggests that creativity flourishes when people’s skills align with their strongest intrinsic interests.

The alternative is to “get caught up in society’s expectations of success,” as one consulting alum put it. She described struggling to choose between a job offer at a Fortune 500 company or one at a lesser-known independent firm. In the end, she chose to go with the smaller business. It was, she stressed, “the right choice for me.” This is crucial advice: Make yourself proud, not others.

One related principle I share with students is the “Tell your story” rule. If a job doesn’t allow you to tell your story – in other words, if it doesn’t mirror your vision, values, talents and goals – keep looking for a new role.

2. Strive for balance, not burnout

A fulfilling life includes time for relationships, health and rest. While many young professionals feel endless pressure to hustle, the most fulfilled alumni I spoke with learned to take steps to protect their personal well-being.

For example, a banking alum told me that business once dominated his thoughts “24/7.” He continued, “I’m happier now that I make more time for a social life and paying attention to all my relationships – professional, personal, community, and let’s not forget myself.”

And remember that balance and motivations can change throughout your life. As one alum explained: “Your goals change and therefore your definition of success changes. I think some of the most successful people are always adapting what success means to them – chasing success even if they are already successful.”

3. Be kind, serve others and maximize your ‘happy circle’

“Some people believe to have a positive change in the world you must be a CEO or have a ton of money,” another alum told me. “But spreading happiness or joy can happen at any moment, has no cost, and the results are priceless.”

Many alumni told me that success isn’t just a matter of personal achievement – it’s about giving back to society. That could be through acts of kindness, creativity, innovation, or other ways of improving people’s lives. A retail alum shared advice from her father: “When your circle is happy, you are going to be happy,” she said. “It’s sort of an upward spiral.”

Your “happy circle” doesn’t need to consist of people you know. An alum who went into the pharmaceutical industry said his work’s true reward was measured in “tens of thousands if not millions of people” in better health thanks to his efforts.

In fact, your happy circle doesn’t even need to be exclusively human. An alum who works in ranching said he valued the well-being of animals – and their riders – more than money or praise.

4. Be a good long-term steward of your values

Success isn’t just about today – it’s what you stand for.

Several alumni spoke passionately about stewardship: the act of preserving and passing on values, relationships and traditions. This mindset extended beyond family to employees, customers and communities. As one alum who majored in economics put it, success is “leaving a mark on the world and creating a legacy that extends beyond one’s quest for monetary gain.”

One alum defined success as creating happiness and stability not just for herself, but for her loved ones. Another, who works in hospitality, said he had a duty to further his employees’ ambitions and help them grow and develop – creating a legacy that will outlast any title or paycheck.

In an analysis by the organizational consulting firm Korn Ferry, Gen Z employees were found to be more prone to burnout when their employers lacked clear values. These findings reinforce what my students already know: Alignment between your values and your work is key to success.

Final words for the Class of 2025

To the latest crop of grads, I offer this advice: Wherever life takes you next — a family business or corporate office, Wall Street or Silicon Valley, or somewhere you can’t even imagine now — remember that your career will be long and full of ups and downs.

You’ll make tough choices. You’ll face pressures. But if you stay grounded, invest in your well-being, celebrate your happy circle and honor your values, you’ll look back one day and see not just a job well done, but a life well lived.

Bon voyage!

The Conversation

Patrick Abouchalache does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. I’m a business professor who asked dozens of former students how they define success. Here are their lessons for today’s grads – https://theconversation.com/im-a-business-professor-who-asked-dozens-of-former-students-how-they-define-success-here-are-their-lessons-for-todays-grads-256189

When does a kid become an adult?

Source: The Conversation – USA (2) – By Jonathan B. Santo, Professor of Psychology, University of Nebraska Omaha

They might not be grown-ups yet. Klaus Vedfelt/DigitalVision via Getty Images

Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to CuriousKidsUS@theconversation.com.


When does a kid become an adult? – Avery, age 8, Los Angeles


Not everyone grows up at the same pace, even though U.S. law holds that you reach adulthood when you turn 18. This is the age where you are treated like an adult in terms of criminal responsibility. However, states differ on the “civil age of majority,” which means that you don’t necessarily get all the rights and privileges reserved for grown-ups at that point.

For example, U.S. citizens may vote or get a tattoo without their parents’ consent when they’re 18, but they can’t legally buy or consume alcohol until their 21st birthday. Young Americans are subject to extra restrictions and fees if they want to rent a car before they’re 25 – even if they got a driver’s license when they turned 16 and have been earning a living for years.

Even physical signs of maturity don’t provide an easy answer to this question. Puberty brings about physical changes associated with adulthood like facial hair or breast development. It also marks the onset of sexual maturity – being able to have children.

Those changes don’t happen at the same time for everyone.

For example, girls typically start going through puberty and beginning to look like adults at an earlier age than boys. Some people don’t look like grown-ups until they’re well into their 20s.

In my view, as a professor of developmental psychology, what really matters in terms of becoming an adult is how people feel and behave, and the responsibilities they handle.

18th Birthday cake with fruit and chocolate.
Even if you’ve developed a sophisticated palate by the time you turn 18, you still aren’t necessarily a full-fledged adult.
nedomacki/Getty Images

Age at milestones may vary

Because everybody is unique, there’s no standard timeline for growing up. Some people learn how to control their emotions, develop the judgment to make good decisions and manage to earn enough to support themselves by the age of 18.

Others take longer.

Coming of age also varies due to cultural differences. In some families, it’s expected that you’ll remain financially dependent on your parents until your mid-20s as you get a college education or job training.

Even within one family, your personality, experiences, career path and specific circumstances can influence how soon you’d be expected to shoulder adult responsibilities.

A young blonde woman stands while her photo is taken.
Drew Barrymore attends a movie premiere at the age of 15 – one year after a judge declared her to be an adult in the eyes of the law through emancipation.
Ron Galella, Ltd. via GettyImages

Some young people technically enter adulthood before they turn 18 through a process called “emancipation” – a legal status indicating that a young person is responsible for their own financial affairs and medical obligations.

Economic independence is hard to attain for young teens, however, because child labor is restricted and regulated in the U.S. by federal law, with states setting some of these rules. States also determine how old you have to be to get married. In most states, that’s 18 years old. But some states allow marriage at any age.

Differentiating between kids and adults

Understanding the differences between how children and adults think can help explain when a kid becomes an adult.

For example, children tend to think concretely and may struggle more than adults with abstract concepts like justice or hypothetical scenarios.

Kids and teens also have shorter attention spans than adults and are more easily distracted, whereas adults are generally better at filtering out distractions.

What’s more, children, especially little ones, tend to have more trouble controlling their emotions. They’re more prone to crying or screaming when they are frustrated or upset than adults.

One reason why being fully grown up by the time you turn 18 or even 21 might not be possible is because of our brains. The prefrontal cortex, which is a part of the brain that plays a crucial role in planning and weighing risks, doesn’t fully develop in most people before their 25th birthday.

Making choices that have lifelong consequences

The delay in the brain’s maturity can make it hard for young adults to fully consider the real-world consequences of their actions and choices. This mismatch may explain why adolescents and people in their early 20s often engage in risky or even reckless behavior – such as driving too fast, not wearing a seatbelt, using dangerous drugs, binge drinking or stealing things.

Despite the medical evidence about the late maturation of the brain, the law doesn’t provide any leeway for whether someone has truly matured if they’re accused of a breaking the law. Once they’re 18 years old, Americans can be tried legally as adults for serious crimes, including murder.

These still-developing parts of the brain also help explain why children are more susceptible to peer pressure. For instance, adolescents are more prone to confess to crimes they didn’t commit under police interrogation, partly because they can’t properly weigh the long-term consequences of their decisions.

However, there are benefits to adolescents’ having a higher tolerance to risks and risk-taking. This can help explain why many young people are motivated to engage in protests regarding climate change and other causes.

Feeling like a real adult

In North America, some young people who by many standards are adults – in that they are over 20 years old, own a car and have a job – may not feel like they’re grown-ups regardless of what the law has to say about it. The psychologist Jeffrey Arnett coined the term “emerging adults” to describe Americans who are 21-25 years old but don’t yet feel like they’re grown-ups.

When someone becomes an adult, regardless of what the law says, really depends on the person.

There are 25-year-olds with full-time jobs and their own children who may still not feel like adults and still rely on their parents for a lot of things grown-ups typically handle. There are 17-year-olds who make all of their own doctor’s appointments, take care of their younger siblings or grandparents, and do all the grocery shopping, meal planning and laundry for their household. They probably see themselves as adults.

Growing up is about gaining experiences, making mistakes and learning from them, while also taking responsibility for your own actions. As there’s no single definition of adulthood, everyone has to decide for themselves whether or not they’ve turned into a grown-up yet.


Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best.

The Conversation

Jonathan B. Santo does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. When does a kid become an adult? – https://theconversation.com/when-does-a-kid-become-an-adult-246287